Sunday, December 21, 2008

ARTICLE UPDATE - Decoding face information in time, frequency and space from direct intracranial recordings of the human brain.

Tsuchiya N, Kawasaki H, Oya H, Howard MA 3rd, Adolphs R.

PLoS One, in press

Faces are processed by a neural system with distributed anatomical components, but the roles of these components remain unclear. A dominant theory of face perception postulates independent representations of invariant aspects of faces (e.g., identity) in ventral temporal cortex including the fusiform gyrus, and changeable aspects of faces (e.g., emotion) in lateral temporal cortex including the superior temporal sulcus. Here we recorded neuronal activity directly from the cortical surface in 9 neurosurgical subjects undergoing epilepsy monitoring while they viewed static and dynamic facial expressions. Applying novel decoding analyses to the power spectrogram of electrocorticograms (ECoG) from over 100 contacts in ventral and lateral temporal cortex, we found better representation of both invariant and changeable aspects of faces in ventral than lateral temporal cortex. Critical information for discriminating faces from geometric patterns was carried by power modulations between 50 to 150 Hz. For both static and dynamic face stimuli, we obtained a higher decoding performance in ventral than lateral temporal cortex. For discriminating fearful from happy expressions, critical information was carried by power modulation between 60-150 Hz and below 30 Hz, and again better decoded in ventral than lateral temporal cortex. Task-relevant attention improved decoding accuracy more than 10% across a wide frequency range in ventral but not at all in lateral temporal cortex. Spatial searchlight decoding showed that decoding performance was highest around the middle fusiform gyrus. Finally, we found that the right hemisphere, in general, showed superior decoding to the left hemisphere. Taken together, our results challenge the dominant model for independent face representation of invariant and changeable aspects: information about both face attributes was better decoded from a single region in the middle fusiform gyrus.

ARTICLE UPDATE - EEG-MEG evidence for early differential repetition effects for fearful, happy and neutral faces.

Morel S, Ponz A, Mercier M, Vuilleumier P, George N.

Brain Research, in press

To determine how emotional information modulates subsequent traces for repeated stimuli, we combined simultaneous electro-encephalography (EEG) and magneto-encephalography (MEG) measures during long-lag incidental repetition of fearful, happy, and neutral faces. Repetition effects were modulated by facial expression in three different time windows, starting as early as 40-50 ms in both EEG and MEG, then arising at the time of the N170/M170, and finally between 280-320 ms in MEG only. The very early repetition effect, observed at 40-50 ms over occipito-temporo-parietal regions, showed a different MEG topography according to the facial expression. This differential response to fearful, happy and neutral faces suggests the existence of very early discriminative visual processing of expressive faces, possibly based on the low-level physical features typical of different emotions. The N170 and M170 face-selective components both showed repetition enhancement selective to neutral faces, with greater amplitude for emotional than neutral faces on the first but not the second presentation. These differential repetition effects may reflect valence acquisition for the neutral faces due to repetition, and suggest a combined influence of emotion- and experience-related factors on the early stage of face encoding. Finally, later repetition effects consisted in enhanced M300 (MEG) between 280 and 320 ms for fearful relative to happy and neutral faces that occurred on the first presentation, but levelled out on the second presentation. This effect may correspond to the higher arousing value of fearful stimuli that might habituate with repetition. Our results reveal that multiple stages of face processing are affected by the repetition of emotional information.

ARTICLE UPDATE - Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

Schulz KP, Clerkin SM, Halperin JM, Newcorn JH, Tang CY, Fan J.

Human Brain Mapping, in press

Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively.

Saturday, December 06, 2008

ARTICLE UPDATE - Working memory capacity and the self-regulation of emotional expression and experience.

Schmeichel BJ, Volokhov RN, Demaree HA.

Journal of Personality and Social Psychology, 95, 1526-1540

This research examined the relationship between individual differences in working memory capacity and the self-regulation of emotional expression and emotional experience. Four studies revealed that people higher in working memory capacity suppressed expressions of negative emotion (Study 1) and positive emotion (Study 2) better than did people lower in working memory capacity. Furthermore, compared to people lower in working memory capacity, people higher in capacity more capably appraised emotional stimuli in an unemotional manner and thereby experienced (Studies 3 and 4) and expressed (Study 4) less emotion in response to those stimuli. These findings indicate that cognitive ability contributes to the control of emotional responding.

ARTICLE UPDATE - Emotions in Go/NoGo conflicts.

Schacht A, Nigbur R, Sommer W.

Psychological Research, in press

On the basis of current emotion theories and functional and neurophysiological ties between the processing of conflicts and errors on the one hand and errors and emotions on the other hand we predicted that conflicts between prepotent Go responses and occasional NoGo trials in the Go/NoGo task would induce emotions. Skin conductance responses (SCRs), corrugator muscle activity, and startle blink responses were measured in three experiments requiring speeded Go responses intermixed with NoGo trials of different relative probability and in a choice reaction experiment serving as a control. NoGo trials affected several of these emotion-sensitive indicators as SCRs and startle blinks were reduced whereas corrugator activity was prolonged as compared to Go trials. From the pattern of findings we suggest that NoGo conflicts are not aversive. Instead, they appear to be appraised as obstructive for the response goal and as less action relevant than Go trials.

ARTICLE UPDATE - Visual Awareness, Emotion, and Gamma Band Synchronization.

Luo Q, Mitchell D, Cheng X, Mondillo K, McCaffrey D, Holroyd T, Carver F, Coppola R, Blair J.

Cerebral Cortex, in press

What makes us become aware? A popular hypothesis is that if cortical neurons fire in synchrony at a certain frequency band (gamma), we become aware of what they are representing. We tested this hypothesis adopting brain-imaging techniques with good spatiotemporal resolution and frequency-specific information. Specifically, we examined the degree to which increases in event-related synchronization (ERS) in the gamma band were associated with awareness of a stimulus (its detectability) and/or the emotional content of the stimulus. We observed increases in gamma band ERS within prefrontal-anterior cingulate, visual, parietal, posterior cingulate, and superior temporal cortices to stimuli available to conscious awareness. However, we also observed increases in gamma band ERS within the amygdala, visual, prefrontal, parietal, and posterior cingulate cortices to emotional relative to neutral stimuli, irrespective of their availability to conscious access. This suggests that increased gamma band ERS is related to, but not sufficient for, consciousness.

ARTICLE UPDATE - Attentional selectivity for emotional faces: Evidence from human electrophysiology.

Holmes A, Bradley BP, Kragh Nielsen M, Mogg K.

Psychophysiology, in press

Abstract This study investigated the temporal course of attentional biases for threat-related (angry) and positive (happy) facial expressions. Electrophysiological (event-related potential) and behavioral (reaction time [RT]) data were recorded while participants viewed pairs of faces (e.g., angry face paired with neutral face) shown for 500 ms and followed by a probe. Behavioral results indicated that RTs were faster to probes replacing emotional versus neutral faces, consistent with an attentional bias for emotional information. Electrophysiological results revealed that attentional orienting to threatening faces emerged earlier (early N2pc time window; 180-250 ms) than orienting to positive faces (after 250 ms), and that attention was sustained toward emotional faces during the 250-500-ms time window (late N2pc and SPCN components). These findings are consistent with models of attention and emotion that posit rapid attentional prioritization of threat.

Saturday, November 22, 2008

ARTICLE UPDATE - See no evil: Directing visual attention within unpleasant images modulates the electrocortical response.

Dunning JP, Hajcak G.

Psychophysiology, in press

The late positive potential (LPP) is larger for emotional than neutral stimuli, and reflects increased attention to motivationally salient stimuli. Recent studies have shown that the LPP can also be modulated by stimulus meaning and task relevance. The present studies sought to determine whether the magnitude of the LPP can be manipulated by directing attention to more or less arousing aspects within an emotional stimulus. To this end, trials included a passive viewing and directed attention portion. In both Studies 1 and 2, unpleasant compared to neutral images were associated with an increased LPP during passive viewing; additionally, directing attention to non-arousing compared to highly arousing areas of unpleasant images resulted in a decreased LPP. Results are discussed in terms of the utility of using the LPP to understand emotion-cognition interactions, especially with regard to directed visual attention as an emotion regulation strategy.

ARTICLE UPDATE - Electrophysiological correlates of decreasing and increasing emotional responses to unpleasant pictures.

Moser JS, Krompinger JW, Dietz J, Simons RF.

Psychophysiology, in press

We examined event-related brain potential (ERP) modulations during the anticipation and processing of unpleasant pictures under instructions to cognitively decrease and increase negative emotion. Instructions to decrease and increase negative emotion modulated the ERP response to unpleasant pictures in the direction of emotional intensity beginning around 400 ms and lasting several seconds. Decrease, but not increase, instructions also elicited enhanced frontal negativity associated with orienting and preparation prior to unpleasant picture onset. Last, ERP modulation by unpleasant pictures began around 300 ms, just prior to regulation effects, suggesting that appraisal of emotion occurs before emotion regulation. Together, the current findings underscore the utility of ERPs in illuminating the time course of emotion modulation and regulation that may help to refine extant theoretical models.

ARTICLE UPDATE - Stereotype threat and executive resource depletion: Examining the influence of emotion regulation.

Johns M, Inzlicht M, Schmader T.

Journal of Experimental Psychology: General, 137, 691-705

Research shows that stereotype threat reduces performance by diminishing executive resources, but less is known about the psychological processes responsible for these impairments. The authors tested the idea that targets of stereotype threat try to regulate their emotions and that this regulation depletes executive resources, resulting in underperformance. Across 4 experiments, they provide converging evidence that targets of stereotype threat spontaneously attempt to control their expression of anxiety and that such emotion regulation depletes executive resources needed to perform well on tests of cognitive ability. They also demonstrate that providing threatened individuals with a means to effectively cope with negative emotions--by reappraising the situation or the meaning of their anxiety--can restore executive resources and improve test performance. They discuss these results within the framework of an integrated process model of stereotype threat, in which affective and cognitive processes interact to undermine performance.

ARTICLE UPDATE - Validation of affective and neutral sentence content for prosodic testing.

Russ JB, Gur RC, Bilker WB.

Behavior Research Methods, 40,935-939

Conducting a study of emotional prosody often requires that one have a valid set of stimuli for assessing perceived emotion in vocal intonation. In this study, we created a list of sentences with both affective and neutral content, and then validated them against rater opinion. Participants read sentences with content that implied happiness, sadness, anger, fear, or neutrality and rated how well they could imagine each sentence being expressed in each emotion. Coefficients of variation and intraclass correlations were calculated to narrow the list to affective sentences that had high agreement and neutral sentences that had low agreement. We found that raters could easily identify most emotional content and did not ascribe any unique emotion to most neutral content. We also found differences between the intensity of male and female ratings. The final list of sentences is available on the Internet (www.med.upenn.edu/bbl/) and can be recorded for use as stimuli for prosodic studies.

ARTICLE UPDATE - Exploring the motivational brain: effects of implicit power motivation on brain activation in response to facial expressions of emoti

Schultheiss OC, Wirth MM, Waugh CE, Stanton SJ, Meier EA, Reuter-Lorenz P.

Social, Cognitive, Affective Neuroscience, in press

This study tested the hypothesis that implicit power motivation (nPower), in interaction with power incentives, influences activation of brain systems mediating motivation. Twelve individuals low (lowest quartile) and 12 individuals high (highest quartile) in nPower, as assessed per content coding of picture stories, were selected from a larger initial participant pool and participated in a functional magnetic resonance imaging study during which they viewed high-dominance (angry faces), low-dominance (surprised faces) and control stimuli (neutral faces, gray squares) under oddball-task conditions. Consistent with hypotheses, high-power participants showed stronger activation in response to emotional faces in brain structures involved in emotion and motivation (insula, dorsal striatum, orbitofrontal cortex) than low-power participants.

ARTICLE UPDATE - Rapid influence of emotional scenes on encoding of facial expressions: an ERP study.

Righart R, de Gelder B.

Social, Cognitive, Affective Neuroscience, 3, 270-278

In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing

Sunday, November 09, 2008

ARTICLE UPDATE - Emotional modulation of visual and motor areas by dynamic body expressions of anger.

Pichon S, de Gelder B, Grezes J.

Social Neuroscience, 3, 199-212

The ability to detect emotional meaning in others' behavior constitutes a central component of social competence. Expressions of anger in particular present salient signals that play a major role in the regulation of social interactions. Investigations of human anger signals have to date used still pictures of facial expressions but so far the neurobiological basis of bodily communication of anger remains largely unknown. Using functional magnetic resonance imaging, the present study investigated the neural bases involved in perceiving anger signals emanating from the whole body. Our study also investigates what the presence of dynamic information adds to the perception of body expressions of anger. Participants were scanned while viewing stimuli (stills or videos) of angry and neutral whole-body expressions. Whole-body expressions of anger elicit activity in regions including the amygdala and the lateral orbitofrontal cortex, which play a role in the affective evaluation of the stimuli. Importantly, the perception of dynamic body expressions of anger additionally engages the hypothalamus, the ventromedial prefrontal cortex, the temporal pole and the premotor cortex, brain regions that are coupled with autonomic reactions and motor responses related to defensive behaviors.

ARTICLE UPDATE - Effective connectivity between amygdala and orbitofrontal cortex differentiates the perception of facial expressions.

Liang X, Zebrowitz LA, Aharon I.

Social Neuroscience, in press

Emotion research is guided both by the view that emotions are points in a dimensional space, such as valence or approach-withdrawal, and by the view that emotions are discrete categories. We determined whether effective connectivity of amygdala with medial orbitofrontal cortex (MOFC) and lateral orbitofrontal cortex (LOFC) differentiates the perception of emotion faces in a manner consistent with the dimensional and/or categorical view. Greater effective connectivity from left MOFC to amygdala differentiated positive and neutral expressions from negatively valenced angry, disgust, and fear expressions. Greater effective connectivity from right LOFC to amygdala differentiated emotion expressions conducive to perceiver approach (happy, neutral, and fear) from angry expressions that elicit perceiver withdrawal. Finally, consistent with the categorical view, there were unique patterns of connectivity in response to fear, anger, and disgust, although not in response to happy expressions, which did not differ from neutral ones.

Sunday, November 02, 2008

ARTICLE UPDATE - Both predator and prey: emotional arousal in threat and reward.

Löw A, Lang PJ, Smith JC, Bradley MM.

Psychological Science, 19, 865-873

This research examined the psychophysiology of emotional arousal anticipatory to potentially aversive and highly pleasant outcomes. Human brain reactions (event-related potentials) and body reactions (heart rate, skin conductance, the probe startle reflex) were assessed along motivational gradients determined by apparent distance from sites of potential punishment or reward. A predator-prey survival context was simulated using cues that signaled possible money rewards or possible losses; the cues appeared to loom progressively closer to the viewer, until a final step when a rapid key response could ensure reward or avoid a punishing loss. The observed anticipatory response patterns of heightened vigilance and physiological mobilization are consistent with the view that the physiology of emotion is founded on action dispositions that evolved in mammals to facilitate survival by dealing with threats or capturing life-sustaining rewards.

ARTICLE UPDATE - Constructing emotion: the experience of fear as a conceptual act.

Lindquist KA, Barrett LF.

Psychological Science, 19, 898-903

This study examined the hypothesis that emotion is a psychological event constructed from the more basic elements of core affect and conceptual knowledge. Participants were primed with conceptual knowledge of fear, conceptual knowledge of anger, or a neutral prime and then proceeded through an affect-induction procedure designed to induce unpleasant, high-arousal affect or a neutral affective state. As predicted, only those individuals for whom conceptual knowledge of fear had been primed experienced unpleasant core affect as evidence that the world was threatening. This study provides the first experimental support for the hypothesis that people experience world-focused emotion when they conceptualize their core affective state using accessible knowledge about emotion.

ARTICLE UPDATE - How does negative emotion cause false memories?

Brainerd CJ, Stein LM, Silveira RA, Rohenkohl G, Reyna VF.

Psychological Science, 19, 919-925

Remembering negative events can stimulate high levels of false memory, relative to remembering neutral events. In experiments in which the emotional valence of encoded materials was manipulated with their arousal levels controlled, valence produced a continuum of memory falsification. Falsification was highest for negative materials, intermediate for neutral materials, and lowest for positive materials. Conjoint-recognition analysis produced a simple process-level explanation: As one progresses from positive to neutral to negative valence, false memory increases because (a) the perceived meaning resemblance between false and true items increases and (b) subjects are less able to use verbatim memories of true items to suppress errors.

ARTICLE UPDATE - Attention, emotion, and deactivation of default activity in inferior medial prefrontal cortex.

Geday J, Gjedde A.

Brain and Cognition, in press

Attention deactivates the inferior medial prefrontal cortex (IMPC), but it is uncertain if emotions can attenuate this deactivation. To test the extent to which common emotions interfere with attention, we measured changes of a blood flow index of brain activity in key areas of the IMPC with positron emission tomography (PET) of labeled water (H(2)(15)O) uptake in brain of 14 healthy subjects. The subjects performed either a less demanding or a more demanding task of attention while they watched neutral and emotive images of people in realistic indoor or outdoor situations. In the less demanding task, subjects used the index finger to press any key when a new image appeared. In the more demanding task, subjects chose the index or middle finger to press separate keys for outdoor and indoor scenes. Compared to the less demanding task, in a global search of all gray matter, the more demanding significantly lowered blood flow (rCBF) in left IMPC, left and right insula, and right amygdala, and significantly raised blood flow in motor cortex and right precuneus. Restricted searches of rCBF changes by emotion, at coordinates of significant effect in previous studies of the medial prefrontal and temporal cortices, revealed significant activation in the fusiform gyrus, independently of the task. In contrast, we found no effect of emotional content in the IMPC, where emotions failed to override the effect of the task. The results are consistent with a role of the IMPC in the selection among competitive inputs from multiple brain regions, as predicted by the theory of a default mode of brain function. The absent emotional interference with the deactivation of the default state suggests that the inferior prefrontal cortex continued to serve the attention rather than submit to the distraction.

Monday, October 20, 2008

ARTICLE UPDATE - Emotion Modulates Early Auditory Response to Speech.

Wang J, Nicol T, Skoe E, Sams M, Kraus N.

Journal of Cognitive Neuroscience, in press

In order to understand how emotional state influences the listener's physiological response to speech, subjects looked at emotion-evoking pictures while 32-channel EEG evoked responses (ERPs) to an unchanging auditory stimulus ("danny") were collected. The pictures were selected from the International Affective Picture System database. They were rated by participants and differed in valence (positive, negative, neutral), but not in dominance and arousal. Effects of viewing negative emotion pictures were seen as early as 20 msec (p = .006). An analysis of the global field power highlighted a time period of interest (30.4-129.0 msec) where the effects of emotion are likely to be the most robust. At the cortical level, the responses differed significantly depending on the valence ratings the subjects provided for the visual stimuli, which divided them into the high valence intensity group and the low valence intensity group. The high valence intensity group exhibited a clear divergent bivalent effect of emotion (ERPs at Cz during viewing neutral pictures subtracted from ERPs during viewing positive or negative pictures) in the time region of interest (r() = .534, p < .01). Moreover, group differences emerged in the pattern of global activation during this time period. Although both groups demonstrated a significant effect of emotion (ANOVA, p = .004 and .006, low valence intensity and high valence intensity, respectively), the high valence intensity group exhibited a much larger effect. Whereas the low valence intensity group exhibited its smaller effect predominantly in frontal areas, the larger effect in the high valence intensity group was found globally, especially in the left temporal areas, with the largest divergent bivalent effects (ANOVA, p < .00001) in high valence intensity subjects around the midline. Thus, divergent bivalent effects were observed between 30 and 130 msec, and were dependent on the subject's subjective state, whereas the effects at 20 msec were evident only for negative emotion, independent of the subject's behavioral responses. Taken together, it appears that emotion can affect auditory function early in the sensory processing stream.

ARTICLE UPDATE - The valence strength of negative stimuli modulates visual novelty processing: Electrophysiological evidence from an event-related pot

Yuan J, Yang J, Meng X, Yu F, Li H.

Neuroscience, in press

In natural settings, the occurrence of unpredictable infrequent events is often associated with emotional reactions in the brain. Previous research suggested a special sensitivity of the brain to valence differences in emotionally negative stimuli. Thus, the present study hypothesizes that valence changes in infrequent negative stimuli would have differential effects on visual novelty processing. Event-related potentials (ERPs) were recorded for highly negative (HN), moderately negative (MN) and Neutral infrequent stimuli, and for the frequent standard stimulus while subjects performed a frequent/infrequent categorization task, irrespective of the emotional valence of the infrequent stimuli. The infrequent-frequent difference waves, which index visual novelty processing, displayed larger N2 amplitudes during HN condition than during MN condition which, in turn, elicited greater N2 amplitude than the Neutral condition. Similarly, in the infrequent-frequent difference waves, the frontocentral P3a and parietal LPC (late positive complex) elicited by the HN condition were more negative than those by MN stimuli, which elicited more negative amplitudes than the Neutral condition. This suggests that negative emotions of diverse strength, as induced by negative stimuli of varying valences, are clearly different in their impact on visual novelty processing. Novel stimuli of increased negativity elicited more attentional resources during the early novelty detection, and recruited increased inhibitive and evaluative processing during the later stages of response decision and reaction readiness, relative to novel stimuli of reduced negativity.

ARTICLE UPDATE - The role of valence and frequency in the emotional Stroop task.

Kahan TA, Hely CD.

Psychological Bulletin & Review, 15, 956-960

People are generally slower to name the color of emotion-laden words than they are to name that of emotionally neutral words. However, an analysis of this emotional Stroop effect (Larsen, Mercer, & Balota, 2006) indicates that the emotion-laden words used are sometimes longer, have lower frequencies, and have smaller orthographic neighborhoods than the emotionally neutral words. This difference in word characteristics raises the possibility that the emotional Stroop effect is partly caused by lexical rather than by emotional aspects of the stimuli-a conclusion supported by the finding that reaction times to name the color of low-frequency words are longer than those for high-frequency words (Burt, 2002). To examine the relative contributions of valence and frequency in color naming, we had 64 participants complete an experiment in which each of these variables was manipulated in a 3 x 2 factorial design; length, orthographic neighborhood density, and arousal were balanced. The data indicate that valence and word frequency interact in contributing to the emotional Stroop effect.

Wednesday, October 15, 2008

ARTICLE UPDATE - How Does Reward Expectation Influence Cognition in the Human Brain?

James B. Rowe, Doris Eckstein, Todd Braver and Adrian M. Owen

Journal of Cognitive Neuroscience,20, 1980-1992

The prospect of reward changes how we think and behave. We investigated how this occurs in the brain using a novel continuous performance task in which fluctuating reward expectations biased cognitive processes between competing spatial and verbal tasks. Critically, effects of reward expectancy could be distinguished from induced changes in task-related networks. Behavioral data confirm specific bias toward a reward-relevant modality. Increased reward expectation improves reaction time and accuracy in the relevant dimension while reducing sensitivity to modulations of stimuli characteristics in the irrelevant dimension. Analysis of functional magnetic resonance imaging data shows that the proximity to reward over successive trials is associated with increased activity of the medial frontal cortex regardless of the modality. However, there are modality-specific changes in brain activity in the lateral frontal, parietal, and temporal cortex. Analysis of effective connectivity suggests that reward expectancy enhances coupling in both early visual pathways and within the prefrontal cortex. These distributed changes in task-related cortical networks arise from subjects' representations of future events and likelihood of reward.

Saturday, October 11, 2008

SPECIAL ISSUE - Music & Emotion

Behavior and Brain Science, Volume 31, Issue 5.

ARTICLE UPDATE - Fear relevancy, strategy use, and probabilistic learning of cue-outcome associations.

Thomas LA, LaBar KS.

Learning & Memory, 15, 777-784

The goal of this study was to determine how the fear relevancy of outcomes during probabilistic classification learning affects behavior and strategy use. Novel variants of the "weather prediction" task were created, in which cue cards predicted either looming fearful or neutral outcomes in a between-groups design. Strategy use was examined by goodness-of-fit estimates of response patterns across trial blocks to mathematical models of simple, complex, and nonidentifiable strategies. Participants in the emotional condition who were fearful of the outcomes had greater skin conductance responses compared with controls and performed worse, used suboptimal strategies, and had less insight into the predictive cue features during initial learning. In contrast, nonfearful participants in the emotional condition used more optimal strategies than the other groups by the end of the two training days. Results have implications for understanding how individual differences in fear relevancy alter the impact of emotion on feedback-based learning.

ARTICLE UPDATE - Electrophysiological correlates of affective blindsight.

Gonzalez Andino SL, Grave de Peralta Menendez R, Khateb A, Landis T, Pegna AJ.

Neuroimage, in press

An EEG investigation was carried out in a patient with complete cortical blindness who presented affective blindsight, i.e. who performed above chance when asked to guess the emotional expressions on a series of faces. To uncover the electrophysiological mechanisms involved in this phenomenon we combined multivariate pattern recognition (MPR) with local field potential estimates provided by electric source imaging (ELECTRA). All faces, including neutral faces, elicited distinctive oscillatory EEG patterns that were correctly identified by the MPR algorithm as belonging to the class of facial expressions actually presented. Consequently, neural responses in this patient are not restricted to emotionally laden faces. Earliest non-specific differences between faces occur from 70 ms onwards in the superior temporal polysensory area (STP). Emotion-specific responses were found after 120 ms in the right anterior areas with right amygdala activation observed only later ( approximately 200 ms). Thus, affective blindsight might be mediated by subcortical afferents to temporal areas as suggested in some studies involving non-emotional stimuli. The early activation of the STP in the patient constitutes evidence for fast activation of higher order visual areas in humans despite bilateral V1 destruction. In addition, the absence of awareness of any visual experience in this patient suggests that neither the extrastriate visual areas, nor the prefrontal cortex activation alone are sufficient for conscious perception, which might require recurrent processing within a network of several cerebral areas including V1.

ARTICLE UPDATE - The combined effect of gaze direction and facial expression on cueing spatial attention.

Pecchinenda A, Pes M, Ferlazzo F, Zoccolotti P.

Emotion, 8, 628-634

Empirical evidence shows an effect of gaze direction on cueing spatial attention, regardless of the emotional expression shown by a face, whereas a combined effect of gaze direction and facial expression has been observed on individuals' evaluative judgments. In 2 experiments, the authors investigated whether gaze direction and facial expression affect spatial attention depending upon the presence of an evaluative goal. Disgusted, fearful, happy, or neutral faces gazing left or right were followed by positive or negative target words presented either at the spatial location looked at by the face or at the opposite spatial location. Participants responded to target words based on affective valence (i.e., positive/negative) in Experiment 1 and on letter case (lowercase/uppercase) in Experiment 2. Results showed that participants responded much faster to targets presented at the spatial location looked at by disgusted or fearful faces but only in Experiment 1, when an evaluative task was used. The present findings clearly show that negative facial expressions enhance the attentional shifts due to eye-gaze direction, provided that there was an explicit evaluative goal present.

ARTICLE UPDATE - Directed forgetting of emotional words.

Minnema MT, Knowlton BJ.

Emotion, 8, 643-652

Emotional material may induce processing limitations affecting memory performance. In the present study, the authors investigated how the emotional content of words influences the degree to which participants can be directed to forget them. In Experiment 1, the authors found that negative-valence words were recalled better when participants were told to forget them than when they were told to remember them. This effect was only obtained when a study-list of negative words was presented after the cue to remember or forget the first list. The effect was correlated with negative mood as assessed by the PANAS. Similar results were obtained in Experiment 2, in which the induction of negative arousal by a mild stressor abolished the directed forgetting of words when the following study list was comprised of negative words. These results support the idea that directed forgetting relies on cognitive control processes that may be disrupted by negative emotion.

ARTICLE UPDATE - Trouble crossing the bridge: Altered interhemispheric communication of emotional images in anxiety.

Compton RJ, Carp J, Chaddock L, Fineman SL, Quandt LC, Ratliff JB.

Emotion, 8, 684-692.

Worry is thought to involve a strategy of cognitive avoidance, in which internal verbalization acts to suppress threatening emotional imagery. This study tested the hypothesis that worry-prone individuals would exhibit patterns of between-hemisphere communication that reflect cognitive avoidance. Specifically, the hypothesis predicted slower transfer of threatening images from the left to the right hemisphere among worriers. Event-related potential (ERP) measures of interhemispheric transfer time supported this prediction. Left-to-right hemisphere transfer times for angry faces were relatively slower for individuals scoring high in self-reported worry compared with those scoring low, whereas transfer of happy and neutral faces did not differ between groups. These results suggest that altered interhemispheric communication may constitute one mechanism of cognitive avoidance in worry.

ARTICLE UPDATE - Interpretation bias in social anxiety as detected by event-related brain potentials.

Moser JS, Hajcak G, Huppert JD, Foa EB, Simons RF.

Emotion, 8, 693-700

Little is known about psychophysiological correlates of interpretation bias in social anxiety. To address this issue, the authors measured event-related brain potentials (ERPs) in high and low socially anxious individuals during a task wherein ambiguous scenarios were resolved with either a positive or negative ending. Specifically, the authors examined modulations of the P600, an ERP that peaks approximately 600 ms following stimulus onset and indexes violations of expectancy. Low-anxious individuals were characterized by an increased P600 to negative in comparison with positive sentence endings, suggesting a positive interpretation bias. In contrast, the high-anxious group evidenced equivalent P600 magnitude for negative and positive sentence endings, suggesting a lack of positive interpretation bias. Similar, but less reliable results emerged in earlier time windows, that is, 200-500 ms poststimulus. Reaction time, occurring around 900 ms poststimulus, failed to show a reliable interpretation bias. Results suggest that ERPs can detect interpretation biases in social anxiety before the emission of behavioral responses.

Saturday, September 20, 2008

ARTICLE UPDATE - Is emotional contagion special? An fMRI study on neural systems for affective and cognitive empathy.

Nummenmaa L, Hirvonen J, Parkkola R, Hietanen JK.

Neuroimage, in press

Empathy allows us to simulate others' affective and cognitive mental states internally, and it has been proposed that the mirroring or motor representation systems play a key role in such simulation. As emotions are related to important adaptive events linked with benefit or danger, simulating others' emotional states might constitute of a special case of empathy. In this functional magnetic resonance imaging (fMRI) study we tested if emotional versus cognitive empathy would facilitate the recruitment of brain networks involved in motor representation and imitation in healthy volunteers. Participants were presented with photographs depicting people in neutral everyday situations (cognitive empathy blocks), or suffering serious threat or harm (emotional empathy blocks). Participants were instructed to empathize with specified persons depicted in the scenes. Emotional versus cognitive empathy resulted in increased activity in limbic areas involved in emotion processing (thalamus), and also in cortical areas involved in face (fusiform gyrus) and body (extrastriate cortex) perception, as well as in networks associated with mirroring of others' actions (inferior parietal lobule). When brain activation resulting from viewing the scenes was controlled, emotional empathy still engaged the mirror neuron system (premotor cortex) more than cognitive empathy. Further, thalamus and primary somatosensory and motor cortices showed increased functional coupling during emotional versus cognitive empathy. The results suggest that emotional empathy is special. Emotional empathy facilitates somatic, sensory, and motor representation of other peoples' mental states, and results in more vigorous mirroring of the observed mental and bodily states than cognitive empathy.

Sunday, September 14, 2008

ARTICLE UPDATE - The human amygdala is involved in general behavioral relevance detection: Evidence from an event-related functional magnetic resonanc

Ousdal OT, Jensen J, Server A, Hariri AR, Nakstad PH, Andreassen OA.
Neuroscience, in press

The amygdala is classically regarded as a detector of potential threat and as a critical component of the neural circuitry mediating conditioned fear responses. However, it has been reported that the human amygdala responds to multiple expressions of emotions as well as emotionally neutral stimuli of a novel, uncertain or ambiguous nature. Thus, it has been proposed that the function of the amygdala may be of a more general art, i.e. as a detector of behaviorally relevant stimuli [Sander D, Grafman J, Zalla T (2003) The human amygdala: an evolved system for relevance detection. Rev Neurosci 14:303-316]. To investigate this putative function of the amygdala, we used event related functional magnetic resonance imaging (fMRI) and a modified Go-NoGo task composed of behaviorally relevant and irrelevant letter and number stimuli. Analyses revealed bilateral amygdala activation in response to letter stimuli that were behaviorally relevant as compared with letters with less behavioral relevance. Similar results were obtained for relatively infrequent NoGo relevant stimuli as compared with more frequent Go stimuli. Our findings support a role for the human amygdala in general detection of behaviorally relevant stimuli.

ARTICLE UPDATE - Natural selective attention: Orienting and emotion.

Bradley MM.

Psychophysiology, in press

The foundations of orienting and attention are hypothesized to stem from activation of defensive and appetitive motivational systems that evolved to protect and sustain the life of the individual. Motivational activation initiates a cascade of perceptual and motor processes that facilitate the selection of appropriate behavior. Among these are detection of significance, indexed by a late centro-parietal positivity in the event-related potential, enhanced perceptual processing, indexed by a initial cardiac deceleration, and preparation for action, indexed by electrodermal changes. Data exploring the role of stimulus novelty and significance in orienting are presented that indicate different components of the orienting response habituate at different rates. Taken together, it is suggested that orienting is mediated by activation of fundamental motivational systems that have evolved to support survival.

ARTICLE UPDATE - Neural Circuitry Underlying the Regulation of Conditioned Fear and Its Relation to Extinction.

Delgado MR, Nearing KI, Ledoux JE, Phelps EA.

Neuron, 59, 829-838

Recent efforts to translate basic research to the treatment of clinical disorders have led to a growing interest in exploring mechanisms for diminishing fear. This research has emphasized two approaches: extinction of conditioned fear, examined across species; and cognitive emotion regulation, unique to humans. Here, we sought to examine the similarities and differences in the neural mechanisms underlying these two paradigms for diminishing fear. Using an emotion regulation strategy, we examine the neural mechanisms of regulating conditioned fear using fMRI and compare the resulting activation pattern with that observed during classic extinction. Our results suggest that the lateral PFC regions engaged by cognitive emotion regulation strategies may influence the amygdala, diminishing fear through similar vmPFC connections that are thought to inhibit the amygdala during extinction. These findings further suggest that humans may have developed complex cognition that can aid in regulating emotional responses while utilizing phylogenetically shared mechanisms of extinction.

ARTICLE UPDATE - Mapping the Semantic Space for the Subjective Experience of Emotional Responses to Odors.

Chrea C, Grandjean D, Delplanque S, Cayeux I, Le Calvé B, Aymard L, Velazco MI, Sander D, Scherer KR.

Chemical Senses, in press

Two studies were conducted to examine the nature of the verbal labels that describe emotional effects elicited by odors. In Study 1, a list of terms selected for their relevance to describe affective feelings induced by odors was assessed while participants were exposed to a set of odorant samples. The data were submitted to a series of exploratory factor analyses to 1) reduce the set of variables to a smaller set of summary scales and 2) get a preliminary sense of the differentiation of affective feelings elicited by odors. The goal of Study 2 was to replicate the findings of Study 1 with a larger sample of odorant samples and participants and to validate the preliminary model obtained in Study 1 by using confirmatory factor analysis. Overall, the findings point to a structure of affective responses to odors that differs from the classical taxonomies of emotion such as posited by discrete or bidimensional emotion theories. These findings suggest that the subjective affective experiences or feelings induced by odors are structured around a small group of dimensions that reflect the role of olfaction in well-being, social interaction, danger prevention, arousal or relaxation sensations, and conscious recollection of emotional memories.

Sunday, September 07, 2008

ARTICLE UPDATE - Music-induced mood modulates the strength of emotional negativity bias: An ERP study.

Chen J, Yuan J, Huang H, Chen C, Li H.

Neuroscience Letters, in press,

The present study investigated the effect of music-elicited moods on the subsequent affective processing through a music-primed valence categorization task. Event-related potentials were recorded for positive and negative emotional pictures that were primed by happy or sad music excerpts. The reaction time data revealed longer reaction times (RTs) for pictures following negative versus positive music pieces, irrespective of the valence of the picture. Additionally, positive pictures elicited faster response latencies than negative pictures, irrespective of the valence of the musical prime. Moreover, the main effect of picture valence, and the music by picture valence interaction effect were both significant for P2 amplitudes and for the averaged amplitudes at 500-700ms interval. Negative pictures elicited smaller P2 amplitudes than positive pictures, and the amplitude differences between negative and positive pictures were larger with negative musical primes than with positive musical primes. Similarly, compared to positive pictures, negative pictures elicited more negative deflections during the 500-700ms interval across prime types. The amplitude differences between negative and positive pictures were again larger under negative versus positive music primes at this interval. Therefore, the present study observed a clear emotional negativity bias during either prime condition, and extended the previous findings by showing increased strength of the negative bias under negative mood primes. This suggests that the neural sensitivity of the brain to negative stimuli varies with individuals' mood states, and this bias is particularly intensified by negative mood states.

Monday, September 01, 2008

ARTICLE UPDATE - Visual search for faces with emotional expressions.

Frischen A, Eastwood JD, Smilek D.

Psychological Bulletin, 134, 662-676

The goal of this review is to critically examine contradictory findings in the study of visual search for emotionally expressive faces. Several key issues are addressed: Can emotional faces be processed preattentively and guide attention? What properties of these faces influence search efficiency? Is search moderated by the emotional state of the observer? The authors argue that the evidence is consistent with claims that (a) preattentive search processes are sensitive to and influenced by facial expressions of emotion, (b) attention guidance is influenced by a dynamic interplay of emotional and perceptual factors, and (c) visual search for emotional faces is influenced by the emotional state of the observer to some extent. The authors also argue that the way in which contextual factors interact to determine search performance needs to be explored further to draw sound conclusions about the precise influence of emotional expressions on search efficiency. Methodological considerations (e.g., set size, distractor background, task set) and ecological limitations of the visual search task are discussed. Finally, specific recommendations are made for future research directions.

ARTICLE UPDATE - Individual differences in learning the affective value of others under minimal conditions.

Bliss-Moreau E, Barrett LF, Wright CI.

Emotion, 8, 479-493.

This paper provides the first demonstration that people can learn about the positive and negative value of other people (e.g., neutral faces) under minimal learning conditions, with stable individual differences in this learning. In four studies, participants viewed neutral faces paired with sentences describing positive, negative or neutral behaviors on either two (Study 1) or four (Studies 2, 3, and 4) occasions. Participants were later asked to judge the valence of the faces alone. Studies 1 and 2 demonstrated that learning does occur under minimal conditions. Study 3 and 4 further demonstrated that the degree of learning was moderated by Extraversion. Finally, Study 4 demonstrated that initial learning persisted over a period of 2 days. Implications for affective processing and person perception are discussed.

ARTICLE UPDATE - Emotion Theory and Research: Highlights, Unanswered Questions, and Emerging Issues.

Izard CE.

Annual Review of Psychology, in press

Emotion feeling is a phase of neurobiological activity, the key component of emotions and emotion-cognition interactions. Emotion schemas, the most frequently occurring emotion experiences, are dynamic emotion-cognition interactions that may consist of momentary/ situational responding or enduring traits of personality that emerge over developmental time. Emotions play a critical role in the evolution of consciousness and the operations of all mental processes. Types of emotion relate differentially to types or levels of consciousness. Unbridled imagination and the ability for sympathetic regulation of empathy may represent both potential gains and losses from the evolution and ontogeny of emotion processes and consciousness. Unresolved issues include psychology’s neglect of levels of consciousness that are distinct from access or reflective consciousness and use of the term “unconscious mind” as a dumpster for all mental processes that are considered unreportable. The relation of memes and the mirror neuron system to empathy, sympathy, and cultural influences on the development of socioemotional skills are unresolved issues destined to attract future research.

ARTICLE UPDATE - Differential Influences of Emotion, Task, and Novelty on Brain Regions Underlying the Processing of Speech Melody.

Ethofer T, Kreifelts B, Wiethoff S, Wolf J, Grodd W, Vuilleumier P, Wildgruber D.

The Journal of Cognitive Neuroscience, in press

Abstract We investigated the functional characteristics of brain regions implicated in processing of speech melody by presenting words spoken in either neutral or angry prosody during a functional magnetic resonance imaging experiment using a factorial habituation design. Subjects judged either affective prosody or word class for these vocal stimuli, which could be heard for either the first, second, or third time. Voice-sensitive temporal cortices, as well as the amygdala, insula, and mediodorsal thalami, reacted stronger to angry than to neutral prosody. These stimulus-driven effects were not influenced by the task, suggesting that these brain structures are automatically engaged during processing of emotional information in the voice and operate relatively independent of cognitive demands. By contrast, the right middle temporal gyrus and the bilateral orbito-frontal cortices (OFC) responded stronger during emotion than word classification, but were also sensitive to anger expressed by the voices, suggesting that some perceptual aspects of prosody are also encoded within these regions subserving explicit processing of vocal emotion. The bilateral OFC showed a selective modulation by emotion and repetition, with particularly pronounced responses to angry prosody during the first presentation only, indicating a critical role of the OFC in detection of vocal information that is both novel and behaviorally relevant. These results converge with previous findings obtained for angry faces and suggest a general involvement of the OFC for recognition of anger irrespective of the sensory modality. Taken together, our study reveals that different aspects of voice stimuli and perceptual demands modulate distinct areas involved in the processing of emotional prosody.

Sunday, August 24, 2008

ARTICLE UPDATE - Affective valence, stimulus attributes, and P300: Color vs. black/white and normal vs. scrambled images.

Cano ME, Class QA, Polich J.

International Journal of Psychophysiology, in press

Pictures from the International Affective Picture System (IAPS) were selected to manipulate affective valence (unpleasant, neutral, pleasant) while keeping arousal level the same. The pictures were presented in an oddball paradigm, with a visual pattern used as the standard stimulus. Subjects pressed a button whenever a target was detected. Experiment 1 presented normal pictures in color and black/white. Control stimuli were constructed for both the color and black/white conditions by randomly rearranging 1 cm square fragments of each original picture to produce a "scrambled" image. Experiment 2 presented the same normal color pictures with large, medium, and small scrambled condition (2, 1, and 0.5 cm squares). The P300 event-related brain potential demonstrated larger amplitudes over frontal areas for positive compared to negative or neutral images for normal color pictures in both experiments. Attenuated and nonsignificant valence effects were obtained for black/white images. Scrambled stimuli in each study yielded no valence effects but demonstrated typical P300 topography that increased from frontal to parietal areas. The findings suggest that P300 amplitude is sensitive to affective picture valence in the absence of stimulus arousal differences, and that stimulus color contributes to ERP valence effects.

ARTICLE UPDATE - Visual search is not blind to emotion.

Gerritsen C, Frischen A, Blake A, Smilek D, Eastwood JD.

Perception and Psychophysics, 70, 1047-1059

A series of three visual search tasks revealed more efficient search for hostile than for peaceful faces among neutral face distractors. Given that this effect has been observed inconsistently in prior literature, meta-analytic methods were employed for evaluating data across three experiments in order to develop a more valid estimate of the potentially small effect size. Furthermore, in the present experiments, different emotional meanings were conditioned to identical faces across observers, thus eliminating confounds between the physical characteristics and the emotional valences of the face stimuli. On the basis of the present findings, we argue that the visual system is capable of determining a face's emotional valence before the face becomes the focus of attention, and that emotional valence can be used by the visual system to determine subsequent attention allocation. However, meta-analytic results indicate that emotional valence makes a relatively small contribution to search efficiency in the present search context.

Saturday, August 16, 2008

ARTICLE UPDATE - Integration of cross-modal emotional information in the human brain: An fMRI study.

Park JY, Gu BM, Kang DH, Shin YW, Choi CH, Lee JM, Kwon JS.

Cortex, in press

The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration.

ARTICLE UPDATE - I feel how you feel but not always: the empathic brain and its modulation.

Hein G, Singer T.

Current Opinions in Neurobiology, in press

The ability to share the other's feelings, known as empathy, has recently become the focus of social neuroscience studies. We review converging evidence that empathy with, for example, the pain of another person, activates part of the neural pain network of the empathizer, without first hand pain stimulation to the empathizer's body. The amplitude of empathic brain responses is modulated by the intensity of the displayed emotion, the appraisal of the situation, characteristics of the suffering person such as perceived fairness, and features of the empathizer such as gender or previous experience with pain-inflicting situations. Future studies in the field should address inter-individual differences in empathy, development and plasticity of the empathic brain over the life span, and the link between empathy, compassionate motivation, and prosocial behavior.

ARTICLE UPDATE - How emotional arousal and valence influence access to awareness.

Sheth BR, Pham T.

Vision Research, in press


Volume 8, Number 6, Abstract 248, Page 248a doi:10.1167/8.6.248 http://journalofvision.org/8/6/248/ ISSN 1534-7362
How emotional arousal and affect influence access to visual awareness
Bruno Breitmeyer

Department of Psychology, University of Houston, and Center for NeuroEngineering and Cognitive Science, University of Houston

[e-mail]
Thuan Pham

University of Houston

Bhavin Sheth

Department of Electrical and Computer Engineering, University of Houston, and Center for NeuroEngineering and Cognitive Science, University of Houston

Abstract

Emotional stimuli attract attention and potentiate the effect of attention on contrast sensitivity, a feature of early vision. The amygdala, a key structure in emotional processing, responds to emotional content prior to awareness and projects to visual cortex. In light of evidence that the primary visual cortex does not have direct access to awareness, we hypothesize that emotion can affect the processing of a visual stimulus even before awareness. Moreover, emotion varies along at least two dimensions: arousal and affect (valence). Dissociating their effects is important to understanding the link between emotion and perception. We examined these effects in binocular rivalry. Pairs of images (54 total) were selected from a known database of natural images (IAPS). Pictures of a pair differed significantly along only one emotional dimension, creating two types – iso-valence and iso-arousal pairs. Pictures of a given pair were presented side by side in a rivalry setup for trials lasting 1 min. each. The duration for which each eye's image was dominant in a trial (dominant phase duration) was obtained from 12 observers. Our results showed: –A main effect of arousal: The dominant phase durations for more arousing pictures of the iso-valence pairs were significantly longer than those for the less arousing pictures. –No main effect of affect: The dominant phase durations of pleasant and unpleasant pictures of iso-arousal pairs did not differ significantly. –An interaction between arousal and affect: For low arousal-level stimuli, the more pleasant image of the pair dominated significantly. In contrast, for high arousal-level stimuli, the more unpleasant image dominated significantly. Our findings suggest that the limbic system acts on visual signals early in processing. While emotional arousal and valence interactively affect access to visual awareness, only arousal exerts an independent control of such access.

ARTICLE UPDATE - A common anterior insula representation of disgust observation, experience and imagination shows divergent functional connectivity pa

Jabbi M, Bastiaansen J, Keysers C.

PLoS

Similar brain regions are involved when we imagine, observe and execute an action. Is the same true for emotions? Here, the same subjects were scanned while they (a) experience, (b) view someone else experiencing and (c) imagine experiencing gustatory emotions (through script-driven imagery). Capitalizing on the fact that disgust is repeatedly inducible within the scanner environment, we scanned the same participants while they (a) view actors taste the content of a cup and look disgusted (b) tasted unpleasant bitter liquids to induce disgust, and (c) read and imagine scenarios involving disgust and their neutral counterparts. To reduce habituation, we inter-mixed trials of positive emotions in all three scanning experiments. We found voxels in the anterior Insula and adjacent frontal operculum to be involved in all three modalities of disgust, suggesting that simulation in the context of social perception and mental imagery of disgust share a common neural substrates. Using effective connectivity, this shared region however was found to be embedded in distinct functional circuits during the three modalities, suggesting why observing, imagining and experiencing an emotion feels so different

Wednesday, August 13, 2008

ARTICLE UPDATE - The human amygdala is sensitive to the valence of pictures and sounds irrespective of arousal: an fMRI study

Silke Anders, Falk Eippert, Nikolaus Weiskopf and Ralf Veit

Social Cognitive and Affective Neuroscience, in press

With the advent of studies showing that amygdala responses are not limited to fear-related or highly unpleasant stimuli, studies began to focus on stimulus valence and stimulus-related arousal as predictors of amygdala activity. Recent studies in the chemosensory domain found amygdala activity to increase with the intensity of negative and positive chemosensory stimuli. This has led to the proposal that amygdala activity might be an indicator of emotional arousal, at least in the chemosensory domain. The present study investigated amygdala activity in response to visual and auditory stimuli. By selecting stimuli based on individual valence and arousal ratings, we were able to dissociate stimulus valence and stimulus-related arousal, both on the verbal and the peripheral physiological level. We found that the amygdala was sensitive to stimulus valence even when arousal was controlled for, and that increased amygdala activity was better explained by valence than by arousal. The proposed difference in the relation between amygdala activity and stimulus-related arousal between the chemosensory and the audiovisual domain is discussed in terms of the amygdala's embedding within these sensory systems and the processes by which emotional meaning is derived.

Saturday, August 09, 2008

ARTICLE UPDATE - Emotional experience modulates brain activity during fixation periods between tasks.Emotional experience modulates brain activity dur

Pitroda S, Angstadt M, McCloskey MS, Coccaro EF, Phan KL.

Neuroscience Letters, in press

Functional imaging studies have begun to identify a set of brain regions whose brain activity is greater during 'rest' (e.g., fixation) states than during cognitive tasks. It has been posited that these regions constitute a network that supports the brain's default mode, which is temporarily suspended during specific goal-directed behaviors. Exogenous tasks that require cognitive effort are thought to command reallocation of resources away from the brain's default state. However, it remains unknown if brain activity during fixation periods between active task periods is influenced by previous task-related emotional content. We examined brain activity during periods of FIXATION (viewing and rating gray-scale images) interspersed among periods of viewing and rating complex images ('PICTURE') with positive, negative, and neutral affective content. We show that a selected group of brain regions (PCC, precuneus, IPL, vACC) do exhibit activity that is greater during FIXATION (>PICTURE); these regions have previously been implicated in the "default brain network". In addition, we report that activity within precuneus and IPL in the FIXATION period is attenuated by the precedent processing of images with positive and negative emotional content, relative to non-emotional content. These data suggest that the activity within regions implicated in the default network is modulated by the presence of environmental stimuli with motivational salience and, thus, adds to our understanding of the brain function during periods of low cognitive, emotional, or sensory demand.

ARTICLE UPDATE - Functional neuroimaging of reward processing and decision-making: A review of aberrant motivational and affective processing in addic

Diekhof EK, Falkai P, Gruber O.

Brain Research Review, in press

The adequate integration of reward- and decision-related information provided by the environment is critical for behavioral success and subjective well being in everyday life. Functional neuroimaging research has already presented a comprehensive picture on affective and motivational processing in the healthy human brain and has recently also turned its interest to the assessment of impaired brain function in psychiatric patients. This article presents an overview on neuroimaging studies dealing with reward processing and decision-making by combining most recent findings from fundamental and clinical research. It provides an outline on the neural mechanisms guiding context-adequate reward processing and decision-making processes in the healthy brain, and also addresses pathophysiological alterations in the brain's reward system that have been observed in substance abuse and mood disorders, two highly prevalent classes of psychiatric disorders. The overall goal is to critically evaluate the specificity of neurophysiological alterations identified in these psychiatric disorders and associated symptoms, and to make suggestions concerning future research.

Saturday, August 02, 2008

ARTICLE UPDATE - The selective processing of emotional visual stimuli while detecting auditory targets: An ERP analysis.

Schupp HT, Stockburger J, Bublatzky F, Junghöfer M, Weike AI, Hamm AO.

Brain Research, in press

Event-related potential studies revealed an early posterior negativity (EPN) for emotional compared to neutral pictures. Exploring the emotion-attention relationship, a previous study observed that a primary visual discrimination task interfered with the emotional modulation of the EPN component. To specify the locus of interference, the present study assessed the fate of selective visual emotion processing while attention is directed towards the auditory modality. While simply viewing a rapid and continuous stream of pleasant, neutral, and unpleasant pictures in one experimental condition, processing demands of a concurrent auditory target discrimination task were systematically varied in three further experimental conditions. Participants successfully performed the auditory task as revealed by behavioral performance and selected event-related potential components. Replicating previous results, emotional pictures were associated with a larger posterior negativity compared to neutral pictures. Of main interest, increasing demands of the auditory task did not modulate the selective processing of emotional visual stimuli. With regard to the locus of interference, selective emotion processing as indexed by the EPN does not seem to reflect shared processing resources of visual and auditory modality.

ARTICLE UPDATE - Communicating emotion: Linking affective prosody and word meaning.

Nygaard LC, Queen JS.

Journal of Experimental Psychology: Human Perception & Performance, 34, 1017-1030

The present study investigated the role of emotional tone of voice in the perception of spoken words. Listeners were presented with words that had either a happy, sad, or neutral meaning. Each word was spoken in a tone of voice (happy, sad, or neutral) that was congruent, incongruent, or neutral with respect to affective meaning, and naming latencies were collected. Across experiments, tone of voice was either blocked or mixed with respect to emotional meaning. The results suggest that emotional tone of voice facilitated linguistic processing of emotional words in an emotion-congruent fashion. These findings suggest that information about emotional tone is used in the processing of linguistic content influencing the recognition and naming of spoken words in an emotion-congruent manner.

Saturday, July 26, 2008

ARTICLE UPDATE - Neural processing of vocal emotion and identity.

Spreckelmeyer KN, Kutas M, Urbach T, Altenmüller E, Münte TF.

Brain & Cognition, in press

The voice is a marker of a person's identity which allows individual recognition even if the person is not in sight. Listening to a voice also affords inferences about the speaker's emotional state. Both these types of personal information are encoded in characteristic acoustic feature patterns analyzed within the auditory cortex. In the present study 16 volunteers listened to pairs of non-verbal voice stimuli with happy or sad valence in two different task conditions while event-related brain potentials (ERPs) were recorded. In an emotion matching task, participants indicated whether the expressed emotion of a target voice was congruent or incongruent with that of a (preceding) prime voice. In an identity matching task, participants indicated whether or not the prime and target voice belonged to the same person. Effects based on emotion expressed occurred earlier than those based on voice identity. Specifically, P2 ( approximately 200ms)-amplitudes were reduced for happy voices when primed by happy voices. Identity match effects, by contrast, did not start until around 300ms. These results show an early task-specific emotion-based influence on the early stages of auditory sensory processing.

Friday, July 18, 2008

ARTICLE UPDATE - Discriminating between changes in bias and changes in accuracy for recognition memory of emotional stimuli.

Grider RC, Malmberg KJ.

Memory & Cognition, 36, 933-946

A debate has emerged as to whether recognition of emotional stimuli is more accurate or more biased than recognition of nonemotional stimuli. Teasing apart changes in accuracy versus changes in bias requires a measurement model. However, different models have been adopted by different researchers, and this has contributed to the current debate. In this article, different measurement models are discussed, and the signal detection model that is most appropriate for recognition is adopted to investigate the effects of valence and arousal on recognition memory performance, using receiver operating characteristic analyses. In addition, complementary two-alternative forced choice experiments were conducted in order to generalize the empirical findings and interpret them under a relatively relaxed set of measurement assumptions. Across all experiments, accuracy was greater for highly valenced stimuli and stimuli with high arousal value. In addition, a bias to endorse positively valenced stimuli was observed. These results are discussed within an adaptive memory framework that assumes that emotion plays an important role in the allocation of attentional resources.

ARTICLE UPDATE - Emotional states influence the neural processing of affective language.

Pratt NL, Kelly SD.

Social Neuroscience, 3, 1-9

The present study investigated whether emotional states influence the neural processing of language. Event-related potentials recorded the brain's response to positively and negatively valenced words (e.g., love vs. death) while participants were directly induced into positive and negative moods. ERP electrodes in frontal scalp regions of the brain distinguished positive and negative words around 400 ms poststimulus. The amplitude of this negative waveform showed a larger negativity for positive words compared to negative words in the frontal electrode region when participants were in a positive, but not negative, mood. These findings build on previous research by demonstrating that people process affective language differently when in positive and negative moods, and lend support to recent views that emotion and cognition interact during language comprehension.

ARTICLE UPDATE - Asymmetrical frontal ERPs, emotion, and behavioral approach/inhibition sensitivity.

Peterson CK, Gable P, Harmon-Jones E.

Social Neuroscience, 113-124

The present study sought to extend past research on frontal brain asymmetry and individual differences by examining relationships of individual differences in behavioral inhibition/approach system (BIS/BAS) sensitivity with asymmetrical frontal event-related brain responses to startle probes presented during viewing of affective pictures. One hundred and ten participants were shown unpleasant, neutral, and pleasant affective pictures, and presented startle probes during picture presentations. Individual differences in BIS sensitivity related to relatively greater right frontal N100 amplitude to startle probes presented during pleasant and unpleasant pictures, whereas individual differences in BAS sensitivity related to reduced left frontal P300 amplitude to startle probes presented during pleasant pictures. The results of this study suggest that BIS sensitivity is related to greater relative right frontal cortical activity during affective states, while BAS sensitivity is related to greater relative left frontal cortical activity during appetitive states.

ARTICLE UPDATE - Friend or foe? Brain systems involved in the perception of dynamic signals of menacing and friendly social approaches.

Carter EJ, Pelphrey KA.

Social Neuroscience, 3, 151-163

During every social approach, humans must assess each other's intentions. Facial expressions provide cues to assist in these assessments via associations with emotion, the likelihood of affiliation, and personality. In this functional magnetic resonance imaging (fMRI) study, participants viewed animated male characters approaching them in a hallway and making either a happy or an angry facial expression. An expected increase in amygdala and superior temporal sulcus activation to the expression of anger was found. Notably, two other social brain regions also had an increased hemodynamic response to anger relative to happiness, including the lateral fusiform gyrus and a region centered in the middle temporal gyrus. Other brain regions showed little differentiation or an increased level of activity to the happy stimuli. These findings provide insight into the brain mechanisms involved in reading the intentions of other human beings in an overtly social context. In particular, they demonstrate brain regions sensitive to social signals of dominance and affiliation.

Thursday, July 10, 2008

ARTICLE UPDATE - Distinguishing expected negative outcomes from preparatory control in the human orbitofrontal cortex.

Ursu S, Clark KA, Stenger VA, Carter CS.

Brain Research, in press

The human orbitofrontal cortex (OFC) plays a critical role in adapting behavior according to the context provided by expected outcomes of actions. However, several aspects of this function are still poorly understood. In particular, it is unclear to what degree any subdivisions of the OFC are specifically engaged when negatively valenced outcomes are expected, and to what extent such areas might be involved in preparatory active control of behavior. We examined these issues in two complementary functional magnetic resonance imaging (fMRI) studies in which we simultaneously and independently manipulated monetary incentives for correct performance, and demands for active preparation of cognitive control. In both experiments, preparation for performance was associated with lateral PFC activity in response to high incentives, regardless of their valence, as well as in response to increased task demands. In contrast, areas of the OFC centered around the lateral orbital sulcus responded maximally to negatively perceived prospects, even when such prospects were associated with decreases in preparatory cognitive control. These results provide direct support for theoretical models which posit that the OFC contributes to behavioral regulation by representing the value of anticipated outcomes, but does not implement active control aimed at avoiding or pursuing outcomes. Furthermore, they provide additional converging evidence that the lateral OFC is involved in representing specifically the affective impact of anticipated negative outcomes.

ARTICLE UPDATE - The effect of appraisal level on processing of emotional prosody in meaningless speech.

Bach DR, Grandjean D, Sander D, Herdener M, Strik WK, Seifritz E.

Neuroimage, in press

In visual perception of emotional stimuli, low- and high-level appraisal processes have been found to engage different neural structures. Beyond emotional facial expression, emotional prosody is an important auditory cue for social interaction. Neuroimaging studies have proposed a network for emotional prosody processing that involves a right temporal input region and explicit evaluation in bilateral prefrontal areas. However, the comparison of different appraisal levels has so far relied upon using linguistic instructions during low-level processing, which might confound effects of processing level and linguistic task. In order to circumvent this problem, we examined processing of emotional prosody in meaningless speech during gender labelling (implicit, low-level appraisal) and emotion labelling (explicit, high-level appraisal). While bilateral amygdala, left superior temporal sulcus and right parietal areas showed stronger blood oxygen level-dependent (BOLD) responses during implicit processing, areas with stronger BOLD responses during explicit processing included the left inferior frontal gyrus, bilateral parietal, anterior cingulate and supplemental motor cortex. Emotional versus neutral prosody evoked BOLD responses in right superior temporal gyrus, bilateral anterior cingulate, left inferior frontal gyrus, insula and bilateral putamen. Basal ganglia and right anterior cingulate responses to emotional versus neutral prosody were particularly pronounced during explicit processing. These results are in line with an amygdala-prefrontal-cingulate network controlling different appraisal levels, and suggest a specific role of the left inferior frontal gyrus in explicit evaluation of emotional prosody. In addition to brain areas commonly related to prosody processing, our results suggest specific functions of anterior cingulate and basal ganglia in detecting emotional prosody, particularly when explicit identification is necessary.

ARTICLE UPDATE - Regulating the expectation of reward via cognitive strategies.

Delgado MR, Gillis MM, Phelps EA.

Nature Neuroscience, in press

Previous emotion regulation research has been successful in altering aversive emotional reactions. It is unclear, however, whether such strategies can also efficiently regulate expectations of reward arising from conditioned stimuli, which can at times be maladaptive (for example, drug cravings). Using a monetary reward-conditioning procedure with cognitive strategies, we observed attenuation in both the physiological (skin conductance) and neural correlates (striatum) of reward expectation as participants engaged in emotion regulation.

ARTICLE UPDATE - Individual differences in disgust sensitivity modulate neural responses to aversive/disgusting stimuli.

Mataix-Cols D, An SK, Lawrence NS, Caseras X, Speckens A, Giampietro V, Brammer MJ, Phillips ML.

European Journal of Neuroscience, 27, 3050-3058.

Little is known about how individual differences in trait disgust sensitivity modulate the neural responses to disgusting stimuli in the brain. Thirty-seven adult healthy volunteers completed the Disgust Scale (DS) and viewed alternating blocks of disgusting and neutral pictures from the International Affective Picture System while undergoing fMRI scanning. DS scores correlated positively with activations in brain regions previously associated with disgust (anterior insula, ventrolateral prefrontal cortex-temporal pole, putamen-globus pallidus, dorsal anterior cingulate, and visual cortex) and negatively with brain regions involved in the regulation of emotions (dorsolateral and rostral prefrontal cortices). The results were not confounded by biological sex, anxiety or depression scores, which were statistically controlled for. Disgust sensitivity, a behavioral trait that is normally distributed in the general population, predicts the magnitude of the individual's neural responses to a broad range of disgusting stimuli. The results have implications for disgust-related psychiatric disorders.

ARTICLE UPDATE - How emotion affects older adults' memories for event details.

Kensinger EA.

Memory, in press

As adults age, they tend to have problems remembering the details of events and the contexts in which events occurred. This review presents evidence that emotion can enhance older adults' abilities to remember episodic detail. Older adults are more likely to remember affective details of an event (e.g., whether something was good or bad, or how an event made them feel) than they are to remember non-affective details, and they remember more details of emotional events than of non-emotional ones. Moreover, in some instances, emotion appears to narrow the age gap in memory performance. It may be that memory for affective context, or for emotional events, relies on cognitive and neural processes that are relatively preserved in older adults.

Sunday, June 29, 2008

ARTICLE UPDATE - Functional grouping and cortical-subcortical interactions in emotion: A meta-analysis of neuroimaging studies.

Kober H, Barrett LF, Joseph J, Bliss-Moreau E, Lindquist K, Wager TD.

Neuroimage, in press

We performed an updated quantitative meta-analysis of 162 neuroimaging studies of emotion using a novel multi-level kernel-based approach, focusing on locating brain regions consistently activated in emotional tasks and their functional organization into distributed functional groups, independent of semantically defined emotion category labels (e.g., "anger," "fear"). Such brain-based analyses are critical if our ways of labeling emotions are to be evaluated and revised based on consistency with brain data. Consistent activations were limited to specific cortical sub-regions, including multiple functional areas within medial, orbital, and inferior lateral frontal cortices. Consistent with a wealth of animal literature, multiple subcortical activations were identified, including amygdala, ventral striatum, thalamus, hypothalamus, and periaqueductal gray. We used multivariate parcellation and clustering techniques to identify groups of co-activated brain regions across studies. These analyses identified six distributed functional groups, including medial and lateral frontal groups, two posterior cortical groups, and paralimbic and core limbic/brainstem groups. These functional groups provide information on potential organization of brain regions into large-scale networks. Specific follow-up analyses focused on amygdala, periaqueductal gray (PAG), and hypothalamic (Hy) activations, and identified frontal cortical areas co-activated with these core limbic structures. While multiple areas of frontal cortex co-activated with amygdala sub-regions, a specific region of dorsomedial prefrontal cortex (dmPFC, Brodmann's Area 9/32) was the only area co-activated with both PAG and Hy. Subsequent mediation analyses were consistent with a pathway from dmPFC through PAG to Hy. These results suggest that medial frontal areas are more closely associated with core limbic activation than their lateral counterparts, and that dmPFC may play a particularly important role in the cognitive generation of emotional states.

ARTICLE UPDATE - The role of the orbitofrontal cortex in the pursuit of happiness and more specific rewards.

Burke KA, Franz TM, Miller DN, Schoenbaum G.

Nature, in press

Cues that reliably predict rewards trigger the thoughts and emotions normally evoked by those rewards. Humans and other animals will work, often quite hard, for these cues. This is termed conditioned reinforcement. The ability to use conditioned reinforcers to guide our behaviour is normally beneficial; however, it can go awry. For example, corporate icons, such as McDonald's Golden Arches, influence consumer behaviour in powerful and sometimes surprising ways, and drug-associated cues trigger relapse to drug seeking in addicts and animals exposed to addictive drugs, even after abstinence or extinction. Yet, despite their prevalence, it is not known how conditioned reinforcers control human or other animal behaviour. One possibility is that they act through the use of the specific rewards they predict; alternatively, they could control behaviour directly by activating emotions that are independent of any specific reward. In other words, the Golden Arches may drive business because they evoke thoughts of hamburgers and fries, or instead, may be effective because they also evoke feelings of hunger or happiness. Moreover, different brain circuits could support conditioned reinforcement mediated by thoughts of specific outcomes versus more general affective information. Here we have attempted to address these questions in rats. Rats were trained to learn that different cues predicted different rewards using specialized conditioning procedures that controlled whether the cues evoked thoughts of specific outcomes or general affective representations common to different outcomes. Subsequently, these rats were given the opportunity to press levers to obtain short and otherwise unrewarded presentations of these cues. We found that rats were willing to work for cues that evoked either outcome-specific or general affective representations. Furthermore the orbitofrontal cortex, a prefrontal region important for adaptive decision-making, was critical for the former but not for the latter form of conditioned reinforcement.

ARTICLE UPDATE - A comparison of two lists providing emotional norms for English words (ANEW and the DAL).

Whissell C.

Psychological Reports, 102, 597-600

Although different in terms of purpose, word-selection procedures, and rating scales, both the ANEW (n = 1034) and DAL (n = 8742) lists, which have 633 words in common, provide normative emotional ratings for English words. This research compared the lists and cross-validated the two main lexical dimensions of affect. Parallel representatives of the two dimensions (Valence and Pleasantness, Arousal and Activation) were correlated across lists (rs = .86, .63). In tune with their separate purposes, the ANEW list, which was designed to describe emotional words, included more rare words, while the DAL, which was designed for natural language applications, included more common ones. The Valence-Activation scatterplot for ANEW was C-shaped and included fewer Arousing words of medium Valence, such as "awake," "debate," and "proves," while the DAL included fewer less common words descriptive of emotion such as "maniac," "corrupt," and "lavish." In view of these differences, list similarities strongly support the generalizability of the two main lexical dimensions of affect.

ARTICLE UPDATE - Preferences for emotional information in older and younger adults: A meta-analysis of memory and attention tasks.

Murphy NA, Isaacowitz DM.

Psychology and Aging, 23, 263-286

The authors conducted a meta-analysis to determine the magnitude of older and younger adults' preferences for emotional stimuli in studies of attention and memory. Analyses involved 1,085 older adults from 37 independent samples and 3,150 younger adults from 86 independent samples. Both age groups exhibited small to medium emotion salience effects (i.e., preference for emotionally valenced stimuli over neutral stimuli) as well as positivity preferences (i.e., preference for positively valenced stimuli over neutral stimuli) and negativity preferences (i.e., preference for negatively valenced stimuli to neutral stimuli). There were few age differences overall. Type of measurement appeared to influence the magnitude of effects; recognition studies indicated significant age effects, where older adults showed smaller effects for emotion salience and negativity preferences than younger adults.

ARTICLE UPDATE - Effects of semantic relatedness on recall of stimuli preceding emotional oddballs.

Smith RM, Beversdorf DQ.

Journal of International Neuropsychological Society, 14, 620-628.

Semantic and episodic memory networks function as highly interconnected systems, both relying on the hippocampal/medial temporal lobe complex (HC/MTL). Episodic memory encoding triggers the retrieval of semantic information, serving to incorporate contextual relationships between the newly acquired memory and existing semantic representations. While emotional material augments episodic memory encoding at the time of stimulus presentation, interactions between emotion and semantic memory that contribute to subsequent episodic recall are not well understood. Using a modified oddball task, we examined the modulatory effects of negative emotion on semantic interactions with episodic memory by measuring the free-recall of serially presented neutral or negative words varying in semantic relatedness. We found increased free-recall for words related to and preceding emotionally negative oddballs, suggesting that negative emotion can indirectly facilitate episodic free-recall by enhancing semantic contributions during encoding. Our findings demonstrate the ability of emotion and semantic memory to interact to mutually enhance free-recall.

Sunday, June 22, 2008

ARTICLE UPDATE - Mirror neuron activation is associated with facial emotion processing.

Enticott PG, Johnston PJ, Herring SE, Hoy KE, Fitzgerald PB.

Neuropsychologia, in press

Theoretical accounts suggest that mirror neurons play a crucial role in social cognition. The current study used transcranial magnetic stimulation (TMS) to investigate the association between mirror neuron activation and facial emotion processing, a fundamental aspect of social cognition, among healthy adults (n=20). Facial emotion processing of static (but not dynamic) images correlated significantly with an enhanced motor response, proposed to reflect mirror neuron activation. These correlations did not appear to reflect general facial processing or pattern recognition, and provide support to current theoretical accounts linking the mirror neuron system to aspects of social cognition. We discuss the mechanism by which mirror neurons might facilitate facial emotion recognition.

Saturday, June 14, 2008

ARTICLE UPDATE - Electrocortical and electrodermal responses covary as a function of emotional arousal: A single-trial analysis.

Keil A, Smith JC, Wangelin BC, Sabatinelli D, Bradley MM, Lang PJ.

Psychophysiology, in press

Electrophysiological studies of human visual perception typically involve averaging across trials distributed over time during an experimental session. Using an oscillatory presentation, in which affective or neutral pictures were presented for 6 s, flickering on and off at a rate of 10 Hz, the present study examined single trials of steady-state visual evoked potentials. Moving window averaging and subsequent Fourier analysis at the stimulation frequency yielded spectral amplitude measures of electrocortical activity. Cronbach's alpha reached values >.79, across electrodes. Single-trial electrocortical activation was significantly related to the size of the skin conductance response recorded during affective picture viewing. These results suggest that individual trials of steady-state potentials may yield reliable indices of electrocortical activity in visual cortex and that amplitude modulation of these indices varies with emotional engagement.

ARTICLE UPDATE - Stimulus-driven and strategic neural responses to fearful and happy facial expressions in humans.

Williams MA, McGlone F, Abbott DF, Mattingley JB.

European Journal of Neuroscience, in press

The human amygdala responds selectively to consciously and unconsciously perceived facial expressions, particularly those that convey potential threat such as fear and anger. In many social situations, multiple faces with varying expressions confront observers yet little is known about the neural mechanisms involved in encoding several faces simultaneously. Here we used event-related fMRI to measure neural activity in pre-defined regions of interest as participants searched multi-face arrays for a designated target expression (fearful or happy). We conducted separate analyses to examine activations associated with each of the four multi-face arrays independent of target expression (stimulus-driven effects), and activations arising from the search for each of the target expressions, independent of the display type (strategic effects). Comparisons across display types, reflecting stimulus-driven influences on visual search, revealed activity in the amygdala and superior temporal sulcus (STS). By contrast, strategic demands of the task did not modulate activity in either the amygdala or STS. These results imply an interactive threat-detection system involving several neural regions. Crucially, activity in the amygdala increased significantly when participants correctly detected the target expression, compared with trials in which the identical target was missed, suggesting that the amygdala has a limited capacity for extracting affective facial expressions.

ARTICLE UPDATE - Sequential modulations of valence processing in the emotional Stroop task.

Kunde W, Mauer N.

Experimental Psychology, 55, 151-156

This study investigated trial-to-trial modulations of the processing of irrelevant valence information. Participants (N = 126) responded to the frame color of pictures with positive, neutral, or negative affective content--a procedure known as an emotional Stroop task (EST). As is typically found, positive and negative pictures delayed responses as compared to neutral pictures. However, the type and extent of this valence-based interference depended on the irrelevant picture valence in the preceding trial. Whereas preceding exposure to negative valence prompted interference from positive and negative pictures, such interference was removed after neutral trials. Following positive pictures, interference from negative but not from positive pictures was observed. We suggest that these sequential modulations reflect automatic self-regulatory selection processes that help to keep the balance between attending to task-relevant information and task-irrelevant information that signals important changes in the environment.

ARTICLE UPDATE - The Montreal Affective Voices: a validated set of nonverbal affect bursts for research on auditory affective processing.

Belin P, Fillion-Bilodeau S, Gosselin F.

Behavioural Research Methods, 40, 531-539

The Montreal Affective Voices consist of 90 nonverbal affect bursts corresponding to the emotions of anger, disgust, fear, pain, sadness, surprise, happiness, and pleasure (plus a neutral expression), recorded by 10 different actors (5 of them male and 5 female). Ratings of valence, arousal, and intensity for eight emotions were collected for each vocalization from 30 participants. Analyses revealed high recognition accuracies for most of the emotional categories (mean of 68%). They also revealed significant effects of both the actors' and the participants' gender: The highest hit rates (75%) were obtained for female participants rating female vocalizations, and the lowest hit rates (60%) for male participants rating male vocalizations. Interestingly, the mixed situations--that is, male participants rating female vocalizations or female participants rating male vocalizations--yielded similar, intermediate ratings. The Montreal Affective Voices are available for download at vnl.psy.gla.ac.uk/ (Resources section).

ARTICLE UPDATE - Unpacking the cognitive architecture of emotion processes.

Grandjean D, Scherer KR.

Emotion, 8, 341-351.

The results of 2 electroencephalographic studies confirm Component Process Model (CPM) predictions that different appraisal checks have specific brain state correlates, occur rapidly in a brief time window after stimulation, and produce results that occur in sequential rather than parallel fashion. The data are compatible with the assumption that early checks (novelty and intrinsic pleasantness) occur in an automatic, unconscious mode of processing, whereas later checks, specifically goal conduciveness, require more extensive, effortful, and controlled processing. Overall, this work, combined with growing evidence for the CPM's response patterning predictions concerning autonomic physiological signatures, facial muscle movements, and vocalization changes, suggests that this model provides an appropriate basis for the unpacking of the cognitive architecture of emotion and its computational modeling.