Sunday, November 02, 2008

ARTICLE UPDATE - Constructing emotion: the experience of fear as a conceptual act.

Lindquist KA, Barrett LF.

Psychological Science, 19, 898-903

This study examined the hypothesis that emotion is a psychological event constructed from the more basic elements of core affect and conceptual knowledge. Participants were primed with conceptual knowledge of fear, conceptual knowledge of anger, or a neutral prime and then proceeded through an affect-induction procedure designed to induce unpleasant, high-arousal affect or a neutral affective state. As predicted, only those individuals for whom conceptual knowledge of fear had been primed experienced unpleasant core affect as evidence that the world was threatening. This study provides the first experimental support for the hypothesis that people experience world-focused emotion when they conceptualize their core affective state using accessible knowledge about emotion.

ARTICLE UPDATE - How does negative emotion cause false memories?

Brainerd CJ, Stein LM, Silveira RA, Rohenkohl G, Reyna VF.

Psychological Science, 19, 919-925

Remembering negative events can stimulate high levels of false memory, relative to remembering neutral events. In experiments in which the emotional valence of encoded materials was manipulated with their arousal levels controlled, valence produced a continuum of memory falsification. Falsification was highest for negative materials, intermediate for neutral materials, and lowest for positive materials. Conjoint-recognition analysis produced a simple process-level explanation: As one progresses from positive to neutral to negative valence, false memory increases because (a) the perceived meaning resemblance between false and true items increases and (b) subjects are less able to use verbatim memories of true items to suppress errors.

ARTICLE UPDATE - Attention, emotion, and deactivation of default activity in inferior medial prefrontal cortex.

Geday J, Gjedde A.

Brain and Cognition, in press

Attention deactivates the inferior medial prefrontal cortex (IMPC), but it is uncertain if emotions can attenuate this deactivation. To test the extent to which common emotions interfere with attention, we measured changes of a blood flow index of brain activity in key areas of the IMPC with positron emission tomography (PET) of labeled water (H(2)(15)O) uptake in brain of 14 healthy subjects. The subjects performed either a less demanding or a more demanding task of attention while they watched neutral and emotive images of people in realistic indoor or outdoor situations. In the less demanding task, subjects used the index finger to press any key when a new image appeared. In the more demanding task, subjects chose the index or middle finger to press separate keys for outdoor and indoor scenes. Compared to the less demanding task, in a global search of all gray matter, the more demanding significantly lowered blood flow (rCBF) in left IMPC, left and right insula, and right amygdala, and significantly raised blood flow in motor cortex and right precuneus. Restricted searches of rCBF changes by emotion, at coordinates of significant effect in previous studies of the medial prefrontal and temporal cortices, revealed significant activation in the fusiform gyrus, independently of the task. In contrast, we found no effect of emotional content in the IMPC, where emotions failed to override the effect of the task. The results are consistent with a role of the IMPC in the selection among competitive inputs from multiple brain regions, as predicted by the theory of a default mode of brain function. The absent emotional interference with the deactivation of the default state suggests that the inferior prefrontal cortex continued to serve the attention rather than submit to the distraction.

Monday, October 20, 2008

ARTICLE UPDATE - Emotion Modulates Early Auditory Response to Speech.

Wang J, Nicol T, Skoe E, Sams M, Kraus N.

Journal of Cognitive Neuroscience, in press

In order to understand how emotional state influences the listener's physiological response to speech, subjects looked at emotion-evoking pictures while 32-channel EEG evoked responses (ERPs) to an unchanging auditory stimulus ("danny") were collected. The pictures were selected from the International Affective Picture System database. They were rated by participants and differed in valence (positive, negative, neutral), but not in dominance and arousal. Effects of viewing negative emotion pictures were seen as early as 20 msec (p = .006). An analysis of the global field power highlighted a time period of interest (30.4-129.0 msec) where the effects of emotion are likely to be the most robust. At the cortical level, the responses differed significantly depending on the valence ratings the subjects provided for the visual stimuli, which divided them into the high valence intensity group and the low valence intensity group. The high valence intensity group exhibited a clear divergent bivalent effect of emotion (ERPs at Cz during viewing neutral pictures subtracted from ERPs during viewing positive or negative pictures) in the time region of interest (r() = .534, p < .01). Moreover, group differences emerged in the pattern of global activation during this time period. Although both groups demonstrated a significant effect of emotion (ANOVA, p = .004 and .006, low valence intensity and high valence intensity, respectively), the high valence intensity group exhibited a much larger effect. Whereas the low valence intensity group exhibited its smaller effect predominantly in frontal areas, the larger effect in the high valence intensity group was found globally, especially in the left temporal areas, with the largest divergent bivalent effects (ANOVA, p < .00001) in high valence intensity subjects around the midline. Thus, divergent bivalent effects were observed between 30 and 130 msec, and were dependent on the subject's subjective state, whereas the effects at 20 msec were evident only for negative emotion, independent of the subject's behavioral responses. Taken together, it appears that emotion can affect auditory function early in the sensory processing stream.

ARTICLE UPDATE - The valence strength of negative stimuli modulates visual novelty processing: Electrophysiological evidence from an event-related pot

Yuan J, Yang J, Meng X, Yu F, Li H.

Neuroscience, in press

In natural settings, the occurrence of unpredictable infrequent events is often associated with emotional reactions in the brain. Previous research suggested a special sensitivity of the brain to valence differences in emotionally negative stimuli. Thus, the present study hypothesizes that valence changes in infrequent negative stimuli would have differential effects on visual novelty processing. Event-related potentials (ERPs) were recorded for highly negative (HN), moderately negative (MN) and Neutral infrequent stimuli, and for the frequent standard stimulus while subjects performed a frequent/infrequent categorization task, irrespective of the emotional valence of the infrequent stimuli. The infrequent-frequent difference waves, which index visual novelty processing, displayed larger N2 amplitudes during HN condition than during MN condition which, in turn, elicited greater N2 amplitude than the Neutral condition. Similarly, in the infrequent-frequent difference waves, the frontocentral P3a and parietal LPC (late positive complex) elicited by the HN condition were more negative than those by MN stimuli, which elicited more negative amplitudes than the Neutral condition. This suggests that negative emotions of diverse strength, as induced by negative stimuli of varying valences, are clearly different in their impact on visual novelty processing. Novel stimuli of increased negativity elicited more attentional resources during the early novelty detection, and recruited increased inhibitive and evaluative processing during the later stages of response decision and reaction readiness, relative to novel stimuli of reduced negativity.

ARTICLE UPDATE - The role of valence and frequency in the emotional Stroop task.

Kahan TA, Hely CD.

Psychological Bulletin & Review, 15, 956-960

People are generally slower to name the color of emotion-laden words than they are to name that of emotionally neutral words. However, an analysis of this emotional Stroop effect (Larsen, Mercer, & Balota, 2006) indicates that the emotion-laden words used are sometimes longer, have lower frequencies, and have smaller orthographic neighborhoods than the emotionally neutral words. This difference in word characteristics raises the possibility that the emotional Stroop effect is partly caused by lexical rather than by emotional aspects of the stimuli-a conclusion supported by the finding that reaction times to name the color of low-frequency words are longer than those for high-frequency words (Burt, 2002). To examine the relative contributions of valence and frequency in color naming, we had 64 participants complete an experiment in which each of these variables was manipulated in a 3 x 2 factorial design; length, orthographic neighborhood density, and arousal were balanced. The data indicate that valence and word frequency interact in contributing to the emotional Stroop effect.

Wednesday, October 15, 2008

ARTICLE UPDATE - How Does Reward Expectation Influence Cognition in the Human Brain?

James B. Rowe, Doris Eckstein, Todd Braver and Adrian M. Owen

Journal of Cognitive Neuroscience,20, 1980-1992

The prospect of reward changes how we think and behave. We investigated how this occurs in the brain using a novel continuous performance task in which fluctuating reward expectations biased cognitive processes between competing spatial and verbal tasks. Critically, effects of reward expectancy could be distinguished from induced changes in task-related networks. Behavioral data confirm specific bias toward a reward-relevant modality. Increased reward expectation improves reaction time and accuracy in the relevant dimension while reducing sensitivity to modulations of stimuli characteristics in the irrelevant dimension. Analysis of functional magnetic resonance imaging data shows that the proximity to reward over successive trials is associated with increased activity of the medial frontal cortex regardless of the modality. However, there are modality-specific changes in brain activity in the lateral frontal, parietal, and temporal cortex. Analysis of effective connectivity suggests that reward expectancy enhances coupling in both early visual pathways and within the prefrontal cortex. These distributed changes in task-related cortical networks arise from subjects' representations of future events and likelihood of reward.

Saturday, October 11, 2008

SPECIAL ISSUE - Music & Emotion

Behavior and Brain Science, Volume 31, Issue 5.

ARTICLE UPDATE - Fear relevancy, strategy use, and probabilistic learning of cue-outcome associations.

Thomas LA, LaBar KS.

Learning & Memory, 15, 777-784

The goal of this study was to determine how the fear relevancy of outcomes during probabilistic classification learning affects behavior and strategy use. Novel variants of the "weather prediction" task were created, in which cue cards predicted either looming fearful or neutral outcomes in a between-groups design. Strategy use was examined by goodness-of-fit estimates of response patterns across trial blocks to mathematical models of simple, complex, and nonidentifiable strategies. Participants in the emotional condition who were fearful of the outcomes had greater skin conductance responses compared with controls and performed worse, used suboptimal strategies, and had less insight into the predictive cue features during initial learning. In contrast, nonfearful participants in the emotional condition used more optimal strategies than the other groups by the end of the two training days. Results have implications for understanding how individual differences in fear relevancy alter the impact of emotion on feedback-based learning.

ARTICLE UPDATE - Electrophysiological correlates of affective blindsight.

Gonzalez Andino SL, Grave de Peralta Menendez R, Khateb A, Landis T, Pegna AJ.

Neuroimage, in press

An EEG investigation was carried out in a patient with complete cortical blindness who presented affective blindsight, i.e. who performed above chance when asked to guess the emotional expressions on a series of faces. To uncover the electrophysiological mechanisms involved in this phenomenon we combined multivariate pattern recognition (MPR) with local field potential estimates provided by electric source imaging (ELECTRA). All faces, including neutral faces, elicited distinctive oscillatory EEG patterns that were correctly identified by the MPR algorithm as belonging to the class of facial expressions actually presented. Consequently, neural responses in this patient are not restricted to emotionally laden faces. Earliest non-specific differences between faces occur from 70 ms onwards in the superior temporal polysensory area (STP). Emotion-specific responses were found after 120 ms in the right anterior areas with right amygdala activation observed only later ( approximately 200 ms). Thus, affective blindsight might be mediated by subcortical afferents to temporal areas as suggested in some studies involving non-emotional stimuli. The early activation of the STP in the patient constitutes evidence for fast activation of higher order visual areas in humans despite bilateral V1 destruction. In addition, the absence of awareness of any visual experience in this patient suggests that neither the extrastriate visual areas, nor the prefrontal cortex activation alone are sufficient for conscious perception, which might require recurrent processing within a network of several cerebral areas including V1.

ARTICLE UPDATE - The combined effect of gaze direction and facial expression on cueing spatial attention.

Pecchinenda A, Pes M, Ferlazzo F, Zoccolotti P.

Emotion, 8, 628-634

Empirical evidence shows an effect of gaze direction on cueing spatial attention, regardless of the emotional expression shown by a face, whereas a combined effect of gaze direction and facial expression has been observed on individuals' evaluative judgments. In 2 experiments, the authors investigated whether gaze direction and facial expression affect spatial attention depending upon the presence of an evaluative goal. Disgusted, fearful, happy, or neutral faces gazing left or right were followed by positive or negative target words presented either at the spatial location looked at by the face or at the opposite spatial location. Participants responded to target words based on affective valence (i.e., positive/negative) in Experiment 1 and on letter case (lowercase/uppercase) in Experiment 2. Results showed that participants responded much faster to targets presented at the spatial location looked at by disgusted or fearful faces but only in Experiment 1, when an evaluative task was used. The present findings clearly show that negative facial expressions enhance the attentional shifts due to eye-gaze direction, provided that there was an explicit evaluative goal present.

ARTICLE UPDATE - Directed forgetting of emotional words.

Minnema MT, Knowlton BJ.

Emotion, 8, 643-652

Emotional material may induce processing limitations affecting memory performance. In the present study, the authors investigated how the emotional content of words influences the degree to which participants can be directed to forget them. In Experiment 1, the authors found that negative-valence words were recalled better when participants were told to forget them than when they were told to remember them. This effect was only obtained when a study-list of negative words was presented after the cue to remember or forget the first list. The effect was correlated with negative mood as assessed by the PANAS. Similar results were obtained in Experiment 2, in which the induction of negative arousal by a mild stressor abolished the directed forgetting of words when the following study list was comprised of negative words. These results support the idea that directed forgetting relies on cognitive control processes that may be disrupted by negative emotion.

ARTICLE UPDATE - Trouble crossing the bridge: Altered interhemispheric communication of emotional images in anxiety.

Compton RJ, Carp J, Chaddock L, Fineman SL, Quandt LC, Ratliff JB.

Emotion, 8, 684-692.

Worry is thought to involve a strategy of cognitive avoidance, in which internal verbalization acts to suppress threatening emotional imagery. This study tested the hypothesis that worry-prone individuals would exhibit patterns of between-hemisphere communication that reflect cognitive avoidance. Specifically, the hypothesis predicted slower transfer of threatening images from the left to the right hemisphere among worriers. Event-related potential (ERP) measures of interhemispheric transfer time supported this prediction. Left-to-right hemisphere transfer times for angry faces were relatively slower for individuals scoring high in self-reported worry compared with those scoring low, whereas transfer of happy and neutral faces did not differ between groups. These results suggest that altered interhemispheric communication may constitute one mechanism of cognitive avoidance in worry.

ARTICLE UPDATE - Interpretation bias in social anxiety as detected by event-related brain potentials.

Moser JS, Hajcak G, Huppert JD, Foa EB, Simons RF.

Emotion, 8, 693-700

Little is known about psychophysiological correlates of interpretation bias in social anxiety. To address this issue, the authors measured event-related brain potentials (ERPs) in high and low socially anxious individuals during a task wherein ambiguous scenarios were resolved with either a positive or negative ending. Specifically, the authors examined modulations of the P600, an ERP that peaks approximately 600 ms following stimulus onset and indexes violations of expectancy. Low-anxious individuals were characterized by an increased P600 to negative in comparison with positive sentence endings, suggesting a positive interpretation bias. In contrast, the high-anxious group evidenced equivalent P600 magnitude for negative and positive sentence endings, suggesting a lack of positive interpretation bias. Similar, but less reliable results emerged in earlier time windows, that is, 200-500 ms poststimulus. Reaction time, occurring around 900 ms poststimulus, failed to show a reliable interpretation bias. Results suggest that ERPs can detect interpretation biases in social anxiety before the emission of behavioral responses.

Saturday, September 20, 2008

ARTICLE UPDATE - Is emotional contagion special? An fMRI study on neural systems for affective and cognitive empathy.

Nummenmaa L, Hirvonen J, Parkkola R, Hietanen JK.

Neuroimage, in press

Empathy allows us to simulate others' affective and cognitive mental states internally, and it has been proposed that the mirroring or motor representation systems play a key role in such simulation. As emotions are related to important adaptive events linked with benefit or danger, simulating others' emotional states might constitute of a special case of empathy. In this functional magnetic resonance imaging (fMRI) study we tested if emotional versus cognitive empathy would facilitate the recruitment of brain networks involved in motor representation and imitation in healthy volunteers. Participants were presented with photographs depicting people in neutral everyday situations (cognitive empathy blocks), or suffering serious threat or harm (emotional empathy blocks). Participants were instructed to empathize with specified persons depicted in the scenes. Emotional versus cognitive empathy resulted in increased activity in limbic areas involved in emotion processing (thalamus), and also in cortical areas involved in face (fusiform gyrus) and body (extrastriate cortex) perception, as well as in networks associated with mirroring of others' actions (inferior parietal lobule). When brain activation resulting from viewing the scenes was controlled, emotional empathy still engaged the mirror neuron system (premotor cortex) more than cognitive empathy. Further, thalamus and primary somatosensory and motor cortices showed increased functional coupling during emotional versus cognitive empathy. The results suggest that emotional empathy is special. Emotional empathy facilitates somatic, sensory, and motor representation of other peoples' mental states, and results in more vigorous mirroring of the observed mental and bodily states than cognitive empathy.

Sunday, September 14, 2008

ARTICLE UPDATE - The human amygdala is involved in general behavioral relevance detection: Evidence from an event-related functional magnetic resonanc

Ousdal OT, Jensen J, Server A, Hariri AR, Nakstad PH, Andreassen OA.
Neuroscience, in press

The amygdala is classically regarded as a detector of potential threat and as a critical component of the neural circuitry mediating conditioned fear responses. However, it has been reported that the human amygdala responds to multiple expressions of emotions as well as emotionally neutral stimuli of a novel, uncertain or ambiguous nature. Thus, it has been proposed that the function of the amygdala may be of a more general art, i.e. as a detector of behaviorally relevant stimuli [Sander D, Grafman J, Zalla T (2003) The human amygdala: an evolved system for relevance detection. Rev Neurosci 14:303-316]. To investigate this putative function of the amygdala, we used event related functional magnetic resonance imaging (fMRI) and a modified Go-NoGo task composed of behaviorally relevant and irrelevant letter and number stimuli. Analyses revealed bilateral amygdala activation in response to letter stimuli that were behaviorally relevant as compared with letters with less behavioral relevance. Similar results were obtained for relatively infrequent NoGo relevant stimuli as compared with more frequent Go stimuli. Our findings support a role for the human amygdala in general detection of behaviorally relevant stimuli.

ARTICLE UPDATE - Natural selective attention: Orienting and emotion.

Bradley MM.

Psychophysiology, in press

The foundations of orienting and attention are hypothesized to stem from activation of defensive and appetitive motivational systems that evolved to protect and sustain the life of the individual. Motivational activation initiates a cascade of perceptual and motor processes that facilitate the selection of appropriate behavior. Among these are detection of significance, indexed by a late centro-parietal positivity in the event-related potential, enhanced perceptual processing, indexed by a initial cardiac deceleration, and preparation for action, indexed by electrodermal changes. Data exploring the role of stimulus novelty and significance in orienting are presented that indicate different components of the orienting response habituate at different rates. Taken together, it is suggested that orienting is mediated by activation of fundamental motivational systems that have evolved to support survival.

ARTICLE UPDATE - Neural Circuitry Underlying the Regulation of Conditioned Fear and Its Relation to Extinction.

Delgado MR, Nearing KI, Ledoux JE, Phelps EA.

Neuron, 59, 829-838

Recent efforts to translate basic research to the treatment of clinical disorders have led to a growing interest in exploring mechanisms for diminishing fear. This research has emphasized two approaches: extinction of conditioned fear, examined across species; and cognitive emotion regulation, unique to humans. Here, we sought to examine the similarities and differences in the neural mechanisms underlying these two paradigms for diminishing fear. Using an emotion regulation strategy, we examine the neural mechanisms of regulating conditioned fear using fMRI and compare the resulting activation pattern with that observed during classic extinction. Our results suggest that the lateral PFC regions engaged by cognitive emotion regulation strategies may influence the amygdala, diminishing fear through similar vmPFC connections that are thought to inhibit the amygdala during extinction. These findings further suggest that humans may have developed complex cognition that can aid in regulating emotional responses while utilizing phylogenetically shared mechanisms of extinction.

ARTICLE UPDATE - Mapping the Semantic Space for the Subjective Experience of Emotional Responses to Odors.

Chrea C, Grandjean D, Delplanque S, Cayeux I, Le Calvé B, Aymard L, Velazco MI, Sander D, Scherer KR.

Chemical Senses, in press

Two studies were conducted to examine the nature of the verbal labels that describe emotional effects elicited by odors. In Study 1, a list of terms selected for their relevance to describe affective feelings induced by odors was assessed while participants were exposed to a set of odorant samples. The data were submitted to a series of exploratory factor analyses to 1) reduce the set of variables to a smaller set of summary scales and 2) get a preliminary sense of the differentiation of affective feelings elicited by odors. The goal of Study 2 was to replicate the findings of Study 1 with a larger sample of odorant samples and participants and to validate the preliminary model obtained in Study 1 by using confirmatory factor analysis. Overall, the findings point to a structure of affective responses to odors that differs from the classical taxonomies of emotion such as posited by discrete or bidimensional emotion theories. These findings suggest that the subjective affective experiences or feelings induced by odors are structured around a small group of dimensions that reflect the role of olfaction in well-being, social interaction, danger prevention, arousal or relaxation sensations, and conscious recollection of emotional memories.

Sunday, September 07, 2008

ARTICLE UPDATE - Music-induced mood modulates the strength of emotional negativity bias: An ERP study.

Chen J, Yuan J, Huang H, Chen C, Li H.

Neuroscience Letters, in press,

The present study investigated the effect of music-elicited moods on the subsequent affective processing through a music-primed valence categorization task. Event-related potentials were recorded for positive and negative emotional pictures that were primed by happy or sad music excerpts. The reaction time data revealed longer reaction times (RTs) for pictures following negative versus positive music pieces, irrespective of the valence of the picture. Additionally, positive pictures elicited faster response latencies than negative pictures, irrespective of the valence of the musical prime. Moreover, the main effect of picture valence, and the music by picture valence interaction effect were both significant for P2 amplitudes and for the averaged amplitudes at 500-700ms interval. Negative pictures elicited smaller P2 amplitudes than positive pictures, and the amplitude differences between negative and positive pictures were larger with negative musical primes than with positive musical primes. Similarly, compared to positive pictures, negative pictures elicited more negative deflections during the 500-700ms interval across prime types. The amplitude differences between negative and positive pictures were again larger under negative versus positive music primes at this interval. Therefore, the present study observed a clear emotional negativity bias during either prime condition, and extended the previous findings by showing increased strength of the negative bias under negative mood primes. This suggests that the neural sensitivity of the brain to negative stimuli varies with individuals' mood states, and this bias is particularly intensified by negative mood states.

Monday, September 01, 2008

ARTICLE UPDATE - Visual search for faces with emotional expressions.

Frischen A, Eastwood JD, Smilek D.

Psychological Bulletin, 134, 662-676

The goal of this review is to critically examine contradictory findings in the study of visual search for emotionally expressive faces. Several key issues are addressed: Can emotional faces be processed preattentively and guide attention? What properties of these faces influence search efficiency? Is search moderated by the emotional state of the observer? The authors argue that the evidence is consistent with claims that (a) preattentive search processes are sensitive to and influenced by facial expressions of emotion, (b) attention guidance is influenced by a dynamic interplay of emotional and perceptual factors, and (c) visual search for emotional faces is influenced by the emotional state of the observer to some extent. The authors also argue that the way in which contextual factors interact to determine search performance needs to be explored further to draw sound conclusions about the precise influence of emotional expressions on search efficiency. Methodological considerations (e.g., set size, distractor background, task set) and ecological limitations of the visual search task are discussed. Finally, specific recommendations are made for future research directions.

ARTICLE UPDATE - Individual differences in learning the affective value of others under minimal conditions.

Bliss-Moreau E, Barrett LF, Wright CI.

Emotion, 8, 479-493.

This paper provides the first demonstration that people can learn about the positive and negative value of other people (e.g., neutral faces) under minimal learning conditions, with stable individual differences in this learning. In four studies, participants viewed neutral faces paired with sentences describing positive, negative or neutral behaviors on either two (Study 1) or four (Studies 2, 3, and 4) occasions. Participants were later asked to judge the valence of the faces alone. Studies 1 and 2 demonstrated that learning does occur under minimal conditions. Study 3 and 4 further demonstrated that the degree of learning was moderated by Extraversion. Finally, Study 4 demonstrated that initial learning persisted over a period of 2 days. Implications for affective processing and person perception are discussed.

ARTICLE UPDATE - Emotion Theory and Research: Highlights, Unanswered Questions, and Emerging Issues.

Izard CE.

Annual Review of Psychology, in press

Emotion feeling is a phase of neurobiological activity, the key component of emotions and emotion-cognition interactions. Emotion schemas, the most frequently occurring emotion experiences, are dynamic emotion-cognition interactions that may consist of momentary/ situational responding or enduring traits of personality that emerge over developmental time. Emotions play a critical role in the evolution of consciousness and the operations of all mental processes. Types of emotion relate differentially to types or levels of consciousness. Unbridled imagination and the ability for sympathetic regulation of empathy may represent both potential gains and losses from the evolution and ontogeny of emotion processes and consciousness. Unresolved issues include psychology’s neglect of levels of consciousness that are distinct from access or reflective consciousness and use of the term “unconscious mind” as a dumpster for all mental processes that are considered unreportable. The relation of memes and the mirror neuron system to empathy, sympathy, and cultural influences on the development of socioemotional skills are unresolved issues destined to attract future research.

ARTICLE UPDATE - Differential Influences of Emotion, Task, and Novelty on Brain Regions Underlying the Processing of Speech Melody.

Ethofer T, Kreifelts B, Wiethoff S, Wolf J, Grodd W, Vuilleumier P, Wildgruber D.

The Journal of Cognitive Neuroscience, in press

Abstract We investigated the functional characteristics of brain regions implicated in processing of speech melody by presenting words spoken in either neutral or angry prosody during a functional magnetic resonance imaging experiment using a factorial habituation design. Subjects judged either affective prosody or word class for these vocal stimuli, which could be heard for either the first, second, or third time. Voice-sensitive temporal cortices, as well as the amygdala, insula, and mediodorsal thalami, reacted stronger to angry than to neutral prosody. These stimulus-driven effects were not influenced by the task, suggesting that these brain structures are automatically engaged during processing of emotional information in the voice and operate relatively independent of cognitive demands. By contrast, the right middle temporal gyrus and the bilateral orbito-frontal cortices (OFC) responded stronger during emotion than word classification, but were also sensitive to anger expressed by the voices, suggesting that some perceptual aspects of prosody are also encoded within these regions subserving explicit processing of vocal emotion. The bilateral OFC showed a selective modulation by emotion and repetition, with particularly pronounced responses to angry prosody during the first presentation only, indicating a critical role of the OFC in detection of vocal information that is both novel and behaviorally relevant. These results converge with previous findings obtained for angry faces and suggest a general involvement of the OFC for recognition of anger irrespective of the sensory modality. Taken together, our study reveals that different aspects of voice stimuli and perceptual demands modulate distinct areas involved in the processing of emotional prosody.

Sunday, August 24, 2008

ARTICLE UPDATE - Affective valence, stimulus attributes, and P300: Color vs. black/white and normal vs. scrambled images.

Cano ME, Class QA, Polich J.

International Journal of Psychophysiology, in press

Pictures from the International Affective Picture System (IAPS) were selected to manipulate affective valence (unpleasant, neutral, pleasant) while keeping arousal level the same. The pictures were presented in an oddball paradigm, with a visual pattern used as the standard stimulus. Subjects pressed a button whenever a target was detected. Experiment 1 presented normal pictures in color and black/white. Control stimuli were constructed for both the color and black/white conditions by randomly rearranging 1 cm square fragments of each original picture to produce a "scrambled" image. Experiment 2 presented the same normal color pictures with large, medium, and small scrambled condition (2, 1, and 0.5 cm squares). The P300 event-related brain potential demonstrated larger amplitudes over frontal areas for positive compared to negative or neutral images for normal color pictures in both experiments. Attenuated and nonsignificant valence effects were obtained for black/white images. Scrambled stimuli in each study yielded no valence effects but demonstrated typical P300 topography that increased from frontal to parietal areas. The findings suggest that P300 amplitude is sensitive to affective picture valence in the absence of stimulus arousal differences, and that stimulus color contributes to ERP valence effects.

ARTICLE UPDATE - Visual search is not blind to emotion.

Gerritsen C, Frischen A, Blake A, Smilek D, Eastwood JD.

Perception and Psychophysics, 70, 1047-1059

A series of three visual search tasks revealed more efficient search for hostile than for peaceful faces among neutral face distractors. Given that this effect has been observed inconsistently in prior literature, meta-analytic methods were employed for evaluating data across three experiments in order to develop a more valid estimate of the potentially small effect size. Furthermore, in the present experiments, different emotional meanings were conditioned to identical faces across observers, thus eliminating confounds between the physical characteristics and the emotional valences of the face stimuli. On the basis of the present findings, we argue that the visual system is capable of determining a face's emotional valence before the face becomes the focus of attention, and that emotional valence can be used by the visual system to determine subsequent attention allocation. However, meta-analytic results indicate that emotional valence makes a relatively small contribution to search efficiency in the present search context.

Saturday, August 16, 2008

ARTICLE UPDATE - Integration of cross-modal emotional information in the human brain: An fMRI study.

Park JY, Gu BM, Kang DH, Shin YW, Choi CH, Lee JM, Kwon JS.

Cortex, in press

The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration.

ARTICLE UPDATE - I feel how you feel but not always: the empathic brain and its modulation.

Hein G, Singer T.

Current Opinions in Neurobiology, in press

The ability to share the other's feelings, known as empathy, has recently become the focus of social neuroscience studies. We review converging evidence that empathy with, for example, the pain of another person, activates part of the neural pain network of the empathizer, without first hand pain stimulation to the empathizer's body. The amplitude of empathic brain responses is modulated by the intensity of the displayed emotion, the appraisal of the situation, characteristics of the suffering person such as perceived fairness, and features of the empathizer such as gender or previous experience with pain-inflicting situations. Future studies in the field should address inter-individual differences in empathy, development and plasticity of the empathic brain over the life span, and the link between empathy, compassionate motivation, and prosocial behavior.

ARTICLE UPDATE - How emotional arousal and valence influence access to awareness.

Sheth BR, Pham T.

Vision Research, in press


Volume 8, Number 6, Abstract 248, Page 248a doi:10.1167/8.6.248 http://journalofvision.org/8/6/248/ ISSN 1534-7362
How emotional arousal and affect influence access to visual awareness
Bruno Breitmeyer

Department of Psychology, University of Houston, and Center for NeuroEngineering and Cognitive Science, University of Houston

[e-mail]
Thuan Pham

University of Houston

Bhavin Sheth

Department of Electrical and Computer Engineering, University of Houston, and Center for NeuroEngineering and Cognitive Science, University of Houston

Abstract

Emotional stimuli attract attention and potentiate the effect of attention on contrast sensitivity, a feature of early vision. The amygdala, a key structure in emotional processing, responds to emotional content prior to awareness and projects to visual cortex. In light of evidence that the primary visual cortex does not have direct access to awareness, we hypothesize that emotion can affect the processing of a visual stimulus even before awareness. Moreover, emotion varies along at least two dimensions: arousal and affect (valence). Dissociating their effects is important to understanding the link between emotion and perception. We examined these effects in binocular rivalry. Pairs of images (54 total) were selected from a known database of natural images (IAPS). Pictures of a pair differed significantly along only one emotional dimension, creating two types – iso-valence and iso-arousal pairs. Pictures of a given pair were presented side by side in a rivalry setup for trials lasting 1 min. each. The duration for which each eye's image was dominant in a trial (dominant phase duration) was obtained from 12 observers. Our results showed: –A main effect of arousal: The dominant phase durations for more arousing pictures of the iso-valence pairs were significantly longer than those for the less arousing pictures. –No main effect of affect: The dominant phase durations of pleasant and unpleasant pictures of iso-arousal pairs did not differ significantly. –An interaction between arousal and affect: For low arousal-level stimuli, the more pleasant image of the pair dominated significantly. In contrast, for high arousal-level stimuli, the more unpleasant image dominated significantly. Our findings suggest that the limbic system acts on visual signals early in processing. While emotional arousal and valence interactively affect access to visual awareness, only arousal exerts an independent control of such access.

ARTICLE UPDATE - A common anterior insula representation of disgust observation, experience and imagination shows divergent functional connectivity pa

Jabbi M, Bastiaansen J, Keysers C.

PLoS

Similar brain regions are involved when we imagine, observe and execute an action. Is the same true for emotions? Here, the same subjects were scanned while they (a) experience, (b) view someone else experiencing and (c) imagine experiencing gustatory emotions (through script-driven imagery). Capitalizing on the fact that disgust is repeatedly inducible within the scanner environment, we scanned the same participants while they (a) view actors taste the content of a cup and look disgusted (b) tasted unpleasant bitter liquids to induce disgust, and (c) read and imagine scenarios involving disgust and their neutral counterparts. To reduce habituation, we inter-mixed trials of positive emotions in all three scanning experiments. We found voxels in the anterior Insula and adjacent frontal operculum to be involved in all three modalities of disgust, suggesting that simulation in the context of social perception and mental imagery of disgust share a common neural substrates. Using effective connectivity, this shared region however was found to be embedded in distinct functional circuits during the three modalities, suggesting why observing, imagining and experiencing an emotion feels so different

Wednesday, August 13, 2008

ARTICLE UPDATE - The human amygdala is sensitive to the valence of pictures and sounds irrespective of arousal: an fMRI study

Silke Anders, Falk Eippert, Nikolaus Weiskopf and Ralf Veit

Social Cognitive and Affective Neuroscience, in press

With the advent of studies showing that amygdala responses are not limited to fear-related or highly unpleasant stimuli, studies began to focus on stimulus valence and stimulus-related arousal as predictors of amygdala activity. Recent studies in the chemosensory domain found amygdala activity to increase with the intensity of negative and positive chemosensory stimuli. This has led to the proposal that amygdala activity might be an indicator of emotional arousal, at least in the chemosensory domain. The present study investigated amygdala activity in response to visual and auditory stimuli. By selecting stimuli based on individual valence and arousal ratings, we were able to dissociate stimulus valence and stimulus-related arousal, both on the verbal and the peripheral physiological level. We found that the amygdala was sensitive to stimulus valence even when arousal was controlled for, and that increased amygdala activity was better explained by valence than by arousal. The proposed difference in the relation between amygdala activity and stimulus-related arousal between the chemosensory and the audiovisual domain is discussed in terms of the amygdala's embedding within these sensory systems and the processes by which emotional meaning is derived.

Saturday, August 09, 2008

ARTICLE UPDATE - Emotional experience modulates brain activity during fixation periods between tasks.Emotional experience modulates brain activity dur

Pitroda S, Angstadt M, McCloskey MS, Coccaro EF, Phan KL.

Neuroscience Letters, in press

Functional imaging studies have begun to identify a set of brain regions whose brain activity is greater during 'rest' (e.g., fixation) states than during cognitive tasks. It has been posited that these regions constitute a network that supports the brain's default mode, which is temporarily suspended during specific goal-directed behaviors. Exogenous tasks that require cognitive effort are thought to command reallocation of resources away from the brain's default state. However, it remains unknown if brain activity during fixation periods between active task periods is influenced by previous task-related emotional content. We examined brain activity during periods of FIXATION (viewing and rating gray-scale images) interspersed among periods of viewing and rating complex images ('PICTURE') with positive, negative, and neutral affective content. We show that a selected group of brain regions (PCC, precuneus, IPL, vACC) do exhibit activity that is greater during FIXATION (>PICTURE); these regions have previously been implicated in the "default brain network". In addition, we report that activity within precuneus and IPL in the FIXATION period is attenuated by the precedent processing of images with positive and negative emotional content, relative to non-emotional content. These data suggest that the activity within regions implicated in the default network is modulated by the presence of environmental stimuli with motivational salience and, thus, adds to our understanding of the brain function during periods of low cognitive, emotional, or sensory demand.

ARTICLE UPDATE - Functional neuroimaging of reward processing and decision-making: A review of aberrant motivational and affective processing in addic

Diekhof EK, Falkai P, Gruber O.

Brain Research Review, in press

The adequate integration of reward- and decision-related information provided by the environment is critical for behavioral success and subjective well being in everyday life. Functional neuroimaging research has already presented a comprehensive picture on affective and motivational processing in the healthy human brain and has recently also turned its interest to the assessment of impaired brain function in psychiatric patients. This article presents an overview on neuroimaging studies dealing with reward processing and decision-making by combining most recent findings from fundamental and clinical research. It provides an outline on the neural mechanisms guiding context-adequate reward processing and decision-making processes in the healthy brain, and also addresses pathophysiological alterations in the brain's reward system that have been observed in substance abuse and mood disorders, two highly prevalent classes of psychiatric disorders. The overall goal is to critically evaluate the specificity of neurophysiological alterations identified in these psychiatric disorders and associated symptoms, and to make suggestions concerning future research.

Saturday, August 02, 2008

ARTICLE UPDATE - The selective processing of emotional visual stimuli while detecting auditory targets: An ERP analysis.

Schupp HT, Stockburger J, Bublatzky F, Junghöfer M, Weike AI, Hamm AO.

Brain Research, in press

Event-related potential studies revealed an early posterior negativity (EPN) for emotional compared to neutral pictures. Exploring the emotion-attention relationship, a previous study observed that a primary visual discrimination task interfered with the emotional modulation of the EPN component. To specify the locus of interference, the present study assessed the fate of selective visual emotion processing while attention is directed towards the auditory modality. While simply viewing a rapid and continuous stream of pleasant, neutral, and unpleasant pictures in one experimental condition, processing demands of a concurrent auditory target discrimination task were systematically varied in three further experimental conditions. Participants successfully performed the auditory task as revealed by behavioral performance and selected event-related potential components. Replicating previous results, emotional pictures were associated with a larger posterior negativity compared to neutral pictures. Of main interest, increasing demands of the auditory task did not modulate the selective processing of emotional visual stimuli. With regard to the locus of interference, selective emotion processing as indexed by the EPN does not seem to reflect shared processing resources of visual and auditory modality.

ARTICLE UPDATE - Communicating emotion: Linking affective prosody and word meaning.

Nygaard LC, Queen JS.

Journal of Experimental Psychology: Human Perception & Performance, 34, 1017-1030

The present study investigated the role of emotional tone of voice in the perception of spoken words. Listeners were presented with words that had either a happy, sad, or neutral meaning. Each word was spoken in a tone of voice (happy, sad, or neutral) that was congruent, incongruent, or neutral with respect to affective meaning, and naming latencies were collected. Across experiments, tone of voice was either blocked or mixed with respect to emotional meaning. The results suggest that emotional tone of voice facilitated linguistic processing of emotional words in an emotion-congruent fashion. These findings suggest that information about emotional tone is used in the processing of linguistic content influencing the recognition and naming of spoken words in an emotion-congruent manner.

Saturday, July 26, 2008

ARTICLE UPDATE - Neural processing of vocal emotion and identity.

Spreckelmeyer KN, Kutas M, Urbach T, Altenmüller E, Münte TF.

Brain & Cognition, in press

The voice is a marker of a person's identity which allows individual recognition even if the person is not in sight. Listening to a voice also affords inferences about the speaker's emotional state. Both these types of personal information are encoded in characteristic acoustic feature patterns analyzed within the auditory cortex. In the present study 16 volunteers listened to pairs of non-verbal voice stimuli with happy or sad valence in two different task conditions while event-related brain potentials (ERPs) were recorded. In an emotion matching task, participants indicated whether the expressed emotion of a target voice was congruent or incongruent with that of a (preceding) prime voice. In an identity matching task, participants indicated whether or not the prime and target voice belonged to the same person. Effects based on emotion expressed occurred earlier than those based on voice identity. Specifically, P2 ( approximately 200ms)-amplitudes were reduced for happy voices when primed by happy voices. Identity match effects, by contrast, did not start until around 300ms. These results show an early task-specific emotion-based influence on the early stages of auditory sensory processing.

Friday, July 18, 2008

ARTICLE UPDATE - Discriminating between changes in bias and changes in accuracy for recognition memory of emotional stimuli.

Grider RC, Malmberg KJ.

Memory & Cognition, 36, 933-946

A debate has emerged as to whether recognition of emotional stimuli is more accurate or more biased than recognition of nonemotional stimuli. Teasing apart changes in accuracy versus changes in bias requires a measurement model. However, different models have been adopted by different researchers, and this has contributed to the current debate. In this article, different measurement models are discussed, and the signal detection model that is most appropriate for recognition is adopted to investigate the effects of valence and arousal on recognition memory performance, using receiver operating characteristic analyses. In addition, complementary two-alternative forced choice experiments were conducted in order to generalize the empirical findings and interpret them under a relatively relaxed set of measurement assumptions. Across all experiments, accuracy was greater for highly valenced stimuli and stimuli with high arousal value. In addition, a bias to endorse positively valenced stimuli was observed. These results are discussed within an adaptive memory framework that assumes that emotion plays an important role in the allocation of attentional resources.

ARTICLE UPDATE - Emotional states influence the neural processing of affective language.

Pratt NL, Kelly SD.

Social Neuroscience, 3, 1-9

The present study investigated whether emotional states influence the neural processing of language. Event-related potentials recorded the brain's response to positively and negatively valenced words (e.g., love vs. death) while participants were directly induced into positive and negative moods. ERP electrodes in frontal scalp regions of the brain distinguished positive and negative words around 400 ms poststimulus. The amplitude of this negative waveform showed a larger negativity for positive words compared to negative words in the frontal electrode region when participants were in a positive, but not negative, mood. These findings build on previous research by demonstrating that people process affective language differently when in positive and negative moods, and lend support to recent views that emotion and cognition interact during language comprehension.

ARTICLE UPDATE - Asymmetrical frontal ERPs, emotion, and behavioral approach/inhibition sensitivity.

Peterson CK, Gable P, Harmon-Jones E.

Social Neuroscience, 113-124

The present study sought to extend past research on frontal brain asymmetry and individual differences by examining relationships of individual differences in behavioral inhibition/approach system (BIS/BAS) sensitivity with asymmetrical frontal event-related brain responses to startle probes presented during viewing of affective pictures. One hundred and ten participants were shown unpleasant, neutral, and pleasant affective pictures, and presented startle probes during picture presentations. Individual differences in BIS sensitivity related to relatively greater right frontal N100 amplitude to startle probes presented during pleasant and unpleasant pictures, whereas individual differences in BAS sensitivity related to reduced left frontal P300 amplitude to startle probes presented during pleasant pictures. The results of this study suggest that BIS sensitivity is related to greater relative right frontal cortical activity during affective states, while BAS sensitivity is related to greater relative left frontal cortical activity during appetitive states.

ARTICLE UPDATE - Friend or foe? Brain systems involved in the perception of dynamic signals of menacing and friendly social approaches.

Carter EJ, Pelphrey KA.

Social Neuroscience, 3, 151-163

During every social approach, humans must assess each other's intentions. Facial expressions provide cues to assist in these assessments via associations with emotion, the likelihood of affiliation, and personality. In this functional magnetic resonance imaging (fMRI) study, participants viewed animated male characters approaching them in a hallway and making either a happy or an angry facial expression. An expected increase in amygdala and superior temporal sulcus activation to the expression of anger was found. Notably, two other social brain regions also had an increased hemodynamic response to anger relative to happiness, including the lateral fusiform gyrus and a region centered in the middle temporal gyrus. Other brain regions showed little differentiation or an increased level of activity to the happy stimuli. These findings provide insight into the brain mechanisms involved in reading the intentions of other human beings in an overtly social context. In particular, they demonstrate brain regions sensitive to social signals of dominance and affiliation.

Thursday, July 10, 2008

ARTICLE UPDATE - Distinguishing expected negative outcomes from preparatory control in the human orbitofrontal cortex.

Ursu S, Clark KA, Stenger VA, Carter CS.

Brain Research, in press

The human orbitofrontal cortex (OFC) plays a critical role in adapting behavior according to the context provided by expected outcomes of actions. However, several aspects of this function are still poorly understood. In particular, it is unclear to what degree any subdivisions of the OFC are specifically engaged when negatively valenced outcomes are expected, and to what extent such areas might be involved in preparatory active control of behavior. We examined these issues in two complementary functional magnetic resonance imaging (fMRI) studies in which we simultaneously and independently manipulated monetary incentives for correct performance, and demands for active preparation of cognitive control. In both experiments, preparation for performance was associated with lateral PFC activity in response to high incentives, regardless of their valence, as well as in response to increased task demands. In contrast, areas of the OFC centered around the lateral orbital sulcus responded maximally to negatively perceived prospects, even when such prospects were associated with decreases in preparatory cognitive control. These results provide direct support for theoretical models which posit that the OFC contributes to behavioral regulation by representing the value of anticipated outcomes, but does not implement active control aimed at avoiding or pursuing outcomes. Furthermore, they provide additional converging evidence that the lateral OFC is involved in representing specifically the affective impact of anticipated negative outcomes.

ARTICLE UPDATE - The effect of appraisal level on processing of emotional prosody in meaningless speech.

Bach DR, Grandjean D, Sander D, Herdener M, Strik WK, Seifritz E.

Neuroimage, in press

In visual perception of emotional stimuli, low- and high-level appraisal processes have been found to engage different neural structures. Beyond emotional facial expression, emotional prosody is an important auditory cue for social interaction. Neuroimaging studies have proposed a network for emotional prosody processing that involves a right temporal input region and explicit evaluation in bilateral prefrontal areas. However, the comparison of different appraisal levels has so far relied upon using linguistic instructions during low-level processing, which might confound effects of processing level and linguistic task. In order to circumvent this problem, we examined processing of emotional prosody in meaningless speech during gender labelling (implicit, low-level appraisal) and emotion labelling (explicit, high-level appraisal). While bilateral amygdala, left superior temporal sulcus and right parietal areas showed stronger blood oxygen level-dependent (BOLD) responses during implicit processing, areas with stronger BOLD responses during explicit processing included the left inferior frontal gyrus, bilateral parietal, anterior cingulate and supplemental motor cortex. Emotional versus neutral prosody evoked BOLD responses in right superior temporal gyrus, bilateral anterior cingulate, left inferior frontal gyrus, insula and bilateral putamen. Basal ganglia and right anterior cingulate responses to emotional versus neutral prosody were particularly pronounced during explicit processing. These results are in line with an amygdala-prefrontal-cingulate network controlling different appraisal levels, and suggest a specific role of the left inferior frontal gyrus in explicit evaluation of emotional prosody. In addition to brain areas commonly related to prosody processing, our results suggest specific functions of anterior cingulate and basal ganglia in detecting emotional prosody, particularly when explicit identification is necessary.

ARTICLE UPDATE - Regulating the expectation of reward via cognitive strategies.

Delgado MR, Gillis MM, Phelps EA.

Nature Neuroscience, in press

Previous emotion regulation research has been successful in altering aversive emotional reactions. It is unclear, however, whether such strategies can also efficiently regulate expectations of reward arising from conditioned stimuli, which can at times be maladaptive (for example, drug cravings). Using a monetary reward-conditioning procedure with cognitive strategies, we observed attenuation in both the physiological (skin conductance) and neural correlates (striatum) of reward expectation as participants engaged in emotion regulation.

ARTICLE UPDATE - Individual differences in disgust sensitivity modulate neural responses to aversive/disgusting stimuli.

Mataix-Cols D, An SK, Lawrence NS, Caseras X, Speckens A, Giampietro V, Brammer MJ, Phillips ML.

European Journal of Neuroscience, 27, 3050-3058.

Little is known about how individual differences in trait disgust sensitivity modulate the neural responses to disgusting stimuli in the brain. Thirty-seven adult healthy volunteers completed the Disgust Scale (DS) and viewed alternating blocks of disgusting and neutral pictures from the International Affective Picture System while undergoing fMRI scanning. DS scores correlated positively with activations in brain regions previously associated with disgust (anterior insula, ventrolateral prefrontal cortex-temporal pole, putamen-globus pallidus, dorsal anterior cingulate, and visual cortex) and negatively with brain regions involved in the regulation of emotions (dorsolateral and rostral prefrontal cortices). The results were not confounded by biological sex, anxiety or depression scores, which were statistically controlled for. Disgust sensitivity, a behavioral trait that is normally distributed in the general population, predicts the magnitude of the individual's neural responses to a broad range of disgusting stimuli. The results have implications for disgust-related psychiatric disorders.

ARTICLE UPDATE - How emotion affects older adults' memories for event details.

Kensinger EA.

Memory, in press

As adults age, they tend to have problems remembering the details of events and the contexts in which events occurred. This review presents evidence that emotion can enhance older adults' abilities to remember episodic detail. Older adults are more likely to remember affective details of an event (e.g., whether something was good or bad, or how an event made them feel) than they are to remember non-affective details, and they remember more details of emotional events than of non-emotional ones. Moreover, in some instances, emotion appears to narrow the age gap in memory performance. It may be that memory for affective context, or for emotional events, relies on cognitive and neural processes that are relatively preserved in older adults.

Sunday, June 29, 2008

ARTICLE UPDATE - Functional grouping and cortical-subcortical interactions in emotion: A meta-analysis of neuroimaging studies.

Kober H, Barrett LF, Joseph J, Bliss-Moreau E, Lindquist K, Wager TD.

Neuroimage, in press

We performed an updated quantitative meta-analysis of 162 neuroimaging studies of emotion using a novel multi-level kernel-based approach, focusing on locating brain regions consistently activated in emotional tasks and their functional organization into distributed functional groups, independent of semantically defined emotion category labels (e.g., "anger," "fear"). Such brain-based analyses are critical if our ways of labeling emotions are to be evaluated and revised based on consistency with brain data. Consistent activations were limited to specific cortical sub-regions, including multiple functional areas within medial, orbital, and inferior lateral frontal cortices. Consistent with a wealth of animal literature, multiple subcortical activations were identified, including amygdala, ventral striatum, thalamus, hypothalamus, and periaqueductal gray. We used multivariate parcellation and clustering techniques to identify groups of co-activated brain regions across studies. These analyses identified six distributed functional groups, including medial and lateral frontal groups, two posterior cortical groups, and paralimbic and core limbic/brainstem groups. These functional groups provide information on potential organization of brain regions into large-scale networks. Specific follow-up analyses focused on amygdala, periaqueductal gray (PAG), and hypothalamic (Hy) activations, and identified frontal cortical areas co-activated with these core limbic structures. While multiple areas of frontal cortex co-activated with amygdala sub-regions, a specific region of dorsomedial prefrontal cortex (dmPFC, Brodmann's Area 9/32) was the only area co-activated with both PAG and Hy. Subsequent mediation analyses were consistent with a pathway from dmPFC through PAG to Hy. These results suggest that medial frontal areas are more closely associated with core limbic activation than their lateral counterparts, and that dmPFC may play a particularly important role in the cognitive generation of emotional states.

ARTICLE UPDATE - The role of the orbitofrontal cortex in the pursuit of happiness and more specific rewards.

Burke KA, Franz TM, Miller DN, Schoenbaum G.

Nature, in press

Cues that reliably predict rewards trigger the thoughts and emotions normally evoked by those rewards. Humans and other animals will work, often quite hard, for these cues. This is termed conditioned reinforcement. The ability to use conditioned reinforcers to guide our behaviour is normally beneficial; however, it can go awry. For example, corporate icons, such as McDonald's Golden Arches, influence consumer behaviour in powerful and sometimes surprising ways, and drug-associated cues trigger relapse to drug seeking in addicts and animals exposed to addictive drugs, even after abstinence or extinction. Yet, despite their prevalence, it is not known how conditioned reinforcers control human or other animal behaviour. One possibility is that they act through the use of the specific rewards they predict; alternatively, they could control behaviour directly by activating emotions that are independent of any specific reward. In other words, the Golden Arches may drive business because they evoke thoughts of hamburgers and fries, or instead, may be effective because they also evoke feelings of hunger or happiness. Moreover, different brain circuits could support conditioned reinforcement mediated by thoughts of specific outcomes versus more general affective information. Here we have attempted to address these questions in rats. Rats were trained to learn that different cues predicted different rewards using specialized conditioning procedures that controlled whether the cues evoked thoughts of specific outcomes or general affective representations common to different outcomes. Subsequently, these rats were given the opportunity to press levers to obtain short and otherwise unrewarded presentations of these cues. We found that rats were willing to work for cues that evoked either outcome-specific or general affective representations. Furthermore the orbitofrontal cortex, a prefrontal region important for adaptive decision-making, was critical for the former but not for the latter form of conditioned reinforcement.

ARTICLE UPDATE - A comparison of two lists providing emotional norms for English words (ANEW and the DAL).

Whissell C.

Psychological Reports, 102, 597-600

Although different in terms of purpose, word-selection procedures, and rating scales, both the ANEW (n = 1034) and DAL (n = 8742) lists, which have 633 words in common, provide normative emotional ratings for English words. This research compared the lists and cross-validated the two main lexical dimensions of affect. Parallel representatives of the two dimensions (Valence and Pleasantness, Arousal and Activation) were correlated across lists (rs = .86, .63). In tune with their separate purposes, the ANEW list, which was designed to describe emotional words, included more rare words, while the DAL, which was designed for natural language applications, included more common ones. The Valence-Activation scatterplot for ANEW was C-shaped and included fewer Arousing words of medium Valence, such as "awake," "debate," and "proves," while the DAL included fewer less common words descriptive of emotion such as "maniac," "corrupt," and "lavish." In view of these differences, list similarities strongly support the generalizability of the two main lexical dimensions of affect.

ARTICLE UPDATE - Preferences for emotional information in older and younger adults: A meta-analysis of memory and attention tasks.

Murphy NA, Isaacowitz DM.

Psychology and Aging, 23, 263-286

The authors conducted a meta-analysis to determine the magnitude of older and younger adults' preferences for emotional stimuli in studies of attention and memory. Analyses involved 1,085 older adults from 37 independent samples and 3,150 younger adults from 86 independent samples. Both age groups exhibited small to medium emotion salience effects (i.e., preference for emotionally valenced stimuli over neutral stimuli) as well as positivity preferences (i.e., preference for positively valenced stimuli over neutral stimuli) and negativity preferences (i.e., preference for negatively valenced stimuli to neutral stimuli). There were few age differences overall. Type of measurement appeared to influence the magnitude of effects; recognition studies indicated significant age effects, where older adults showed smaller effects for emotion salience and negativity preferences than younger adults.

ARTICLE UPDATE - Effects of semantic relatedness on recall of stimuli preceding emotional oddballs.

Smith RM, Beversdorf DQ.

Journal of International Neuropsychological Society, 14, 620-628.

Semantic and episodic memory networks function as highly interconnected systems, both relying on the hippocampal/medial temporal lobe complex (HC/MTL). Episodic memory encoding triggers the retrieval of semantic information, serving to incorporate contextual relationships between the newly acquired memory and existing semantic representations. While emotional material augments episodic memory encoding at the time of stimulus presentation, interactions between emotion and semantic memory that contribute to subsequent episodic recall are not well understood. Using a modified oddball task, we examined the modulatory effects of negative emotion on semantic interactions with episodic memory by measuring the free-recall of serially presented neutral or negative words varying in semantic relatedness. We found increased free-recall for words related to and preceding emotionally negative oddballs, suggesting that negative emotion can indirectly facilitate episodic free-recall by enhancing semantic contributions during encoding. Our findings demonstrate the ability of emotion and semantic memory to interact to mutually enhance free-recall.

Sunday, June 22, 2008

ARTICLE UPDATE - Mirror neuron activation is associated with facial emotion processing.

Enticott PG, Johnston PJ, Herring SE, Hoy KE, Fitzgerald PB.

Neuropsychologia, in press

Theoretical accounts suggest that mirror neurons play a crucial role in social cognition. The current study used transcranial magnetic stimulation (TMS) to investigate the association between mirror neuron activation and facial emotion processing, a fundamental aspect of social cognition, among healthy adults (n=20). Facial emotion processing of static (but not dynamic) images correlated significantly with an enhanced motor response, proposed to reflect mirror neuron activation. These correlations did not appear to reflect general facial processing or pattern recognition, and provide support to current theoretical accounts linking the mirror neuron system to aspects of social cognition. We discuss the mechanism by which mirror neurons might facilitate facial emotion recognition.

Saturday, June 14, 2008

ARTICLE UPDATE - Electrocortical and electrodermal responses covary as a function of emotional arousal: A single-trial analysis.

Keil A, Smith JC, Wangelin BC, Sabatinelli D, Bradley MM, Lang PJ.

Psychophysiology, in press

Electrophysiological studies of human visual perception typically involve averaging across trials distributed over time during an experimental session. Using an oscillatory presentation, in which affective or neutral pictures were presented for 6 s, flickering on and off at a rate of 10 Hz, the present study examined single trials of steady-state visual evoked potentials. Moving window averaging and subsequent Fourier analysis at the stimulation frequency yielded spectral amplitude measures of electrocortical activity. Cronbach's alpha reached values >.79, across electrodes. Single-trial electrocortical activation was significantly related to the size of the skin conductance response recorded during affective picture viewing. These results suggest that individual trials of steady-state potentials may yield reliable indices of electrocortical activity in visual cortex and that amplitude modulation of these indices varies with emotional engagement.

ARTICLE UPDATE - Stimulus-driven and strategic neural responses to fearful and happy facial expressions in humans.

Williams MA, McGlone F, Abbott DF, Mattingley JB.

European Journal of Neuroscience, in press

The human amygdala responds selectively to consciously and unconsciously perceived facial expressions, particularly those that convey potential threat such as fear and anger. In many social situations, multiple faces with varying expressions confront observers yet little is known about the neural mechanisms involved in encoding several faces simultaneously. Here we used event-related fMRI to measure neural activity in pre-defined regions of interest as participants searched multi-face arrays for a designated target expression (fearful or happy). We conducted separate analyses to examine activations associated with each of the four multi-face arrays independent of target expression (stimulus-driven effects), and activations arising from the search for each of the target expressions, independent of the display type (strategic effects). Comparisons across display types, reflecting stimulus-driven influences on visual search, revealed activity in the amygdala and superior temporal sulcus (STS). By contrast, strategic demands of the task did not modulate activity in either the amygdala or STS. These results imply an interactive threat-detection system involving several neural regions. Crucially, activity in the amygdala increased significantly when participants correctly detected the target expression, compared with trials in which the identical target was missed, suggesting that the amygdala has a limited capacity for extracting affective facial expressions.

ARTICLE UPDATE - Sequential modulations of valence processing in the emotional Stroop task.

Kunde W, Mauer N.

Experimental Psychology, 55, 151-156

This study investigated trial-to-trial modulations of the processing of irrelevant valence information. Participants (N = 126) responded to the frame color of pictures with positive, neutral, or negative affective content--a procedure known as an emotional Stroop task (EST). As is typically found, positive and negative pictures delayed responses as compared to neutral pictures. However, the type and extent of this valence-based interference depended on the irrelevant picture valence in the preceding trial. Whereas preceding exposure to negative valence prompted interference from positive and negative pictures, such interference was removed after neutral trials. Following positive pictures, interference from negative but not from positive pictures was observed. We suggest that these sequential modulations reflect automatic self-regulatory selection processes that help to keep the balance between attending to task-relevant information and task-irrelevant information that signals important changes in the environment.

ARTICLE UPDATE - The Montreal Affective Voices: a validated set of nonverbal affect bursts for research on auditory affective processing.

Belin P, Fillion-Bilodeau S, Gosselin F.

Behavioural Research Methods, 40, 531-539

The Montreal Affective Voices consist of 90 nonverbal affect bursts corresponding to the emotions of anger, disgust, fear, pain, sadness, surprise, happiness, and pleasure (plus a neutral expression), recorded by 10 different actors (5 of them male and 5 female). Ratings of valence, arousal, and intensity for eight emotions were collected for each vocalization from 30 participants. Analyses revealed high recognition accuracies for most of the emotional categories (mean of 68%). They also revealed significant effects of both the actors' and the participants' gender: The highest hit rates (75%) were obtained for female participants rating female vocalizations, and the lowest hit rates (60%) for male participants rating male vocalizations. Interestingly, the mixed situations--that is, male participants rating female vocalizations or female participants rating male vocalizations--yielded similar, intermediate ratings. The Montreal Affective Voices are available for download at vnl.psy.gla.ac.uk/ (Resources section).

ARTICLE UPDATE - Unpacking the cognitive architecture of emotion processes.

Grandjean D, Scherer KR.

Emotion, 8, 341-351.

The results of 2 electroencephalographic studies confirm Component Process Model (CPM) predictions that different appraisal checks have specific brain state correlates, occur rapidly in a brief time window after stimulation, and produce results that occur in sequential rather than parallel fashion. The data are compatible with the assumption that early checks (novelty and intrinsic pleasantness) occur in an automatic, unconscious mode of processing, whereas later checks, specifically goal conduciveness, require more extensive, effortful, and controlled processing. Overall, this work, combined with growing evidence for the CPM's response patterning predictions concerning autonomic physiological signatures, facial muscle movements, and vocalization changes, suggests that this model provides an appropriate basis for the unpacking of the cognitive architecture of emotion and its computational modeling.

ARTICLE UPDATE - Decoding of affective facial expressions in the context of emotional situations.

Sommer M, Döhnel K, Meinhardt J, Hajak G.

Neuropsychologia, in press

The ability to recognize other persons' affective states and to link these with aspects of the current situation arises early in development and is precursor functions of a Theory of Mind (ToM). Until now, studies investigated either the processing of affective faces or affective pictures. In the present study, we tried to realize a scenario more similar to every day situations. We employed fMRI and used a picture matching task to explore the neural correlates associated with the integration and decoding of facial affective expressions in the context of affective situations. In the emotion condition, the participants judged an emotional facial expression with respect to the content of an emotional picture. In the two other conditions, participants indicated colour matches on the background of either affective or scrambled pictures. In contrast to colour matching on scrambled pictures, colour matching on emotional pictures resulted in longer reaction times and increased activation of the bilateral fusiform and occipital gyrus. These results indicated that, although task irrelevant, participants may attend to the emotional background of the pictures. The emotion task was associated with higher reaction times and with activation of the bilateral fusiform and occipital gyrus. Additionally, emotion attribution induced left amygdala activity. Possibly, attention processes and amygdala projections modulated the activation found in the occipital and fusiform areas. Furthermore, the involvement of the amygdala in the ToM precursor ability to link facial expressions with an emotional situation may indicate that the amygdala is involved in the development of stable ToM abilities.

ARTICLE UPDATE - Emotion, decision making, and the amygdala.

Seymour B, Dolan R.

Neuron, 58, 662-671

Emotion plays a critical role in many contemporary accounts of decision making, but exactly what underlies its influence and how this is mediated in the brain remain far from clear. Here, we review behavioral studies that suggest that Pavlovian processes can exert an important influence over choice and may account for many effects that have traditionally been attributed to emotion. We illustrate how recent experiments cast light on the underlying structure of Pavlovian control and argue that generally this influence makes good computational sense. Corresponding neuroscientific data from both animals and humans implicate a central role for the amygdala through interactions with other brain areas. This yields a neurobiological account of emotion in which it may operate, often covertly, to optimize rather than corrupt economic choice.

ARTICLE UPDATE - Affective learning enhances visual detection and responses in primary visual cortex.

Padmala S, Pessoa L.

Journal of Neuroscience, 28, 6202-6210

The affective significance of a visual item is thought to lead to enhanced visual processing. However, the precise link between enhanced visual perception of emotion-laden items and increased visual responses remains poorly understood. To investigate this link, we acquired functional magnetic resonance imaging (fMRI) data while participants performed a challenging visual detection task. Grating stimuli were physically identical and differed only as a function of their previous exposure history; CS+ stimuli were initially paired with shock, whereas CS- stimuli were not. Behaviorally, subjects were both faster and more accurate during CS+ relative to CS- target detection. These behavioral results were paralleled by increases in fMRI responses across early, retinotopically organized visual cortex, which was mapped in a separate fMRI session. Logistic regression analyses revealed that trial-by-trial fluctuations in fMRI responses were closely linked to trial type, such that fMRI signal strength reliably predicted the probability of a hit trial across retinotopically organized visual cortex, including area V1. For instance, during the CS+ condition, a 0.5% signal change increased the probability of a hit from chance to 67.3-73.5% in V1-V4 (the highest increase was observed in area V1). Furthermore, across participants, differential fMRI responses to hits versus correct rejects were correlated with behavioral performance. Our findings provide a close link between increased activation in early visual cortex and improved behavioral performance as a function of the affective significance of an item.

Saturday, May 31, 2008

ARTICLE UPDATE - Affective primes suppress attention bias to threat in socially anxious individuals.

Helfinstein SM, White LK, Bar-Haim Y, Fox NA.

Behavioral Research Therapy, in press

Anxious individuals show an attention bias towards threatening information. However, under conditions of sustained environmental threat this otherwise-present attention bias disappears. It remains unclear whether this suppression of attention bias can be caused by a transient activation of the fear system. In the present experiment, high socially anxious and low socially anxious individuals (HSA group, n=12; LSA group, n=12) performed a modified dot-probe task in which they were shown either a neutral or socially threatening prime word prior to each trial. EEG was collected and ERP components to the prime and faces displays were computed. HSA individuals showed an attention bias to threat after a neutral prime, but no attention bias after a threatening prime, demonstrating that suppression of attention bias can occur after a transient activation of the fear system. LSA individuals showed an opposite pattern: no evidence of a bias to threat with neutral primes but induction of an attention bias to threat following threatening primes. ERP results suggested differential processing of the prime and faces displays by HSA and LSA individuals. However, no group by prime interaction was found for any of ERP components.

ARTICLE UPDATE - When do motor behaviors (mis)match affective stimuli? An evaluative coding view of approach and avoidance reactions.

Eder AB, Rothermund K.

Journal of Experimental Psychology: General, 2, 262-281

Affective-mapping effects between affective stimuli and lever movements are critically dependent upon the evaluative meaning of the response labels that are used in the task instructions. In Experiments 1 and 2, affective-mapping effects predicted by specific-muscle-activation and distance-regulation accounts were replicated when the standard response labels towards and away were used but were reversed when identical lever movements were labeled downwards and upwards. In Experiment 3, affective-mapping effects were produced with affectively labeled right and left lever movements that are intrinsically unrelated to approach and avoidance. Experiments 4 and 5 revealed that affective-mapping effects are not mediated by memory retrieval processes and depend on the execution of affectively coded responses. The results support the assumption that evaluative implications of action instructions assign affective codes to motor responses on a representational level that interact with stimulus evaluations on a response selection stage.

ARTICLE UPDATE - Time course of the involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum in emotional pro

Hoekert M, Bais L, Kahn RS, Aleman A.

PLoS One, 3, e2244

In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400-1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence.

ARTICLE UPDATE - Audio-visual integration of emotion expression.

Collignon O, Girard S, Gosselin F, Roy S, Saint-Amour D, Lassonde M, Lepore F.

Brain Research, in press

Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of affect expressions. In Experiment 1, participants were required to categorise fear and disgust expressions displayed auditorily, visually, or using congruent or incongruent audio-visual stimuli. Results showed faster and more accurate categorisation in the bimodal congruent situation than in the unimodal conditions. In the incongruent situation, participant preferentially categorised the affective expression based on the visual modality, demonstrating a visual dominance in emotional processing. However, when the reliability of the visual stimuli was diminished, participants categorised incongruent bimodal stimuli preferentially via the auditory modality. These results demonstrate that visual dominance in affect perception does not occur in a rigid manner, but follows flexible situation-dependent rules. In Experiment 2, we requested the participants to pay attention to only one sensory modality at a time in order to test the putative mandatory nature of multisensory affective interactions. We observed that even if they were asked to ignore concurrent sensory information, the irrelevant information significantly affected the processing of the target. This observation was especially true when the target modality was less reliable. Altogether, these findings indicate that the perception of emotion expressions is a robust multisensory situation which follows rules that have been previously observed in other perceptual domains.

Friday, May 09, 2008

ARTICLE UPDATE - Emotional words are preferentially processed during silent reading. Here, we investigate to what extent different components of the v

Mallan KM, Lipp OV, Libera M.

International Journal of Psychophysiology, in press

Affect modulates the blink startle reflex in the picture-viewing paradigm, however, the process responsible for reflex modulation during conditional stimuli (CSs) that have acquired valence through affective conditioning remains unclear. In Experiment 1, neutral shapes (CSs) and valenced or neutral pictures (USs) were paired in a forward (CS-->US) manner. Pleasantness ratings supported affective learning of positive and negative valence. Post-acquisition, blink reflexes were larger during the pleasant and unpleasant CSs than during the neutral CS. Rather than affect, attention or anticipatory arousal were suggested as sources of startle modulation. Experiment 2 confirmed that affective learning in the picture-picture paradigm was not affected by whether the CS preceded the US. Pleasantness ratings and affective priming revealed similar extents of affective learning following forward, backward or simultaneous pairings of CSs and USs. Experiment 3 utilized a backward conditioning procedure (US-->CS) to minimize effects of US anticipation. Again, blink reflexes were larger during CSs paired with valenced USs regardless of US valence implicating attention rather than anticipatory arousal or affect as the process modulating startle in this paradigm.

ARTICLE UPDATE - Emotion and attention in visual word processing-An ERP study.

Kissler J, Herbert C, Winkler I, Junghofer M.

Biological Psychology, in press

Emotional words are preferentially processed during silent reading. Here, we investigate to what extent different components of the visual evoked potential, namely the P1, N1, the early posterior negativity (EPN, around 250ms after word onset) as well as the late positive complex (LPC, around 500ms) respond differentially to emotional words and whether this response depends on the availability of attentional resources. Subjects viewed random sequences of pleasant, neutral and unpleasant adjectives and nouns. They were first instructed to simply read the words and then to count either adjectives or nouns. No consistent effects emerged for the P1 and N1. However, during both reading and counting the EPN was enhanced for emotionally arousing words (pleasant and unpleasant), regardless of whether the word belonged to a target or a non-target category. A task effect on the EPN was restricted to adjectives, but the effect did not interact with emotional content. The later centro-parietal LPC (450-650ms) showed a large enhancement for the attended word class. A small and topographically distinct emotion-LPC effect was found specifically in response to pleasant words, both during silent reading and the active task. Thus, emotional word content is processed effortlessly and automatically and is not subject to interference from a primary grammatical decision task. The results are in line with other reports of early automatic semantic processing as reflected by posterior negativities in the ERP around 250ms after word onset. Implications for models of emotion-attention interactions in the brain are discussed.

ARTICLE UPDATE - Early emotion word processing: Evidence from event-related potentials.

Scott GG, O'Donnell PJ, Leuthold H, Sereno SC.

Biological Psychology, in press

Behavioral and electrophysiological responses were monitored to 80 controlled sets of emotionally positive, negative, and neutral words presented randomly in a lexical decision paradigm. Half of the words were low frequency and half were high frequency. Behavioral results showed significant effects of frequency and emotion as well as an interaction. Prior research has demonstrated sensitivity to lexical processing in the N1 component of the event-related brain potential (ERP). In this study, the N1 (135-180ms) showed a significant emotion by frequency interaction. The P1 window (80-120ms) preceding the N1 as well as post-N1 time windows, including the Early Posterior Negativity (200-300ms) and P300 (300-450ms), were examined. The ERP data suggest an early identification of the emotional tone of words leading to differential processing. Specifically, high frequency negative words seem to attract additional cognitive resources. The overall pattern of results is consistent with a time line of word recognition in which semantic analysis, including the evaluation of emotional quality, occurs at an early, lexical stage of processing.

ARTICLE UPDATE - Not all emotions are created equal: The negativity bias in social-emotional development.

Vaish A, Grossmann T, Woodward A

Psychological Bulletin, 134, 383-403

There is ample empirical evidence for an asymmetry in the way that adults use positive versus negative information to make sense of their world; specifically, across an array of psychological situations and tasks, adults display a negativity bias, or the propensity to attend to, learn from, and use negative information far more than positive information. This bias is argued to serve critical evolutionarily adaptive functions, but its developmental presence and ontogenetic emergence have never been seriously considered. The authors argue for the existence of the negativity bias in early development and that it is evident especially in research on infant social referencing but also in other developmental domains. They discuss ontogenetic mechanisms underlying the emergence of this bias and explore not only its evolutionary but also its developmental functions and consequences. Throughout, the authors suggest ways to further examine the negativity bias in infants and older children, and they make testable predictions that would help clarify the nature of the negativity bias during early development.

ARTICLE UPDATE - In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of t

Kapucu A, Rotello CM, Ready RE, Seidl KN.

Journal of Experimental Psychology: Learning, Memory and Cognition, 34, 703-711

Older adults sometimes show a recall advantage for emotionally positive, rather than neutral or negative, stimuli (S. T. Charles, M. Mather, & L. L. Carstensen, 2003). In contrast, younger adults respond "old" and "remember" more often to negative materials in recognition tests. For younger adults, both effects are due to response bias changes rather than to enhanced memory accuracy (S. Dougal & C. M. Rotello, 2007). We presented older and younger adults with emotional and neutral stimuli in a remember-know paradigm. Signal-detection and model-based analyses showed that memory accuracy did not differ for the neutral, negative, and positive stimuli, and that "remember" responses did not reflect the use of recollection. However, both age groups showed large and significant response bias effects of emotion: Younger adults tended to say "old" and "remember" more often in response to negative words than to positive and neutral words, whereas older adults responded "old" and "remember" more often to both positive and negative words than to neutral stimuli.

ARTICLE UPDATE - Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization.

Grandjean D, Sander D, Scherer KR.

Consciousness and Cognition, in press

In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of theoretical considerations and empirical evidence. First, we argue that componential emotion models, and specifically the Component Process Model (CPM), may be better able to account for the emergence of feelings than basic emotion or dimensional models. Second, we advance the hypothesis that consciousness of emotional reactions emerges when lower levels of processing are not sufficient to cope with the event and regulate the emotional process, particularly when the degree of synchronization between the components reaches a critical level and duration. Third, we review recent neuroscience evidence that bolsters our claim of the central importance of the synchronization of neuronal assemblies at different levels of processing.