Kober H, Barrett LF, Joseph J, Bliss-Moreau E, Lindquist K, Wager TD.
Neuroimage, in press
We performed an updated quantitative meta-analysis of 162 neuroimaging studies of emotion using a novel multi-level kernel-based approach, focusing on locating brain regions consistently activated in emotional tasks and their functional organization into distributed functional groups, independent of semantically defined emotion category labels (e.g., "anger," "fear"). Such brain-based analyses are critical if our ways of labeling emotions are to be evaluated and revised based on consistency with brain data. Consistent activations were limited to specific cortical sub-regions, including multiple functional areas within medial, orbital, and inferior lateral frontal cortices. Consistent with a wealth of animal literature, multiple subcortical activations were identified, including amygdala, ventral striatum, thalamus, hypothalamus, and periaqueductal gray. We used multivariate parcellation and clustering techniques to identify groups of co-activated brain regions across studies. These analyses identified six distributed functional groups, including medial and lateral frontal groups, two posterior cortical groups, and paralimbic and core limbic/brainstem groups. These functional groups provide information on potential organization of brain regions into large-scale networks. Specific follow-up analyses focused on amygdala, periaqueductal gray (PAG), and hypothalamic (Hy) activations, and identified frontal cortical areas co-activated with these core limbic structures. While multiple areas of frontal cortex co-activated with amygdala sub-regions, a specific region of dorsomedial prefrontal cortex (dmPFC, Brodmann's Area 9/32) was the only area co-activated with both PAG and Hy. Subsequent mediation analyses were consistent with a pathway from dmPFC through PAG to Hy. These results suggest that medial frontal areas are more closely associated with core limbic activation than their lateral counterparts, and that dmPFC may play a particularly important role in the cognitive generation of emotional states.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Sunday, June 29, 2008
ARTICLE UPDATE - The role of the orbitofrontal cortex in the pursuit of happiness and more specific rewards.
Burke KA, Franz TM, Miller DN, Schoenbaum G.
Nature, in press
Cues that reliably predict rewards trigger the thoughts and emotions normally evoked by those rewards. Humans and other animals will work, often quite hard, for these cues. This is termed conditioned reinforcement. The ability to use conditioned reinforcers to guide our behaviour is normally beneficial; however, it can go awry. For example, corporate icons, such as McDonald's Golden Arches, influence consumer behaviour in powerful and sometimes surprising ways, and drug-associated cues trigger relapse to drug seeking in addicts and animals exposed to addictive drugs, even after abstinence or extinction. Yet, despite their prevalence, it is not known how conditioned reinforcers control human or other animal behaviour. One possibility is that they act through the use of the specific rewards they predict; alternatively, they could control behaviour directly by activating emotions that are independent of any specific reward. In other words, the Golden Arches may drive business because they evoke thoughts of hamburgers and fries, or instead, may be effective because they also evoke feelings of hunger or happiness. Moreover, different brain circuits could support conditioned reinforcement mediated by thoughts of specific outcomes versus more general affective information. Here we have attempted to address these questions in rats. Rats were trained to learn that different cues predicted different rewards using specialized conditioning procedures that controlled whether the cues evoked thoughts of specific outcomes or general affective representations common to different outcomes. Subsequently, these rats were given the opportunity to press levers to obtain short and otherwise unrewarded presentations of these cues. We found that rats were willing to work for cues that evoked either outcome-specific or general affective representations. Furthermore the orbitofrontal cortex, a prefrontal region important for adaptive decision-making, was critical for the former but not for the latter form of conditioned reinforcement.
Nature, in press
Cues that reliably predict rewards trigger the thoughts and emotions normally evoked by those rewards. Humans and other animals will work, often quite hard, for these cues. This is termed conditioned reinforcement. The ability to use conditioned reinforcers to guide our behaviour is normally beneficial; however, it can go awry. For example, corporate icons, such as McDonald's Golden Arches, influence consumer behaviour in powerful and sometimes surprising ways, and drug-associated cues trigger relapse to drug seeking in addicts and animals exposed to addictive drugs, even after abstinence or extinction. Yet, despite their prevalence, it is not known how conditioned reinforcers control human or other animal behaviour. One possibility is that they act through the use of the specific rewards they predict; alternatively, they could control behaviour directly by activating emotions that are independent of any specific reward. In other words, the Golden Arches may drive business because they evoke thoughts of hamburgers and fries, or instead, may be effective because they also evoke feelings of hunger or happiness. Moreover, different brain circuits could support conditioned reinforcement mediated by thoughts of specific outcomes versus more general affective information. Here we have attempted to address these questions in rats. Rats were trained to learn that different cues predicted different rewards using specialized conditioning procedures that controlled whether the cues evoked thoughts of specific outcomes or general affective representations common to different outcomes. Subsequently, these rats were given the opportunity to press levers to obtain short and otherwise unrewarded presentations of these cues. We found that rats were willing to work for cues that evoked either outcome-specific or general affective representations. Furthermore the orbitofrontal cortex, a prefrontal region important for adaptive decision-making, was critical for the former but not for the latter form of conditioned reinforcement.
ARTICLE UPDATE - A comparison of two lists providing emotional norms for English words (ANEW and the DAL).
Whissell C.
Psychological Reports, 102, 597-600
Although different in terms of purpose, word-selection procedures, and rating scales, both the ANEW (n = 1034) and DAL (n = 8742) lists, which have 633 words in common, provide normative emotional ratings for English words. This research compared the lists and cross-validated the two main lexical dimensions of affect. Parallel representatives of the two dimensions (Valence and Pleasantness, Arousal and Activation) were correlated across lists (rs = .86, .63). In tune with their separate purposes, the ANEW list, which was designed to describe emotional words, included more rare words, while the DAL, which was designed for natural language applications, included more common ones. The Valence-Activation scatterplot for ANEW was C-shaped and included fewer Arousing words of medium Valence, such as "awake," "debate," and "proves," while the DAL included fewer less common words descriptive of emotion such as "maniac," "corrupt," and "lavish." In view of these differences, list similarities strongly support the generalizability of the two main lexical dimensions of affect.
Psychological Reports, 102, 597-600
Although different in terms of purpose, word-selection procedures, and rating scales, both the ANEW (n = 1034) and DAL (n = 8742) lists, which have 633 words in common, provide normative emotional ratings for English words. This research compared the lists and cross-validated the two main lexical dimensions of affect. Parallel representatives of the two dimensions (Valence and Pleasantness, Arousal and Activation) were correlated across lists (rs = .86, .63). In tune with their separate purposes, the ANEW list, which was designed to describe emotional words, included more rare words, while the DAL, which was designed for natural language applications, included more common ones. The Valence-Activation scatterplot for ANEW was C-shaped and included fewer Arousing words of medium Valence, such as "awake," "debate," and "proves," while the DAL included fewer less common words descriptive of emotion such as "maniac," "corrupt," and "lavish." In view of these differences, list similarities strongly support the generalizability of the two main lexical dimensions of affect.
ARTICLE UPDATE - Preferences for emotional information in older and younger adults: A meta-analysis of memory and attention tasks.
Murphy NA, Isaacowitz DM.
Psychology and Aging, 23, 263-286
The authors conducted a meta-analysis to determine the magnitude of older and younger adults' preferences for emotional stimuli in studies of attention and memory. Analyses involved 1,085 older adults from 37 independent samples and 3,150 younger adults from 86 independent samples. Both age groups exhibited small to medium emotion salience effects (i.e., preference for emotionally valenced stimuli over neutral stimuli) as well as positivity preferences (i.e., preference for positively valenced stimuli over neutral stimuli) and negativity preferences (i.e., preference for negatively valenced stimuli to neutral stimuli). There were few age differences overall. Type of measurement appeared to influence the magnitude of effects; recognition studies indicated significant age effects, where older adults showed smaller effects for emotion salience and negativity preferences than younger adults.
Psychology and Aging, 23, 263-286
The authors conducted a meta-analysis to determine the magnitude of older and younger adults' preferences for emotional stimuli in studies of attention and memory. Analyses involved 1,085 older adults from 37 independent samples and 3,150 younger adults from 86 independent samples. Both age groups exhibited small to medium emotion salience effects (i.e., preference for emotionally valenced stimuli over neutral stimuli) as well as positivity preferences (i.e., preference for positively valenced stimuli over neutral stimuli) and negativity preferences (i.e., preference for negatively valenced stimuli to neutral stimuli). There were few age differences overall. Type of measurement appeared to influence the magnitude of effects; recognition studies indicated significant age effects, where older adults showed smaller effects for emotion salience and negativity preferences than younger adults.
ARTICLE UPDATE - Effects of semantic relatedness on recall of stimuli preceding emotional oddballs.
Smith RM, Beversdorf DQ.
Journal of International Neuropsychological Society, 14, 620-628.
Semantic and episodic memory networks function as highly interconnected systems, both relying on the hippocampal/medial temporal lobe complex (HC/MTL). Episodic memory encoding triggers the retrieval of semantic information, serving to incorporate contextual relationships between the newly acquired memory and existing semantic representations. While emotional material augments episodic memory encoding at the time of stimulus presentation, interactions between emotion and semantic memory that contribute to subsequent episodic recall are not well understood. Using a modified oddball task, we examined the modulatory effects of negative emotion on semantic interactions with episodic memory by measuring the free-recall of serially presented neutral or negative words varying in semantic relatedness. We found increased free-recall for words related to and preceding emotionally negative oddballs, suggesting that negative emotion can indirectly facilitate episodic free-recall by enhancing semantic contributions during encoding. Our findings demonstrate the ability of emotion and semantic memory to interact to mutually enhance free-recall.
Journal of International Neuropsychological Society, 14, 620-628.
Semantic and episodic memory networks function as highly interconnected systems, both relying on the hippocampal/medial temporal lobe complex (HC/MTL). Episodic memory encoding triggers the retrieval of semantic information, serving to incorporate contextual relationships between the newly acquired memory and existing semantic representations. While emotional material augments episodic memory encoding at the time of stimulus presentation, interactions between emotion and semantic memory that contribute to subsequent episodic recall are not well understood. Using a modified oddball task, we examined the modulatory effects of negative emotion on semantic interactions with episodic memory by measuring the free-recall of serially presented neutral or negative words varying in semantic relatedness. We found increased free-recall for words related to and preceding emotionally negative oddballs, suggesting that negative emotion can indirectly facilitate episodic free-recall by enhancing semantic contributions during encoding. Our findings demonstrate the ability of emotion and semantic memory to interact to mutually enhance free-recall.
Sunday, June 22, 2008
ARTICLE UPDATE - Mirror neuron activation is associated with facial emotion processing.
Enticott PG, Johnston PJ, Herring SE, Hoy KE, Fitzgerald PB.
Neuropsychologia, in press
Theoretical accounts suggest that mirror neurons play a crucial role in social cognition. The current study used transcranial magnetic stimulation (TMS) to investigate the association between mirror neuron activation and facial emotion processing, a fundamental aspect of social cognition, among healthy adults (n=20). Facial emotion processing of static (but not dynamic) images correlated significantly with an enhanced motor response, proposed to reflect mirror neuron activation. These correlations did not appear to reflect general facial processing or pattern recognition, and provide support to current theoretical accounts linking the mirror neuron system to aspects of social cognition. We discuss the mechanism by which mirror neurons might facilitate facial emotion recognition.
Neuropsychologia, in press
Theoretical accounts suggest that mirror neurons play a crucial role in social cognition. The current study used transcranial magnetic stimulation (TMS) to investigate the association between mirror neuron activation and facial emotion processing, a fundamental aspect of social cognition, among healthy adults (n=20). Facial emotion processing of static (but not dynamic) images correlated significantly with an enhanced motor response, proposed to reflect mirror neuron activation. These correlations did not appear to reflect general facial processing or pattern recognition, and provide support to current theoretical accounts linking the mirror neuron system to aspects of social cognition. We discuss the mechanism by which mirror neurons might facilitate facial emotion recognition.
Saturday, June 14, 2008
ARTICLE UPDATE - Electrocortical and electrodermal responses covary as a function of emotional arousal: A single-trial analysis.
Keil A, Smith JC, Wangelin BC, Sabatinelli D, Bradley MM, Lang PJ.
Psychophysiology, in press
Electrophysiological studies of human visual perception typically involve averaging across trials distributed over time during an experimental session. Using an oscillatory presentation, in which affective or neutral pictures were presented for 6 s, flickering on and off at a rate of 10 Hz, the present study examined single trials of steady-state visual evoked potentials. Moving window averaging and subsequent Fourier analysis at the stimulation frequency yielded spectral amplitude measures of electrocortical activity. Cronbach's alpha reached values >.79, across electrodes. Single-trial electrocortical activation was significantly related to the size of the skin conductance response recorded during affective picture viewing. These results suggest that individual trials of steady-state potentials may yield reliable indices of electrocortical activity in visual cortex and that amplitude modulation of these indices varies with emotional engagement.
Psychophysiology, in press
Electrophysiological studies of human visual perception typically involve averaging across trials distributed over time during an experimental session. Using an oscillatory presentation, in which affective or neutral pictures were presented for 6 s, flickering on and off at a rate of 10 Hz, the present study examined single trials of steady-state visual evoked potentials. Moving window averaging and subsequent Fourier analysis at the stimulation frequency yielded spectral amplitude measures of electrocortical activity. Cronbach's alpha reached values >.79, across electrodes. Single-trial electrocortical activation was significantly related to the size of the skin conductance response recorded during affective picture viewing. These results suggest that individual trials of steady-state potentials may yield reliable indices of electrocortical activity in visual cortex and that amplitude modulation of these indices varies with emotional engagement.
ARTICLE UPDATE - Stimulus-driven and strategic neural responses to fearful and happy facial expressions in humans.
Williams MA, McGlone F, Abbott DF, Mattingley JB.
European Journal of Neuroscience, in press
The human amygdala responds selectively to consciously and unconsciously perceived facial expressions, particularly those that convey potential threat such as fear and anger. In many social situations, multiple faces with varying expressions confront observers yet little is known about the neural mechanisms involved in encoding several faces simultaneously. Here we used event-related fMRI to measure neural activity in pre-defined regions of interest as participants searched multi-face arrays for a designated target expression (fearful or happy). We conducted separate analyses to examine activations associated with each of the four multi-face arrays independent of target expression (stimulus-driven effects), and activations arising from the search for each of the target expressions, independent of the display type (strategic effects). Comparisons across display types, reflecting stimulus-driven influences on visual search, revealed activity in the amygdala and superior temporal sulcus (STS). By contrast, strategic demands of the task did not modulate activity in either the amygdala or STS. These results imply an interactive threat-detection system involving several neural regions. Crucially, activity in the amygdala increased significantly when participants correctly detected the target expression, compared with trials in which the identical target was missed, suggesting that the amygdala has a limited capacity for extracting affective facial expressions.
European Journal of Neuroscience, in press
The human amygdala responds selectively to consciously and unconsciously perceived facial expressions, particularly those that convey potential threat such as fear and anger. In many social situations, multiple faces with varying expressions confront observers yet little is known about the neural mechanisms involved in encoding several faces simultaneously. Here we used event-related fMRI to measure neural activity in pre-defined regions of interest as participants searched multi-face arrays for a designated target expression (fearful or happy). We conducted separate analyses to examine activations associated with each of the four multi-face arrays independent of target expression (stimulus-driven effects), and activations arising from the search for each of the target expressions, independent of the display type (strategic effects). Comparisons across display types, reflecting stimulus-driven influences on visual search, revealed activity in the amygdala and superior temporal sulcus (STS). By contrast, strategic demands of the task did not modulate activity in either the amygdala or STS. These results imply an interactive threat-detection system involving several neural regions. Crucially, activity in the amygdala increased significantly when participants correctly detected the target expression, compared with trials in which the identical target was missed, suggesting that the amygdala has a limited capacity for extracting affective facial expressions.
ARTICLE UPDATE - Sequential modulations of valence processing in the emotional Stroop task.
Kunde W, Mauer N.
Experimental Psychology, 55, 151-156
This study investigated trial-to-trial modulations of the processing of irrelevant valence information. Participants (N = 126) responded to the frame color of pictures with positive, neutral, or negative affective content--a procedure known as an emotional Stroop task (EST). As is typically found, positive and negative pictures delayed responses as compared to neutral pictures. However, the type and extent of this valence-based interference depended on the irrelevant picture valence in the preceding trial. Whereas preceding exposure to negative valence prompted interference from positive and negative pictures, such interference was removed after neutral trials. Following positive pictures, interference from negative but not from positive pictures was observed. We suggest that these sequential modulations reflect automatic self-regulatory selection processes that help to keep the balance between attending to task-relevant information and task-irrelevant information that signals important changes in the environment.
Experimental Psychology, 55, 151-156
This study investigated trial-to-trial modulations of the processing of irrelevant valence information. Participants (N = 126) responded to the frame color of pictures with positive, neutral, or negative affective content--a procedure known as an emotional Stroop task (EST). As is typically found, positive and negative pictures delayed responses as compared to neutral pictures. However, the type and extent of this valence-based interference depended on the irrelevant picture valence in the preceding trial. Whereas preceding exposure to negative valence prompted interference from positive and negative pictures, such interference was removed after neutral trials. Following positive pictures, interference from negative but not from positive pictures was observed. We suggest that these sequential modulations reflect automatic self-regulatory selection processes that help to keep the balance between attending to task-relevant information and task-irrelevant information that signals important changes in the environment.
ARTICLE UPDATE - The Montreal Affective Voices: a validated set of nonverbal affect bursts for research on auditory affective processing.
Belin P, Fillion-Bilodeau S, Gosselin F.
Behavioural Research Methods, 40, 531-539
The Montreal Affective Voices consist of 90 nonverbal affect bursts corresponding to the emotions of anger, disgust, fear, pain, sadness, surprise, happiness, and pleasure (plus a neutral expression), recorded by 10 different actors (5 of them male and 5 female). Ratings of valence, arousal, and intensity for eight emotions were collected for each vocalization from 30 participants. Analyses revealed high recognition accuracies for most of the emotional categories (mean of 68%). They also revealed significant effects of both the actors' and the participants' gender: The highest hit rates (75%) were obtained for female participants rating female vocalizations, and the lowest hit rates (60%) for male participants rating male vocalizations. Interestingly, the mixed situations--that is, male participants rating female vocalizations or female participants rating male vocalizations--yielded similar, intermediate ratings. The Montreal Affective Voices are available for download at vnl.psy.gla.ac.uk/ (Resources section).
Behavioural Research Methods, 40, 531-539
The Montreal Affective Voices consist of 90 nonverbal affect bursts corresponding to the emotions of anger, disgust, fear, pain, sadness, surprise, happiness, and pleasure (plus a neutral expression), recorded by 10 different actors (5 of them male and 5 female). Ratings of valence, arousal, and intensity for eight emotions were collected for each vocalization from 30 participants. Analyses revealed high recognition accuracies for most of the emotional categories (mean of 68%). They also revealed significant effects of both the actors' and the participants' gender: The highest hit rates (75%) were obtained for female participants rating female vocalizations, and the lowest hit rates (60%) for male participants rating male vocalizations. Interestingly, the mixed situations--that is, male participants rating female vocalizations or female participants rating male vocalizations--yielded similar, intermediate ratings. The Montreal Affective Voices are available for download at vnl.psy.gla.ac.uk/ (Resources section).
ARTICLE UPDATE - Unpacking the cognitive architecture of emotion processes.
Grandjean D, Scherer KR.
Emotion, 8, 341-351.
The results of 2 electroencephalographic studies confirm Component Process Model (CPM) predictions that different appraisal checks have specific brain state correlates, occur rapidly in a brief time window after stimulation, and produce results that occur in sequential rather than parallel fashion. The data are compatible with the assumption that early checks (novelty and intrinsic pleasantness) occur in an automatic, unconscious mode of processing, whereas later checks, specifically goal conduciveness, require more extensive, effortful, and controlled processing. Overall, this work, combined with growing evidence for the CPM's response patterning predictions concerning autonomic physiological signatures, facial muscle movements, and vocalization changes, suggests that this model provides an appropriate basis for the unpacking of the cognitive architecture of emotion and its computational modeling.
Emotion, 8, 341-351.
The results of 2 electroencephalographic studies confirm Component Process Model (CPM) predictions that different appraisal checks have specific brain state correlates, occur rapidly in a brief time window after stimulation, and produce results that occur in sequential rather than parallel fashion. The data are compatible with the assumption that early checks (novelty and intrinsic pleasantness) occur in an automatic, unconscious mode of processing, whereas later checks, specifically goal conduciveness, require more extensive, effortful, and controlled processing. Overall, this work, combined with growing evidence for the CPM's response patterning predictions concerning autonomic physiological signatures, facial muscle movements, and vocalization changes, suggests that this model provides an appropriate basis for the unpacking of the cognitive architecture of emotion and its computational modeling.
ARTICLE UPDATE - Decoding of affective facial expressions in the context of emotional situations.
Sommer M, Döhnel K, Meinhardt J, Hajak G.
Neuropsychologia, in press
The ability to recognize other persons' affective states and to link these with aspects of the current situation arises early in development and is precursor functions of a Theory of Mind (ToM). Until now, studies investigated either the processing of affective faces or affective pictures. In the present study, we tried to realize a scenario more similar to every day situations. We employed fMRI and used a picture matching task to explore the neural correlates associated with the integration and decoding of facial affective expressions in the context of affective situations. In the emotion condition, the participants judged an emotional facial expression with respect to the content of an emotional picture. In the two other conditions, participants indicated colour matches on the background of either affective or scrambled pictures. In contrast to colour matching on scrambled pictures, colour matching on emotional pictures resulted in longer reaction times and increased activation of the bilateral fusiform and occipital gyrus. These results indicated that, although task irrelevant, participants may attend to the emotional background of the pictures. The emotion task was associated with higher reaction times and with activation of the bilateral fusiform and occipital gyrus. Additionally, emotion attribution induced left amygdala activity. Possibly, attention processes and amygdala projections modulated the activation found in the occipital and fusiform areas. Furthermore, the involvement of the amygdala in the ToM precursor ability to link facial expressions with an emotional situation may indicate that the amygdala is involved in the development of stable ToM abilities.
Neuropsychologia, in press
The ability to recognize other persons' affective states and to link these with aspects of the current situation arises early in development and is precursor functions of a Theory of Mind (ToM). Until now, studies investigated either the processing of affective faces or affective pictures. In the present study, we tried to realize a scenario more similar to every day situations. We employed fMRI and used a picture matching task to explore the neural correlates associated with the integration and decoding of facial affective expressions in the context of affective situations. In the emotion condition, the participants judged an emotional facial expression with respect to the content of an emotional picture. In the two other conditions, participants indicated colour matches on the background of either affective or scrambled pictures. In contrast to colour matching on scrambled pictures, colour matching on emotional pictures resulted in longer reaction times and increased activation of the bilateral fusiform and occipital gyrus. These results indicated that, although task irrelevant, participants may attend to the emotional background of the pictures. The emotion task was associated with higher reaction times and with activation of the bilateral fusiform and occipital gyrus. Additionally, emotion attribution induced left amygdala activity. Possibly, attention processes and amygdala projections modulated the activation found in the occipital and fusiform areas. Furthermore, the involvement of the amygdala in the ToM precursor ability to link facial expressions with an emotional situation may indicate that the amygdala is involved in the development of stable ToM abilities.
ARTICLE UPDATE - Emotion, decision making, and the amygdala.
Seymour B, Dolan R.
Neuron, 58, 662-671
Emotion plays a critical role in many contemporary accounts of decision making, but exactly what underlies its influence and how this is mediated in the brain remain far from clear. Here, we review behavioral studies that suggest that Pavlovian processes can exert an important influence over choice and may account for many effects that have traditionally been attributed to emotion. We illustrate how recent experiments cast light on the underlying structure of Pavlovian control and argue that generally this influence makes good computational sense. Corresponding neuroscientific data from both animals and humans implicate a central role for the amygdala through interactions with other brain areas. This yields a neurobiological account of emotion in which it may operate, often covertly, to optimize rather than corrupt economic choice.
Neuron, 58, 662-671
Emotion plays a critical role in many contemporary accounts of decision making, but exactly what underlies its influence and how this is mediated in the brain remain far from clear. Here, we review behavioral studies that suggest that Pavlovian processes can exert an important influence over choice and may account for many effects that have traditionally been attributed to emotion. We illustrate how recent experiments cast light on the underlying structure of Pavlovian control and argue that generally this influence makes good computational sense. Corresponding neuroscientific data from both animals and humans implicate a central role for the amygdala through interactions with other brain areas. This yields a neurobiological account of emotion in which it may operate, often covertly, to optimize rather than corrupt economic choice.
ARTICLE UPDATE - Affective learning enhances visual detection and responses in primary visual cortex.
Padmala S, Pessoa L.
Journal of Neuroscience, 28, 6202-6210
The affective significance of a visual item is thought to lead to enhanced visual processing. However, the precise link between enhanced visual perception of emotion-laden items and increased visual responses remains poorly understood. To investigate this link, we acquired functional magnetic resonance imaging (fMRI) data while participants performed a challenging visual detection task. Grating stimuli were physically identical and differed only as a function of their previous exposure history; CS+ stimuli were initially paired with shock, whereas CS- stimuli were not. Behaviorally, subjects were both faster and more accurate during CS+ relative to CS- target detection. These behavioral results were paralleled by increases in fMRI responses across early, retinotopically organized visual cortex, which was mapped in a separate fMRI session. Logistic regression analyses revealed that trial-by-trial fluctuations in fMRI responses were closely linked to trial type, such that fMRI signal strength reliably predicted the probability of a hit trial across retinotopically organized visual cortex, including area V1. For instance, during the CS+ condition, a 0.5% signal change increased the probability of a hit from chance to 67.3-73.5% in V1-V4 (the highest increase was observed in area V1). Furthermore, across participants, differential fMRI responses to hits versus correct rejects were correlated with behavioral performance. Our findings provide a close link between increased activation in early visual cortex and improved behavioral performance as a function of the affective significance of an item.
Journal of Neuroscience, 28, 6202-6210
The affective significance of a visual item is thought to lead to enhanced visual processing. However, the precise link between enhanced visual perception of emotion-laden items and increased visual responses remains poorly understood. To investigate this link, we acquired functional magnetic resonance imaging (fMRI) data while participants performed a challenging visual detection task. Grating stimuli were physically identical and differed only as a function of their previous exposure history; CS+ stimuli were initially paired with shock, whereas CS- stimuli were not. Behaviorally, subjects were both faster and more accurate during CS+ relative to CS- target detection. These behavioral results were paralleled by increases in fMRI responses across early, retinotopically organized visual cortex, which was mapped in a separate fMRI session. Logistic regression analyses revealed that trial-by-trial fluctuations in fMRI responses were closely linked to trial type, such that fMRI signal strength reliably predicted the probability of a hit trial across retinotopically organized visual cortex, including area V1. For instance, during the CS+ condition, a 0.5% signal change increased the probability of a hit from chance to 67.3-73.5% in V1-V4 (the highest increase was observed in area V1). Furthermore, across participants, differential fMRI responses to hits versus correct rejects were correlated with behavioral performance. Our findings provide a close link between increased activation in early visual cortex and improved behavioral performance as a function of the affective significance of an item.
Subscribe to:
Posts (Atom)