scispace - formally typeset
Search or ask a question

Showing papers by "Raymond J. Dolan published in 2012"


Journal ArticleDOI
TL;DR: The psychological and neural underpinnings of metacognitive accuracy are reviewed, and a neural synthesis in which dorsolateral and anterior prefrontal cortical subregions interact with interoceptive cortices (cingulate and insula) to promote accurate judgements of performance is proposed.
Abstract: Ability in various cognitive domains is often assessed by measuring task performance, such as the accuracy of a perceptual categorization. A similar analysis can be applied to metacognitive reports about a task to quantify the degree to which an individual is aware of his or her success or failure. Here, we review the psychological and neural underpinnings of metacognitive accuracy, drawing on research in memory and decision-making. These data show that metacognitive accuracy is dissociable from task performance and varies across individuals. Convergent evidence indicates that the function of the rostral and dorsal aspect of the lateral prefrontal cortex (PFC) is important for the accuracy of retrospective judgements of performance. In contrast, prospective judgements of performance may depend upon medial PFC. We close with a discussion of how metacognitive processes relate to concepts of cognitive control, and propose a neural synthesis in which dorsolateral and anterior prefrontal cortical subregions interact with interoceptive cortices (cingulate and insula) to promote accurate judgements of performance.

538 citations


Journal ArticleDOI
TL;DR: It is shown that activity in right rostrolateral prefrontal cortex (rlPFC) satisfies three constraints for a role in metac cognitive aspects of decision-making and is discussed in a theoretical framework where rlPFC re-represents object-level decision uncertainty to facilitate metacognitive report.
Abstract: Neuroscience has made considerable progress in understanding the neural substrates supporting cognitive performance in a number of domains, including memory, perception, and decision making. In contrast, how the human brain generates metacognitive awareness of task performance remains unclear. Here, we address this question by asking participants to perform perceptual decisions while providing concurrent metacognitive reports during fMRI scanning. We show that activity in right rostrolateral prefrontal cortex (rlPFC) satisfies three constraints for a role in metacognitive aspects of decision-making. Right rlPFC showed greater activity during self-report compared to a matched control condition, activity in this region correlated with reported confidence, and the strength of the relationship between activity and confidence predicted metacognitive ability across individuals. In addition, functional connectivity between right rlPFC and both contralateral PFC and visual cortex increased during metacognitive reports. We discuss these findings in a theoretical framework where rlPFC re-represents object-level decision uncertainty to facilitate metacognitive report.

323 citations


Journal ArticleDOI
TL;DR: The behavioral and computational data showed that instrumental learning is contingent on overcoming inherent and plastic Pavlovian biases, while the neuronal data showed this learning is linked to unique patterns of brain activity in regions implicated in action and inhibition respectively.

318 citations


Journal ArticleDOI
TL;DR: This paper focuses on the consequences of changing tonic levels of dopamine firing using simulations of cued sequential movements and uses these simulations to demonstrate how a single functional role for dopamine at the synaptic level can manifest in different ways at the behavioural level.
Abstract: The role of dopamine in behaviour and decision-making is often cast in terms of reinforcement learning and optimal decision theory. Here, we present an alternative view that frames the physiology of dopamine in terms of Bayes-optimal behaviour. In this account, dopamine controls the precision or salience of (external or internal) cues that engender action. In other words, dopamine balances bottom-up sensory information and top-down prior beliefs when making hierarchical inferences (predictions) about cues that have affordance. In this paper, we focus on the consequences of changing tonic levels of dopamine firing using simulations of cued sequential movements. Crucially, the predictions driving movements are based upon a hierarchical generative model that infers the context in which movements are made. This means that we can confuse agents by changing the context (order) in which cues are presented. These simulations provide a (Bayes-optimal) model of contextual uncertainty and set switching that can be quantified in terms of behavioural and electrophysiological responses. Furthermore, one can simulate dopaminergic lesions (by changing the precision of prediction errors) to produce pathological behaviours that are reminiscent of those seen in neurological disorders such as Parkinson's disease. We use these simulations to demonstrate how a single functional role for dopamine at the synaptic level can manifest in different ways at the behavioural level.

308 citations


Journal ArticleDOI
TL;DR: It is concluded that the currently available evidence indicates that distinct neural encoding (including summary statistic-type representations) of uncertainty occurs in distinct neural systems.
Abstract: Decision making is influenced by uncertainty, which arises from internal and external noise. A fundamental question is how uncertainty is encoded in the brain and how it influences behaviour. In this Review, Bach and Dolan integrate several theoretical concepts about uncertainty into a hierarchical decision-making framework. How we estimate uncertainty is important in decision neuroscience and has wide-ranging implications in basic and clinical neuroscience, from computational models of optimality to ideas on psychopathological disorders including anxiety, depression and schizophrenia. Empirical research in neuroscience, which has been based on divergent theoretical assumptions, has focused on the fundamental question of how uncertainty is encoded in the brain and how it influences behaviour. Here, we integrate several theoretical concepts about uncertainty into a decision-making framework. We conclude that the currently available evidence indicates that distinct neural encoding (including summary statistic-type representations) of uncertainty occurs in distinct neural systems.

278 citations


Journal ArticleDOI
TL;DR: Using behavioral and neuroimaging analyses of a minimax decision task, it is found that the computational processes underlying forward planning are expressed in the anterior caudate nucleus as values of individual branching steps in a decision tree.
Abstract: Investigations of the underlying mechanisms of choice in humans have focused on learning from prediction errors, leaving the computational structure of value based planning comparatively underexplored. Using behavioral and neuroimaging analyses of a minimax decision task, we found that the computational processes underlying forward planning are expressed in the anterior caudate nucleus as values of individual branching steps in a decision tree. In contrast, values represented in the putamen pertain solely to values learned during extensive training. During actual choice, both striatal areas showed a functional coupling to ventromedial prefrontal cortex, consistent with this region acting as a value comparator. Our findings point toward an architecture of choice in which segregated value systems operate in parallel in the striatum for planning and extensively trained choices, with medial prefrontal cortex integrating their outputs.

271 citations


Journal ArticleDOI
09 Aug 2012-Neuron
TL;DR: The effect of a dopamine manipulation on the degree to which either system contributes to instrumental behavior in a two-stage Markov decision task, which has been shown to discriminate model-free from model-based control, is investigated.

271 citations


Journal ArticleDOI
TL;DR: This introductory article to a Theme Issue on metacognition reviews recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind, and proposes a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape.
Abstract: Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape.

236 citations


Journal ArticleDOI
TL;DR: The impact of decreased brain serotonin induced by acute dietary tryptophan depletion selectively impaired both behavioral and neural representations of reward outcome value, and hence the effective exchange rate by which rewards and punishments were compared.
Abstract: Establishing a function for the neuromodulator serotonin in human decision-making has proved remarkably difficult because if its complex role in reward and punishment processing. In a novel choice task where actions led concurrently and independently to the stochastic delivery of both money and pain, we studied the impact of decreased brain serotonin induced by acute dietary tryptophan depletion. Depletion selectively impaired both behavioral and neural representations of reward outcome value, and hence the effective exchange rate by which rewards and punishments were compared. This effect was computationally and anatomically distinct from a separate effect on increasing outcome-independent choice perseveration. Our results provide evidence for a surprising role for serotonin in reward processing, while illustrating its complex and multifarious effects.

216 citations


Journal ArticleDOI
20 Sep 2012-Neuron
TL;DR: It is shown that the brain models the values and choices of others even when these values are currently irrelevant, and neural computations underlying self-referential and social inference are tied together.

194 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that administration of a drug that enhances dopaminergic function (dihydroxy-L-phenylalanine; L-DOPA) increases optimism bias.

Journal ArticleDOI
TL;DR: FMRI in an elderly population in which there is a loss of dopamine neurons as part of normal aging shows a lasting improvement even for weakly encoded events, suggesting a role for dopamine in human episodic memory consolidation, albeit operating within a narrow dose range.
Abstract: Activation of the hippocampus is required to encode memories for new events (or episodes). Observations from animal studies suggest that, for these memories to persist beyond 4-6 h, a release of dopamine generated by strong hippocampal activation is needed. This predicts that dopaminergic enhancement should improve human episodic memory persistence also for events encoded with weak hippocampal activation. Here, using pharmacological functional MRI (fMRI) in an elderly population in which there is a loss of dopamine neurons as part of normal aging, we show this very effect. The dopamine precursor levodopa led to a dose-dependent (inverted U-shape) persistent episodic memory benefit for images of scenes when tested after 6 h, independent of whether encoding-related hippocampal fMRI activity was weak or strong (U-shaped dose-response relationship). This lasting improvement even for weakly encoded events supports a role for dopamine in human episodic memory consolidation, albeit operating within a narrow dose range.

Journal ArticleDOI
TL;DR: This work selectively improved people’s tendency to incorporate bad news into their beliefs by disrupting the function of the left (but not right) inferior frontal gyrus using transcranial magnetic stimulation, thereby eliminating the engrained “good news/bad news effect.”
Abstract: Humans form beliefs asymmetrically; we tend to discount bad news but embrace good news. This reduced impact of unfavorable information on belief updating may have important societal implications, including the generation of financial market bubbles, ill preparedness in the face of natural disasters, and overly aggressive medical decisions. Here, we selectively improved people's tendency to incorporate bad news into their beliefs by disrupting the function of the left (but not right) inferior frontal gyrus using transcranial magnetic stimulation, thereby eliminating the engrained "good news/bad news effect." Our results provide an instance of how selective disruption of regional human brain function paradoxically enhances the ability to incorporate unfavorable information into beliefs of vulnerability.

01 Jan 2012
TL;DR: It is shown that administration of a drug that enhances dopaminergic function (dihydroxy-L-phenylalanine; L-DOPA) increases an optimism bias, the first evidence that the neuromodulator dopamine impacts on belief formation by reducing negative expectations regarding the future.
Abstract: Berlin School of Mind and Brain, Humboldt-Universita¨tzu Berlin, Spandauer Strase 1, 10178 Berlin, GermanySummaryWhen predicting financial profits [1], relationship outcomes[2], longevity [3], or professional success [4], people habitu-ally underestimate the likelihood of future negative events(for review see [5]) This well-known bias, termed unrealisticoptimism [6], is observed across age [7], culture [8], andspecies [9]and hasasignificant societal impactondomainsranging from financial markets to health and well beingHowever, it is unknown how neuromodulatory systemsimpact on the generation of optimistically biased beliefsThisquestionassumesgreatimportanceinlightofevidencethat common neuropsychiatric disorders, such as depres-sion, are characterized by pessimism [10, 11] Here, weshow that administration of a drug that enhances dopami-nergic function (dihydroxy-L-phenylalanine; L-DOPA) in-creases an optimism bias This effect is due to L-DOPAimpairing the ability to update belief in response to undesir-able information about the future These findings providethe first evidence that the neuromodulator dopamine im-pacts on belief formation by reducing negative expectationsregarding the futureResultsHumans are optimistically biased when making predictionsabout the future, habitually underestimating the likelihood ofnegativeevents[1–8]Thisbiasisrelatedtoastrikingasymme-try whereby people update their beliefs more in response toinformation that is better than expected compared to informa-tion that is worse than expected [12, 13] Selective updating ismediated by regions of the frontal cortex that track errors inestimationwhenthesecallforpositiveupdatebutshowarela-tive failure to code for errors that might induce a negativeupdate [12]Anunresolvedquestioniswhetherneuromodulatorsassoci-ated with generating expectations of future outcomes influ-ence this process A prominent candidate is the monoaminedopamine, a neuromodulator suggested to provide a teachingsignal that indexes when predictions fail to align with out-comes[14,15]InParkinson’sdisease,drugsenhancingdopa-minergic function (eg, dihydroxy-L-phenylalanine; L-DOPA)influence learning of positive and negative outcomes in anasymmetric manner, enhancing the former and impairing thelatter[16]Dopamineeffectsonlearninghavebeenextensivelystudied in the context of model-free reinforcement learning[14–16] However, it also impacts on domains as diverse asworking memory, episodic memory, and reversal learning[17, 18] Given these set of findings [12, 16], we hypothesizethat enhancing dopamine function will influence how healthyindividuals incorporate information about probabilities offuturelife eventsin an asymmetricmanner, increasing anopti-mism biasTo test whether an optimism bias is modulated by dopa-mine, participants completed a belief update task [12] ontwoseparatedays,oneweekapart(Figure1),inadouble-blindplacebo-controlled pharmacological intervention study Onone of the days, participants received placebo and on theother they received L-DOPA (150 mg), in a counterbalancedorder (n = 21) The task was identical on both days exceptfor the fact that different stimuli were used on each day (listswere counterbalanced) At each session, participants pro-vided estimates of their likelihood of experiencing 40 differenttypes of adverse life events (eg, Alzheimer’s disease,robbery; see the List of Stimuli in the Supplemental Experi-mental Procedures available online) adapted from a previousstudy [12] After each trial, they were presented with an actu-arial average probability of that event occurring to a personfrom the same sociocultural environment We then assessedwhether participants used this information to update theirpredictions by subsequently asking them to again estimatetheir likelihoods for the same 40 events in a second session,taking place w15 min after the first session They also com-pleted a memory test for the information presented and ratedall stimuli on different subjective scales (for a full description,see Supplemental Experimental Procedures)To test whether effects might be observed when manipu-lating another neuromodulator implicated in learning aboutreward and punishment, we administered the serotonergicreuptake inhibitor citalopram (24 mg in oral drops, equivalentto 30 mg in tablets) to a second group of participants (n =19) Serotonin neurotransmission is suggested to be involvedin aversive processing and inhibition ([19, 20], but see [21])However, the nature of its role in learning is less establishedthan is the case for dopamineOptimism Bias Grows with Increased Dopamine FunctionWe found that enhancing participants’ dopamine function in-creasedtheir predictionbiasin anoptimisticdirection Specif-ically, for each participant on each trial, we subtracted theparticipant’s estimation of how likely they were to encounterthe negative event from the average probability of encoun-tering that event (ie, estimation error = estimation 2 proba-bility presented) If the average estimation error was negative,then this indicated that participants tended to underestimate

Journal ArticleDOI
01 Jun 2012-Brain
TL;DR: It is found that dopamine modulation of nucleus accumbens and ventromedial prefrontal cortex exerts a specific effect on choice behaviour distinct from pure learning, and that dopamine plays a key role in performance that may be distinct from its role in learning.
Abstract: The role dopamine plays in decision-making has important theoretical, empirical and clinical implications. Here, we examined its precise contribution by exploiting the lesion deficit model afforded by Parkinson’s disease. We studied patients in a two-stage reinforcement learning task, while they were ON and OFF dopamine replacement medication. Contrary to expectation, we found that dopaminergic drug state (ON or OFF) did not impact learning. Instead, the critical factor was drug state during the performance phase, with patients ON medication choosing correctly significantly more frequently than those OFF medication. This effect was independent of drug state during initial learning and appears to reflect a facilitation of generalization for learnt information. This inference is bolstered by our observation that neural activity in nucleus accumbens and ventromedial prefrontal cortex, measured during simultaneously acquired functional magnetic resonance imaging, represented learnt stimulus values during performance. This effect was expressed solely during the ON state with activity in these regions correlating with better performance. Our data indicate that dopamine modulation of nucleus accumbens and ventromedial prefrontal cortex exerts a specific effect on choice behaviour distinct from pure learning. The findings are in keeping with the substantial other evidence that certain aspects of learning are unaffected by dopamine lesions or depletion, and that dopamine plays a key role in performance that may be distinct from its role in learning.

Journal ArticleDOI
TL;DR: These findings show that cholinergic neuromodulation enhances attentional selection via an impact on oscillatory synchrony in visual cortex, for low rather than high frequencies, in relation to proposals that lower-frequency oscillations are generated by feedback pathways within visual cortex.

Journal ArticleDOI
TL;DR: It is shown that dopamine enhanced the neural representation of rewarding actions, without significantly affecting the representation of reward value as such, which highlights a key role for dopamine in the generation of appetitively motivated actions.
Abstract: Dopamine is widely observed to signal anticipation of future rewards and thus thought to be a key contributor to affectively charged decision making. However, the experiments supporting this view have not dissociated rewards from the actions that lead to, or are occasioned by, them. Here, we manipulated dopamine pharmacologically and examined the effect on a task that explicitly dissociates action and reward value. We show that dopamine enhanced the neural representation of rewarding actions, without significantly affecting the representation of reward value as such. Thus, increasing dopamine levels with levodopa selectively boosted striatal and substantia nigra/ventral tegmental representations associated with actions leading to reward, but not with actions leading to the avoidance of punishment. These findings highlight a key role for dopamine in the generation of appetitively motivated actions.

Journal ArticleDOI
TL;DR: Montague as mentioned in this paper was supported by National Institute on Drug Abuse grant no R01DA011723-11 (No.R01DA1172311) and was the first to mention the existence of drug abuse.

Journal ArticleDOI
TL;DR: The data supports the idea that an expedited evaluation of sensory input is best explained by an architecture that involves a subcortical path to the amygdala, and finds that neuronal responses elicited by salient information were better explained when a sub cortical pathway was included.

Journal ArticleDOI
TL;DR: The data suggest that MTL novelty signals are interpreted in terms of their reward‐predicting properties in the mOFC, which biases striatal reward responses, which regulates MTL‐dependent long‐term memory formation and contextual exploration bonus signals in the hippocampus.
Abstract: Medial temporal lobe (MTL) dependent long-term memory for novel events is modulated by a circuitry that also responds to reward and includes the ventral striatum, dopaminergic midbrain, and medial orbitofrontal cortex (mOFC). This common neural network may reflect a functional link between novelty and reward whereby novelty motivates exploration in the search for rewards; a link also termed novelty “exploration bonus.” We used fMRI in a scene encoding paradigm to investigate the interaction between novelty and reward with a focus on neural signals akin to an exploration bonus. As expected, reward related long-term memory for the scenes (after 24 hours) strongly correlated with activity of MTL, ventral striatum, and substantia nigra/ventral tegmental area (SN/VTA). Furthermore, the hippocampus showed a main effect of novelty, the striatum showed a main effect of reward, and the mOFC signalled both novelty and reward. An interaction between novelty and reward akin to an exploration bonus was found in the hippocampus. These data suggest that MTL novelty signals are interpreted in terms of their reward-predicting properties in the mOFC, which biases striatal reward responses. The striatum together with the SN/VTA then regulates MTL-dependent long-term memory formation and contextual exploration bonus signals in the hippocampus. Hum Brain Mapp, 2011. © 2011 Wiley-Liss, Inc.

Journal ArticleDOI
TL;DR: It is found that choices altered preferences both immediately after being made and after the delay, providing evidence that making a decision can lead to enduring change in preferences.
Abstract: The idea that decisions alter preferences has had a considerable influence on the field of psychology and underpins cognitive dissonance theory Yet it is unknown whether choice-induced changes in preferences are long lasting or are transient manifestations seen in the immediate aftermath of decisions In the research reported here, we investigated whether these changes in preferences are fleeting or stable Participants rated vacation destinations before making hypothetical choices between destinations, immediately afterward, and 25 to 3 years later We found that choices altered preferences both immediately after being made and after the delay These changes could not be accounted for by participants' preexisting preferences, and they occurred only when participants made the choices themselves Our findings provide evidence that making a decision can lead to enduring change in preferences

Journal ArticleDOI
TL;DR: The findings show that the biological control of social behaviour is dynamically regulated not only by modulators promoting, but also by those diminishing a propensity to collaborate.
Abstract: Collaboration can provide benefits to the individual and the group across a variety of contexts. Even in simple perceptual tasks, the aggregation of individuals' personal information can enable enhanced group decision-making. However, in certain circumstances such collaboration can worsen performance, or even expose an individual to exploitation in economic tasks, and therefore a balance needs to be struck between a collaborative and a more egocentric disposition. Neurohumoral agents such as oxytocin are known to promote collaborative behaviours in economic tasks, but whether there are opponent agents, and whether these might even affect information aggregation without an economic component, is unknown. Here, we show that an androgen hormone, testosterone, acts as such an agent. Testosterone causally disrupted collaborative decision-making in a perceptual decision task, markedly reducing performance benefit individuals accrued from collaboration while leaving individual decision-making ability unaffected. This effect emerged because testosterone engendered more egocentric choices, manifest in an overweighting of one's own relative to others' judgements during joint decision-making. Our findings show that the biological control of social behaviour is dynamically regulated not only by modulators promoting, but also by those diminishing a propensity to collaborate.

01 Jan 2012
TL;DR: Gatsby et al. as mentioned in this paper found that the subcortical pathway was most important early in stimulus processing, with the largest effect being evident in the context of fearful faces.
Abstract: Gatsby Computational Neuroscience Unit, University CollegeLondon, London WC1E 3AN, UKSummaryThe amygdala plays a central role in evaluating the behav-ioralimportanceofsensoryinformation.Anatomicalsubcor-tical pathways provide direct input to the amygdala fromearly sensory systems and may support an adaptively valu-able rapid appraisal of salient information [1–3]. However,the functional significance of these subcortical inputsremains controversial [4]. We recorded magnetoencephalo-graphic activity evoked by tones in the context of emo-tionally valent faces and tested two competing biologicallymotivated dynamic causal models [5, 6] against these data:the dual and cortical models. The dual model comprisedtwo parallel (cortical and subcortical) routes to the amyg-dala, whereas the cortical model excluded the subcorticalpath. We found that neuronal responses elicited by salientinformation were better explained when a subcortical path-way was included. In keeping with its putative functionalrole of rapid stimulus appraisal, the subcortical pathwaywas most important early in stimulus processing. How-ever, as often assumed, its action was not limited to thecontext of fear, pointing to a more widespread informationprocessing role. Thus, our data supports the idea that anexpedited evaluation of sensory input is best explained byan architecture that involves a subcortical path to theamygdala.ResultsOurgoalwastoassesstheexplanatorypowerofafastsubcor-tical route in salient information processing. We first investi-gated whether brain responses elicited by a salient context,such as unpredictable information under threat, were bettermodeledwithorwithoutasubcortical‘‘lowroute.’’Wehypoth-esized that early evoked responses would be better explainedbythedual-routemodelandpredictedthatasubcorticalpath-way would play a more significant role in early, rather thanlater,timeepochs.Thecriticalfactorinsuchamodelisrapidityof processing, and this mandates a methodology with ade-quate temporal resolution. Thus, we used computationalmodelingtocomparemodels,withandwithoutthesubcorticalpathway, and evaluated their predictions in terms of how welltheyexplainedevokedmagnetoencephalographic(MEG)data.In addition, we asked whether the functional role of the sub-cortical pathway depends on stimulus predictability and emo-tional context. This provides an opportunity to address anunresolved and controversial question as to the degree towhich subcortical processing promotes expeditious evalua-tion of biological significance in sensory information.Surprise-Evoked Fields Are Enhanced in a Fearful ContextWepresentedparticipants withasequence ofpredictableandsurprising pure tone sounds. Subjects simultaneously per-formed a gender discrimination task on visually presentedfaces with neutral, happy, or fearful expressions (Figure 1).Responses to predictable, or standard, sounds were similarinallthreecontexts.However,thestrengthofthefieldsevokedbyoddballs,orsurprisingevents,increasedwiththeemotionalsalience of facial expressions. This gradient was particularlyevident in the period of 100–150 ms poststimulus, with thelargest effect being evident in the context of fearful faces,consistent with previous studies [7](Figure 2A).Enhanced Early Amygdala Activity with a SubcorticalPathwayWe estimated that activity at each source included in twocompeting dynamic causal models (DCMs) [5] for oddballsunder fear (Figure 2C and 2D). The cortical model (C) includedacorticalpathwayonly,whichtestsahypothesisthatinforma-tion about auditory objects reaches the amygdala after beingprocessed by the auditory thalamus (MGB) and primary audi-tory cortex (A1). On the other hand, a dual-route, or corticalandsubcorticalmodel(CS),includedacorticalandsubcorticalpathway, expressing a hypothesis that information reachesthe amygdala both directly through a thalamic projection andindirectly through a cortical route (Figure 2D).Activity in A1 as estimated by both models was similar.Crucially, we found that the dual-route model could recoverearly amygdala activity (peaking at w50 ms and w100 ms).Conversely, the absence of the subcortical pathway linkingMGB to AMY caused early (<100 ms) amygdala activity todisappear. The cortical model could only recover late amyg-dala activity (peaking at about 150 ms) (Figure 2C). This disso-ciationsupportstheroleofasubcorticalpathwayinconveyingrapid information to the amygdala.Time-Specific Role of the Subcortical PathwayNeuroanatomicaltracingsintheratdemonstratetheexistenceof two parallel processing pathways involving a thalamo-cortico-amygdala and a direct thalamo-amygdala pathway[8]. There is also evidence that auditory inputs can accessthe basolateral amygdala from both the auditory thalamusandthecortex[9–11].Crucially,directsubcorticalconnectionsbetween the auditory thalamus and the amygdala are alonesufficient for some forms of fear conditioning [12–14]. On thisbasis it is argued that a subcortical pathway plays an impor-tant role in adaptive behavior. Indeed, the ability to rapidlyprocessbehaviorallyrelevant informationrepresents abiolog-icaladvantageinapotentiallydangerousenvironment.Hence,a fast route that bypasses cortical processing is central to thedual-route hypothesis [2]. Motivated by this and the sourceanalysis described above (Figure 2C), we asked whetherthe relevance of the subcortical pathway was dependent ontime. We hypothesized that the functional role of the sub-

Journal ArticleDOI
TL;DR: A recently developed multivariate decoding method is applied to high-resolution fMRI data in subjects performing an instrumental learning task and finds strong evidence that both action-specific and action-independent value signals are represented in a distributed fashion.
Abstract: Estimating the value of potential actions is crucial for learning and adaptive behavior. We know little about how the human brain represents action-specific value outside of motor areas. This is, in part, due to a difficulty in detecting the neural correlates of value using conventional (region of interest) functional magnetic resonance imaging (fMRI) analyses, due to a potential distributed representation of value. We address this limitation by applying a recently developed multivariate decoding method to high-resolution fMRI data in subjects performing an instrumental learning task. We found evidence for action-specific value signals in circumscribed regions, specifically ventromedial prefrontal cortex, putamen, thalamus, and insula cortex. In contrast, action-independent value signals were more widely represented across a large set of brain areas. Using multivariate Bayesian model comparison, we formally tested whether value-specific responses are spatially distributed or coherent. We found strong evidence that both action-specific and action-independent value signals are represented in a distributed fashion. Our results suggest that a surprisingly large number of classical reward-related areas contain distributed representations of action-specific values, representations that are likely to mediate between reward and adaptive behavior.

Journal ArticleDOI
TL;DR: It is highlighted that a reduced (top-down) influence of the MTL on ipsilateral language regions is accompanied by enhanced reciprocal coupling in the undamaged hemisphere providing a first demonstration of “connectional diaschisis”.
Abstract: Accumulating evidence suggests a role for the medial temporal lobe (MTL) in working memory (WM). However, little is known concerning its functional interactions with other cortical regions in the distributed neural network subserving WM. To reveal these, we availed of subjects with MTL damage and characterized changes in effective connectivity while subjects engaged in WM task. Specifically, we compared dynamic causal models, extracted from magnetoencephalographic recordings during verbal WM encoding, in temporal lobe epilepsy patients (with left hippocampal sclerosis) and controls. Bayesian model comparison indicated that the best model (across subjects) evidenced bilateral, forward, and backward connections, coupling inferior temporal cortex (ITC), inferior frontal cortex (IFC), and MTL. MTL damage weakened backward connections from left MTL to left ITC, a decrease accompanied by strengthening of (bidirectional) connections between IFC and MTL in the contralesional hemisphere. These findings provide novel evidence concerning functional interactions between nodes of this fundamental cognitive network and sheds light on how these interactions are modified as a result of focal damage to MTL. The findings highlight that a reduced (top-down) influence of the MTL on ipsilateral language regions is accompanied by enhanced reciprocal coupling in the undamaged hemisphere providing a first demonstration of "connectional diaschisis."

Journal ArticleDOI
TL;DR: It is reported that a linear relationship between grey matter volume in a region of lateral orbitofrontal cortex (lOFCGM) and the tendency to shift reported desire for objects toward values expressed by other people has an anatomical correlate in the human brain.

Journal ArticleDOI
TL;DR: This work demonstrates that reward prediction errors in the human striatum are expressed according to an adaptive coding scheme and shows that adaptive coding is gated by changes in effective connectivity between the striatum and other reward-sensitive regions, namely the midbrain and the medial prefrontal cortex.
Abstract: To efficiently represent all of the possible rewards in the world, dopaminergic midbrain neurons dynamically adapt their coding range to the momentarily available rewards. Specifically, these neurons increase their activity for an outcome that is better than expected and decrease it for an outcome worse than expected, independent of the absolute reward magnitude. Although this adaptive coding is well documented, it remains unknown how this rescaling is implemented. To investigate the adaptive coding of prediction errors and its underlying rescaling process, we used human functional magnetic resonance imaging (fMRI) in combination with a reward prediction task that involved different reward magnitudes. We demonstrate that reward prediction errors in the human striatum are expressed according to an adaptive coding scheme. Strikingly, we show that adaptive coding is gated by changes in effective connectivity between the striatum and other reward-sensitive regions, namely the midbrain and the medial prefrontal cortex. Our results provide evidence that striatal prediction errors are normalized by a magnitude-dependent alteration in the interregional connectivity within the brain's reward system.

Journal ArticleDOI
TL;DR: The magnetic signature of prediction errors in the human brain was identified for the first time, which emerged approximately 320 ms after an outcome and expressed as an interaction between outcome valence and probability.

Journal ArticleDOI
TL;DR: It is suggested that in the choice process risk and loss can independently engage approach–avoidance mechanisms, which can provide a novel explanation for how risk influences action selection and explains both classically described choice behavior as well as behavioral patterns not predicted by existing theory.
Abstract: Value-based choices are influenced both by risk in potential outcomes and by whether outcomes reflect potential gains or losses. These variables are held to be related in a specific fashion, manifest in risk aversion for gains and risk seeking for losses. Instead, we hypothesized that there are independent impacts of risk and loss on choice such that, depending on context, subjects can show either risk aversion for gains and risk seeking for losses or the exact opposite. We demonstrate this independence in a gambling task, by selectively reversing a loss-induced effect (causing more gambling for gains than losses and the reverse) while leaving risk aversion unaffected. Consistent with these dissociable behavioral impacts of risk and loss, fMRI data revealed dissociable neural correlates of these variables, with parietal cortex tracking risk and orbitofrontal cortex and striatum tracking loss. Based on our neural data, we hypothesized that risk and loss influence action selection through approach–avoidance mechanisms, a hypothesis supported in an experiment in which we show valence and risk-dependent reaction time effects in line with this putative mechanism. We suggest that in the choice process risk and loss can independently engage approach–avoidance mechanisms. This can provide a novel explanation for how risk influences action selection and explains both classically described choice behavior as well as behavioral patterns not predicted by existing theory.

Journal ArticleDOI
15 Oct 2012-PLOS ONE
TL;DR: In this paper, the authors test the hypothesis that movement time in Parkinson's disease can be modulated by the specific nature of the motivational salience of possible action-outcomes.
Abstract: Bradykinesia is a cardinal feature of Parkinson's disease (PD). Despite its disabling impact, the precise cause of this symptom remains elusive. Recent thinking suggests that bradykinesia may be more than simply a manifestation of motor slowness, and may in part reflect a specific deficit in the operation of motivational vigour in the striatum. In this paper we test the hypothesis that movement time in PD can be modulated by the specific nature of the motivational salience of possible action-outcomes.