Expected value, reward outcome, and temporal difference error representations in a probabilistic decision task
Rolls, Edmund T., McCabe, C. and Redoute, J.. (2008) Expected value, reward outcome, and temporal difference error representations in a probabilistic decision task. Cerebral Cortex, Vol.18 (No.3). pp. 652-663. ISSN 1047-3211Full text not available from this repository.
Official URL: http://dx.doi.org/10.1093/cercor/bhm097
In probabilistic decision tasks, an expected value (EV) of a choice is calculated, and after the choice has been made, this can be updated based on a temporal difference (TD) prediction error between the EV and the reward magnitude (RM) obtained. The EV is measured as the probability of obtaining a reward × RM. To understand the contribution of different brain areas to these decision-making processes, functional magnetic resonance imaging activations related to EV versus RM (or outcome) were measured in a probabilistic decision task. Activations in the medial orbitofrontal cortex were correlated with both RM and with EV and confirmed in a conjunction analysis to extend toward the pregenual cingulate cortex. From these representations, TD reward prediction errors could be produced. Activations in areas that receive from the orbitofrontal cortex including the ventral striatum, midbrain, and inferior frontal gyrus were correlated with the TD error. Activations in the anterior insula were correlated negatively with EV, occurring when low reward outcomes were expected, and also with the uncertainty of the reward, implicating this region in basic and crucial decision-making parameters, low expected outcomes, and uncertainty.
|Item Type:||Journal Article|
|Subjects:||Q Science > QA Mathematics > QA76 Electronic computers. Computer science. Computer software|
|Divisions:||Faculty of Science > Computer Science|
|Journal or Publication Title:||Cerebral Cortex|
|Publisher:||Oxford University Press|
|Page Range:||pp. 652-663|
|Access rights to Published version:||Restricted or Subscription Access|
Actions (login required)