Abstract
Primates make decisions visually by shifting their view from one object to the next, comparing values between objects, and choosing the best reward, even before acting. Here, we show that when monkeys make value-guided choices, amygdala neurons encode their decisions in an abstract, purely internal representation defined by the monkey’s current view but not by specific object or reward properties. Across amygdala subdivisions, recorded activity patterns evolved gradually from an object-specific value code to a transient, object-independent code in which currently viewed and last-viewed objects competed to reflect the emerging view-based choice. Using neural-network modeling, we identified a sequence of computations by which amygdala neurons implemented view-based decision making and eventually recovered the chosen object’s identity when the monkeys acted on their choice. These findings reveal a neural mechanism in the amygdala that derives object choices from abstract, view-based computations, suggesting an efficient solution for decision problems with many objects.
Fabian Grabenhorst, Adrián Ponce-Alvarez, Alexandra Battaglia-Mayer, Gustavo Deco and Wolfram Schultz. A view-based decision mechanism for rewards in the primate amygdala. Neuron, 2023-9. [LINK]
Speaker: Qiyue Zhang
Time: 9:00 am, 2023/10/16
Location: CIBR A622