A Computational Account of Real-World Attentional Allocation Based on Visual Gain Fields

biorxiv(2023)

引用 0|浏览9
暂无评分
摘要
Coordination of goal-directed behaviour depends on the brain’s ability to recover the locations of relevant objects in the world. In humans, the visual system encodes the spatial organisation of sensory inputs, but neurons in early visual areas map objects according to their retinal positions, rather than where they are in the world. How the brain computes world-referenced spatial information across eye movements has been widely researched and debated. Here we tested whether shifts of covert attention are sufficiently precise in space and time to track an object’s real-world location across eye movements. We found that observers’ attentional selectivity is remarkably precise, and is barely perturbed by the execution of saccades. Inspired by recent neurophysiological discoveries, we developed an observer model that rapidly estimates the real-world locations of objects and allocates attention within this reference frame. The model recapitulates the human data and provides a parsimonious explanation for previously reported phenomena in which observers allocate attention to task-irrelevant locations across eye movements. Our findings reveal that visual attention operates in real-world coordinates, which can be computed rapidly at the earliest stages of cortical processing. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
关键词
attentional allocation,visual,real-world
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要