EconPapers    
Economics at your fingertips  
 

Multiple spatial frames for immersive working memory

Dejan Draschkow (), Anna C. Nobre and Freek Ede
Additional contact information
Dejan Draschkow: University of Oxford
Anna C. Nobre: University of Oxford
Freek Ede: University of Oxford

Nature Human Behaviour, 2022, vol. 6, issue 4, 536-544

Abstract: Abstract As we move around, relevant information that disappears from sight can still be held in working memory to serve upcoming behaviour. How we maintain and select visual information as we move through the environment remains poorly understood because most laboratory tasks of working memory rely on removing visual material while participants remain still. We used virtual reality to study visual working memory following self-movement in immersive environments. Directional biases in gaze revealed the recruitment of more than one spatial frame for maintaining and selecting memoranda following self-movement. The findings bring the important realization that multiple spatial frames support working memory in natural behaviour. The results also illustrate how virtual reality can be a critical experimental tool to characterize this core memory system.

Date: 2022
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.nature.com/articles/s41562-021-01245-y Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:nathum:v:6:y:2022:i:4:d:10.1038_s41562-021-01245-y

Ordering information: This journal article can be ordered from
https://www.nature.com/nathumbehav/

DOI: 10.1038/s41562-021-01245-y

Access Statistics for this article

Nature Human Behaviour is currently edited by Stavroula Kousta

More articles in Nature Human Behaviour from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:nathum:v:6:y:2022:i:4:d:10.1038_s41562-021-01245-y