Independent working memory resources for egocentric and allocentric spatial information
David Aagten-Murphy and
Paul M Bays
PLOS Computational Biology, 2019, vol. 15, issue 2, 1-20
Abstract:
Visuospatial working memory enables us to maintain access to visual information for processing even when a stimulus is no longer present, due to occlusion, our own movements, or transience of the stimulus. Here we show that, when localizing remembered stimuli, the precision of spatial recall does not rely solely on memory for individual stimuli, but additionally depends on the relative distances between stimuli and visual landmarks in the surroundings. Across three separate experiments, we consistently observed a spatially selective improvement in the precision of recall for items located near a persistent landmark. While the results did not require that the landmark be visible throughout the memory delay period, it was essential that it was visible both during encoding and response. We present a simple model that can accurately capture human performance by considering relative (allocentric) spatial information as an independent localization estimate which degrades with distance and is optimally integrated with egocentric spatial information. Critically, allocentric information was encoded without cost to egocentric estimation, demonstrating independent storage of the two sources of information. Finally, when egocentric and allocentric estimates were put in conflict, the model successfully predicted the resulting localization errors. We suggest that the relative distance between stimuli represents an additional, independent spatial cue for memory recall. This cue information is likely to be critical for spatial localization in natural settings which contain an abundance of visual landmarks.Author summary: Human capacity to maintain spatial information over brief interruptions is strongly limited. However, while studies of visual working memory typically examine recall in sparse displays, consisting only of the stimuli to remember, natural scenes are commonly filled with other objects that—although not required to be remembered—may nevertheless influence subsequent localization. We demonstrate that memory for spatial location depends on independent stores for egocentric (relative to the observer) and allocentric (relative to other stimuli) information about object position. Both types of spatial representation become increasingly imprecise as the number of objects in memory increases. However, even when visual landmarks are present—and allocentric information encoded—there is no change in egocentric precision. This suggests that the encoding of additional allocentric spatial information does not compete for working memory resources with egocentric spatial information. Additionally, the fidelity of allocentric position information diminished rapidly with distance, resulting in a spatially specific advantage for recall of objects in the vicinity of stable landmarks. The effect of a landmark on recall matches that of an ideal observer who optimally combines egocentric and allocentric cues. This work provides a new experimental and theoretical framework for the investigation of spatial memory mechanisms.
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1006563 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 06563&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1006563
DOI: 10.1371/journal.pcbi.1006563
Access Statistics for this article
More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol (ploscompbiol@plos.org).