Incremental accumulation of linguistic context in artificial and biological neural networks
Refael Tikochinski (),
Ariel Goldstein,
Yoav Meiri,
Uri Hasson and
Roi Reichart
Additional contact information
Refael Tikochinski: Technion - Israel Institute of Technology
Ariel Goldstein: The Hebrew University of Jerusalem
Yoav Meiri: Technion - Israel Institute of Technology
Uri Hasson: Princeton University
Roi Reichart: Technion - Israel Institute of Technology
Nature Communications, 2025, vol. 16, issue 1, 1-11
Abstract:
Abstract Large Language Models (LLMs) have shown success in predicting neural signals associated with narrative processing, but their approach to integrating context over large timescales differs fundamentally from that of the human brain. In this study, we show how the brain, unlike LLMs that process large text windows in parallel, integrates short-term and long-term contextual information through an incremental mechanism. Using fMRI data from 219 participants listening to spoken narratives, we first demonstrate that LLMs predict brain activity effectively only when using short contextual windows of up to a few dozen words. Next, we introduce an alternative LLM-based incremental-context model that combines incoming short-term context with an aggregated, dynamically updated summary of prior context. This model significantly enhances the prediction of neural activity in higher-order regions involved in long-timescale processing. Our findings reveal how the brain’s hierarchical temporal processing mechanisms enable the flexible integration of information over time, providing valuable insights for both cognitive neuroscience and AI development.
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-025-56162-9 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56162-9
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-025-56162-9
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().