Ubiquitous memory augmentation via mobile multimodal embedding system
Dongqi Cai,
Shangguang Wang (),
Chen Peng,
Zeling Zhang,
Zhenyan Lu,
Tao Qi,
Nicholas D. Lane () and
Mengwei Xu ()
Additional contact information
Dongqi Cai: Beijing University of Posts and Telecommunications
Shangguang Wang: Beijing University of Posts and Telecommunications
Chen Peng: Beijing University of Posts and Telecommunications
Zeling Zhang: Beijing University of Posts and Telecommunications
Zhenyan Lu: Beijing University of Posts and Telecommunications
Tao Qi: Beijing University of Posts and Telecommunications
Nicholas D. Lane: University of Cambridge
Mengwei Xu: Beijing University of Posts and Telecommunications
Nature Communications, 2025, vol. 16, issue 1, 1-12
Abstract:
Abstract Forgetting is inevitable in human memory. Recently, multimodal embedding models have been proposed to vectorize multimodal reality into a unified embedding space. Once generated, these embeddings allow mobile users to quickly retrieve relevant information, effectively augmenting their memory. However, as the model’s capacity increases, its resource consumption also rises. The resulting slow throughput and significant computational resource requirements hinder its deployment on mobile devices. In this paper, we present Reminisce, an efficient on-device multimodal embedding system that enables high-throughput embedding and precise retrieval on resource-constrained mobile devices. The core design draws inspiration from the memory functions of the human brain, utilizing coarse-grained embeddings to identify likely candidates, which are then refined through query-driven fine-grained retrieval. A series of algorithm-hardware orchestrated optimizations automatically navigates this process and strengthen the embedding quality. Experiments show that Reminisce provides high-quality embedding representation with high throughput while operating silently in the background with negligible memory usage and reduced energy consumption.
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-025-60802-5 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-60802-5
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-025-60802-5
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().