EconPapers    
Economics at your fingertips  
 

Foundation neural-networks quantum states as a unified Ansatz for multiple hamiltonians

Riccardo Rende (), Luciano Loris Viteritti (), Federico Becca, Antonello Scardicchio, Alessandro Laio and Giuseppe Carleo ()
Additional contact information
Riccardo Rende: International School for Advanced Studies (SISSA)
Luciano Loris Viteritti: École Polytechnique Fédérale de Lausanne (EPFL)
Federico Becca: Università di Trieste
Antonello Scardicchio: The Abdus Salam ICTP
Alessandro Laio: International School for Advanced Studies (SISSA)
Giuseppe Carleo: École Polytechnique Fédérale de Lausanne (EPFL)

Nature Communications, 2025, vol. 16, issue 1, 1-12

Abstract: Abstract Foundation models are highly versatile neural-network architectures capable of processing different data types, such as text and images, and generalizing across various tasks like classification and generation. Inspired by this success, we propose Foundation Neural-Network Quantum States (FNQS) as an integrated paradigm for studying quantum many-body systems. FNQS leverage key principles of foundation models to define variational wave functions based on a single, versatile architecture that processes multimodal inputs, including spin configurations and Hamiltonian physical couplings. Unlike specialized architectures tailored for individual Hamiltonians, FNQS can generalize to physical Hamiltonians beyond those encountered during training, offering a unified framework adaptable to various quantum systems and tasks. FNQS enable the efficient estimation of quantities that are traditionally challenging or computationally intensive to calculate using conventional methods, particularly disorder-averaged observables. Furthermore, the fidelity susceptibility can be easily obtained to uncover quantum phase transitions without prior knowledge of order parameters. These pretrained models can be efficiently fine-tuned for specific quantum systems. The architectures trained in this paper are publicly available at https://huggingface.co/nqs-models , along with examples for implementing these neural networks in NetKet.

Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.nature.com/articles/s41467-025-62098-x Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-62098-x

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-025-62098-x

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-08-07
Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-62098-x