EconPapers    
Economics at your fingertips  
 

Generalization in quantum machine learning from few training data

Matthias C. Caro (), Hsin-Yuan Huang, M. Cerezo, Kunal Sharma, Andrew Sornborger, Lukasz Cincio and Patrick J. Coles
Additional contact information
Matthias C. Caro: Technical University of Munich
Hsin-Yuan Huang: Institute for Quantum Information and Matter, Caltech
M. Cerezo: Information Sciences, Los Alamos National Laboratory
Kunal Sharma: University of Maryland
Andrew Sornborger: Information Sciences, Los Alamos National Laboratory
Lukasz Cincio: Theoretical Division, Los Alamos National Laboratory
Patrick J. Coles: Theoretical Division, Los Alamos National Laboratory

Nature Communications, 2022, vol. 13, issue 1, 1-11

Abstract: Abstract Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML after training on a limited number N of training data points. We show that the generalization error of a quantum machine learning model with T trainable gates scales at worst as $$\sqrt{T/N}$$ T / N . When only K ≪ T gates have undergone substantial change in the optimization process, we prove that the generalization error improves to $$\sqrt{K/N}$$ K / N . Our results imply that the compiling of unitaries into a polynomial number of native gates, a crucial application for the quantum computing industry that typically uses exponential-size training data, can be sped up significantly. We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set. Other potential applications include learning quantum error correcting codes or quantum dynamical simulation. Our work injects new hope into the field of QML, as good generalization is guaranteed from few training data.

Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (5)

Downloads: (external link)
https://www.nature.com/articles/s41467-022-32550-3 Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-32550-3

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-022-32550-3

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-32550-3