EconPapers    
Economics at your fingertips  
 

Brain-inspired global-local learning incorporated with neuromorphic computing

Yujie Wu, Rong Zhao, Jun Zhu, Feng Chen, Mingkun Xu, Guoqi Li, Sen Song, Lei Deng, Guanrui Wang, Hao Zheng, Songchen Ma, Jing Pei, Youhui Zhang, Mingguo Zhao and Luping Shi ()
Additional contact information
Yujie Wu: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Rong Zhao: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Jun Zhu: Tsinghua University
Feng Chen: Tsinghua University
Mingkun Xu: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Guoqi Li: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Sen Song: Tsinghua University
Lei Deng: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Guanrui Wang: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Hao Zheng: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Songchen Ma: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Jing Pei: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University
Youhui Zhang: Tsinghua University
Mingguo Zhao: Tsinghua University
Luping Shi: Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Tsinghua University

Nature Communications, 2022, vol. 13, issue 1, 1-14

Abstract: Abstract There are two principle approaches for learning in artificial intelligence: error-driven global learning and neuroscience-oriented local learning. Integrating them into one network may provide complementary learning capabilities for versatile learning scenarios. At the same time, neuromorphic computing holds great promise, but still needs plenty of useful algorithms and algorithm-hardware co-designs to fully exploit its advantages. Here, we present a neuromorphic global-local synergic learning model by introducing a brain-inspired meta-learning paradigm and a differentiable spiking model incorporating neuronal dynamics and synaptic plasticity. It can meta-learn local plasticity and receive top-down supervision information for multiscale learning. We demonstrate the advantages of this model in multiple different tasks, including few-shot learning, continual learning, and fault-tolerance learning in neuromorphic vision sensors. It achieves significantly higher performance than single-learning methods. We further implement the model in the Tianjic neuromorphic platform by exploiting algorithm-hardware co-designs and prove that the model can fully utilize neuromorphic many-core architecture to develop hybrid computation paradigm.

Date: 2022
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
https://www.nature.com/articles/s41467-021-27653-2 Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-021-27653-2

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-021-27653-2

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-021-27653-2