Multi-neuron connection using multi-terminal floating–gate memristor for unsupervised learning
Ui Yeon Won,
Quoc An Vu,
Sung Bum Park,
Mi Hyang Park,
Do Van Dam,
Hyun Jun Park,
Heejun Yang,
Young Hee Lee () and
Woo Jong Yu ()
Additional contact information
Ui Yeon Won: Sungkyunkwan University
Quoc An Vu: Sungkyunkwan University
Sung Bum Park: Sungkyunkwan University
Mi Hyang Park: Sungkyunkwan University
Do Van Dam: Sungkyunkwan University
Hyun Jun Park: Mobile Communication Business, Samsung Electronics
Heejun Yang: Korea Advanced Institute of Science and Technology
Young Hee Lee: Sungkyunkwan University
Woo Jong Yu: Sungkyunkwan University
Nature Communications, 2023, vol. 14, issue 1, 1-11
Abstract:
Abstract Multi-terminal memristor and memtransistor (MT-MEMs) has successfully performed complex functions of heterosynaptic plasticity in synapse. However, theses MT-MEMs lack the ability to emulate membrane potential of neuron in multiple neuronal connections. Here, we demonstrate multi-neuron connection using a multi-terminal floating-gate memristor (MT-FGMEM). The variable Fermi level (EF) in graphene allows charging and discharging of MT-FGMEM using horizontally distant multiple electrodes. Our MT-FGMEM demonstrates high on/off ratio over 105 at 1000 s retention about ~10,000 times higher than other MT-MEMs. The linear behavior between current (ID) and floating gate potential (VFG) in triode region of MT-FGMEM allows for accurate spike integration at the neuron membrane. The MT-FGMEM fully mimics the temporal and spatial summation of multi-neuron connections based on leaky-integrate-and-fire (LIF) functionality. Our artificial neuron (150 pJ) significantly reduces the energy consumption by 100,000 times compared to conventional neurons based on silicon integrated circuits (11.7 μJ). By integrating neurons and synapses using MT-FGMEMs, a spiking neurosynaptic training and classification of directional lines functioned in visual area one (V1) is successfully emulated based on neuron’s LIF and synapse’s spike-timing-dependent plasticity (STDP) functions. Simulation of unsupervised learning based on our artificial neuron and synapse achieves a learning accuracy of 83.08% on the unlabeled MNIST handwritten dataset.
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-023-38667-3 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-38667-3
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-023-38667-3
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().