EconPapers    
Economics at your fingertips  
 

Memristor-based feature learning for pattern classification

Tuo Shi, Lili Gao, Yang Tian, Shuangzhu Tang, Jinchang Liu, Yiqi Li, Ruixi Zhou, Shiyu Cui, Hui Zhang, Yu Li, Zuheng Wu, Xumeng Zhang, Taihao Li, Xiaobing Yan () and Qi Liu ()
Additional contact information
Tuo Shi: Zhejiang Laboratory
Lili Gao: Zhejiang Laboratory
Yang Tian: Zhejiang Laboratory
Shuangzhu Tang: Zhejiang Laboratory
Jinchang Liu: Zhejiang Laboratory
Yiqi Li: Zhejiang Laboratory
Ruixi Zhou: Zhejiang Laboratory
Shiyu Cui: Zhejiang Laboratory
Hui Zhang: Zhejiang Laboratory
Yu Li: Fudan University
Zuheng Wu: Anhui University
Xumeng Zhang: Fudan University
Taihao Li: Zhejiang Laboratory
Xiaobing Yan: Hebei University
Qi Liu: Fudan University

Nature Communications, 2025, vol. 16, issue 1, 1-13

Abstract: Abstract Inspired by biological processes, feature learning techniques, such as deep learning, have achieved great success in various fields. However, since biological organs may operate differently from semiconductor devices, deep models usually require dedicated hardware and are computation-complex. High energy consumption has made deep model growth unsustainable. We present an approach that directly implements feature learning using semiconductor physics to minimize disparity between model and hardware. Following this approach, a feature learning technique based on memristor drift-diffusion kinetics is proposed by leveraging the dynamic response of a single memristor to learn features. The model parameters and computational operations of the kinetics-based network are reduced by up to 2 and 4 orders of magnitude, respectively, compared with deep models. We experimentally implement the proposed network on 180 nm memristor chips for various dimensional pattern classification tasks. Compared with memristor-based deep learning hardware, the memristor kinetics-based hardware can further reduce energy and area consumption significantly. We propose that innovations in hardware physics could create an intriguing solution for intelligent models by balancing model complexity and performance.

Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.nature.com/articles/s41467-025-56286-y Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56286-y

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-025-56286-y

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-56286-y