EconPapers    
Economics at your fingertips  
 

Bringing uncertainty quantification to the extreme-edge with memristor-based Bayesian neural networks

Djohan Bonnet (), Tifenn Hirtzlin, Atreya Majumdar, Thomas Dalgaty, Eduardo Esmanhotto, Valentina Meli, Niccolo Castellani, Simon Martin, Jean-François Nodin, Guillaume Bourgeois, Jean-Michel Portal, Damien Querlioz () and Elisa Vianello ()
Additional contact information
Djohan Bonnet: Université Grenoble Alpes, CEA, LETI
Tifenn Hirtzlin: Université Grenoble Alpes, CEA, LETI
Atreya Majumdar: Université Paris-Saclay, CNRS, Centre de Nanosciences et de Nanotechnologies
Thomas Dalgaty: Université Grenoble Alpes, CEA, LIST
Eduardo Esmanhotto: Université Grenoble Alpes, CEA, LETI
Valentina Meli: Université Grenoble Alpes, CEA, LETI
Niccolo Castellani: Université Grenoble Alpes, CEA, LETI
Simon Martin: Université Grenoble Alpes, CEA, LETI
Jean-François Nodin: Université Grenoble Alpes, CEA, LETI
Guillaume Bourgeois: Université Grenoble Alpes, CEA, LETI
Jean-Michel Portal: Aix-Marseille Université, CNRS, Institut Matériaux Microélectronique Nanosciences de Provence
Damien Querlioz: Université Paris-Saclay, CNRS, Centre de Nanosciences et de Nanotechnologies
Elisa Vianello: Université Grenoble Alpes, CEA, LETI

Nature Communications, 2023, vol. 14, issue 1, 1-13

Abstract: Abstract Safety-critical sensory applications, like medical diagnosis, demand accurate decisions from limited, noisy data. Bayesian neural networks excel at such tasks, offering predictive uncertainty assessment. However, because of their probabilistic nature, they are computationally intensive. An innovative solution utilizes memristors’ inherent probabilistic nature to implement Bayesian neural networks. However, when using memristors, statistical effects follow the laws of device physics, whereas in Bayesian neural networks, those effects can take arbitrary shapes. This work overcome this difficulty by adopting a variational inference training augmented by a “technological loss”, incorporating memristor physics. This technique enabled programming a Bayesian neural network on 75 crossbar arrays of 1,024 memristors, incorporating CMOS periphery for in-memory computing. The experimental neural network classified heartbeats with high accuracy, and estimated the certainty of its predictions. The results reveal orders-of-magnitude improvement in inference energy efficiency compared to a microcontroller or an embedded graphics processing unit performing the same task.

Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://www.nature.com/articles/s41467-023-43317-9 Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-43317-9

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-023-43317-9

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-43317-9