Tree-based machine learning performed in-memory with memristive analog CAM
Giacomo Pedretti (),
Catherine E. Graves (),
Sergey Serebryakov,
Ruibin Mao,
Xia Sheng,
Martin Foltin,
Can Li and
John Paul Strachan ()
Additional contact information
Giacomo Pedretti: Hewlett Packard Labs, Hewlett Packard Enterprise
Catherine E. Graves: Hewlett Packard Labs, Hewlett Packard Enterprise
Sergey Serebryakov: Hewlett Packard Labs, Hewlett Packard Enterprise
Ruibin Mao: The University of Hong Kong
Xia Sheng: Hewlett Packard Labs, Hewlett Packard Enterprise
Martin Foltin: Hewlett Packard Labs, Hewlett Packard Enterprise
Can Li: Hewlett Packard Labs, Hewlett Packard Enterprise
John Paul Strachan: Forschungszentrum Jülich GmbH
Nature Communications, 2021, vol. 12, issue 1, 1-10
Abstract:
Abstract Tree-based machine learning techniques, such as Decision Trees and Random Forests, are top performers in several domains as they do well with limited training datasets and offer improved interpretability compared to Deep Neural Networks (DNN). However, these models are difficult to optimize for fast inference at scale without accuracy loss in von Neumann architectures due to non-uniform memory access patterns. Recently, we proposed a novel analog content addressable memory (CAM) based on emerging memristor devices for fast look-up table operations. Here, we propose for the first time to use the analog CAM as an in-memory computational primitive to accelerate tree-based model inference. We demonstrate an efficient mapping algorithm leveraging the new analog CAM capabilities such that each root to leaf path of a Decision Tree is programmed into a row. This new in-memory compute concept for enables few-cycle model inference, dramatically increasing 103 × the throughput over conventional approaches.
Date: 2021
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-021-25873-0 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:12:y:2021:i:1:d:10.1038_s41467-021-25873-0
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-021-25873-0
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().