Photon-efficient camera with in-sensor computing
Yanqiu Guan,
Haochen Li,
Yi Zhang,
Yuchen Qiu,
Labao Zhang (),
Xiangyang Ji (),
Hao Wang,
Qi Chen,
Liang Ma,
Xiaohan Wang,
Zhuolin Yang,
Xuecou Tu,
Qingyuan Zhao,
Xiaoqing Jia,
Jian Chen,
Lin Kang and
Peiheng Wu
Additional contact information
Yanqiu Guan: Nanjing University
Haochen Li: Nanjing University
Yi Zhang: Tsinghua University
Yuchen Qiu: Nanjing University
Labao Zhang: Nanjing University
Xiangyang Ji: Tsinghua University
Hao Wang: Nanjing University
Qi Chen: Nanjing University
Liang Ma: Nanjing University
Xiaohan Wang: Nanjing University
Zhuolin Yang: Nanjing University
Xuecou Tu: Nanjing University
Qingyuan Zhao: Nanjing University
Xiaoqing Jia: Nanjing University
Jian Chen: Nanjing University
Lin Kang: Nanjing University
Peiheng Wu: Nanjing University
Nature Communications, 2025, vol. 16, issue 1, 1-8
Abstract:
Abstract Image sensors with internal computing capabilities fuse sensing and computing to significantly reduce the power consumption and latency of machine vision tasks. Linear photodetectors such as 2D semiconductors with tunable electrical and optical properties enable in-sensor computing for multiple functions. In-sensor computing at the single-photon level is much more plausible but has not yet been achieved. Here, we demonstrate a photon-efficient camera with in-sensor computing based on a superconducting nanowire array detector with four programmable dimensions including photon count rate, response time, pulse amplitude, and spectral responsivity. At the same time, the sensor features saturated (100%) quantum efficiency in the range of 405–1550 nm. Benefiting from the multidimensional modulation and ultra-high sensitivity, a classification accuracy of 92.22% for three letters is achieved with only 0.12 photons per pixel per pattern. Furthermore, image preprocessing and spectral classification are demonstrated. Photon-efficient in-sensor computing is beneficial for vision tasks in extremely low-light environments such as covert imaging, biological imaging and space exploration. The single-photon image sensor can be scaled up to construct more complex neural networks, enabling more complex real-time vision tasks with high sensitivity.
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-025-58501-2 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-58501-2
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-025-58501-2
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().