Fast, efficient, and accurate neuro-imaging denoising via supervised deep learning
Shivesh Chaudhary,
Sihoon Moon and
Hang Lu ()
Additional contact information
Shivesh Chaudhary: Georgia Institute of Technology
Sihoon Moon: Georgia Institute of Technology
Hang Lu: Georgia Institute of Technology
Nature Communications, 2022, vol. 13, issue 1, 1-16
Abstract:
Abstract Volumetric functional imaging is widely used for recording neuron activities in vivo, but there exist tradeoffs between the quality of the extracted calcium traces, imaging speed, and laser power. While deep-learning methods have recently been applied to denoise images, their applications to downstream analyses, such as recovering high-SNR calcium traces, have been limited. Further, these methods require temporally-sequential pre-registered data acquired at ultrafast rates. Here, we demonstrate a supervised deep-denoising method to circumvent these tradeoffs for several applications, including whole-brain imaging, large-field-of-view imaging in freely moving animals, and recovering complex neurite structures in C. elegans. Our framework has 30× smaller memory footprint, and is fast in training and inference (50–70 ms); it is highly accurate and generalizable, and further, trained with only small, non-temporally-sequential, independently-acquired training datasets (∼500 pairs of images). We envision that the framework will enable faster and long-term imaging experiments necessary to study neuronal mechanisms of many behaviors.
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-022-32886-w Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-32886-w
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-022-32886-w
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().