High-resolution neural recordings improve the accuracy of speech decoding
Suseendrakumar Duraivel,
Shervin Rahimpour,
Chia-Han Chiang,
Michael Trumpis,
Charles Wang,
Katrina Barth,
Stephen C. Harward,
Shivanand P. Lad,
Allan H. Friedman,
Derek G. Southwell,
Saurabh R. Sinha,
Jonathan Viventi () and
Gregory B. Cogan ()
Additional contact information
Suseendrakumar Duraivel: Duke University
Shervin Rahimpour: Duke School of Medicine
Chia-Han Chiang: Duke University
Michael Trumpis: Duke University
Charles Wang: Duke University
Katrina Barth: Duke University
Stephen C. Harward: Duke School of Medicine
Shivanand P. Lad: Duke School of Medicine
Allan H. Friedman: Duke School of Medicine
Derek G. Southwell: Duke University
Saurabh R. Sinha: University of Pennsylvania
Jonathan Viventi: Duke University
Gregory B. Cogan: Duke University
Nature Communications, 2023, vol. 14, issue 1, 1-16
Abstract:
Abstract Patients suffering from debilitating neurodegenerative diseases often lose the ability to communicate, detrimentally affecting their quality of life. One solution to restore communication is to decode signals directly from the brain to enable neural speech prostheses. However, decoding has been limited by coarse neural recordings which inadequately capture the rich spatio-temporal structure of human brain signals. To resolve this limitation, we performed high-resolution, micro-electrocorticographic (µECoG) neural recordings during intra-operative speech production. We obtained neural signals with 57× higher spatial resolution and 48% higher signal-to-noise ratio compared to macro-ECoG and SEEG. This increased signal quality improved decoding by 35% compared to standard intracranial signals. Accurate decoding was dependent on the high-spatial resolution of the neural interface. Non-linear decoding models designed to utilize enhanced spatio-temporal neural information produced better results than linear techniques. We show that high-density µECoG can enable high-quality speech decoding for future neural speech prostheses.
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.nature.com/articles/s41467-023-42555-1 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-42555-1
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-023-42555-1
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().