Speech synthesis from neural decoding of spoken sentences
Gopala K. Anumanchipalli,
Josh Chartier and
Edward F. Chang ()
Additional contact information
Gopala K. Anumanchipalli: University of California San Francisco
Josh Chartier: University of California San Francisco
Edward F. Chang: University of California San Francisco
Nature, 2019, vol. 568, issue 7753, 493-498
Abstract:
Abstract Technology that translates neural activity into speech would be transformative for people who are unable to communicate as a result of neurological impairments. Decoding speech from neural activity is challenging because speaking requires very precise and rapid multi-dimensional control of vocal tract articulators. Here we designed a neural decoder that explicitly leverages kinematic and sound representations encoded in human cortical activity to synthesize audible speech. Recurrent neural networks first decoded directly recorded cortical activity into representations of articulatory movement, and then transformed these representations into speech acoustics. In closed vocabulary tests, listeners could readily identify and transcribe speech synthesized from cortical activity. Intermediate articulatory dynamics enhanced performance even with limited data. Decoded articulatory representations were highly conserved across speakers, enabling a component of the decoder to be transferrable across participants. Furthermore, the decoder could synthesize speech when a participant silently mimed sentences. These findings advance the clinical viability of using speech neuroprosthetic technology to restore spoken communication.
Date: 2019
References: Add references at CitEc
Citations: View citations in EconPapers (11)
Downloads: (external link)
https://www.nature.com/articles/s41586-019-1119-1 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:nature:v:568:y:2019:i:7753:d:10.1038_s41586-019-1119-1
Ordering information: This journal article can be ordered from
https://www.nature.com/
DOI: 10.1038/s41586-019-1119-1
Access Statistics for this article
Nature is currently edited by Magdalena Skipper
More articles in Nature from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().