EconPapers    
Economics at your fingertips  
 

Video-based AI for beat-to-beat assessment of cardiac function

David Ouyang (), Bryan He, Amirata Ghorbani, Neal Yuan, Joseph Ebinger, Curtis P. Langlotz, Paul A. Heidenreich, Robert A. Harrington, David H. Liang, Euan A. Ashley and James Y. Zou ()
Additional contact information
David Ouyang: Stanford University
Bryan He: Stanford University
Amirata Ghorbani: Stanford University
Neal Yuan: Cedars-Sinai Medical Center
Joseph Ebinger: Cedars-Sinai Medical Center
Curtis P. Langlotz: Stanford University
Paul A. Heidenreich: Stanford University
Robert A. Harrington: Stanford University
David H. Liang: Stanford University
Euan A. Ashley: Stanford University
James Y. Zou: Stanford University

Nature, 2020, vol. 580, issue 7802, 252-256

Abstract: Abstract Accurate assessment of cardiac function is crucial for the diagnosis of cardiovascular disease1, screening for cardiotoxicity2 and decisions regarding the clinical management of patients with a critical illness3. However, human assessment of cardiac function focuses on a limited sampling of cardiac cycles and has considerable inter-observer variability despite years of training4,5. Here, to overcome this challenge, we present a video-based deep learning algorithm—EchoNet-Dynamic—that surpasses the performance of human experts in the critical tasks of segmenting the left ventricle, estimating ejection fraction and assessing cardiomyopathy. Trained on echocardiogram videos, our model accurately segments the left ventricle with a Dice similarity coefficient of 0.92, predicts ejection fraction with a mean absolute error of 4.1% and reliably classifies heart failure with reduced ejection fraction (area under the curve of 0.97). In an external dataset from another healthcare system, EchoNet-Dynamic predicts the ejection fraction with a mean absolute error of 6.0% and classifies heart failure with reduced ejection fraction with an area under the curve of 0.96. Prospective evaluation with repeated human measurements confirms that the model has variance that is comparable to or less than that of human experts. By leveraging information across multiple cardiac cycles, our model can rapidly identify subtle changes in ejection fraction, is more reproducible than human evaluation and lays the foundation for precise diagnosis of cardiovascular disease in real time. As a resource to promote further innovation, we also make publicly available a large dataset of 10,030 annotated echocardiogram videos.

Date: 2020
References: Add references at CitEc
Citations: View citations in EconPapers (4)

Downloads: (external link)
https://www.nature.com/articles/s41586-020-2145-8 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:nature:v:580:y:2020:i:7802:d:10.1038_s41586-020-2145-8

Ordering information: This journal article can be ordered from
https://www.nature.com/

DOI: 10.1038/s41586-020-2145-8

Access Statistics for this article

Nature is currently edited by Magdalena Skipper

More articles in Nature from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:nature:v:580:y:2020:i:7802:d:10.1038_s41586-020-2145-8