Deep neural networks enable quantitative movement analysis using single-camera videos
Łukasz Kidziński (),
Bryan Yang,
Jennifer L. Hicks,
Apoorva Rajagopal,
Scott L. Delp and
Michael H. Schwartz ()
Additional contact information
Łukasz Kidziński: Department of Bioengineering, Stanford University
Bryan Yang: Department of Bioengineering, Stanford University
Jennifer L. Hicks: Department of Bioengineering, Stanford University
Apoorva Rajagopal: Department of Bioengineering, Stanford University
Scott L. Delp: Department of Bioengineering, Stanford University
Michael H. Schwartz: Center for Gait and Motion Analysis, Gillette Children’s Specialty Healthcare
Nature Communications, 2020, vol. 11, issue 1, 1-10
Abstract:
Abstract Many neurological and musculoskeletal diseases impair movement, which limits people’s function and social participation. Quantitative assessment of motion is critical to medical decision-making but is currently possible only with expensive motion capture systems and highly trained personnel. Here, we present a method for predicting clinically relevant motion parameters from an ordinary video of a patient. Our machine learning models predict parameters include walking speed (r = 0.73), cadence (r = 0.79), knee flexion angle at maximum extension (r = 0.83), and Gait Deviation Index (GDI), a comprehensive metric of gait impairment (r = 0.75). These correlation values approach the theoretical limits for accuracy imposed by natural variability in these metrics within our patient population. Our methods for quantifying gait pathology with commodity cameras increase access to quantitative motion analysis in clinics and at home and enable researchers to conduct large-scale studies of neurological and musculoskeletal disorders.
Date: 2020
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.nature.com/articles/s41467-020-17807-z Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:11:y:2020:i:1:d:10.1038_s41467-020-17807-z
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-020-17807-z
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().