Interpretable video-based tracking and quantification of parkinsonism clinical motor states
Abstract Quantification of motor symptom progression in Parkinson’s disease (PD) patients is crucial for assessing disease progression and for optimizing therapeutic interventions, such as dopaminergic medications and deep brain stimulation. Cumulative and heuristic clinical experience has identifie...
Saved in:
| Main Authors: | , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2024-06-01
|
| Series: | npj Parkinson's Disease |
| Online Access: | https://doi.org/10.1038/s41531-024-00742-x |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849335621745639424 |
|---|---|
| author | Daniel Deng Jill L. Ostrem Vy Nguyen Daniel D. Cummins Julia Sun Anupam Pathak Simon Little Reza Abbasi-Asl |
| author_facet | Daniel Deng Jill L. Ostrem Vy Nguyen Daniel D. Cummins Julia Sun Anupam Pathak Simon Little Reza Abbasi-Asl |
| author_sort | Daniel Deng |
| collection | DOAJ |
| description | Abstract Quantification of motor symptom progression in Parkinson’s disease (PD) patients is crucial for assessing disease progression and for optimizing therapeutic interventions, such as dopaminergic medications and deep brain stimulation. Cumulative and heuristic clinical experience has identified various clinical signs associated with PD severity, but these are neither objectively quantifiable nor robustly validated. Video-based objective symptom quantification enabled by machine learning (ML) introduces a potential solution. However, video-based diagnostic tools often have implementation challenges due to expensive and inaccessible technology, and typical “black-box” ML implementations are not tailored to be clinically interpretable. Here, we address these needs by releasing a comprehensive kinematic dataset and developing an interpretable video-based framework that predicts high versus low PD motor symptom severity according to MDS-UPDRS Part III metrics. This data driven approach validated and robustly quantified canonical movement features and identified new clinical insights, not previously appreciated as related to clinical severity, including pinkie finger movements and lower limb and axial features of gait. Our framework is enabled by retrospective, single-view, seconds-long videos recorded on consumer-grade devices such as smartphones, tablets, and digital cameras, thereby eliminating the requirement for specialized equipment. Following interpretable ML principles, our framework enforces robustness and interpretability by integrating (1) automatic, data-driven kinematic metric evaluation guided by pre-defined digital features of movement, (2) combination of bi-domain (body and hand) kinematic features, and (3) sparsity-inducing and stability-driven ML analysis with simple-to-interpret models. These elements ensure that the proposed framework quantifies clinically meaningful motor features useful for both ML predictions and clinical analysis. |
| format | Article |
| id | doaj-art-9d5a0933f3b245d29fce6faf6eee9585 |
| institution | Kabale University |
| issn | 2373-8057 |
| language | English |
| publishDate | 2024-06-01 |
| publisher | Nature Portfolio |
| record_format | Article |
| series | npj Parkinson's Disease |
| spelling | doaj-art-9d5a0933f3b245d29fce6faf6eee95852025-08-20T03:45:12ZengNature Portfolionpj Parkinson's Disease2373-80572024-06-0110111210.1038/s41531-024-00742-xInterpretable video-based tracking and quantification of parkinsonism clinical motor statesDaniel Deng0Jill L. Ostrem1Vy Nguyen2Daniel D. Cummins3Julia Sun4Anupam Pathak5Simon Little6Reza Abbasi-Asl7Department of Neurology, University of California, San FranciscoDepartment of Neurology, University of California, San FranciscoDepartment of Neurology, University of California, San FranciscoDepartment of Neurology, University of California, San FranciscoDepartment of Neurology, University of California, San FranciscoGoogle Inc., Mountain ViewDepartment of Neurology, University of California, San FranciscoDepartment of Neurology, University of California, San FranciscoAbstract Quantification of motor symptom progression in Parkinson’s disease (PD) patients is crucial for assessing disease progression and for optimizing therapeutic interventions, such as dopaminergic medications and deep brain stimulation. Cumulative and heuristic clinical experience has identified various clinical signs associated with PD severity, but these are neither objectively quantifiable nor robustly validated. Video-based objective symptom quantification enabled by machine learning (ML) introduces a potential solution. However, video-based diagnostic tools often have implementation challenges due to expensive and inaccessible technology, and typical “black-box” ML implementations are not tailored to be clinically interpretable. Here, we address these needs by releasing a comprehensive kinematic dataset and developing an interpretable video-based framework that predicts high versus low PD motor symptom severity according to MDS-UPDRS Part III metrics. This data driven approach validated and robustly quantified canonical movement features and identified new clinical insights, not previously appreciated as related to clinical severity, including pinkie finger movements and lower limb and axial features of gait. Our framework is enabled by retrospective, single-view, seconds-long videos recorded on consumer-grade devices such as smartphones, tablets, and digital cameras, thereby eliminating the requirement for specialized equipment. Following interpretable ML principles, our framework enforces robustness and interpretability by integrating (1) automatic, data-driven kinematic metric evaluation guided by pre-defined digital features of movement, (2) combination of bi-domain (body and hand) kinematic features, and (3) sparsity-inducing and stability-driven ML analysis with simple-to-interpret models. These elements ensure that the proposed framework quantifies clinically meaningful motor features useful for both ML predictions and clinical analysis.https://doi.org/10.1038/s41531-024-00742-x |
| spellingShingle | Daniel Deng Jill L. Ostrem Vy Nguyen Daniel D. Cummins Julia Sun Anupam Pathak Simon Little Reza Abbasi-Asl Interpretable video-based tracking and quantification of parkinsonism clinical motor states npj Parkinson's Disease |
| title | Interpretable video-based tracking and quantification of parkinsonism clinical motor states |
| title_full | Interpretable video-based tracking and quantification of parkinsonism clinical motor states |
| title_fullStr | Interpretable video-based tracking and quantification of parkinsonism clinical motor states |
| title_full_unstemmed | Interpretable video-based tracking and quantification of parkinsonism clinical motor states |
| title_short | Interpretable video-based tracking and quantification of parkinsonism clinical motor states |
| title_sort | interpretable video based tracking and quantification of parkinsonism clinical motor states |
| url | https://doi.org/10.1038/s41531-024-00742-x |
| work_keys_str_mv | AT danieldeng interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT jilllostrem interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT vynguyen interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT danieldcummins interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT juliasun interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT anupampathak interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT simonlittle interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates AT rezaabbasiasl interpretablevideobasedtrackingandquantificationofparkinsonismclinicalmotorstates |