WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch
We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartpho...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2025-01-01
|
Series: | Frontiers in Robotics and AI |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/frobt.2024.1478016/full |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1841561201246470144 |
---|---|
author | Fabian C. Weigend Neelesh Kumar Oya Aran Heni Ben Amor |
author_facet | Fabian C. Weigend Neelesh Kumar Oya Aran Heni Ben Amor |
author_sort | Fabian C. Weigend |
collection | DOAJ |
description | We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters. www.github.com/wearable-motion-capture. |
format | Article |
id | doaj-art-953bae6ed60341a3ae6a8c7e4f97101f |
institution | Kabale University |
issn | 2296-9144 |
language | English |
publishDate | 2025-01-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Robotics and AI |
spelling | doaj-art-953bae6ed60341a3ae6a8c7e4f97101f2025-01-03T05:10:14ZengFrontiers Media S.A.Frontiers in Robotics and AI2296-91442025-01-011110.3389/frobt.2024.14780161478016WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatchFabian C. Weigend0Neelesh Kumar1Oya Aran2Heni Ben Amor3Interactive Robotics Laboratory, School of Computing and Augmented Intelligence (SCAI), Arizona State University (ASU), Tempe, AZ, United StatesCorporate Functions-R&D, Procter and Gamble, Mason, OH, United StatesCorporate Functions-R&D, Procter and Gamble, Mason, OH, United StatesInteractive Robotics Laboratory, School of Computing and Augmented Intelligence (SCAI), Arizona State University (ASU), Tempe, AZ, United StatesWe present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters. www.github.com/wearable-motion-capture.https://www.frontiersin.org/articles/10.3389/frobt.2024.1478016/fullmotion capturehuman-robot interactionteleoperationsmartwatchwearablesdrone control |
spellingShingle | Fabian C. Weigend Neelesh Kumar Oya Aran Heni Ben Amor WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch Frontiers in Robotics and AI motion capture human-robot interaction teleoperation smartwatch wearables drone control |
title | WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch |
title_full | WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch |
title_fullStr | WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch |
title_full_unstemmed | WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch |
title_short | WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch |
title_sort | wearmocap multimodal pose tracking for ubiquitous robot control using a smartwatch |
topic | motion capture human-robot interaction teleoperation smartwatch wearables drone control |
url | https://www.frontiersin.org/articles/10.3389/frobt.2024.1478016/full |
work_keys_str_mv | AT fabiancweigend wearmocapmultimodalposetrackingforubiquitousrobotcontrolusingasmartwatch AT neeleshkumar wearmocapmultimodalposetrackingforubiquitousrobotcontrolusingasmartwatch AT oyaaran wearmocapmultimodalposetrackingforubiquitousrobotcontrolusingasmartwatch AT henibenamor wearmocapmultimodalposetrackingforubiquitousrobotcontrolusingasmartwatch |