Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface

Noninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This...

Full description

Saved in:
Bibliographic Details
Main Authors: Kirill Kokorin, Syeda R. Zehra, Jing Mu, Peter Yoo, David B. Grayden, Sam E. John
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Transactions on Neural Systems and Rehabilitation Engineering
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10755142/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846160739140632576
author Kirill Kokorin
Syeda R. Zehra
Jing Mu
Peter Yoo
David B. Grayden
Sam E. John
author_facet Kirill Kokorin
Syeda R. Zehra
Jing Mu
Peter Yoo
David B. Grayden
Sam E. John
author_sort Kirill Kokorin
collection DOAJ
description Noninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This scheme improves task performance but users may prefer direct control (DC) of robot motion. To provide users with a balance of autonomous assistance and manual control, we developed a shared control (SC) system for continuous control of robot translation using an SSVEP AR-BCI, which we tested in a 3D reaching task. The SC system used the BCI input and robot sensor data to continuously predict which object the user wanted to reach, generated an assistance signal, and regulated the level of assistance based on prediction confidence. Eighteen healthy participants took part in our study and each completed 24 reaching trials using DC and SC. Compared to DC, SC significantly improved (paired two-tailed t-test, Holm-corrected <inline-formula> <tex-math notation="LaTeX">$\alpha \lt 0.05$ </tex-math></inline-formula>) mean task success rate (<inline-formula> <tex-math notation="LaTeX">${p} \lt 0.0001$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$\mu =36.1$ </tex-math></inline-formula>%, 95% CI [25.3%, 46.9%]), normalised reaching trajectory length (<inline-formula> <tex-math notation="LaTeX">${p} \lt 0.0001$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$\mu = -26.8$ </tex-math></inline-formula>%, 95% CI [&#x2212;36.0%, &#x2212;17.7%]), and participant workload (<inline-formula> <tex-math notation="LaTeX">${p} =0.02$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$\mu = -11.6$ </tex-math></inline-formula>, 95% CI [&#x2212;21.1, &#x2212;2.0]) measured with the NASA Task Load Index. Therefore, users of SC can control the robot effectively, while experiencing increased agency. Our system can personalise assistive technology by providing users with the ability to select their preferred level of autonomous assistance.
format Article
id doaj-art-15b6f41cf09349d48e17e221c528887d
institution Kabale University
issn 1534-4320
1558-0210
language English
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Transactions on Neural Systems and Rehabilitation Engineering
spelling doaj-art-15b6f41cf09349d48e17e221c528887d2024-11-22T00:00:07ZengIEEEIEEE Transactions on Neural Systems and Rehabilitation Engineering1534-43201558-02102024-01-01324098410810.1109/TNSRE.2024.350021710755142Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer InterfaceKirill Kokorin0https://orcid.org/0000-0002-9003-6917Syeda R. Zehra1https://orcid.org/0000-0003-2568-2653Jing Mu2https://orcid.org/0000-0002-3289-2002Peter Yoo3David B. Grayden4https://orcid.org/0000-0002-5497-7234Sam E. John5https://orcid.org/0000-0003-3780-2210Department of Biomedical Engineering and the Graeme Clark Institute, The University of Melbourne, Melbourne, AustraliaDepartment of Biomedical Engineering and the Graeme Clark Institute, The University of Melbourne, Melbourne, AustraliaDepartment of Biomedical Engineering and the Graeme Clark Institute, The University of Melbourne, Melbourne, AustraliaSynchron Inc., New York, NY, USADepartment of Biomedical Engineering and the Graeme Clark Institute, The University of Melbourne, Melbourne, AustraliaDepartment of Biomedical Engineering and the Graeme Clark Institute, The University of Melbourne, Melbourne, AustraliaNoninvasive augmented-reality (AR) brain-computer interfaces (BCIs) that use steady-state visually evoked potentials (SSVEPs) typically adopt a fully-autonomous goal-selection framework to control a robot, where automation is used to compensate for the low information transfer rate of the BCI. This scheme improves task performance but users may prefer direct control (DC) of robot motion. To provide users with a balance of autonomous assistance and manual control, we developed a shared control (SC) system for continuous control of robot translation using an SSVEP AR-BCI, which we tested in a 3D reaching task. The SC system used the BCI input and robot sensor data to continuously predict which object the user wanted to reach, generated an assistance signal, and regulated the level of assistance based on prediction confidence. Eighteen healthy participants took part in our study and each completed 24 reaching trials using DC and SC. Compared to DC, SC significantly improved (paired two-tailed t-test, Holm-corrected <inline-formula> <tex-math notation="LaTeX">$\alpha \lt 0.05$ </tex-math></inline-formula>) mean task success rate (<inline-formula> <tex-math notation="LaTeX">${p} \lt 0.0001$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$\mu =36.1$ </tex-math></inline-formula>%, 95% CI [25.3%, 46.9%]), normalised reaching trajectory length (<inline-formula> <tex-math notation="LaTeX">${p} \lt 0.0001$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$\mu = -26.8$ </tex-math></inline-formula>%, 95% CI [&#x2212;36.0%, &#x2212;17.7%]), and participant workload (<inline-formula> <tex-math notation="LaTeX">${p} =0.02$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$\mu = -11.6$ </tex-math></inline-formula>, 95% CI [&#x2212;21.1, &#x2212;2.0]) measured with the NASA Task Load Index. Therefore, users of SC can control the robot effectively, while experiencing increased agency. Our system can personalise assistive technology by providing users with the ability to select their preferred level of autonomous assistance.https://ieeexplore.ieee.org/document/10755142/Shared controlbrain-computer/machine interface (BCI/BMI)augmented reality (AR)steady-state visually evoked potential (SSVEP)assistive robot
spellingShingle Kirill Kokorin
Syeda R. Zehra
Jing Mu
Peter Yoo
David B. Grayden
Sam E. John
Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Shared control
brain-computer/machine interface (BCI/BMI)
augmented reality (AR)
steady-state visually evoked potential (SSVEP)
assistive robot
title Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
title_full Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
title_fullStr Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
title_full_unstemmed Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
title_short Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface
title_sort semi autonomous continuous robotic arm control using an augmented reality brain computer interface
topic Shared control
brain-computer/machine interface (BCI/BMI)
augmented reality (AR)
steady-state visually evoked potential (SSVEP)
assistive robot
url https://ieeexplore.ieee.org/document/10755142/
work_keys_str_mv AT kirillkokorin semiautonomouscontinuousroboticarmcontrolusinganaugmentedrealitybraincomputerinterface
AT syedarzehra semiautonomouscontinuousroboticarmcontrolusinganaugmentedrealitybraincomputerinterface
AT jingmu semiautonomouscontinuousroboticarmcontrolusinganaugmentedrealitybraincomputerinterface
AT peteryoo semiautonomouscontinuousroboticarmcontrolusinganaugmentedrealitybraincomputerinterface
AT davidbgrayden semiautonomouscontinuousroboticarmcontrolusinganaugmentedrealitybraincomputerinterface
AT samejohn semiautonomouscontinuousroboticarmcontrolusinganaugmentedrealitybraincomputerinterface