Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement

The realization of hand function reengineering using a manipulator is a research hotspot in the field of robotics. In this paper, we propose a multimodal perception and control method for a robotic hand to assist the disabled. The movement of the human hand can be divided into two parts: the coordin...

Full description

Saved in:
Bibliographic Details
Main Authors: Jianwei Cui, Bingyan Yan
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Biomimetics
Subjects:
Online Access:https://www.mdpi.com/2313-7673/9/12/775
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846105652731052032
author Jianwei Cui
Bingyan Yan
author_facet Jianwei Cui
Bingyan Yan
author_sort Jianwei Cui
collection DOAJ
description The realization of hand function reengineering using a manipulator is a research hotspot in the field of robotics. In this paper, we propose a multimodal perception and control method for a robotic hand to assist the disabled. The movement of the human hand can be divided into two parts: the coordination of the posture of the fingers, and the coordination of the timing of grasping and releasing objects. Therefore, we first used a pinhole camera to construct a visual device suitable for finger mounting, and preclassified the shape of the object based on YOLOv8; then, a filtering process using multi-frame synthesized point cloud data from miniature 2D Lidar, and DBSCAN algorithm clustering objects and the DTW algorithm, was proposed to further identify the cross-sectional shape and size of the grasped part of the object and realize control of the robot’s grasping gesture; finally, a multimodal perception and control method for prosthetic hands was proposed. To control the grasping attitude, a fusion algorithm based on information of upper limb motion state, hand position, and lesser toe haptics was proposed to realize control of the robotic grasping process with a human in the ring. The device designed in this paper does not contact the human skin, does not produce discomfort, and the completion rate of the grasping process experiment reached 91.63%, which indicates that the proposed control method has feasibility and applicability.
format Article
id doaj-art-aabd2d7495c8429a9aba918b72702240
institution Kabale University
issn 2313-7673
language English
publishDate 2024-12-01
publisher MDPI AG
record_format Article
series Biomimetics
spelling doaj-art-aabd2d7495c8429a9aba918b727022402024-12-27T14:13:32ZengMDPI AGBiomimetics2313-76732024-12-0191277510.3390/biomimetics9120775Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion MeasurementJianwei Cui0Bingyan Yan1Institute of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaInstitute of Instrument Science and Engineering, Southeast University, Nanjing 210096, ChinaThe realization of hand function reengineering using a manipulator is a research hotspot in the field of robotics. In this paper, we propose a multimodal perception and control method for a robotic hand to assist the disabled. The movement of the human hand can be divided into two parts: the coordination of the posture of the fingers, and the coordination of the timing of grasping and releasing objects. Therefore, we first used a pinhole camera to construct a visual device suitable for finger mounting, and preclassified the shape of the object based on YOLOv8; then, a filtering process using multi-frame synthesized point cloud data from miniature 2D Lidar, and DBSCAN algorithm clustering objects and the DTW algorithm, was proposed to further identify the cross-sectional shape and size of the grasped part of the object and realize control of the robot’s grasping gesture; finally, a multimodal perception and control method for prosthetic hands was proposed. To control the grasping attitude, a fusion algorithm based on information of upper limb motion state, hand position, and lesser toe haptics was proposed to realize control of the robotic grasping process with a human in the ring. The device designed in this paper does not contact the human skin, does not produce discomfort, and the completion rate of the grasping process experiment reached 91.63%, which indicates that the proposed control method has feasibility and applicability.https://www.mdpi.com/2313-7673/9/12/775intention recognitionhuman–machine interaction2D Lidarenvironmental perceptionprosthetic hand
spellingShingle Jianwei Cui
Bingyan Yan
Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement
Biomimetics
intention recognition
human–machine interaction
2D Lidar
environmental perception
prosthetic hand
title Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement
title_full Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement
title_fullStr Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement
title_full_unstemmed Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement
title_short Research on Multimodal Control Method for Prosthetic Hands Based on Visuo-Tactile and Arm Motion Measurement
title_sort research on multimodal control method for prosthetic hands based on visuo tactile and arm motion measurement
topic intention recognition
human–machine interaction
2D Lidar
environmental perception
prosthetic hand
url https://www.mdpi.com/2313-7673/9/12/775
work_keys_str_mv AT jianweicui researchonmultimodalcontrolmethodforprosthetichandsbasedonvisuotactileandarmmotionmeasurement
AT bingyanyan researchonmultimodalcontrolmethodforprosthetichandsbasedonvisuotactileandarmmotionmeasurement