A smart walking stick with voice guidance in an African language for visually impaired persons

Abstract The inability of visually impaired individuals to navigate their environment independently can lead to a loss of independence and quality of life. Existing solutions do not address the specific needs of individuals who speak Yorùbá. Therefore, there is need to develop a smart walking stick...

Full description

Saved in:
Bibliographic Details
Main Authors: Abisola Olayiwola, Wasilat Olayode, Temiloluwa Akintayo, Ajibola Oyedeji, Dare Olayiwola, Martins Osifeko, Olatilewa Abolade
Format: Article
Language:English
Published: SpringerOpen 2025-08-01
Series:Journal of Electrical Systems and Information Technology
Subjects:
Online Access:https://doi.org/10.1186/s43067-025-00254-5
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract The inability of visually impaired individuals to navigate their environment independently can lead to a loss of independence and quality of life. Existing solutions do not address the specific needs of individuals who speak Yorùbá. Therefore, there is need to develop a smart walking stick that can detect obstacles and communicate in Yorùbá. The development process begins with creating an object detection dataset, featuring annotated images of common obstacles in Yorùbá, to ensure cultural and linguistic relevance. A Convolutional Neural Network is then trained using this dataset to achieve precise obstacle detection and classification. The model is subsequently deployed to Render's cloud server to leverage advanced computational resources for efficient processing. The final stage involves integrating the trained model with the ESP32. The model achieved accuracy, precision, recall, and F1-score of 0.8969, 0.9110, 0.9915, and 0.8969 in obstacle recognition and offers about 6.23 h of continuous use on a full battery charge. This work demonstrates the viability of integrating cloud-based machine learning into assistive devices for visually impaired users. This study has the potential to significantly impact on the lives of visually impaired individuals, contribute to the advancement of assistive technology, and promote cultural inclusivity. It will also provide opportunities for language learning and engagement.
ISSN:2314-7172