RF-AIRCGR: Lightweight Convolutional Neural Network-Based RFID Chinese Character Gesture Recognition Research

Gesture recognition serves as a foundation for Human-Computer Interaction (HCI). Although Radio Frequency Identification (RFID) is gaining popularity due to its advantages (non-invasive, low-cost, and lightweight), most existing research has only addressed the recognition of simple sign language ges...

Full description

Saved in:
Bibliographic Details
Main Authors: Yajun Zhang, Congcong Wang, Feng Li, Weiqian Yu, Yuankang Wang, Jingying Chen
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10802886/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Gesture recognition serves as a foundation for Human-Computer Interaction (HCI). Although Radio Frequency Identification (RFID) is gaining popularity due to its advantages (non-invasive, low-cost, and lightweight), most existing research has only addressed the recognition of simple sign language gestures or body movements. There is still a significant gap in the recognition of fine-grained gestures. In this paper, we propose RF-AIRCGR as a fine-grained hand gesture recognition system for Chinese characters. It enables information input and querying through gestures in contactless scenarios, which is of great significance for both medical and educational applications. This system has three main advantages: First, by designing a tag matrix and dual-antenna layout, it fully captures fine-grained gesture data for handwritten Chinese characters. Second, it uses a variance-based sliding window method to segment continuous gesture actions. Lastly, the phase signals of Chinese characters are innovatively transformed into feature images using the Markov Transition Field. After a series of preprocessing steps, the improved C-AlexNet model is employed for deep training and experimentation. Experimental results show that RF-AIRCGR achieves average recognition accuracies of 97.85% for new users and 97.15% for new scenarios. The accuracy and robustness of the system in recognizing Chinese character gestures have been validated.
ISSN:2169-3536