Exploring Gait Recognition in Wild Nighttime Scenes

Currently, gait recognition research is gradually expanding from ideal indoor environments to real-world outdoor scenarios. However, recognition scenarios in practical applications are often more complex than those considered in existing studies. For instance, real-world scenarios present multiple i...

Full description

Saved in:
Bibliographic Details
Main Authors: Haotian Li, Wenjuan Gong, Yutong Li, Yikai Wu, Kechen Li, Jordi Gonzàlez
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/1/350
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Currently, gait recognition research is gradually expanding from ideal indoor environments to real-world outdoor scenarios. However, recognition scenarios in practical applications are often more complex than those considered in existing studies. For instance, real-world scenarios present multiple influencing factors, such as viewpoint variations and diverse carried items. Notably, many gait recognition tasks occur under low-light conditions at night. At present, research on gait recognition in nocturnal environments is relatively limited, and effective methods for nighttime gait recognition are lacking. To address this gap, this study extends gait recognition research to outdoor nighttime environments and introduces the first wild gait dataset encompassing both daytime and nighttime data, named Gait Recognition of Day and Night (GaitDN). Furthermore, to tackle the challenges posed by low-light conditions and other influencing factors in outdoor nighttime gait recognition, we propose a novel pose-based gait recognition framework called GaitSAT. This framework models the intrinsic correlations of human joints by integrating self-attention and graph convolution modules. We conduct a comprehensive evaluation of the proposed method and existing approaches using both the GaitDN dataset and other available datasets. The proposed GaitSAT achieves state-of-the-art performance on the OUMVLP, GREW, Gait3D, and GaitDN datasets, with Rank-1 accuracies of 60.77%, 57.37%, 22.90%, and 86.24%, respectively. Experimental results demonstrate that GaitSAT achieves higher accuracy and superior generalization capabilities compared to state-of-the-art pose-based methods.
ISSN:2076-3417