Data Brief. 2026 Apr 17;66:112761. doi: 10.1016/j.dib.2026.112761. eCollection 2026 Jun.
ABSTRACT
Recognizing fine-grained hand and arm gestures, especially those that occur naturally in daily activities, remains a challenge in wearable-based human activity recognition. This dataset supports fine-grained gesture recognition using wearable inertial sensors, with a focus on distinguishing subtle daily activities such as liquid ingestion and similar upper-body gestures. Fifty volunteers participated in controlled recording sessions, each performing predefined gestures including answering a phone call, scratching the head, adjusting glasses, passing the hand over the face, holding the chin, and stretching the arms behind the neck. Data were collected from a WT901BLECL5 sensor placed on the dominant wrist, capturing tri-axial accelerometer and gyroscope readings at 200 Hz. Real-time annotations were performed via a custom mobile application synchronized with sensor acquisition. The dataset is provided as CSV files, structured both by segmented gesture occurrences and by continuous recordings, with each sample labeled using standardized gesture identifiers. This structure facilitates straightforward reuse for machine learning tasks such as gesture classification, activity recognition, and sequence modeling. The dataset is expected to support the development of robust models for real-world wearable gesture recognition applications.
PMID:42088299 | PMC:PMC13136770 | DOI:10.1016/j.dib.2026.112761

