Abstract

In this paper, we present a technique for recognising hand gestures using data obtained from wearable sensors. A Neural Network takes three input streams- accelerometer data, EMG data, and joint angle data- from sensors affixed to the user’s arm and estimates the gesture the user is performing based on the incoming data. Our proposed method handles each of the three input streams separately using multiple convolution and LSTM layers before concatenating them together and passing them through 3 layers of fully connected neurons. This work was trained on a subset of DB2 Exercise 3 from the Ninapro dataset, composed of data collected from 40 participants performing 23 gestures. This work achieved a classification accuracy of 82.11%. As this falls below the classification accuracy of established literature on similar data, we aim to improve its performance by training and validating the model with a larger quantity of training data.
Original languageEnglish
Title of host publicationIEEE RO-MAN
Publication statusAcceptance date - 9 Jun 2025
EventIEEE International Conference on Robot and Human Interactive Communication - Eindhoven University of Technology, Eindhoven, Netherlands
Duration: 25 Aug 202529 Aug 2025
Conference number: 34

Conference

ConferenceIEEE International Conference on Robot and Human Interactive Communication
Abbreviated titleIEEE RO-MAN 2025
Country/TerritoryNetherlands
CityEindhoven
Period25/08/2529/08/25

Fingerprint

Dive into the research topics of 'Deep Learning for Recognition of Object Manipulation Hand Gestures using Wearable Sensor Data'. Together they form a unique fingerprint.

Cite this