Dataset for "DiGeTac Unit for Multimodal Communication in Human-Robot Interaction"



The dataset is divided into 2 main folders containing hand gesture and touch data. Gesture_Data folder contains data collected by performing four hand gestures; up, down, left, and right. There's also error data on mistakes made during these gestures. Touch_Data folder contains IMU data collected from the tactile sensor applying force on four different contact locations and from the idle case where no contact was made with the tactile sensor. The aim of this data collection is to present that the proposed sensing module can establish reliable multimodal communication channels between humans and industrial robots, thereby enhancing the interaction efficiency and user experience in automated environments.
Date made available22 Apr 2024
PublisherUniversity of Bath

Cite this