18 Citations (SciVal)
292 Downloads (Pure)

Abstract

In this paper, we present a multimodal sensor interface that is capable of recognizing hand gestures for human-robot interaction. The proposed system is composed of an array of proximity and gesture sensors, which have been mounted on a 3D printed bracelet. The gesture sensors are employed for data collection from four hand gesture movements (up, down, left and right) performed by the human at a predefined distance from the sensorised bracelet. The hand gesture movements are classified using Artificial Neural Networks. The proposed approach is validated with experiments in offline and real-time modes performed systematically. First, in offline mode, the accuracy for recognition of the four hand gesture movements achieved a mean of 97.86%. Second, the trained model was used for classification in real-time and achieved a mean recognition accuracy of 97.7%. The output from the recognised hand gesture in real-time mode was used to control the movement of a Universal Robot (UR3) arm in the CoppeliaSim simulation environment. Overall, the results from the experiments show that using multimodal sensors, together with computational intelligence methods, have the potential for the development of intuitive and safe human-robot interaction.
Original languageEnglish
Title of host publicationIEEE International Conference on Multisensor Fusion and Integration (MFI 2020)
PublisherIEEE
Number of pages6
DOIs
Publication statusPublished - 2020

Fingerprint

Dive into the research topics of 'Towards an intuitive human-robot interaction based on hand gesture recognition and proximity sensors'. Together they form a unique fingerprint.

Cite this