Multimodal Sensing Module for Human-Robot Collaboration

  • Gorkem Anil Al

Student thesis: Doctoral ThesisDoctor of Engineering (EngD)

Abstract

Current market asks for reduced lead time and mass customisation, which forces production systems to change to be more adaptive to product variation, life cycle and smaller batch size. A fully robotic manufacturing system is an expensive approach to meet these demands because the adaptation of the robots into a continually changing work environment requires many sensing devices to increase the perception capabilities of the robots. The use of extensive dataset generated from the multiple sensing devices for machine learning methods is also a demanding procedure. Hence, Human Robot Collaboration (HRC) is
becoming a popular concept bringing the decision-making capability of humans and high repeatability, speed and accuracy capabilities of robots together for flexible manufacturing. In HRC concept, humans and robots share the same work environment and tasks. However, the safety of humans must be maintained using safety devices and methods during the collaborative work in case of harmful collision with robots. Also, interaction channels should be implemented between humans and robots for effective communication making the collaboration more intuitive and natural.

In the thesis, a novel multimodal sensing module is developed for safe interaction between humans and robots in the domain of industrial applications. The sensing module allows interaction and detection of human presence without restricting the human motion in a collaborative workspace. The sensing module consists of distance, gesture and tactile sensors. Capabilities of the sensing module are presented with two different safety strategies; pre-collision and post-collision, and two interaction methods; contactless gesture interaction
and touch-based interaction. Distance and gesture sensors are used with Artificial Neural Network for contactless gesture interaction to convey commands to a robot. Based on the acquired information from the gesture sensor, the robot performs predefined tasks. The distance sensor of the sensing module is implemented for a pre-collision safety method by changing the robot’s speed dynamically based on the measured distance between the human and the robot. Also, a custom-made tactile sensor in the sensing module provides post-collision and touch-based interaction. The tactile sensor is designed covering a chip composed of accelerometer, gyroscope and barometric sensor with a rectangular shaped silicon material. A post-collision method is applied to stop the robot motion with the change of pressure output resulting after unintended collision. Another ability of the tactile sensor is the recognition of the applied contact location using accelerometer and gyroscope data with a Convolutional Neural Network. This capability is validated with experiments of robot position control using touch sensing. Validation of the explained safety and interaction
methods are conducted in a collaborative assembly task where the human performs a customized assembly process with the help of the robot. The presented sensing module is compact, wearable and can offer different sensing features in one unit. Compared to other wearable multimodal devices, the presented multimodal sensing module offers different kinds of safety strategies and interaction methods that can be used together for complex tasks. Moreover, the multimodal sensing module can be used for industrial and healthcare robotic applications.
Date of Award24 Feb 2024
Original languageEnglish
Awarding Institution
  • University of Bath
SupervisorUriel Martinez Hernandez (Supervisor) & Pedro Estrela (Supervisor)

Cite this

'