Abstract
Human robot collaboration in manufacturing environments lacks adaptability to different tasks and environments as well as fluency of interaction with workers. This work looks at developing a cognitive architecture for assembly robots allowing prediction of when collaborative actions are required. The cognitive architecture provides reliable perception and reasoning methods for increased human-robot fluency allowing the user to focus on the required task. The system has three layers; the perception layer determines the current task state while the memory layer keeps track of task details, current predictions and past episodes. The control layer predicts future collaborative actions and passes commands to the robot at the required time to reduce user idle time. The system uses inertial measurement unit convolutional neural network action recognition and vision recognition for environment perception. The cognitive architecture is validated with experiments in offline and real-time modes. In offline mode, two action recognition methods are performed, with one classifier for all actions achieving accuracy of 81% and a separate classifier for each action achieving 74%. In real-time mode, a UR3 cobot is used for a collaborative assembly task where an increase in the proportion of time the user is active is shown.
Original language | English |
---|---|
Title of host publication | 2021 20th International Conference on Advanced Robotics, ICAR 2021 |
Place of Publication | U. S. A. |
Publisher | IEEE |
Pages | 1024-1029 |
Number of pages | 6 |
ISBN (Electronic) | 9781665436847 |
DOIs | |
Publication status | Published - 5 Jan 2022 |
Event | 20th International Conference on Advanced Robotics, ICAR 2021 - Ljubljana, Slovenia Duration: 6 Dec 2021 → 10 Dec 2021 |
Publication series
Name | 2021 20th International Conference on Advanced Robotics, ICAR 2021 |
---|
Conference
Conference | 20th International Conference on Advanced Robotics, ICAR 2021 |
---|---|
Country/Territory | Slovenia |
City | Ljubljana |
Period | 6/12/21 → 10/12/21 |
Bibliographical note
Funding Information:*This work was supported by The Engineering and Physical Sciences Research Council (EPSRC) and the Royal Society Research Grants for the ‘Touching and feeling the immersive world’ project (RGS/R2/192346) James and Uriel are with the inte-R-action lab, the Centre for Autonomous Robotics (CENTAUR) and the Department of Electronic & Electrical Engineering, University of Bath, UK (jjm53, u.martinez)@bath.ac.uk
ASJC Scopus subject areas
- Artificial Intelligence
- Human-Computer Interaction
- Software