Abstract
Recent improvements in ultrasound imaging enable new opportunities for hand pose detection using wearable devices. Ultrasound imaging has remained under-explored in the HCI community despite being non-invasive, harmless and capable of imaging internal body parts, with applications including smart-watch interaction, prosthesis control and instrument tuition. In this paper, we compare the performance of different forearm mounting positions for a wearable ultrasonographic device. Location plays a fundamental role in ergonomics and performance since the anatomical features differ among positions. We also investigate the performance decrease due to cross-session position shifts and develop a technique to compensate for this misalignment. Our gesture recognition algorithm combines image processing and neural networks to classify the flexion and extension of 10 discrete hand gestures with an accuracy above 98%. Furthermore, this approach can continuously track individual digit flexion with less than 5% NRMSE, and also differentiate between digit flexion at different joints.
Original language | English |
---|---|
Title of host publication | CHI '17: Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems |
Subtitle of host publication | Explore, Innovate, Inspire |
Publisher | Association for Computing Machinery |
Pages | 1923-1934 |
Number of pages | 12 |
ISBN (Electronic) | 9781450346559 |
DOIs | |
Publication status | Published - 2 May 2017 |
Event | 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 - Denver, USA United States Duration: 6 May 2017 → 11 May 2017 |
Publication series
Name | Conference on Human Factors in Computing Systems - Proceedings |
---|---|
Volume | 2017-May |
Conference
Conference | 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 |
---|---|
Country/Territory | USA United States |
City | Denver |
Period | 6/05/17 → 11/05/17 |
Funding
This research was supported by EPSRC Doctoral Training funding through grant EP/M507994/1. We especially thank members of BIRCH within the UH Bristol NHS Foundation Trust. We also thank Nvidia for facilitating our research by providing hardware and Matthew Sutton for assistance with the video and pictures.
Keywords
- Computer vision
- Gesture recognition
- Interactive ultrasound imaging
- Machine learning
ASJC Scopus subject areas
- Software
- Human-Computer Interaction
- Computer Graphics and Computer-Aided Design