EchoFlex: Hand gesture recognition using ultrasound imaging

Jess McIntosh, Asier Marzo, Mike Fraser, Carol Phillips

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

87 Citations (SciVal)

Abstract

Recent improvements in ultrasound imaging enable new opportunities for hand pose detection using wearable devices. Ultrasound imaging has remained under-explored in the HCI community despite being non-invasive, harmless and capable of imaging internal body parts, with applications including smart-watch interaction, prosthesis control and instrument tuition. In this paper, we compare the performance of different forearm mounting positions for a wearable ultrasonographic device. Location plays a fundamental role in ergonomics and performance since the anatomical features differ among positions. We also investigate the performance decrease due to cross-session position shifts and develop a technique to compensate for this misalignment. Our gesture recognition algorithm combines image processing and neural networks to classify the flexion and extension of 10 discrete hand gestures with an accuracy above 98%. Furthermore, this approach can continuously track individual digit flexion with less than 5% NRMSE, and also differentiate between digit flexion at different joints.

Original languageEnglish
Title of host publicationCHI '17: Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems
Subtitle of host publicationExplore, Innovate, Inspire
PublisherAssociation for Computing Machinery
Pages1923-1934
Number of pages12
ISBN (Electronic)9781450346559
DOIs
Publication statusPublished - 2 May 2017
Event2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 - Denver, USA United States
Duration: 6 May 201711 May 2017

Publication series

NameConference on Human Factors in Computing Systems - Proceedings
Volume2017-May

Conference

Conference2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017
Country/TerritoryUSA United States
CityDenver
Period6/05/1711/05/17

Funding

This research was supported by EPSRC Doctoral Training funding through grant EP/M507994/1. We especially thank members of BIRCH within the UH Bristol NHS Foundation Trust. We also thank Nvidia for facilitating our research by providing hardware and Matthew Sutton for assistance with the video and pictures.

Keywords

  • Computer vision
  • Gesture recognition
  • Interactive ultrasound imaging
  • Machine learning

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'EchoFlex: Hand gesture recognition using ultrasound imaging'. Together they form a unique fingerprint.

Cite this