TapGazer: Text Entry with finger tapping and gaze-directed word selection

Zhenyi He, Christof Lutteroth, Ken Perlin

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

24 Downloads (Pure)

Abstract

While using VR, efficient text entry is a challenge: users cannot easily locate standard physical keyboards, and keys are often out of reach, e.g. when standing. We present TapGazer, a text entry system where users type by tapping their fingers in place. Users can tap anywhere as long as the identity of each tapping finger can be detected with sensors. Ambiguity between different possible input words is resolved by selecting target words with gaze. If gaze tracking is unavailable, ambiguity is resolved by selecting target words with additional taps. We evaluated TapGazer for seated and standing VR: seated novice users using touchpads as tap surfaces reached 44.81 words per minute (WPM), 79.17% of their QWERTY typing speed. Standing novice users tapped on their thighs with touch-sensitive gloves, reaching 45.26 WPM (71.91%). We analyze TapGazer with a theoretical performance model and discuss its potential for text input in future AR scenarios.

Original languageEnglish
Title of host publicationCHI 2022 - Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450391573
DOIs
Publication statusPublished - 29 Apr 2022

Publication series

NameConference on Human Factors in Computing Systems - Proceedings

Keywords

  • Eye Tracking
  • Input Techniques
  • Text Entry
  • Typing
  • Virtual Reality

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design
  • Software

Fingerprint

Dive into the research topics of 'TapGazer: Text Entry with finger tapping and gaze-directed word selection'. Together they form a unique fingerprint.

Cite this