Abstract
Eye gaze is a fast and ergonomic modality for pointing but limited in precision and accuracy. In this work, we introduce BimodalGaze, a novel technique for seamless head-based refinement of a gaze cursor. The technique leverages eye-head coordination insights to separate natural from gestural head movement. This allows users to quickly shift their gaze to targets over larger fields of view with naturally combined eye-head movement, and to refine the cursor position with gestural head movement. In contrast to an existing baseline, head refinement is invoked automatically, and only if a target is not already acquired by the initial gaze shift. Study results show that users reliably achieve fine-grained target selection, but we observed a higher rate of initial selection errors affecting overall performance. An in-depth analysis of user performance provides insight into the classification of natural versus gestural head movement, for improvement of BimodalGaze and other potential applications.
Original language | English |
---|---|
Title of host publication | Proceedings ETRA 2020 Full Papers - ACM Symposium on Eye Tracking Research and Applications |
Publisher | Association for Computing Machinery |
Pages | 1-9 |
Number of pages | 9 |
DOIs | |
Publication status | Published - 2 Jun 2020 |
Bibliographical note
© ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ETRA '20 Full Papers: ACM Symposium on Eye Tracking Research and Applications, 2020 http://doi.acm.org/10.1145/3379155.3391312Keywords
- Eye tracking
- Gaze interaction
- Refinement
- Eye-head coordination
- Virtual reality