Projects per year
Abstract
In this study, we establish a much-needed baseline for evaluating eye tracking interactions using an eye tracking enabled Meta Quest 2 VR headset with 30 participants. Each participant went through 1098 targets using multiple conditions representative of AR/VR targeting and selecting tasks, including both traditional standards and those more aligned with AR/VR interactions today. We use circular white world-locked targets, and an eye tracking system with sub-1-degree mean accuracy errors running at approximately 90Hz. In a targeting and button press selection task, we, by design, compare completely unadjusted, cursor-less, eye tracking with controller and head tracking, which both had cursors. Across all inputs, we presented targets in a configuration similar to the ISO 9241-9 reciprocal selection task and another format with targets more evenly distributed near the center. Targets were laid out either flat on a plane or tangent to a sphere and rotated toward the user. Even though we intended this to be a baseline study, we see unmodified eye tracking, without any form of a cursor, or feedback, outperformed the head by 27.9% and performed comparably to the controller (5.63% decrease) in throughput. Eye tracking had improved subjective ratings relative to head in Ease of Use, Adoption, and Fatigue (66.4%, 89.8%, and 116.1% improvements, respectively) and had similar ratings relative to the controller (reduction by 4.2%, 8.9%, and 5.2% respectively). Eye tracking had a higher miss percentage than controller and head (17.3% vs 4.7% vs 7.2% respectively). Collectively, the results of this baseline study serve as a strong indicator that eye tracking, with even minor sensible interaction design modifications, has tremendous potential in reshaping interactions in next-generation AR/VR head mounted displays.
Original language | English |
---|---|
Pages (from-to) | 2269-2279 |
Number of pages | 11 |
Journal | IEEE Transactions on Visualization and Computer Graphics |
Volume | 29 |
Issue number | 5 |
DOIs | |
Publication status | Published - 22 Feb 2023 |
Bibliographical note
Funding Information:The authors wish to thank Joseph Zhang, Ken Koh, Duane Sawyer, Joel Shook, Carmen Wang, and Carina Thiemann for their contributions to this study.
Keywords
- 3D user interaction
- Eye tracking
- Gaze targeting
- Human factors and ergonomics
- Input devices
- User experience
ASJC Scopus subject areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Computer Graphics and Computer-Aided Design
Fingerprint
Dive into the research topics of 'Leveling the Playing Field: A Comparative Reevaluation of Unmodified Eye Tracking as an Input and Interaction Modality for VR'. Together they form a unique fingerprint.Projects
- 1 Active
-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) - 2.0
Campbell, N. (PI), Cosker, D. (PI), Bilzon, J. (CoI), Campbell, N. (CoI), Cazzola, D. (CoI), Colyer, S. (CoI), Cosker, D. (CoI), Lutteroth, C. (CoI), McGuigan, P. (CoI), O'Neill, E. (CoI), Petrini, K. (CoI), Proulx, M. (CoI) & Yang, Y. (CoI)
Engineering and Physical Sciences Research Council
1/11/20 → 31/10/25
Project: Research council