Abstract
Can eye tracking enable VR users to target and select elements at par or better than controller or head-based targeting? We explored visual feedback methods (none, cursor, outline and resize) for gaze targeting with a button press for selection, and an ecologically valid representation of commercially available user interfaces with a body-locked, grid-based design. Forty participants interacted with a 5x5 square element grid, and elements subtended either 3-, 6- or 9-degrees of visual angle. If the participant looked out of the grid boundary, on button press, we chose to select the last targeted element, but no other algorithms to enhance performance were employed. We also assessed signal quality requirements with a fixed offset 1.5-degree accuracy degradation. Participants completed 36 blocks and in each, targeted and selected 15 successive elements. We found that gaze targeting, with appropriate feedback methods and a button press, can perform at par or better than the controller in cases intended to replicate targeting and selecting in world- or body-locked paradigms in AR/VR. We anticipate that with design improvements or algorithmic mitigations that this can improve significantly.
| Original language | English |
|---|---|
| Journal | International Journal of Human-Computer Interaction |
| Early online date | 5 Feb 2025 |
| DOIs | |
| Publication status | E-pub ahead of print - 5 Feb 2025 |
Acknowledgements
The authors wish to thank Joseph Zhang, Ken Koh, Duane Sawyer, Joel Shook, Carmen Wang, and Carina Thiemann for their contributions to this study.Keywords
- 3D user interaction
- Eye tracking
- gaze targeting
- human factors and ergonomics
- input devices
- user experience
ASJC Scopus subject areas
- Human Factors and Ergonomics
- Human-Computer Interaction
- Computer Science Applications