Abstract
The eyes are coupled in their gaze function and therefore usually treated as a single input channel, limiting the range of interactions. However, people are able to open and close one eye while still gazing with the other. We introduce Gaze+Hold as an eyes-only technique that builds on this ability to leverage the eyes as separate input channels, with one eye modulating the state of interaction while the other provides continuous input. Gaze+Hold enables direct manipulation beyond pointing which we explore through the design of Gaze+Hold techniques for a range of user interface tasks. In a user study, we evaluated performance, usability and
user’s spontaneous choice of eye for modulation of input. The results show that users are effective with Gaze+Hold. The choice of dominant versus non-dominant eye had no effect on performance, perceived usability and workload. This is significant for the utility of Gaze+Hold as it affords flexibility for mapping of either eye in different configurations.
user’s spontaneous choice of eye for modulation of input. The results show that users are effective with Gaze+Hold. The choice of dominant versus non-dominant eye had no effect on performance, perceived usability and workload. This is significant for the utility of Gaze+Hold as it affords flexibility for mapping of either eye in different configurations.
Original language | English |
---|---|
Title of host publication | Proceedings ETRA 2021 Full Papers - ACM Symposium on Eye Tracking Research and Applications |
Publisher | Association for Computing Machinery |
Publication status | Published - 25 May 2021 |
Event | ACM Symposium on Eye Tracking Research & Applications: Bridging Communities - Duration: 24 May 2021 → 27 May 2021 |
Conference
Conference | ACM Symposium on Eye Tracking Research & Applications |
---|---|
Abbreviated title | ETRA 2021 |
Period | 24/05/21 → 27/05/21 |