Skip to main navigation Skip to search Skip to main content

A Taxonomy and Systematic Review of Gaze Interactions for 2D Displays: Promising Techniques and Opportunities

Asma Shakil, Christof Lutteroth, Gerald Weber

Research output: Contribution to journalArticlepeer-review

1   Link opens in a new tab Citation (SciVal)

Abstract

Gaze input offers strong potential for creating intuitive and engaging user interfaces, but remains constrained by inherent limitations in accuracy and precision. Although extensive research has explored gaze-based interaction over the past three decades, a systematic framework that fully captures the diversity of gaze interaction techniques is still lacking. To address this gap, we present a novel two-dimensional taxonomy that classifies gaze interactions by (1) the type of input, distinguishing between gaze-only and gaze-assisted modalities, and (2) the type of target, differentiating between those requiring absolute gaze coordinates and thus higher accuracy, and those using relative coordinates, which tolerate lower accuracy. Our taxonomy explicitly captures the required input accuracy and interface constraints of each technique, providing clearer guidance for designers of gaze-based interfaces. We apply this taxonomy to review and classify 125 studies of active gaze interactions on 2D displays. The findings highlight promising techniques and identify research opportunities to advance gaze interaction design.

Original languageEnglish
Article number308
Pages (from-to)1-37
JournalACM Computing Surveys
Volume57
Issue number12
DOIs
Publication statusPublished - 11 Jul 2025

Keywords

  • 2D displays
  • Eye-tracking
  • gaze
  • gaze-assisted
  • gaze-only
  • interaction
  • literature review
  • selection
  • survey

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'A Taxonomy and Systematic Review of Gaze Interactions for 2D Displays: Promising Techniques and Opportunities'. Together they form a unique fingerprint.

Cite this