Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens

Ken Pfeuffer, Jason Alexander, Hans Gellersen

Research output: Chapter in Book/Report/Conference proceedingChapter in a published conference proceeding

43 Downloads (Pure)


Eye-gaze is a technology for implicit, fast, and hands-free input for a variety of use cases, with the majority of techniques focusing on single-user contexts. In this work, we present an exploration into gaze techniques of users interacting together on the same surface. We explore interaction concepts that exploit two states in an interactive system: 1) users visually attending to the same object in the UI, or 2) users focusing on separate targets. Interfaces can exploit these states with increasing availability of eye-tracking. For example, to dynamically personalise content on the UI to each user, and to provide a merged or compromised view on an object when both users’ gaze are falling upon it. These concepts are explored with a prototype horizontal interface that tracks gaze of two users facing each other. We build three applications that illustrate different mappings of gaze to multi-user support: an indoor map with gaze-highlighted information, an interactive tree-of-life visualisation that dynamically expands on users’ gaze, and a worldmap application with gaze-aware fisheye zooming. We conclude with insights from a public deployment of this system, pointing toward the engaging and seamless ways how eye based input integrates into collaborative interaction.
Original languageEnglish
Title of host publicationETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications
PublisherAssociation for Computing Machinery
Number of pages7
VolumeMay 2021
Publication statusPublished - 31 May 2021


Dive into the research topics of 'Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens'. Together they form a unique fingerprint.

Cite this