Gaze-shifting: direct-indirect input with pen and touch modulated by gaze

Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, Hans Gellersen

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

70 Citations (SciVal)
193 Downloads (Pure)

Abstract

Modalities such as pen and touch are associated with direct input but can also be used for indirect input. We propose to combine the two modes for direct-indirect input modulated by gaze. We introduce gaze-shifting as a novel mechanism for switching the input mode based on the alignment of manual input and the user's visual attention. Input in the user's area of attention results in direct manipulation whereas input offset from the user's gaze is redirected to the visual target. The technique is generic and can be used in the same manner with different input modalities. We show how gaze-shifting enables novel direct-indirect techniques with pen, touch, and combinations of pen and touch input.
Original languageEnglish
Title of host publicationUIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
Place of PublicationNew York, New York
PublisherAssociation for Computing Machinery
Pages373-383
Number of pages11
ISBN (Print)9781450337793
DOIs
Publication statusPublished - 11 Nov 2015

Bibliographical note

©ACM, 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology http://dx.doi.org/10.1145/2807442.2807460

Keywords

  • eye tracking
  • gaze
  • indirect input
  • pen
  • direct input
  • touch

Fingerprint

Dive into the research topics of 'Gaze-shifting: direct-indirect input with pen and touch modulated by gaze'. Together they form a unique fingerprint.

Cite this