Eye Drop: an interaction concept for gaze-supported point-to-point content transfer

Jayson Turner, Andreas Bulling, Jason Alexander, Hans Gellersen

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

8 Citations (SciVal)
54 Downloads (Pure)

Abstract

The shared displays in our environment contain content that we desire. Furthermore, we often acquire content for a specific purpose, i.e., the acquisition of a phone number to place a call. We have developed a content transfer concept, Eye Drop. Eye Drop provides techniques that allow fluid content acquisition, transfer from shared displays, and local positioning on personal devices using gaze combined with manual input. The eyes naturally focus on content we desire. Our techniques use gaze to point remotely, removing the need for explicit pointing on the user's part. A manual trigger from a personal device confirms selection. Transfer is performed using gaze or manual input to smoothly transition content to a specific location on a personal device. This work demonstrates how techniques can be applied to acquire and apply actions to content through a natural sequence of interaction. We demonstrate a proof of concept prototype through five implemented application scenarios.
Original languageEnglish
Title of host publicationMUM '13 Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia
PublisherAssociation for Computing Machinery
ISBN (Print)9781450326483
DOIs
Publication statusPublished - 2 Dec 2013

Bibliographical note

12th International Conference on Mobile and Ubiquitous Multimedia ; Conference date: 02-12-2013 Through 05-12-2013

Fingerprint

Dive into the research topics of 'Eye Drop: an interaction concept for gaze-supported point-to-point content transfer'. Together they form a unique fingerprint.

Cite this