VideoHandles: Replicating gestures to search through action-camera video

Jarrod Knibbe, Sue Ann Seah, Mike Fraser

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

5 Citations (SciVal)

Abstract

We present VideoHandles, a novel interaction technique to support rapid review of wearable video camera data by re-performing gestures as a search query. The availability of wearable video capture devices has led to a significant increase in activity logging across a range of domains. However, searching through and reviewing footage for data curation can be a laborious and painstaking process. In this paper we showcase the use of gestures as search queries to support review and navigation of video data. By exploring example self-captured footage across a range of activities, we propose two video data navigation styles using gestures: prospective gesture tagging and retrospective gesture searching. We describe VideoHandles' interaction design, motivation and results of a pilot study.

Original languageEnglish
Title of host publicationSUI 2014 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction
PublisherAssociation for Computing Machinery
Pages50-53
Number of pages4
ISBN (Electronic)9781450328203
DOIs
Publication statusPublished - 4 Oct 2014
Event2nd ACM Symposium on Spatial User Interaction, SUI 2014 - Honolulu, USA United States
Duration: 4 Oct 20145 Oct 2014

Publication series

NameSUI 2014 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction

Conference

Conference2nd ACM Symposium on Spatial User Interaction, SUI 2014
Country/TerritoryUSA United States
CityHonolulu
Period4/10/145/10/14

ASJC Scopus subject areas

  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'VideoHandles: Replicating gestures to search through action-camera video'. Together they form a unique fingerprint.

Cite this