Our understanding of human visual attention has greatly benefited from a wealth of visual search studies conducted over the past few decades. Task observers with searching for a specific target embedded among a set of distractors, and the time taken for them to find the target can reveal much about the sensory and cognitive processes involved. These experiments have typically been conducted on 2D displays under tightly controlled viewing conditions. Recently however, there have been calls within the visual attention community to explore more ecologically valid means of data collection. Virtual reality (VR) is a promising methodological tool for such research as it offers improved visual realism and the possibility of participant interaction, while retaining a significant amount of the control afforded by a computerized and monitor presented experiment. Here we present the Visual Search in Virtual Reality (VSVR) ToolBox. VSVR is a set of functions, scripts and assets that can be combined within a visual scripting environment in the Unity game engine to design, replicate and extend visual search experiments in VR. We further demonstrate the utility of such a toolbox with three experiments: a replication of feature search behavior, a demonstration of wide field-of-view visual search and eccentricity effects, and replication of depth plane as a feature for search.

Original languageEnglish
Article number19
Number of pages1
JournalJournal of Vision
Issue number3
Publication statusPublished - 28 Feb 2022

ASJC Scopus subject areas

  • Ophthalmology
  • Sensory Systems


Dive into the research topics of 'Contributed Session II: Visual Search in Virtual Reality (VSVR): A visual search toolbox for virtual reality'. Together they form a unique fingerprint.

Cite this