Faculty Research & Creative Activity

Document Type

Article

Publication Date

May 2012

Abstract

In this paper, we describe an experiment designed to evaluate the effectiveness of three interfaces for surveillance or remote control using live 360-degree video feeds from a person or vehicle in the field. Video feeds are simulated using a game engine. While locating targets within a 3D terrain using a 2D 360-degree interface, participants indicated perceived egocentric directions to targets and later placed targets on an overhead view of the terrain. Interfaces were compared based on target finding and map placement performance. Results suggest 1) nonseamless interfaces with visual boundaries facilitate spatial understanding, 2) correct perception of self-to-object relationships is not correlated with understanding object-toobject relationships within the environment, and 3) increased video game experience corresponds with better spatial understanding of an environment observed in 360- degrees. This work can assist researchers of panoramic video systems in evaluating the optimal interface for observation and teleoperation of remote systems.

Share

COinS