Abstract
In this paper, we describe an experiment designed to evaluate the effectiveness of three interfaces for surveillance or remote control using live 360-degree video feeds from a person or vehicle in the field. Video feeds are simulated using a game engine. While locating targets within a 3D terrain using a 2D 360-degree interface, participants indicated perceived egocentric directions to targets and later placed targets on an overhead view of the terrain. Interfaces were compared based on target finding and map placement performance. Results suggest 1) nonseamless interfaces with visual boundaries facilitate spatial understanding, 2) correct perception of self-to-object relationships is not correlated with understanding object-toobject relationships within the environment, and 3) increased video game experience corresponds with better spatial understanding of an environment observed in 360- degrees. This work can assist researchers of panoramic video systems in evaluating the optimal interface for observation and teleoperation of remote systems.
Original language | American English |
---|---|
State | Published - May 2012 |
Event | Proceedings of the 2012 Annual Conference on Human Factors in Computing Systems - Duration: May 1 2012 → … |
Conference
Conference | Proceedings of the 2012 Annual Conference on Human Factors in Computing Systems |
---|---|
Period | 5/1/12 → … |
Keywords
- 360-degree view
- panorama
- virtual navigation
- spatial cognition
Disciplines
- Technology and Innovation