Updating visual direction in real and virtual scenes.

J Vuong, L C Pickup, A Glennerster

School of Psychology and CLS, University of Reading, United Kingdom
Contact: j.vuong@pgr.reading.ac.uk

As humans move from one location to another, the visual direction of objects around them changes continuously. We investigated the information required to do this accurately. Participants viewed a real or virtual scene containing a prominent target then walked to a second location in the room (or, in one instance in virtual reality, were teleported there). From here, they pointed back to the target. One virtual scene closely mimicked the real scene, while in another sparse condition the target and corners of the room were replaced by very long thin poles in an otherwise black scene. In this case, there was no ground plane and information about the distance of the poles could only be derived from the changing angle between them. We found that the richness of the scene made a negligible difference to pointing precision. On the other hand, visual information presented during walking had a beneficial effect on pointing precision even when the target was not visible during this phase. These data will help constrain models of how humans point at invisible targets from unvisited locations, which currently present a challenge to view-based models of spatial representation.

Up Home