Differential modulation of visually evoked postural responses by real and virtual foreground objects.

G Meyer1, F Shao1, M White2, T Robotham3

1Experimental Psychology, University of Liverpool, United Kingdom
2Flight Science and Technology, University of Liverpool, United Kingdom
3School of Engineering, Auckland University of Technology, New Zealand

Contact: georg@liv.ac.uk

Externally generated visual signals that are consistent with self-motion, cause corresponding visually evoked postural responses (VEPR). These VEPR are not simple responses to optokinetic stimulation, but are modulated by the configuration of the environment. We measured VEPR in a virtual environment where the visual background moved in either lateral or anterior-posterior direction. We show that: 1) VEPR for lateral visual motion are modulated by the presence of foreground objects that can be haptic, visual and auditory; 2) real objects and their virtual reality equivalents have different effects on VEPR; 3) VEPR for anterior-posterior motion are not modulated by the presence or reality of reference signals in the foreground. We conclude that automatic postural responses for laterally moving visual stimuli are strongly influenced by the configuration and interpretation of the environment and draw on multisensory representations. Different postural responses were observed for real and virtual visual reference objects. On the basis that VEPR in high fidelity virtual environments should mimic those seen in real situations we propose to use the observed effect as a robust objective test for presence and fidelity in VR.

Up Home