Multisensory mechanisms for perceptual disambiguation. A classification image study on the stream-bounce illusion

C V Parise1, M Ernst2

1Max Planck Institute and Uni Bielefeld, Germany
2Cognitive Neurosciences, Bielefeld University, Germany

Contact: cesare.parise@uni-bielefeld.de

Sensory information is inherently ambiguous, and observers must resolve such ambiguity to infer the actual state of the world. Here, we take the stream-bounce illusion as a tool to investigate disambiguation from a cue-integration perspective, and explore how humans gather and combine sensory information to resolve ambiguity. In a classification task, we presented two bars moving in opposite directions along the same trajectory, meeting at the centre. Observers classified such ambiguous displays as streaming or bouncing. Stimuli were embedded in audiovisual noise to estimate the perceptual templates used for the classification. Such templates, the classification images, describe the spatiotemporal noise properties selectively associated to either percept. Results demonstrate that audiovisual noise strongly biased perception. Computationally, observers’ performance is well explained by a simple model involving a matching stage, where the sensory signals are cross-correlated with the internal templates, and an integration stage, where matching estimates are linearly combined. These results reveal analogous integration principles for categorical stimulus properties (stream/bounce decisions) and continuous estimates (object size, position…). Finally, the time-course of the templates reveals that most of the decisional weight is assigned to information gathered before the crossing of the stimuli, thus highlighting a predictive nature of perceptual disambiguation.

Up Home