Motion extrapolation of auditory–visual targets

Sophie Wuerger, Georg Meyer, Markus Hofbauer, Christoph Zetzsche, Kerstin Schill
Information Fusion, Volume 11, Issue 1, January 2010


Many tasks involve the precise estimation of speed and position of moving objects, for instance to catch or avoid objects that cohabit in our environment. Many of these objects are characterised by signal representations in more than one modality, such as hearing and vision. The aim of this study was to investigate the extent to which the simultaneous presentation of auditory and visual signals enhances the estimation of motion speed and instantaneous position. Observers are asked to estimate the instant when a moving object arrives at a target spatial position by pressing a response button. This task requires observers to estimate the speed of the moving object and to calibrate the timing of their manual response such that it coincides with the true arrival time of the moving object. When both visual and auditory motion signals are available, the variability in estimating the arrival time of the moving object is significantly reduced compared to the variability in the unimodal conditions. This reduction in variability is consistent with optimal integration of the auditory and visual speed signals. The average bias in the estimated arrival times depends on the motion speed: for medium speeds (17 deg/s) observers’ subjective arrival times are earlier than the true arrival times; for high speeds (47 deg/s) observers exhibit a (much smaller) bias in the other direction. This speed-dependency suggests that the bias is due to an error in estimating the motion speeds rather than an error in calibrating the timing of the motor response. Finally, in this temporal localization task, the bias and variability show similar patterns for motion defined by vision, audition or both.

Go to Journal

Check Also

IM2 – Intelligent Management of MicroPower