Modelling time-order effects in haptic perception with a Bayesian dynamical framework
New dynamical framework reveals how prior expectations warp tactile judgments
A team of researchers from Argentina and Spain, led by Gastón Avetta, has developed a dynamical Bayesian framework to model time-order effects in haptic perception. Published on arXiv (2604.19662), the study addresses how perceptual judgments of sequential stimuli are systematically biased by prior expectations and temporal structure. In haptic discrimination tasks, these biases manifest as time-order asymmetries—where the perceived difference between two stimuli depends on their presentation order. The model formalizes perception as an inference process where prior expectations are updated by incoming stimuli and propagate in time between observations.
Tested on psychophysical data from vibrotactile discrimination experiments with varying intensities, the model quantitatively reproduces both the direction and magnitude of time-order effects across subjects, as well as inter-individual variability, using only a small number of parameters. The inferred parameters provide a compact description of perceptual biases in terms of prior expectations and noise characteristics. Beyond fitting data, the model induces a transformation of stimulus space, leading to a subject-dependent geometry of perceived stimuli where perceptual judgments exhibit approximate symmetries absent in physical coordinates. These results suggest temporal biases in perception stem from dynamical inference and impose non-trivial geometric constraints on perceptual representations.
- Dynamical Bayesian model uses few parameters to reproduce time-order asymmetries in haptic discrimination tasks
- Tested on vibrotactile data, it captures both direction/magnitude of biases and inter-individual variability across subjects
- Model transforms stimulus space into subject-dependent geometry with symmetries not present in physical coordinates
Why It Matters
This framework could improve haptic interfaces and VR by accounting for perceptual biases in sequential touch stimuli.