Anna Metzger
Justus-Liebig University Giessen, Germany / Bournemouth University, UK
Perceptual systems are limited by the constraints of sensory organs and by neural resources, implying that information has to be sampled and represented efficiently. We investigated haptic perception, from sampling with active movements of our hands to the encoding of material properties into perceptual representations. Haptic exploration is traditionally characterized as consisting of stereotypical task-dependent movements. In a first study, we could demonstrate that touch is also driven by stimulus properties. We used a Deep Neural Network (DNN) to predict where participants touch a stimulus based on local properties of its surface. From the model’s preferences we inferred that participants preferentially touch edge-like structures and vertical and horizontal patterns, which might be most informative about objects’ shape. In a second study, we showed that perceptual haptic representations of materials emerge from efficient encoding of the vibratory patterns elicited by natural interactions with materials - which constitute the very sensory input for the haptic system. We trained a unsupervised DNN (Autoencoder) to reconstruct such vibratory patterns. The learned compressed representation allows for classification of material categories. More importantly, distances between these categories in the latent space resemble perceptual distances, suggesting similar representations. This similarity increases with the level of compression, suggesting a crucial role of efficient encoding of the sensory signals in learning perceptual representations. Finally, we showed remarkable similarities between the temporal tuning of the representation learned by the Autoencoder and the tuning of human tactile receptors, suggesting that our sensors have evolved to efficiently encode the sensory input elicited by natural textures.