Program

Keynote Speakers

Multidisciplinary aspects will be covered, ranging from design, planning and control to psychophysics, usability, and accessibility evaluation of technologies. The range of speakers reflects this multidisciplinariety.

Confirmed Speakers

Prof. Knut Drewing

Experimental Psychology "HapLab" Justus-Liebig-University Giessen

"Tactile, kinaesthetic, and motor information on exploratory movements"

Exploratory finger and hand movements are an integral part of haptic perception. They serve to gather sensory information, and thus information about the exploratory movements itself often becomes necessary for interpreting the stimulus. Representations of own movements can be derived from tactile signals, kinaesthetic signals or efference copies from the motor commands. I will present experiments on movement and softness perception demonstrating that haptic perception and motor guidance is based on an interplay of all three sources of movement information

Prof. Vincent Hayward

Sorbonne Université, Paris


"I feel it because my hand is there" or "my hand is there because I feel it "?

It is common place to think of haptic sensations as the result of the combination of proprioception and touch. The most common view is that proprioception places things in space and touch tells what they are. This view is reflected in the many efforts that were, and still are, made to put tactile displays on force feedback devices or on gloves. This seminar will describe two, or time permitting more, counter-examples that support the contrarian view that proprioception is subservient to touch.

Prof. Roberta Klatzky

Carnegie Mellon University


"Recursive Bayesian updating of stiffness under visual feedback delay"


The talk will be held by Prof. Klatzky, but Roberta Klatzky and Bing Wu contributed equally to the work

Stiffness perception relies on the combination of information about position and force. We have modeled this as an adaptive process that sequentially and recursively updates a stiffness estimate, using force and displacement information sampled over the spatial trajectory of exploration (Wu & Klatzky, 2018). The updating process is constrained to weight prior estimates according to their reliability and to discount sudden deviations. We will expand the model to integrate multiple exploratory cycles and visual estimates of displacement, and we will describe how updating responds when vision becomes unreliable because of feedback delay.

Prof. Lynette Jones


Department of Mechanical Engineering, Massachusetts Institute of Technology (MIT)


"Incongruities in the Control and Perception of Finger Forces"

The ability to control the forces generated by the hand is essential for most skilled activities from grasping a small object between the index finger and thumb to turning a key in a lock. In these activities finger forces are exquisitely adapted to the contact conditions between the hand and the object and vary as the weight, surface texture or shape of the object changes. The neural signals that arise from cutaneous mechanoreceptors therefore have a dual role, in that they serve as inputs for the motor control system to enable effective manipulation and assist in determining the properties of objects held in the hand. Studies of finger forces used during tactile and haptic exploration of objects indicate that they are typically small, less than 1 N, and are optimized for the property being perceived. When finger forces have to be explicitly controlled using only haptic feedback, performance is surprisingly poor as reflected in a coefficient of variation (SD/M) of 13%. Interestingly, this index of control does not vary significantly among muscle groups indicating that forces are controlled with same relative degree of precision across the body. In this talk the fundamental differences between the proprioceptive and tactile sensory systems will be highlighted.

Dr. Lucile Dupin

Institut de Psychiatrie et Neurosciences de Paris


"Sensorimotor calibration of space through self-touch"

Self-touch is a quite specific sensorimotor spatial experience. It involves a unique contingency between the information concerning the moving body part and the tactile feedback stimulating the skin receptors of the touched body part. Ecologically, when sliding for instance one finger along the forearm, the distance of movement planned by the brain and the extent of tactile feedback are both similar and the information about the distance is consequently redundant. But what happens when movement and touch extent are not correlated anymore: which information is used by the brain to estimate movement or tactile extent?

Prof. Viviana Betti

Department of Psychology, Sapienza University of Rome


"States and transitions of natural hand movements"

Hand movement is dynamic by nature. In everyday activities, humans use a finite number of discrete postural configurations highly adapted to the physical properties of the external objects, but how certain hand states flow into others to create sophisticated manual behavior? Likely akin to the writing system, manual behavior can be studied as a succession of simple elements, or motor primitives, whose combination and alternation over time can form flexible and dexterous movement. We hypothesized that manual behavior emerges through a precise temporal structure and that not all the transitions from a state to another form distributions of high frequent behaviors. To address this question, we collected kinematics data from 36 participants as they perform reach-to-grasp movements aimed to prepare breakfast in a naturalistic setting. In order to identify a basis of the most recurrent and consistent movements over time, we developed a temporal PCA based algorithm. This allowed us to recover hand postures associated with reach-to-grasp movements and to describe manual behavior in terms of basic states and transitions among them. Our results show that even in a cohort of subjects performing movements in an unconstrained manner, a limited number of states recurrently flow into others with highest frequency.


Prof. Sliman Bensmaia

University of Chicago Department of Organismal Biology & Anatomy


"Neural computations that give rise to invariant sensory representations"

A major function of sensory processing is to achieve neural representations of objects that are stable across changes in context and perspective. Small changes in exploratory behavior can lead to large changes in signals at the sensory periphery, thus resulting in ambiguous neural representations of objects. Overcoming this ambiguity is a hallmark of human object recognition across sensory modalities. In this talk, I will discuss our efforts to understand how the perception of tactile texture remains stable across exploratory movements of the hand, including changes in scanning speed, despite the concomitant changes in afferent responses. First, I will show that texture perception is invariant to changes in speed whereas afferent responses are highly dependent thereon. Second, I will show that neuronal responses in somatosensory cortex are less dependent on speed than are their counterparts in the peripheral nerves. Finally, I will describe a computational model that reveals the neural computations that give rise to increasingly speed-independent representations of texture.

Prof. Alessandro Moscatelli

University of Tor Vergata, Rome


"Integration of Slip Motion and Kinaesthesia for the Control of Reaching Movements"

Hand reaching is a complex task that requires the integration of multiple sensory information from

muscle, joints and the skin, and internal model of the motor command. Whenever we touch the surface of an object and we slide our fingertip on it, texture orientation and slip motion provide information about the path length, and about the direction of the hand trajectory. In an ongoing study, we use an innovative haptic device to physically decouple slip motion and hand movements during a reaching task. Participants slid their fingertip on the lubricated surface of the device towards a visual target. The position of the contact plate was continuously updated with the hand position multiplied by a gain parameter. By changing the value of the gain across trials, it was possible to produce different combinations of slip motion and hand motion. We evaluated the systematic changes in the hand velocity profile and the motion path depending on slip motion stimuli. Our results support our previous findings that the perceived movement of the hand is a weighted average of cutaneous and kinesthetic cues.