Bora Celebi, Julian Kaduk , Müge Cavdan, Heiko Hamann & Knut Drewing
The human role in human–swarm interaction (HSI) shifts from controller to supervisor, as robots become more autonomous and require efficient search strategies in complex visual environments. Previous research has shown that spatially uninformative brief cues enhance search performance in laboratory environments (namely, “pip-and-pop” effect). Here we examined if these effects can be effectively applicable in HSI. To this end, we conducted two experiments using small mobile robots (Thymio II) to investigate the impact of auditory, tactile, and audiotactile cues on visual search performance and timing judgments. In the first experiment, 20 participants identified a stopped robot among moving robots. The results showed that all cue conditions significantly reduced reaction times (RTs) compared to the no-cue condition, suggesting that brief spatially non-informative signals improve search performance by increasing sensory information accumulation speed. The second experiment involved 12 participants judging the duration of a robot’s stop after a tactile cue was presented or not. The findings indicate that tactile cues improve temporal sensitivity without affecting subjective duration judgments. These results highlight the potential of uni- and multisensory cues to enhance HSI performance by facilitating quicker and more accurate human responses, particularly in dynamic environments. The study extends the “pip-and-pop” effect to real-world scenarios, offering insights for designing HSI systems that allow users to interact with robotic swarms more naturally and efficiently.
Celebi, B., Kaduk, J., Cavdan, M., Hamann, H., & Drewing, K. (2025). Brief non-spatial signals facilitate visual search and temporal sensitivity in robot supervision. International Journal of Human-Computer Studies, 103643.
Didem Katircilar, Roland Bennewitz & Knut Drewing
Individuals with more elastic, more hydrated or smaller fingers usually show better performance in several passive touch tasks. In active touch, people use different exploratory procedures when evaluating object properties, and tune their exploratory parameters. For example, they indent stimuli to assess softness and optimize their peak forces to get relevant information. In this study, we aim to understand whether finger pad size, elasticity and hydration affect individuals' force-tuning and discrimination performance in active softness perception. Participants performed two softness tasks in two different sessions. In one session, hyaluronic acid was applied to their finger pads to soften it, in the other they received no treatment. We assessed individual elasticity and hydration values with cutometer and corneometer in each session, and measured finger pad size in three dimension by caliper. In each task, two pairs of stimuli were presented to the participants (Young's Modulus: 41.5 vs. 45.0; 28.7 vs. 31.3 kPa) who chose the softer stimulus. In the restricted task, they could apply force only up to 2 Newton, whereas there was no force limit in the unconstrained task. We found that participants with smaller finger pad size exerted less force in the restricted task and participants with more hydrated and elastic fingers exerted less force in the unconstrained task. The force-tuning disappeared in the unconstrained task when treatment was applied. These results indicate that people employ strategies according to their finger parameters and to the availability of cues whereas adaptation to treatment is likely to need longer practice.
Katircilar, D., Bennewitz, R., & Drewing, K. A Role for Finger Properties in Exploration and Perception of Softness. IEEE transactions on haptics. https://doi.org/10.1109/TOH.2025.3582077
Maja Fehlberg, Dominik S. Schmidt, Sairam Saikumar, Müge Cavdan, Knut Drewing & Roland Bennewitz
Friction was studied for the human finger pad during the spreading of viscous liquid samples in circular motion on a solid substrate. The samples included both Newtonian and shear-thinning liquids with a range of viscosity between 0.83 mPa/s and 150 Pa/s. During active touch, participants applied varying normal forces and sliding speeds depending on the sample and individual behavior. Friction coefficients vary greatly between participants, but fall on one Stribeck curve when shear-thinning effects were accounted for full-film lubrication. A comparison with the measured height variations during spreading demonstrates that the logarithm of the Hersey number is an instantaneous indicator of the film thickness in the full-film lubrication regime. Comparison of the measured friction coefficients with reported values of the perceived slipperiness for the same samples shows a close correspondence along the Stribeck curve.
Fehlberg, M., Schmidt, D. S., Saikumar, S., Ciamin, F., Aghababaei, R., & Reihsner, R. (2025). A touch of Stribeck – Finger-pad friction in viscous liquid spreading. Tribology Letters, 73, 91. https://doi.org/10.1007/s11249-025-02024-w
Nedim Goktepe, Müge Cavdan & Knut Drewing
Previous studies have successfully elicited a wide range of emotional responses by stimulating the hand region. The purpose of the current study was to test whether tactile stimuli applied to the torso could elicit similar emotional responses. To this end, we created 45 custom vibrotactile patterns that were presented through a vibrotactile vest to the front, back, and both sides of the torso. The patterns covered a wide range of physical variables such as amplitude, trajectory, and continuity. In an exploratory experiment, participants rated the arousal and valence of these patterns. Emotional responses differed between the patterns, and detailed analyses suggested that vibration amplitude and intensity where these vibrations were applied influenced both valence and arousal judgments. In a follow-up experiment, we systematically varied the amplitude and location of the vibrations. Our results showed that lower amplitudes were less arousing and more pleasant than higher amplitudes. Similarly, vibrations to the back torso were less arousing and more pleasant than those applied to the front or both sides of the torso, which can be explained by the lower sensitivity on the back. Taken together, we suggest that perceived intensity partially explains the relationship between the emotionality of vibration patterns on the torso.
Goktepe, N., Cavdan, M., & Drewing, K. (2025). Touched by vibrations: Intensity modulates valence and arousal on the torso. IEEE Transactions on Haptics. https://doi.org/10.1109/TOH.2025.3576894
Nedim Goktepe, Müge Cavdan & Knut Drewing
The perceived time can shrink or expand for emotional stimuli. Converging evidence suggests that emotional time distortions are rooted in the emotional states of the timing agents because emotional stimuli can influence the timing of simultaneous neutral events. As emotional states are transitory, we investigated if time modulating emotional states also influence timing of subsequent neutral events. In each trial, we induced different valence and arousal levels by using affective vibrotactile patterns before participants judged the duration of neutral auditory tones. Compared to neutral patterns, affective patterns modulated participants’ time perception of the subsequent tones. We observed an interaction between arousal and valence: Pleasant-Low arousal patterns expanded the timing of subsequent neutral events more than Unpleasant-Low arousal patterns while Pleasant and Unpleasant-High arousal led to a similar temporal expansion. Our results indicate time modulating effects of emotional stimuli are due to changed emotional states and influence time perception likely until the underlying state decays.
Goktepe, N., Cavdan, M., & Drewing, K. (2025). Emotional time lengthening carries over to subsequent neutral events. Acta Psychologica, 257, 105043.
Junyi Chen, Alap Kshirsagar, Frederik Heller, Mario Gómez Andreu, Boris Belousov, Tim Schneider, Lisa P. Y. Lin, Katja Doerschner, Knut Drewing & Jan Peters
Hardness is a key tactile property perceived by humans and robots. In this work, we investigate informationtheoretic active sampling for efficient hardness classificationusing vision-based tactile sensors. We assess three probabilisticclassifiers and two uncertainty-based sampling strategies on arobotic setup and a human-collected dataset. Results show thatuncertainty-driven sampling outperforms random sampling in accuracy and stability. While human participants achieve 48.00% accuracy, our best method reaches 88.78% on the same objects, highlighting the effectiveness of vision-based tactilesensors for hardness classification.
Chen, J., Kshirsagar, A., Heller, F., Andreu, M. G., Belousov, B., Schneider, T., ... & Peters, J. Active Sampling for Hardness Classification with Vision-Based Tactile Sensors.
Bora Celebi, Müge Cavdan & Knut Drewing
Time perception is a fundamental aspect of human life, and is influenced and regulated by cognitive and sensory processes. For instance, spatial attention is found to modulate temporal judgments when resources are allocated to a specific stimulus location in vision and audition. However, it is unclear to what extent the attentional effects observed in vision and audition can be generalized to the tactile modality. Here, we study the effects of attentional cues on the time perception of tactile stimuli presented on the human torso. Across four experiments, we examined (1) the impact of visual versus tactile spatial cues, (2) the modulation of time perception by dynamic versus static tactile cues, (3) the role of spatial congruency between cue and target locations (front vs. back of the torso), and (4) the influence of cue-target intervals. Participants performed temporal bisection tasks, judging whether the vibrations following the cues were closer to short or long anchor durations. Tactile cues expanded the perceived duration of subsequent stimuli, with dynamic cues having a greater effect than static ones. While no congruency effects were observed for left and right torso locations, front-back congruency enhanced time expansion. The attentional effect peaked at a 100-ms cue-target interval. We conclude that the time-expanding effects of spatial attention extend to tactile stimuli on the human torso given that time expansion follows principles known from spatial attention.
Celebi, B., Cavdan, M. & Drewing, K. Spatial attention modulates time perception on the human torso. Atten Percept Psychophys (2025). doi: 10.3758/s13414-025-03025-6
Müge Cavdan & Knut Drewing
Research has shown that affective visual and auditory events (e.g., a crying baby) are perceived as lasting longer compared to neutral ones. However, the impact of affective haptic experiences on time perception has hardly been studied. This study investigates the influence of interacting with affective materials on time perception. We selected three materials that are known to evoke pleasant (velvet), unpleasant (sandpaper), and neutral (paper) affective responses. Participants completed a temporal bisection task to assess how each material influenced their perception of time. The task involved presenting the materials in time intervals from 1000 to 2200ms in 200ms increments. In each trial, a participant stroked one of the materials, with the duration being limited by two vibrotactile feedback, and judged whether the duration felt closer to a previously learned short or long interval. Expectedly, velvet yielded lower bisection points than paper. Contrary to expectations, bisection points for sandpaper – despite being an unpleasant material – did not significantly differ from that for the control material, paper. These findings suggest that while pleasant haptic material experiences can extend perceived time, unpleasant materials may not have an effect. This effect is partially consistent with the observed time lengthening during affective auditory and visual events.
M. Cavdan and K. Drewing, "Stretching Time With Velvet: How Affective Materials Shape our Perception of Time," in IEEE Transactions on Haptics, doi: 10.1109/TOH.2025.3545473.
Lisa Pui Yee Lin, Knut Drewing & Katja Doerschner
Image motion contributes to the perception of visual material properties, and motion signals are generated during active exploration. However, little is known about how specific perceptual tasks influence the actions that generate these cues. In an experiment using virtual reality and real-time hand tracking, we investigated how the demands of perceptual tasks (e.g., judging gloss or lightness) shape exploratory behaviours. Participants either observed or actively explored objects varying in gloss and lightness while performing a matching task. We analysed how their exploration patterns varied based on the tasks. Using the same stimuli in both tasks, we found that participants explored objects more extensively when judging gloss than when judging lightness. These findings suggest a strategic prioritisation of relevant cues for gloss judgments, with participants using larger movements and object rotation to enhance viewing perspectives and highlight detection. Our findings show that exploration behaviours are task-dependent, with actions adapted to the demands of the perceptual task at hand.
Lin, L. P., Drewing, K., & Doerschner, K. (2025). Exploring For Gloss: Active Exploration in Visual Material Perception. bioRxiv preprint, bioRxiv 2024.07.09.602662, doi: https://doi.org/10.1101/2024.07.09.602662
Julian Kaduk, Müge Cavdan, Knut Drewing, Heiko Hamann
In robotics, understanding human interaction with autonomous systems is crucial for enhancing collaborative technologies. We focus on human-swarm interaction (HSI), exploring how differently sized groups of active robots affect operators' cognitive and perceptual reactions over different durations. We analyze the impact of different numbers of active robots within a 15-robot swarm on operators' time perception, emotional state, flow experience, and task difficulty perception. Our findings indicate that managing multiple active robots when compared to one active robot significantly alters time perception and flow experience, leading to a faster passage of time and increased flow. More active robots and extended durations cause increased emotional arousal and perceived task difficulty, highlighting the interaction between robot the number of active robots and human cognitive processes. These insights inform the creation of intuitive human-swarm interfaces and aid in developing swarm robotic systems aligned with human cognitive structures, enhancing human-robot collaboration.
Kaduk, J., Cavdan, M., Drewing, K., & Hamann, H. (2024). From One to Many: How Active Robot Swarm Sizes Influence Human Cognitive Processes. arXiv preprint arXiv:2403.13541. https://doi.org/10.48550/arXiv.2403.13541
Müge Cavdan & Knut Drewing
When interacting with surfaces, humans perceive surface attributes which are often accompanied by affective responses. Notably, rough materials tend to evoke unpleasant feelings whereas some soft materials are frequently associated with pleasantness. While the literature predominantly focused on the relationship between solid objects and pleasantness, our daily haptic interactions also include fluids. Here, our main objective was to explore the relationship between unpleasantness and perceived qualities of touched fluids. We created a stimulus set by varying fluid properties of real-life materials (e.g., diluting honey with water). Participants actively explored the materials without time or movement constraints. In a first presentation block, they rated the unpleasantness of the materials while in a second block, they evaluated the materials based on seven sensory adjectives. Principal Component Analysis on adjective ratings revealed the dimensions characterizing differences in sensory qualities of our materials: viscosity and slipperiness. Importantly, we observed a positive significant correlation between unpleasantness and viscosity while no correlation was found for slipperiness. Specifically, materials perceived as more viscous felt unpleasant, emphasizing the role of viscosity in affective responses during haptic exploration. Overall, the current study contributes to the broader understanding of unpleasantness by extending our knowledge beyond the traditionally studied solid materials.
Cavdan, M., Drewing, K. (2025). To Touch or Not to Touch: The Linkage Between Viscosity and Unpleasantness. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14769. Springer, Cham. https://doi.org/10.1007/978-3-031-70061-3_6
Lisa Pui Yee Lin, Alina Böhm, Boris Belousov, Alap Kshirsagar, Tim Schneider, Jan Peters, Katja Doerschner & Knut Drewing
The perception of material/object properties plays a fundamental role in our daily lives. Previous research has shown that individuals use distinct and consistent patterns of hand movements, known as exploratory procedures (EPs), to extract perceptual information relevant to specific material/object properties. Here, we investigated the variation in EP usage across different tasks involving objects that varied in task-relevant properties (shape or deformability) as well as in task-irrelevant properties (deformability or texture). Participants explored 1 reference object and 2 test objects with a single finger before selecting the test object that was most similar to the reference. We recorded their finger movements during explorations, and these movements were then categorised into different EPs. Our results show strong task-dependent usage of EPs, even when exploration was confined to a single finger. Furthermore, within a given task, EPs varied as a function of material/object properties unrelated to the primary task. These variations suggest that individuals flexibly adapt their exploration strategies to obtain consistent and relevant information.
Lin, L.P.Y. et al. (2025). Task-Adapted Single-Finger Explorations of Complex Objects. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_11
Bora Celebi, Müge Cavdan & Knut Drewing
Signals from different senses are integrated into multisensory events or segregated according to their temporal and spatial relations. If signals are integrated, we perceive synchrony between them even in the presence of slight stimulus onset asynchronies (SOA). The range of SOAs during which physically asynchronous signals are perceived to be synchronous is called the temporal binding window (TBW). The TBW depends on various factors. Here we investigated how spatial congruency affects the width of the visuotactile TBW in a naturalistic setting, given that spatial congruency of signals in the single senses should promote multisensory integration and thereby binding. In a virtual reality (VR) environment, we presented visual and vibrotactile stimuli in different locations. Vibrotactile stimuli were presented on the participants’ hands or forearms, and visual stimuli were rendered in real time on virtual counterparts of the tracked hands or forearms. We varied SOAs between vision and touch and asked if visual and tactile stimuli had occurred synchronously. Similar to what has been found in the audiovisual domain, the temporal binding window was wider when visual and tactile stimuli were spatially congruent—possibly due to enhanced multisensory integration. Thus, we extend the previous findings and conclusions on spatial congruency effects to visuotactile interactions in VR environments.
Celebi, B., Cavdan, M., Drewing, K. (2025). The Visuotactile Temporal Binding Window Widens with Spatial Congruency. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14769. Springer, Cham. https://doi.org/10.1007/978-3-031-70061-3_12
Michaela Jeschke, Anna Metzger & Knut Drewing
Haptic exploration is an inherently active process by which humans gather sensory information through physical contact with objects. It has been proposed that humans generally optimize their exploration behavior to improve perception. We hypothesized that the duration of haptic explorations is the result of an optimal interplay of sensory and predictive processes, also taking costs such as motor effort into account. We assessed exploration duration and task performance in a two-alternative forced-choice spatial frequency discrimination task under varying conditions of task demand and motor effort. We manipulated task demands by varying the discriminability of virtual grating stimuli and manipulated motor effort by implementing forces counteracting the participants’ movements while switching between stimuli. Participants were instructed to switch between stimuli after each swipe movement. Results revealed that higher task demands lead to higher numbers of exploratory movements (i.e. longer exploration duration), likely reflecting a compensatory mechanism that enables participants to attain a certain level of task performance. However, this effect is reduced when motor effort is increased; while low and medium task demands yield similar numbers of movements regardless of related motor effort, higher demands are not associated with increased numbers of movements when the required motor effort is high. In conclusion, the extent to which increased task demands are compensated via the extension of an exploration seems to depend on the motor costs that the agent is confronted with.
Jeschke, M., Metzger, A., Drewing, K. (2025). Humans Terminate Their Haptic Explorations According to an Interplay of Task Demands and Motor Effort. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_7
Didem Katircilar, Knut Drewing
People regularly use active touch to perform daily life tasks. Imagine choosing a comfortable pillow and how you would explore its softness. It is known that people tune their exploratory behavior to get the most relevant information. In the exploration process, also prior information is used, which is available before we touch an object. For softness perception, object indentation plays a crucial role; indentation forces were higher, when people implicitly expected to explore harder as compared to softer objects. This force-tuning improved perception, and was observed when trials of the same softness level (hard or soft) were presented in longer blocks. However, it was not reported for predictable patterns in that hard and soft stimuli alternate in every or every other trial. Here, we investigated when and how implicit prior information about the softness level becomes accessible for successful force-tuning in softness discrimination. Participants were presented with hard and soft stimulus pairs in sequences of the length of 2, 4 or 6 trials. In predictable conditions, same-length sequences of hard and soft trials alternated constantly. In unpredictable conditions, we presented sequences of lengths 2, 4 and 6 randomly. We analyzed initial peak indentation forces. Participants applied higher forces to harder stimuli in the predictable condition in longer sequences (4 and 6) as compared to the unpredictable condition and shorter sequences of 2. We interpret the findings in terms of an anticipatory and incremental mechanism of force-tuning, which needs to be triggered by an initial predictable stimulus.
Katircilar, D., Drewing, K. (2025). The Role of Implicit Prior Information in Haptic Perception of Softness. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_13
Maja Fehlberg, Eva Monfort, Dairam Saikumar, Knut Drewing, Roland Bennewitz
Fingertip friction is a key component of tactile perception. In active tactile exploration, friction forces depend on the applied normal force and on the sliding speed chosen. We have investigated whether humans perceive the speed dependence of friction for textured surfaces of materials, which show either increase or decrease of the friction coefficient with speed. Participants perceived the decrease or increase when the relative difference in friction coefficient between fast and slow sliding speed was more than 20 %. The fraction of comparison judgments which were in agreement with the measured difference in friction coefficient did not depend on variations in the applied normal force. The results indicate a perceptual constancy for fingertip friction with respect to self-generated variations of sliding speed and applied normal force.
Fehlberg, M., Monfort, E., Saikumar, S., Drewing, K., & Bennewitz, R. (2024). Perceptual Constancy in the Speed Dependence of Friction During Active Tactile Exploration. IEEE Transactions on Haptics.
Knut Drewing
This chapter provides an initial overview of the state of knowledge on the joint processing of information from different senses in humans. It deals with processes of multisensory integration of redundant information and multisensory combination, the problem of assigning related information from different senses, mechanisms of matching between the senses, the role of attention and the neurophysiological principles of multisensory processing. Examples from ergonomics and clinical practice are used to illustrate the applicability of the findings.
Drewing, K. (2024). Multisensorische Informationsverarbeitung. In: Rieger, M., Müsseler, J. (eds) Allgemeine Psychologie. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-68476-4_4
Nedim Goktepe, Knut Drewing & Alexander C Schütz
Combining or integrating information from multiple senses often provides richer and more reliable estimates for the perception of objects and events. In daily life, sensory information from the same source often is in close spatiotemporal proximity. This can be an important determinant of whether and how multisensory signals are combined. The introduction of advanced technical display systems allows to present multisensory information in virtual environments. However, technical displays can lack the spatiotemporal fidelity of the real world due the rendering delays. Thus, any spatiotemporal incongruency could alter how information is combined. In the current study we tested this by investigating if and how spatially and temporally discrepant tactile displacement cues can supplement imprecise visual displacement cues. Participants performed a visual displacement task with visual and tactile displacement cues under spatial and temporal incongruency conditions. We modelled how participants combined visual and tactile information in visuotactile condition using their performance in visual only condition. We found that temporal incongruency lead to an increase in tactile weights although they were correlated with the congruency condition. In contrast, the spatial incongruency led to individual differences altering cue combination strategies. Our results illustrate the importance of spatiotemporal congruency for combining tactile and visual cues when making visual displacement judgments. Given the altered cue combination strategies and individual differences, we recommend developers to adopt individual spatiotemporal calibration procedures to improve the efficiency of the sensory augmentation.
Goktepe, N., Drewing, K., & Schütz, A. C. (2024). Spatiotemporal congruency modulates weighting of visuotactile information in displacement judgments. IEEE Transactions on Haptics.
Michaela Jeschke, Aaron C. Zoeller & Knut Drewing
Humans can use prior information to optimize their haptic exploratory behavior. Here, we investigated the usage of visual priors, which mechanisms enable their usage, and how the usage is affected by information quality. Participants explored different grating textures and discriminated their spatial frequency. Visual priors on texture orientation were given each trial, with qualities randomly varying from high to no informational value. Adjustments of initial exploratory movement direction orthogonal to the textures’ orientation served as an indicator of prior usage. Participants indeed used visual priors; the more so the higher the priors’ quality (Experiment 1). Higher task demands did not increase the direct usage of visual priors (Experiment 2), but possibly fostered the establishment of adjustment behavior. In Experiment 3, we decreased the proportion of high-quality priors presented during the session, hereby reducing the contingency between high-quality priors and haptic information. In consequence, even priors of high quality ceased to evoke movement adjustments. We conclude that the establishment of adjustment behavior results from a rather implicit contingency learning. Overall, it became evident that humans can autonomously learn to use rather abstract visual priors to optimize haptic exploration, with the learning process and direct usage substantially depending on the priors’ quality.
Jeschke, M., Zoeller, A.C. & Drewing, K. Humans flexibly use visual priors to optimize their haptic exploratory behavior. Sci Rep 14, 14906 (2024). https://doi.org/10.1038/s41598-024-65958-6
Knut Drewing
In everyday interaction we touch different materials, which we experience along a limited number of perceptual and emotional dimensions: For instances, a furry surface feels soft and pleasant, whereas sandpaper feels rough and unpleasant. In a previous study, younger adults manually explored a representative set of solid, fluid and granular materials. Their ratings were made along six perceptual dimensions (roughness, fluidity, granularity, deformability, fibrousness, heaviness) and three emotional ones (valence, arousal, dominance). Perceptual and emotional dimensions were systematically correlated. Here, we wondered how this perceptuo-affective organization of touched materials depends on age, given that older adults show decline in haptic abilities, in particular detail perception. 30 younger participants (~22 years, half females) and 15 older participants (~66 years) explored 25 materials using 18 perceptual and 9 emotional adjectives. We extracted 6 perceptual and 2 emotional dimensions. Older and younger adults showed similar dimensions. However, in younger participants roughness and granularity judgments were done separately, while they were collapsed in a single dimension in older people. Further, age groups differed in the perception of roughness, granularity and valence, and older people did not show a positive correlation between valence and granularity as did younger people. As expected, control analyses between young males and females did not reveal similar gender differences. Overall, the results demonstrate that older people organize and experience materials partly differently from younger people, which we lead back to sensory decline. However, other aspects of perceptual organization that also include fine perception are preserved into older age.
Drewing, K. (2024). Perceptuo-affective organization of touched materials in younger and older adults. Plos one, 19(1), e0296633.
Müge Cavdan & Dicle Dövencioglu
Softness is a material property that plays an essential role in our daily interactions with objects. The sense of touch, or haptic sense, provides us with valuable information about the shapes, functions, and material properties of objects. We use the sense of haptic softness in a wide variety of situations, from assessing the ripeness of the fruit we eat to the suitability of the clothes we wear. Accordingly, describing an object as soft can encompass a wide range of materials including fabric, hand cream, sand, or cat hair. In the engineering literature, the perception of haptic softness is commonly defined by compliance: the degree to which an object can be physically deformed by the effect of external forces. Consequently, the studies investigating different components of softness equated softness to the compliance of elastic materials. However, recent studies have shown that the perception of softness in humans cannot be explained by a single dimension. Instead, it has been determined that there are multiple perceptual dimensions of softness, each associated with specific hand movements (exploratory procedures) that can be used to explore and evaluate the softness of different objects. These perceptual dimensions of softness include surface softness, fluidity (viscosity), granularity, and deformability. Furthermore, people adapt their hand gestures and haptic explorations depending on the characteristics of the touched object, the information they want to acquire, and the interaction of the object's properties and the desired information. These new developments can contribute not only to the understanding of the perception of softness but also to improving the grasping and exploration abilities of autonomous robots.
M. Cavdan and D. Dövencioglu, “Dokunsal Yumuşaklik Algisina i̇li̇şki̇n bi̇r i̇nceleme,” Ankara Üniversitesi Dil ve Tarih-Coğrafya Fakültesi Dergisi, 2023.
Michaela Jeschke, Knut Drewing, Elena Azañón
Touch is susceptible to various aftereffects. Recent findings on tactile distance perception demonstrate that when an area of the body is repeatedly touched at two points separated by a given distance, subsequently presented smaller distances are perceived as smaller and larger distances as larger. Here we investigate whether adaptation to a tactile distance transfers to the perception of coarse textures’ roughness. Additionally, we examine whether this transfer is orientation-specific, which is typical for low-level aftereffects. On each trial, the tip of the left index finger was adapted either 1) to a tactile two-point distance of 4 mm applied along the length of the finger, 2) the same distance applied across the width of the finger or 3) to single indentations. After adaptation to a two-point distance, participants systematically perceived subsequently presented gratings with smaller groove distances as being less rough–-when the orientation of the adapted distance matched that of the texture. This reflects an aftereffect transfer for the orientation-congruent condition only. The results suggest that the processing of distance between two points on the skin is involved in the computation of texture, and that texture is a basic somatosensory feature computed at relatively early stages of sensory processing.
M. Jeschke, K. Drewing and E. Azañón, "The Tactile Distance Aftereffect Transfers to Roughness Perception," 2023 IEEE World Haptics Conference (WHC), Delft, Netherlands, 2023, pp. 8-13, doi: 10.1109/WHC56415.2023.10224476.
Didem Katircilar, Knut Drewing
Haptic perception is inherently active. People utilize different exploratory strategies that affect their perception. For example, people perceive small shapes more precisely when the finger explores them laterally as compared to anteroposterior, and they adjust their exploratory direction in a corresponding task to increase perceptual performance. Here, we investigated how prescribed movement direction of the finger affects texture perception and associated exploratory movements. Texture perception is based on spatial cues from static touch and temporal cues from active movement. We used stimuli that maximized the relevancy of movement-related temporal cues. The finger was moving lateral or anteroposterior to the body, but always orthogonal to the texture orientation. In addition, one group of participants explored while wearing a glove that further reduced the availability of spatial cues, another group explored without glove. Participants performed a two-interval forced choice task choosing in each trial the stimulus with higher spatial frequency. Participants applied higher force and stroked faster in anteroposterior orientation than in lateral orientation. Further, participants wearing gloves stroked the textures more slowly. Perceptual performance did not differ between conditions. We conclude that participants adapted their movement strategies to the respective exploratory constraints in ways to maintain good perception.
D. Katircilar and K. Drewing, "The Effects of Movement Direction and Glove on Spatial Frequency Discrimination in Oriented Textures," 2023 IEEE World Haptics Conference (WHC), Delft, Netherlands, 2023, pp. 313-318, doi: 10.1109/WHC56415.2023.10224465.
Bora Celebi, Müge Cavdan, Knut Drewing
The human torso encompasses a large haptic perceptual area where cutaneous feedback can be delivered. Haptic vests can provide different effects in order to present information or enrich and even alter perception. Over the last decades, different haptic vest designs have been proposed for use in spatial navigation aid and virtual environments. However, most of the designs so far used one type of stimulation only. Here we present a haptic vest design that can provide feedback in three cutaneous modalities: vibrotactile feedback, punctual force, and warmth. We conducted a first evaluation experiment to study the suitability of our design. We conclude that our design is suited to deliver multimodal feedback to the torso area. The incorporation of different haptic modalities on the vest allows the presentation of diverse perceptual effects that can be beneficial in altering human time perception.
B. Celebi, M. Cavdan and K. Drewing, "Design and Evaluation of a Multimodal Haptic Vest," 2023 IEEE World Haptics Conference (WHC), Delft, Netherlands, 2023, pp. 56-63, doi: 10.1109/WHC56415.2023.10224374.
Lisa Pui Yee Lin, Müge Cavdan, Katja Doerschner, Knut Drewing
Objects’ material properties are essential not only in how we use and interact with them but also in eliciting affective responses when in contact with the body. Such affective experiences are of particular interest because they likely strongly impact our daily interactions with materials. We examined whether exploration time and surface size could influence affective responses to rough stimuli. Here, participants made pleasantness and arousal judgments after actively exploring sandpaper stimuli of different sizes with varying roughness levels under different time constraints. Findings confirm that increased surface roughness is associated with decreased perceived pleasantness; however, arousal did not systematically covary with roughness. We didn’t find an effect of exploration time on perceived pleasantness or arousal, but there were interactions between grit size and surface size. Overall, the direction of the effects of grit size on pleasantness was similar for both surface sizes. However, the slopes of increase in pleasantness relative to grit size varied depending on surface size. Effects on arousal were unrelated and small. We suggest that exploration time had little influence on the perceived magnitude of affective reactions to roughness. However, surface size may influence not only perceived roughness but also the perceived pleasantness of rough stimuli.
L. P. Y. Lin, M. Cavdan, K. Doerschner and K. Drewing, "The Influence of Surface Roughness and Surface Size on Perceived Pleasantness," 2023 IEEE World Haptics Conference (WHC), Delft, Netherlands, 2023, pp. 417-424, doi: 10.1109/WHC56415.2023.10224384.
Müge Cavdan, Nedim Goktepe, Knut Drewing, Katja Doerschner
Softness is an important material property that can be judged directly, by interacting with an object, but also indirectly, by simply looking at an image of a material. The latter is likely possible by filling in relevant multisensory information from prior experiences with soft materials. Such experiences are thought to lead to associations that make up our representations about perceptual softness. Here, we investigate the structure of this representational space when activated by words, and compare it to haptic and visual perceptual spaces that we obtained in earlier work. To this end, we performed an online study where people rated different sensory aspects of soft materials, presented as written names. We compared the results with the previous studies where identical ratings were made on the basis of visual and haptic information. Correlation and Procrustes analyses show that, overall, the representational spaces of verbally presented materials were similar to those obtained from haptic and visual experiments. However, a classifier analysis showed that verbal representations could better be predicted from those obtained from visual than from haptic experiments. In a second study we rule out that these larger discrepancies in representations between verbal and haptic conditions could be due to difficulties in material identification in haptic experiments. We discuss the results with respect to the recent idea that at perceived softness is a multidimensional construct.
Cavdan, M., Goktepe, N., Drewing, K. et al. Assessing the representational structure of softness activated by words. Sci Rep 13, 8974 (2023). https://doi.org/10.1038/s41598-023-35169-6
Lucia Seminara, Strahinja Dosen, Fulvio Mastrogiovanni, Matteo Bianchi, Simon Watt, Philipp Beckerle, Thrishantha Nanayakkara, Knut Drewing, Alessandro Moscatelli, Roberta L. Klatzky, Gerald E. Loeb
Human manual dexterity relies critically on touch. Robotic and prosthetic hands are much less dexterous and make little use of the many tactile sensors now available. We propose a framework modeled on the hierarchical controllers of the sensorimotor nervous system to link sensing to action in human-in-the-loop, haptically enabled, artificial hands.
Seminara, L., Dosen, S., Mastrogiovanni, F., Bianchi, M., Watt, S., Beckerle, P., Nanayakkara. T., Drewing. K., Moscatelli, A., Klatzky, R. & Loeb, G. (2023). Science Robotics 8, eadd5434. https://doi.org/10.1126/scirobotics.add5434
Temporal information plays a crucial role in human everyday life. Yet, perceived time is subject to distortions. Emotion, for instance, is a powerful time modulator in that emotional events are perceived longer than neutral events of the same length. However, it is unknown how exposure to emotional stimuli influences the time perception of a simultaneous neutral tactile event. To fill this gap, we tested the effect of emotional auditory sounds on the perception of neutral vibrotactile feedback. We used neutral and emotional (i.e., pleasant-high arousal, pleasant-low arousal, unpleasant-high arousal, and unpleasant-low arousal) auditory stimuli from the International Digitized Sound System (IADS). Tactile information was a vibrotactile stimulus at a fixed intensity and presented through a custom-made vibrotactile sleeve. Participants listened to auditory stimuli which were temporally coupled with vibrotactile stimulation for 2,3,4, or 5sec. Their task was to focus on the duration of vibrotactile information and reproduce elapsed time. We tested the effects of valence and arousal of auditory stimuli on the perceived duration of vibrotactile information. Simultaneously presented auditory stimuli, in general, lengthened the perceived duration of the neutral vibrotactile information compared to neutral auditory stimuli. We conclude that emotional events influence time perception of simultaneous neutral haptic events.
Cavdan, M., Celebi, B., Drewing, K. (2023). Simultaneous Emotional Stimuli Prolong the Timing of Vibrotactile Events, in IEEE Transactions on Haptics. https://doi.org/10.1109/TOH.2023.3275190.
Many large-scale multi-robot systems require human input during operation in different applications. To still minimize the human effort, interaction is intermittent or restricted to a subset of robots. Despite this reduced demand for human interaction, the mental load and stress can be challenging for the human operator. A specific effect of human-swarm interaction may be a hypothesized change of subjective time perception in the human operator. In a series of simple human-swarm interaction experiments with robot swarms of up to 15 physical robots, we study whether human operators have altered time perception due to the number of controlled robots or robot speeds. Using data gathered by questionnaires, we found that increased swarm size shrinks perceived time and decreased robot speeds expand the perceived time. We introduce the concept of subjective time perception to human-swarm interaction. Future research will enable swarm systems to autonomously modulate subjective timing to ease the job of human operators
Kaduk, J., Cavdan, M., Drewing, K., Vatakis, A., & Hamann, H. (2023). Effects of human-swarm interactions on subjective time perception: Swarm size and speed. HRI 23. https://doi.org/10.1145/3568162.3578626
The approximate number system (ANS) is thought to be an innate cognitive system that allows humans to perceive numbers (>4) in a fuzzy manner. One assumption of the ANS is that numerosity is represented amodally due to a mechanism, which filters out nonnumerical information from stimulus material. However, some studies show that nonnumerical information (e.g., spatial parameters) influence the numerosity percept as well. Here, we investigated whether there is a cross-modal transfer of spatial information between the haptic and visual modality in an approximate cross-modal number matching task. We presented different arrays of dowels (haptic stimuli) to 50 undergraduates and asked them to compare haptically perceived numerosity to two visually presented dot arrays. Participants chose which visually presented array matched the numerosity of the haptic stimulus. The distractor varied in number and displayed a random pattern, whereas the matching (target) dot array was either spatially identical or spatially randomized (to the haptic stimulus). We hypothesized that if a “numerosity” percept is based solely on number, neither spatially identical nor spatial congruence between the haptic and the visual target arrays would affect the accuracy in the task. However, results show significant processing advantages for targets with spatially identical patterns and, furthermore, that spatial congruency between haptic source and visual target facilitates performance. Our results show that spatial information was extracted from the haptic stimuli and influenced participants’ responses, which challenges the assumption that numerosity is represented in a truly abstract manner by filtering out any other stimulus features.
Ziegler, M.C., Stricker, L.K. & Drewing, K. The role of spatial information in an approximate cross-modal number matching task. Atten Percept Psychophys (2023). https://doi.org/10.3758/s13414-023-02658-9
Vibrations effectively transmit information from objects, surfaces or events to the human skin through the cutaneous sense. However, due to the diverse densities of receptive fields and mechanoreceptor populations vibrotactile sensitivity differs across body parts. Hardware that utilizes vibrotactile information should consider such differences. Here, we examined perceived intensity of vibrotactile stimuli applied to the front and back of the human torso. Participants wore a vibrotactile vest. They had to judge if a vibration from the back side of the vest was larger or smaller than a fixed vibration given from the front side; the intensity of the stimulus at the back was adapted using staircase methods. We found that, stimuli at the back had to be physically more intense by 12.3% than stimuli at the front to be perceived equally intense: Presentation of vibrotactile information through wearables could equalize for differential sensitivity, e.g., to equalize attention-capturing effects.
Celebi, B., Cavdan, M., Drewing, K. (2022). Vibrotactile Stimuli are Perceived More Intense at the Front than at the Back of the Torso. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_7
Perceiving mechanical properties of objects, i.e., how they react to physical forces, is a crucial ability in many aspects of life, from choosing an avocado to picking your clothes. There is, a wide variety of materials that differ substantially in their mechanical properties. For example, both, silk and sand deform and change shape in response to exploration forces, but each does so in very different ways. Studies show that the haptic perceptual space has multiple dimensions corresponding to the physical properties of textures, however in these experiments the range of materials or exploratory movements were restricted. Here we investigate the perceptual dimensionality in a large set of real materials in a free haptic exploration task. Thirty‑two participants actively explored deformable and non‑deformable materials with their hands and rated them on several attributes. Using the semantic differential technique, video analysis and linear classification, we found four haptic dimensions, each associated with a distinct set of hand and finger movements during active exploration. Taken together our findings suggest that the physical, particularly the mechanical, properties of a material systematically affect how it is explored on a much more fine‑grained level than originally thought.
Dövencioǧlu, D.N., Üstün, F.S., Doerschner, K. et al. Hand explorations are determined by the characteristics of the perceptual space of real-world materials from silk to sand. Sci Rep 12, 14785 (2022). https://doi.org/10.1038/s41598-022-18901-6
When humans explore objects haptically, they seem to use prior as well as sensory information to adapt their exploratory behavior. For texture discrimination, it was shown that participants adapted the direction of their exploratory movement to be orthogonal to the orientation of textures with a defined direction. That is, they adapted the exploratory direction based on the sensory information gathered over the course of an exploration, and this behavior improved their perceptual precision. In the present study we examined if prior visual information that indicates a texture orientation produces a similar adjustment of exploratory movement direction. We expected an increase of orthogonal initial exploration movements with higher qualities of prior information. In each trial, participants explored two grating textures with equal amplitude, only differing in their spatial period. They had to report the stimulus with the higher spatial frequency. Grating stimuli were given in six different orientations relative to the observer. Prior visual information on grating orientation was given in five different qualities: 50% (excellent information), 35%, 25%, 15% and 0% (none). We analyzed movement directions of the first, middle and last strokes over the textures of each trial. The results show an increase in the amount of initial orthogonal strokes and a decrease in variability of movement directions with higher qualities of prior visual information.
Jeschke, M., Zöller, A.C., Drewing, K. (2022). Influence of Prior Visual Information on Exploratory Movement Direction in Texture Perception. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_4
Haptic perception of objects’ softness plays an important role in the identification and interaction with objects. How softness is represented in the brain is yet not clear. Here we investigated whether there is a neutral point in the perceptual representation of haptically perceived softness relative to which the objects are represented as being “soft” or “hard”. We created a wide range of softness stimuli, varying from very hard (ceramic) to very soft foam with differently soft foam and silicone stimuli in between. Participants were assigned to one of three different stimulus set conditions: full set (18 stimuli), soft set (13 softest stimuli) or the hard set (13 hardest stimuli). They categorized each stimulus as “hard” or “soft” and we estimated the neutral point as the point of subjectively equal categorization as “hard” or “soft”. We found that neutral points were different from the middle stimulus of each set. Furthermore, during the course of the experiment neutral points rather moved away from the middle of the stimulus set than towards it. Our results indicate that there might be a neutral point in the representation of haptically perceived softness, however range effects may play a role.
Metzger, A., Lotz, A., Drewing, K. (2022). Neutral Point in Haptic Perception of Softness. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_7
Fingertip friction and the related shear of skin are key mechanical mechanisms in tactile perception, but the perception of friction itself is rarely explored except for the flat surfaces of tactile displays. We investigated the perception of friction for tactile exploration of a unique set of samples whose fabric-like surfaces are equipped with regular arrays of flexible micropillars. The measured fingertip friction increases with decreasing bending stiffness, where the latter is controlled by radius (20–75 µm) and aspect ratio of the micropillars. In forced-choice tasks, participants noticed relative differences in friction as small as 0.2, and even smaller when a sample with less than 100 µm distance between pillars is omitted from the analysis. In an affective ranking of samples upon active touch, the perception of pleasantness is anticorrelated with the measured friction. Our results offer insights towards a rational design of materials with well-controlled surface microstructure which elicit a dedicated tactile appeal.
Fehlberg, M., Kim, KS., Drewing, K., Hensel, R., Bennewitz, R. (2022). Perception of Friction in Tactile Exploration of Micro-structured Rubber Samples. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_3
In the flash lag effect (FLE), a moving object is seen to be ahead of a brief flash that is presented at the same spatial location; a haptic analogue of the FLE has also been observed [1, 2]. Some accounts of the FLE relate the effect to temporal delays in the processing of the stationary stimulus as compared to that of the moving stimulus [3–5]. We tested for movement-related processing effects in haptics. People judged the temporal order of two vibrotactile stimuli at the two hands: One hand was stationary, the other hand was executing a fast, medium, or slow hand movement. Stimuli at the moving hand had to be presented around 36 ms later, to be perceived to be simultaneous with stimuli at the stationary hand. In a control condition, where both hands were stationary, perceived simultaneity corresponded to physical simultaneity. We conclude that the processing of haptic stimuli at moving hands is accelerated as compared to stationary ones–in line with assumptions derived from the FLE.
Drewing, K., Vroomen, J. (2022). Moving Hands Feel Stimuli Before Stationary Hands. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_2
We interact with different types of soft materials on a daily basis such as salt, hand cream, etc. Recently we have shown that soft materials can be described using four perceptual dimensions which are deformability, granularity, viscosity, and surface softness [1]. Here, we investigated whether humans can actually perceive systematic differences in materials that selectively vary along one of these four dimensions as well as how judgments on the different dimensions are correlated to softness judgments. We selected at least two material classes per dimension (e.g., hair gel and hand cream for viscosity) and varied the corresponding feature (e.g., the viscosity of hair gel). Participants ordered four to ten materials from each material class according to their corresponding main feature, and in addition, according to their softness. Rank orders of materials according to the main feature were consistent across participants and repetitions. Rank orders according to softness were correlated either positively or negatively with the judgments along the associated four perceptual dimensions. These findings support our notion of multiple softness dimensions and demonstrate that people can reliably discriminate materials which are artificially varied along each of these softness dimensions.
Cavdan, M., Doerschner, K., Drewing, K. (2022). Haptic Discrimination of Different Types of Soft Materials. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_1
We recorded vibratory patterns elicited by free haptic exploration of a large set of natural textures with a steel tool tip. Vision and audio signals during the exploration were excluded. After the exploration of each sample, participants provided judgments about its perceptual attributes and material category. We found that vibratory signals can be approximated by a single parameter in the temporal frequency domain, in a similar way as we can describe the spatial frequency spectrum of natural images. This parameter varies systematically between material categories and correlates with human perceptual judgements. It provides an estimate of the spectral composition of the power spectra which is highly correlated with the differential activity of the Rapidly Adapting (RA) and Pacinian Corpuscle (PC) afferents.
Toscani, M., Metzger. A. (2022). A Database of Vibratory Signals from Free Haptic Exploration of Natural Material Textures and Perceptual Judgments (ViPer): Analysis of Spectral Statistics. In: , et al. Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science, vol 13235. Springer, Cham. https://doi.org/10.1007/978-3-031-06249-0_36
The ability to sample sensory information with our hands is crucial for smooth and efficient interactions with the world. Despite this important role of touch, tactile sensations on a moving hand are perceived weaker than when presented on the same but stationary hand. This phenomenon of tactile suppression has been explained by predictive mechanisms, such as internal forward models, that estimate future sensory states of the body on the basis of the motor command and suppress the associated predicted sensory feedback. The origins of tactile suppression have sparked a lot of debate, with contemporary accounts claiming that suppression is independent of sensorimotor predictions and is instead due to an unspecific mechanism. Here, we target this debate and provide evidence for specific tactile suppression due to precise sensorimotor predictions. Participants stroked with their finger over textured objects that caused predictable vibrotactile feedback signals on that finger. Shortly before touching the texture, we probed tactile suppression by applying external vibrotactile probes on the moving finger that either matched or mismatched the frequency generated by the stroking movement along the texture. We found stronger suppression of the probes that matched the predicted sensory feedback. These results show that tactile suppression is specifically tuned to the predicted sensory states of a movement .
Führer, E., Voudouris, D., Lezkan, A., Drewing, K., & Fiehler, K. (2021). Tactile suppression stems from sensation-specific sensorimotor predictions. Proceedings of the National Academy of Sciences (2022). https://doi.org/10.1073/pnas.2118445119
Creativity has traditionally been associated with high independence and low conformity. The present study investigated the moderating role of collectivist (conformity) and individualist (self-direction) values in the link between self-construals and creativity in a collectivist cultural context. We hypothesized that (1) creativity would be related to both independent and interdependent self, and (2) creativity would be higher when individual values fit with cultural norms. We also investigated whether a bicultural self, characterized by high independence coupled with high interdependence, benefits creativity more than nonbicultural combinations and whether values moderate these relations. The task-specific perceived and actual creativity scores of 201 undergraduate students in Turkey showed expected relations with self-construals and values: First, both independence and interdependence were positively related to higher creativity. Second, high interdependence benefitted creativity more when coupled with high conformity or low self-direction. Finally, people with a bicultural self were more creative, especially when they were also high on conformity. Overall, our study provides first evidence for the interplay between self-construals, values, and the larger cultural context in affording or limiting individuals’ creativity. The results were discussed in terms of the implications for cultivating creativity in educational and intercultural settings.
Güngör, D., Yildiz, G.Y. & Cavdan, M. Values Moderate the Relations Between Self-Construals and Creativity: The Role of Cultural Fit. Psychol Stud (2022). https://doi.org/10.1007/s12646-022-00651-0
When touching the surface of an object, its spatial structure translates into a vibration on the skin. The perceptual system evolved to translate this pattern into a representation that allows to distinguish between different materials. Here, we show that perceptual haptic representation of materials emerges from efficient encoding of vibratory patterns elicited by the interaction with materials. We trained a deep neural network with unsupervised learning (Autoencoder) to reconstruct vibratory patterns elicited by human haptic exploration of different materials. The learned compressed representation (i.e., latent space) allows for classification of material categories (i.e., plastic, stone, wood, fabric, leather/wool, paper, and metal). More importantly, classification performance is higher with perceptual category labels as compared to ground truth ones, and distances between categories in the latent space resemble perceptual distances, suggesting a similar coding. Crucially, the classification performance and the similarity between the perceptual and the latent space decrease with decreasing compression level. We could further show that the temporal tuning of the emergent latent dimensions is similar to properties of human tactile receptors.
Metzger, A., & Toscani, M. (2022). Unsupervised learning of haptic material properties. eLife, 11, e64876. https://doi.org/10.3758/s13414-021-02427-6
The Approximate Number System (ANS) is conceptualized as an innate cognitive system that allows humans to perceive numbers of objects or events (>4) in a fuzzy, imprecise manner. The representation of numbers is assumed to be abstract and not bound to a particular sense. In the present study, we test the assumption of a shared cross-sensory system. We investigated approximate number processing in the haptic modality and compared performance to that of the visual modality. We used a dot comparison task (DCT), in which participants compare two dot arrays and decide which one contains more dots. In the haptic DCT, 67 participants had to compare two simultaneously presented dot arrays with the palms of their hands; in the visual DCT, participants inspected and compared dot arrays on a screen. Tested ratios ranged from 2.0 (larger/smaller number) to 1.1. As expected, in both the haptic and the visual DCT responses similarly depended on the ratio of the numbers of dots in the two arrays. However, on an individual level, we found evidence against medium or stronger positive correlations between “ANS acuity” in the visual and haptic DCTs. A regression model furthermore revealed that besides number, spacing-related features of dot patterns (e.g., the pattern’s convex hull) contribute to the percept of numerosity in both modalities. Our results contradict the strong theory of the ANS solely processing number and being independent of a modality. According to our regression and response prediction model, our results rather point towards a modality-specific integration of number and number-related features.
Ziegler, M., Drewing, K. Get in touch with numbers – an approximate number comparison task in the haptic modality. Atten Percept Psychophys (2022). https://doi.org/10.3758/s13414-021-02427-6
The softness of objects can be perceived through several senses. For instance, to judge the softness of a cat's fur, we do not only look at it, we often also run our fingers through its coat. Recently, we have shown that haptically perceived softness covaries with the compliance, viscosity, granularity, and furriness of materials (Dovencioglu, Üstün, Doerschner, & Drewing, 2020). However, it is unknown whether vision can provide similar information about the various aspects of perceived softness. Here, we investigated this question in an experiment with three conditions: in the haptic condition, blindfolded participants explored materials with their hands, in the static visual condition participants were presented with close-up photographs of the same materials, and in the dynamic visual condition participants watched videos of the hand-material interactions that were recorded in the haptic condition. After haptically or visually exploring the materials, participants rated them on various attributes. Our results show a high overall perceptual correspondence among the three experimental conditions. With a few exceptions, this correspondence tended to be strongest between haptic and dynamic visual conditions. These results are discussed with respect to information potentially available through the senses, or through prior experience, when judging the softness of materials.
Cavdan, M., Drewing, K., & Doerschner, K. (2021). The look and feel of soft are similar across different softness dimensions. Journal of vision, 21(10), 20-20.
Adaptation to delays between actions and sensory feedback is important for efficiently interacting with our environment. Adaptation may rely on predictions of action-feedback pairing (motor-sensory component), or predictions of tactile-proprioceptive sensation from the action and sensory feedback of the action (inter-sensory component). Reliability of temporal information might differ across sensory feedback modalities (e.g. auditory or visual), which in turn influences adaptation. Here, we investigated the role of motor-sensory and inter-sensory components on sensorimotor temporal recalibration for motor-auditory (button press-tone) and motor-visual (button press-Gabor patch) events. In the adaptation phase of the experiment, action-feedback pairs were presented with systematic temporal delays (0 ms or 150 ms). In the subsequent test phase, audio/visual feedback of the action were presented with variable delays. The participants were then asked whether they detected a delay. To disentangle motor-sensory from inter-sensory component, we varied movements (active button press or passive depression of button) at adaptation and test. Our results suggest that motor-auditory recalibration is mainly driven by the motor-sensory component, whereas motor-visual recalibration is mainly driven by the inter-sensory component. Recalibration transferred from vision to audition, but not from audition to vision. These results indicate that motor-sensory and inter-sensory components contribute to recalibration in a modality-dependent manner.
Arikan, B. E., van Kemenade, B. M., Fiehler, K., Kircher, T., Drewing, K., & Straube, B. (2021). Different contributions of efferent and reafferent feedback to sensorimotor temporal recalibration. Scientific reports, 11(1), 1-15.
Haptic exploration of objects usually consists of repeated exploratory movements and our perception of their properties is the result of the integration of information gained during each of these single movements. The serial nature of information integration in haptic perception requires that sensory estimates from single exploratory movements are retained in memory. Here we propose an optimal model for serial integration of information in haptic explorations which considers memory limitations. We tested the model by predicting discrimination performance in free and restricted (fixed number of indentations and varied number of switches between the stimuli) explorations of softness. Our model overall well predicts performance given different exploratory patterns in both free and restricted explorations. The model slightly overestimates performance in the restricted exploration and predictions are accurate in free explorations. These results suggest that integration of information can be well approximated by our model, in particular in free haptic exploration. We further tested whether participants prefer explorations which maximize performance. The model predicts that with constant number of indentations switching between the stimuli increases performance. Our results show that participants increase the number of switches only up to three switches, suggesting a trade-off between muscular switching costs and performance.
Metzger, A., & Drewing, K. (2021, July). A Kalman filter model for predicting discrimination performance in free and restricted haptic explorations. In 2021 IEEE World Haptics Conference (WHC) (pp. 439-444). IEEE.
Müge Cavdan, Robert Ennis, Knut Drewing, Katja Doerschner
Humans typically interact with the environment using bare hands. However, sometimes this is not possible or not preferred, e.g., when wearing protective gloves for work or sensor gloves in mixed/augmented reality (AR). Also, studying softness is highly important since it makes use of tactile and proprioceptive cues and it might be highly sensitive to restrictions. Here we tested how corresponding haptic constraints affect perceived softness. Participants manually explored and rated 10 materials on 15 sensory adjectives under four constraint conditions: bare hand, open-fingered glove, open-fingered glove with rigid sensors, and full glove. The materials represented extreme values on different softness dimensions; the adjectives were chosen to assess these dimensions. Principal Component Analysis (PCA), Procrustes distances, and correlation analyses showed that across constraint conditions, softness perception is overall highly similar. However, when we inspected responses on a more detailed level, per material-adjective combination, we observed that the full glove condition differed from the others especially for judgments on surface softness. Overall, the results suggest that sensor gloves hardly change the perception of different dimensions of softness if fingertips are left bare.
Cavdan, M., Ennis, R., Drewing, K., & Doerschner, K. (2021, July). Constraining haptic exploration with sensors and gloves hardly changes the multidimensional structure of softness perception. In 2021 IEEE World Haptics Conference (WHC) (pp. 31-36). IEEE.
Humans can optimize haptic perception by tuning their exploratory behavior. In softness exploration humans use more force when expecting a pair of hard objects as compared to soft ones, and this force control improves softness discrimination. Such force tuning seems to be based on implicit prior information about the upcoming compliance category. In previous studies, prior information was implicitly induced by presenting blocks of trials of the same category (hard or soft). Here, we studied force control when hard and soft stimulus pairs alternate according to a predictable pattern. Participants had to decide which one of two silicone stimuli was softer. Soft and hard trials were presented in random order, in blocks, in alternating order (short pattern), or alternating always two hard and two soft trials (long pattern). We confirmed the finding of force tuning to compliance for blocked as compared to random presentation. The predictable presentation patterns also influenced force control, but not in the expected direction. We conclude that implicit expectations from sequences can be used in force control, but they are not sufficient for successful tuning. A further sequential analysis shows that forces are not only adapted by simple reactive trial-by-trial mechanisms.
Drewing, K., & Zoeller, A. C. (2021, July). Influence of presentation order on force control in softness exploration. In 2021 IEEE World Haptics Conference (WHC) (pp. 19-24). IEEE.
Haptic texture perception is based on sensory information sequentially gathered during several lateral movements (“strokes”). In this process, sensory information of earlier strokes must be preserved in a memory system. We investigated whether this system may be a haptic sensory memory. In the first experiment, participants performed three strokes across each of two textures in a frequency discrimination task. Between the strokes over the first texture, participants explored an intermediate area, which presented either a mask (high-energy tactile pattern) or minimal stimulation (low-energy smooth surface). Perceptual precision was significantly lower with the mask compared with a three-strokes control condition without an intermediate area, approaching performance in a one-stroke-control condition. In contrast, precision in the minimal stimulation condition was significantly better than in the one-stroke control condition and similar to the three-strokes control condition. In a second experiment, we varied the number of strokes across the first stimulus (one, three, five, or seven strokes) and either presented no masking or repeated masking after each stroke. Again, masking between the strokes decreased perceptual precision relative to the control conditions without masking. Precision effects of masking over different numbers of strokes were fit by a proven model on haptic serial integration that modeled masking by repeated disturbances in the ongoing integration. Taken together, results suggest that masking impedes the processes of haptic information preservation and integration. We conclude that a haptic sensory memory, which is comparable to iconic memory in vision, is used for integrating sequentially gathered sensory information.
Drewing, K., & Lezkan, A. (2021). Masking interferes with haptic texture perception from sequential exploratory movements. Attention, Perception, & Psychophysics, 83(4), 1766-1776.
Haptic search is a common everyday task, usually consisting of two processes: target search and target analysis. During target search we need to know where our fingers are in space, remember the already completed path and the outline of the remaining space. During target analysis we need to understand whether the detected potential target is the desired one. Here we characterized dynamics of exploratory movements in these two processes. In our experiments participants searched for a particular configuration of symbols on a rectangular tactile display. We observed that participants preferentially moved the hand parallel to the edges of the tactile display during target search, which possibly eased orientation within the search space. After a potential target was detected by any of the fingers, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, these fingers ramatically slowed down. Being in contact with the potential target, the index and the middle finger moved within a smaller area than the other fingers, which rather seemed to move away to leave them space. These results suggest that the middle and the index finger are specialized for fine analysis in haptic search.
Haptic search is a common everyday task, usually consisting of two processes: target search and target analysis. During target search we need to know where our fingers are in space, remember the already completed path and the outline of the remaining space. During target analysis we need to understand whether the detected potential target is the desired one. Here we characterized dynamics of exploratory movements in these two processes. In our experiments participants searched for a particular configuration of symbols on a rectangular tactile display. We observed that participants preferentially moved the hand parallel to the edges of the tactile display during target search, which possibly eased orientation within the search space. After a potential target was detected by any of the fingers, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, these fingers ramatically slowed down. Being in contact with the potential target, the index and the middle finger moved within a smaller area than the other fingers, which rather seemed to move away to leave them space. These results suggest that the middle and the index finger are specialized for fine analysis in haptic search.
Metzger, A., Toscani, M., Valsecchi, M., & Drewing, K. (2021). Target search and inspection strategies in haptic search. IEEE Transactions on Haptics.
Haptic exploration usually involves stereotypical systematic movements that are adapted to the task. Here we tested whether exploration movements are also driven by physical stimulus features. We designed haptic stimuli, whose surface relief varied locally in spatial frequency, height, orientation, and anisotropy. In Experiment 1, participants subsequently explored two stimuli in order to decide whether they were same or different. We trained a variational autoencoder to predict the spatial distribution of touch duration from the surface relief of the haptic stimuli. The model successfully predicted where participants touched the stimuli. It could also predict participants’ touch distribution from the stimulus’ surface relief when tested with two new groups of participants, who performed a different task (Exp. 2) or explored different stimuli (Exp. 3). We further generated a large number of virtual surface reliefs (uniformly expressing a certain combination of features) and correlated the model’s responses with stimulus properties to understand the model’s preferences in order to infer which stimulus features were preferentially touched by participants. Our results indicate that haptic exploratory behavior is to some extent driven by the physical features of the stimuli, with e.g. edge-like structures, vertical and horizontal patterns, and rough regions being explored in more detail.
Metzger, A., Toscani, M., Akbarinia, A., Valsecchi, M., & Drewing, K. (2021). Deep neural network model of haptic saliency. Scientific reports, 11(1), 1-14.
The ability to sample sensory information with our hands is crucial for smooth and efficient interactions with the world. Despite this important role of touch, tactile sensations on a moving hand are perceived weaker than when presented on the same but stationary hand.1-3 This phenomenon of tactile suppression has been explained by predictive mechanisms, such as forward models, that estimate future sensory states of the body on the basis of the motor command and suppress the associated predicted sensory feedback.4 The origins of tactile suppression have sparked a lot of debate, with contemporary accounts claiming that suppression is independent of predictive mechanisms and is instead akin to unspecific gating.5 Here, we target this debate and provide evidence for sensation-specific tactile suppression due to sensorimotor predictions. Participants stroked with their finger over textured surfaces that caused predictable vibrotactile feedback signals on that finger. Shortly before touching the texture, we applied external vibrotactile probes on the moving finger that either matched or mismatched the frequency generated by the stroking movement. We found stronger suppression of the probes that matched the predicted sensory feedback. These results show that tactile suppression is not limited to unspecific gating but is specifically tuned to the predicted sensory states of a movement.
Führer, E., Voudouris, D., Lezkan, A., Drewing, K., & Fiehler, K. (2021). Tactile suppression stems from sensation-specific sensorimotor predictions. bioRxiv.
Adaptation to delays between actions and sensory feedback is important for efficiently interacting with our environment. Adaptation may rely on predictions of action-feedback pairing (motor-sensory component), or predictions of tactile-proprioceptive sensation from the action and sensory feedback of the action (inter-sensory component). Reliability of temporal information might differ across sensory feedback modalities (e.g. auditory or visual), influencing adaptation. Here, we investigated the role of motor-sensory and inter-sensory components on sensorimotor temporal recalibration for motor-auditory events (button press-tone) and motor-visual events (button press-Gabor patch). In the adaptation phase of the experiment, the motor action-feedback event pairs were presented with systematic temporal delays (0ms or 150ms). In the subsequent test phase, sensory feedback of the action were presented with variable delays. The participants were then asked whether this delay could be detected. To disentangle motor-sensory from inter-sensory component, we varied movements (active button press or passive depression of button) at adaptation and test. Our results suggest that motor-auditory recalibration is mainly driven by motor-sensory component, whereas motor-visual recalibration is mainly driven by inter-sensory component. Recalibration transferred from vision to audition, but not from audition to vision. These results indicate that motor-sensory and inter-sensory components of recalibration are weighted in a modality-dependent manner.
Arikan, B. E., van Kemenade, B. M., Fiehler, K., Kircher, T., Drewing, K., & Straube, B. (2021). Sensorimotor temporal recalibration: the contribution of motor-sensory and inter-sensory components. bioRxiv.
The softness of objects can be perceived through several senses. For instance, to judge the softness of our cat9s fur, we do not only look at it, we also run our fingers in idiosyncratic ways through its coat. Recently, we have shown that haptically perceived softness covaries with the compliance, viscosity, granularity, and furriness of materials (Dovencioglu et al.,2020). However, it is unknown whether vision can provide similar information about the various aspects of perceived softness. Here, we investigated this question in an experiment with three conditions: in the haptic condition, blindfolded participants explored materials with their hands, in the visual-static condition participants were presented with close-up photographs of the same materials, and in the visual-dynamic condition participants watched videos of the hand-material interactions that were recorded in the haptic condition. After haptically or visually exploring the materials participants rated them on various attributes. Our results show a high overall perceptual correspondence between the three experimental conditions. With a few exceptions, this correspondence tended to be strongest between haptic and visual-dynamic conditions. These results are discussed with respect to information potentially available through the senses, or through prior experience, when judging the softness of materials.
Cavdan, M., Drewing, K., & Doerschner, K. (2021). Materials in action: The look and feel of soft. bioRxiv.
In vision, only the information projected onto the central portion of our retina is perceived with high resolution. Therefore, the visual system needs to process the full visual scene with coarse resolution through peripheral vision and shift the eye in order to analyse a selected portion in detail (foveation). This process allows to reduce the complexity of visual processing by serializing detailed analysis. A haptic process analogous to foveation has been described in the behavior of the blind star-nosed mole, who detects potential prays with any of its tactile appendages but analyzes it with a specific pair, characterized by higher tactile resolution. Here we tested the hypothesis of haptic foveation behavior in humans. Nine participants searched for a particular configuration of symbols on a planar rigid tactile display. We computed the probability for each finger of touching a potential target after it was previously encountered by any of the other fingers, and the exploration speed of each finger while exploring a potential target. Independent of which finger encountered a potential target first, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, these fingers dramatically slowed down, suggesting that these specialized fingers are involved in detailed analysis. In a second experiment we tested the hypothesis that foveation is performed to gain information. Ten participants searched either for an easy target (a rough patch among smooth ones) or a difficult one (a hole in a certain corner of a patch). Overall, we replicated the results of the first experiment. Corraborating our hypothesis, specialized detailed analysis was reduced in easy search, suggesting that foveation behavior was employed less if it provided less information gain. Our results suggest that in haptic search humans employ foveation-like behavior similar as in vision.
Metzger, A., Toscani, M., Valsecchi, M., & Drewing, K. (2020). Foveation-like behavior in human haptic search. Journal of Vision, 20(11), 1105-1105.
In studies investigating haptic softness perception, participants are typically instructed to explore soft objects by indenting them with their index finger. In contrast, performance with other fingers has rarely been investigated. We wondered which fingers are used in spontaneous exploration and if performance differences between fingers can explain spontaneous usage. In Experiment 1 participants discriminated the softness of two rubber stimuli with hardly any constraints on finger movements. Results indicate that humans use successive phases of different fingers and finger combinations during an exploration, preferring index, middle, and (to a lesser extent) ring finger. In Experiment 2 we compared discrimination thresholds between conditions, with participants using one of the four fingers of the dominant hand. Participants compared the softness of rubber stimuli in a two-interval forced choice discrimination task. Performance with index and middle finger was better as compared to ring and little finger, the little finger was the worst. In Experiment 3 we again compared discrimination thresholds, but participants were told to use constant peak force. Performance with the little finger was worst, whereas performance for the other fingers did not differ. We conclude that in spontaneous exploration the preference of combinations of index, middle, and partly ring finger seems to be well chosen, as indicated by improved performance with the spontaneously used fingers. Better performance seems to be based on both different motor abilities to produce force, mainly linked to using index and middle finger, and different sensory sensitivities, mainly linked to avoiding the little finger.
Zoeller, A. C., & Drewing, K. (2020). A Systematic Comparison of Perceptual Performance in Softness Discrimination with Different Fingers. Attention, Perception, & Psychophysics, 82(7), 3696-3709.
People display systematic affective reactions to specific properties of touched materials. For example, granular materials such as fine sand feel pleasant, while rough materials feel unpleasant. We wondered how far such relationships between sensory material properties and affective responses can be changed by learning. Manipulations in the present experiment aimed at unlearning the previously observed negative relationship between roughness and valence and the positive one between granularity and valence. In the learning phase, participants haptically explored materials that are either very rough or very fine-grained while they simultaneously watched positive or negative stimuli, respectively, from the International Affective Picture System (IAPS). A control group did not interact with granular or rough materials during the learning phase. In the experimental phase, participants rated a representative diverse set of 28 materials according to twelve affective adjectives. We found a significantly weaker relationship between granularity and valence in the experimental group compared to the control group, whereas roughness-valence correlations did not differ between groups. That is, the valence of granular materials was unlearned (i.e., to modify the existing valence of granular materials) but not that of rough materials. These points to differences in the strength of perceptuo-affective relations, which we discuss in terms of hard-wired versus learned connections.
Haptic perception involves active exploration usually consisting of repeated stereotypical movements. The choice of such exploratory movements and their parameters are tuned to achieve high perceptual precision. Information obtained from repeated exploratory movements (e.g. repeated indentations of an object to perceive its softness) is integrated but improvement of discrimination performance is limited by memory if the two objects are explored one after the other in order to compare them. In natural haptic exploration humans tend to switch between the objects multiple times when comparing them. Using the example of softness perception here we test the hypothesis that given the same amount of information, discrimination improves if memory demands are lower. In our experiment participants explored two softness stimuli by indenting each of the stimuli four times. They were allowed to switch between the stimuli after every single indentation (7 switches), after every second indentation (3 switches) or only once after four indentations (1 switch). We found better discrimination performance with seven switches as compared to one switch, indicating that humans naturally apply an exploratory strategy which might reduce memory demands and thus leads to improved performance.
Metzger, A., & Drewing, K. (2020, September). Switching Between Objects Improves Precision in Haptic Perception of Softness. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 69-77). Springer, Cham.
When interacting haptically with objects, humans enhance their perception by using prior information to adapt their behavior. When discriminating the softness of objects, humans use higher initial peak forces when expecting harder objects or a smaller difference between the two objects, which increases differential sensitivity. Here we investigated if prior information about constraints in exploration duration yields behavioral adaptation as well. When exploring freely, humans use successive indentations to gather sufficient sensory information about softness. When constraining the number of indentations, also sensory input is limited. We hypothesize that humans compensate limited input in short explorations by using higher initial peak forces. In two experiments, participants performed a 2 Interval Forced Choice task discriminating the softness of two rubber stimuli out of one compliance category (hard, soft). Trials of different compliance categories were presented in blocks containing only trials of one category or in randomly mixed blocks (category expected vs. not expected). Exploration was limited to one vs. five indentations per stimulus (Exp. 1), or to one vs. a freely chosen number of indentations (Exp. 2). Initial peak forces were higher when indenting stimuli only once. We did not find a difference in initial peak forces when expecting hard vs. soft stimuli. We conclude that humans trade off different ways to gather sufficient sensory information for perceptual tasks, integrating prior information to enhance performance.
Zoeller, A. C., & Drewing, K. (2020, September). Systematic Adaptation of Exploration Force to Exploration Duration in Softness Discrimination. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 105-112). Springer, Cham.
When people judge the weight of two objects of equal mass but different size, they perceive the smaller one as being heavier. Up to date, there is no consensus about the mechanisms which give rise to this size-weight illusion. We recently suggested a model that describes heaviness perception as a weighted average of two sensory heaviness estimates with correlated noise: one estimate derived from mass, the other one derived from density. The density estimate is first derived from mass and size, but at the final perceptual level, perceived heaviness is biased by an object’s density, not by its size. Here, we tested the models’ prediction that weight discrimination of equal-size objects is better in lifting conditions which are prone to the size-weight illusion as compared to conditions lacking (the essentially uninformative) size information. This is predicted because in these objects density covaries with mass, and according to the model density serves as an additional sensory cue. Participants performed a two-interval forced-choice weight discrimination task. We manipulated the quality of either haptic (Experiment 1) or visual (Experiment 2) size information and measured just-noticeable differences (JNDs). Both for the haptic and the visual illusion, JNDs were lower in lifting conditions in which size information was available. Thus, when heaviness perception can be influenced by an object’s density, it is more reliable. This discrimination benefit under conditions that provide the additional information that objects are of equal size is further support for the role of density and the integration of sensory estimates in the size-weight illusion.
Wolf, C., & Drewing, K. (2020). The size-weight illusion comes along with improved weight discrimination. Plos one, 15(7), e0236440.
Exploring an object's shape by touch also renders information about its surface roughness. It has been suggested that shape and roughness are processed distinctly in the brain, a result based on comparing brain activation when exploring objects that differed in one of these features. To investigate the neural mechanisms of top-down control on haptic perception of shape and roughness, we presented the same multidimensional objects but varied the relevance of each feature. Specifically, participants explored two objects that varied in shape (oblongness of cuboids) and surface roughness. They either had to compare the shape or the roughness in an alternative-forced-choice-task. Moreover, we examined whether the activation strength of the identified brain regions as measured by functional magnetic resonance imaging (fMRI) can predict the behavioral performance in the haptic discrimination task. We observed a widespread network of activation for shape and roughness perception comprising bilateral precentral and postcentral gyrus, cerebellum, and insula. Task-relevance of the object's shape increased activation in the right supramarginal gyrus (SMG/BA 40) and the right precentral gyrus (PreCG/BA 44) suggesting that activation in these areas does not merely reflect stimulus-driven processes, such as exploring shape, but also entails top-down controlled processes driven by task-relevance. Moreover, the strength of the SMG/PreCG activation predicted individual performance in the shape but not in the roughness discrimination task. No activation was found for the reversed contrast (roughness > shape). We conclude that macrogeometric properties, such as shape, can be modulated by top-down mechanisms whereas roughness, a microgeometric feature, seems to be processed automatically.
Mueller, S., de Haas, B., Metzger, A., Drewing, K., & Fiehler, K. (2019). Neural correlates of top‐down modulation of haptic shape versus roughness perception. Human brain mapping, 40(18), 5172-5184.
The memory of an object’s property (e.g. its typical colour) can affect its visual perception. We investigated whether memory of the softness of every-day objects influences their haptic perception. We produced bipartite silicone rubber stimuli: one half of the stimuli was covered with a layer of an object (sponge, wood, tennis ball, foam ball); the other half was uncovered silicone. Participants were not aware of the partition. They first used their bare finger to stroke laterally over the covering layer to recognize the well-known object and then indented the other half of the stimulus with a probe to compare its softness to that of an uncovered silicone stimulus. Across four experiments with different methods we showed that silicon stimuli covered with a layer of rather hard objects (tennis ball and wood) were perceived harder than the same silicon stimuli when being covered with a layer of rather soft objects (sponge and foam ball), indicating that haptic perception of softness is affected by memory.
Metzger, A., & Drewing, K. (2019). Memory influences haptic perception of softness. Scientific reports, 9(1), 1-10.
When interacting haptically with objects, humans enhance their perception by using prior information to adapt their behavior. When discriminating the softness of objects, humans use higher initial peak forces when expecting harder objects or a smaller difference between the two objects, which increases differential sensitivity. Here we investigated if prior information about constraints in exploration duration yields behavioral adaptation as well. When exploring freely, humans use successive indentations to gather sufficient sensory information about softness. When constraining the number of indentations, also sensory input is limited. We hypothesize that humans compensate limited input in short explorations by using higher initial peak forces. In two experiments, participants performed a 2 Interval Forced Choice task discriminating the softness of two rubber stimuli out of one compliance category (hard, soft). Trials of different compliance categories were presented in blocks containing only trials of one category or in randomly mixed blocks (category expected vs. not expected). Exploration was limited to one vs. five indentations per stimulus (Exp. 1), or to one vs. a freely chosen number of indentations (Exp. 2). Initial peak forces were higher when indenting stimuli only once. We did not find a difference in initial peak forces when expecting hard vs. soft stimuli. We conclude that humans trade off different ways to gather sufficient sensory information for perceptual tasks, integrating prior information to enhance performance.
Zoeller, A. C., & Drewing, K. (2020, September). Systematic Adaptation of Exploration Force to Exploration Duration in Softness Discrimination. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 105-112). Springer, Cham.
Haptic search is a common every day task. Here we characterize the movement dynamics in haptic search. Participants searched for a particular configuration of symbols on a tactile display. We compared the exploratory behavior of the fingers in proximity to potential targets: when any of the fingers encountered a potential target, there was higher probability that subsequent exploration was performed by the index or the middle finger. At the same time, the middle and the index fingers dramatically slowed down. Being in contact with the potential target, the index and the middle finger moved in around a smaller area than the other fingers, which rather seemed to move away to leave them space. Our results corroborate a previous hypothesis [1] that haptic search consists of two phases: a process of target search using all fingers, and a target analysis using the middle and the index finger, which might be specialized for fine analysis.
Metzger, A., Toscani, M., Valsecchi, M., & Drewing, K. (2019, July). Dynamics of exploration in haptic search. In 2019 IEEE World Haptics Conference (WHC) (pp. 277-282). IEEE.
When people judge the temporal order (TOJ task) of two tactile stimuli at the two hands while their hands are crossed, performance is much worse than with uncrossed hands [1]. This crossed-hands deficit is widely considered to indicate interferences of external spatial coordinates with body-centered coordinates in the localization of touch [2]. Similar deficits have also been observed when people are only about to move their hands towards a crossed position [3]-[5], suggesting a predictive update of external spatial coordinates. Here, we extend the investigation of the dynamics of external coordinates during hand movement. Participants performed a TOJ task while they executed an uncrossing or a crossing movement, and during presentation of the TOJ stimuli the present posture of the hands was crossed, uncrossed or in-between. Present, future and past crossed-hands postures decreased performance in the TOJ task, suggesting that the update of external spatial coordinates of touch includes both predictive processes and processes that preserve the recent past. In addition, our data corroborate the flip model of crossed-hands deficits [1], and suggest that more pronounced deficits come along with higher time requirements to resolve interferences.
K. Drewing, F. Hartmann and J. H. M. Vroomen, "The crossed-hands deficit in temporal order judgments occurs for present, future, and past hand postures," 2019 IEEE World Haptics Conference (WHC), 2019, pp. 145-150, doi: 10.1109/WHC.2019.8816125.
Haptic research has traditionally often equated softness with compliance. However, in a recent study we have suggested that compliance is not the only perceived object dimension underlying what is commonly called softness [1]. Here, we investigate how the different perceptual dimensions of softness affect how materials are haptically explored. Participants freely explored and rated 19 materials on 15 adjectives. The adjectives defined different perceptual tasks by being associated with different softness dimensions. Materials were chosen to represent extreme values separately for each dimension; some materials served as control. Hand movements were recorded on video and subsequently categorized into different exploratory procedures (EPs). A multivariate analysis of variance (MANOVA) yielded significant effects of material, of the perceptual task and of their interaction. Taken together, the results suggest that participants actively adapt their EPs to both the type of material being explored, and to the judged softness dimension, and thus support the notion of different dimensions of softness.
Cavdan, M., Doerschner, K., & Drewing, K. (2019, July). The many dimensions underlying perceived softness: How exploratory procedures are influenced by material and the perceptual task. In 2019 IEEE World Haptics Conference (WHC) (pp. 437-442). IEEE.
The memory of an object’s property (e.g. its typical colour) can affect its visual perception. We investigated whether memory of the softness of every-day objects influences their haptic perception. We produced bipartite silicone rubber stimuli: one half of the stimuli was covered with a layer of an object (sponge, wood, tennis ball, foam ball); the other half was uncovered silicone. Participants were not aware of the partition. They first used their bare finger to stroke laterally over the covering layer to recognize the well-known object and then indented the other half of the stimulus with a probe to compare its softness to that of an uncovered silicone stimulus. Across four experiments with different methods we showed that silicon stimuli covered with a layer of rather hard objects (tennis ball and wood) were perceived harder than the same silicon stimuli when being covered with a layer of rather soft objects (sponge and foam ball), indicating that haptic perception of softness is affected by memory.
Metzger, A., & Drewing, K. (2019, April). Haptic Perception of Softness Is Influenced by Memory. In PERCEPTION (Vol. 48, pp. 110-110). 1 OLIVERS YARD, 55 CITY ROAD, LONDON EC1Y 1SP, ENGLAND: SAGE PUBLICATIONS LTD.
When haptically exploring softness, humans use higher peak forces when indenting harder versus softer objects. Here, we investigated the influence of different channels and types of prior knowledge on initial peak forces. Participants explored two stimuli (hard vs. soft) and judged which was softer. In Experiment 1 participants received either semantic (the words "hard" and "soft"), visual (video of indentation), or prior information from recurring presentation (blocks of harder or softer pairs only). In a control condition no prior information was given (randomized presentation). In the recurring condition participants used higher initial forces when exploring harder stimuli. No effects were found in control and semantic conditions. With visual prior information, participants used less force for harder objects. We speculate that these findings reflect differences between implicit knowledge induced by recurring presentation and explicit knowledge induced by visual and semantic information. To test this hypothesis, we investigated whether explicit prior information interferes with implicit information in Experiment 2. Two groups of participants discriminated softness of harder or softer stimuli in two conditions (blocked and randomized). The interference group received additional explicit information during the blocked condition; the implicit-only group did not. Implicit prior information was only used for force adaptation when no additional explicit information was given, whereas explicit interfered with movement adaptation. The integration of prior knowledge only seems possible when implicit prior knowledge is induced-not with explicit knowledge.
Zoeller, A. C., Lezkan, A., Paulun, V. C., Fleming, R. W., & Drewing, K. (2019). Integration of prior knowledge during haptic exploration depends on information type. Journal of vision, 19(4), 20-20.
In haptic perception information is often sampled serially (e.g. a stimulus is repeatedly indented to estimate its softness), requiring that sensory information is retained and integrated over time. Hence, integration of sequential information is likely affected by memory. Particularly, when two sequentially explored stimuli are compared, integration of information on the second stimulus might be determined by the fading representation of the first stimulus. We investigated how the exploration length of the first stimulus and a temporal delay affect contributions of sequentially gathered estimates of the second stimulus in haptic softness discrimination. Participants subsequently explored two silicon rubber stimuli by indenting the first stimulus 1 or 5 times and the second stimulus always 3 times. In an additional experiment we introduced a 5s delay after the first stimulus was indented 5 times. We show that the longer the first stimulus is explored, the more estimates of the second stimulus’ softness contribute to the discrimination of the two stimuli, independent of the delay. This suggests that the exploration length of the first stimulus influences the strength of its representation, persisting at least for 5s, and determines how much information about the second stimulus is exploited for the comparison.
Metzger, A., & Drewing, K. (2019). Effects of stimulus exploration length and time on the integration of information in haptic softness discrimination. IEEE transactions on haptics, 12(4), 451-460.
Due to limitations in perceptual processing, information relevant to momentary task goals is selected from the vast amount of available sensory information by top-down mechanisms (e.g. attention) that can increase perceptual performance. We investigated how covert attention affects perception of 3D objects in active touch. In our experiment, participants simultaneously explored the shape and roughness of two objects in sequence, and were told afterwards to compare the two objects with regard to one of the two features. To direct the focus of covert attention to the different features we manipulated the expectation of a shape or roughness judgment by varying the frequency of trials for each task (20%, 50%, 80%), then we measured discrimination thresholds. We found higher discrimination thresholds for both shape and roughness perception when the task was unexpected, compared to the conditions in which the task was expected (or both tasks were expected equally). Our results suggest that active touch perception is modulated by expectations about the task. This implies that despite fundamental differences, active and passive touch are affected by feature selective covert attention in a similar way.
Metzger, A., Mueller, S., Fiehler, K., & Drewing, K. (2019). Top-down modulation of shape and roughness discrimination in active touch by covert attention. Attention, Perception, & Psychophysics, 81(2), 462-475.
Active finger movements play a crucial role in natural haptic perception. For the perception of different haptic properties people use different well-chosen movement schemes (Lederman & Klatzky, 1987). The haptic property of softness is stereotypically judged by repeatedly pressing one’s finger against an objects’ surface, actively indenting the object. It has been shown that people adjust the peak indentation forces of their pressing movements to the expected stimulus’ softness in order to improve perception (Kaim & Drewing, 2011). Here, we aim to clarify the mechanisms underlying such adjustments. We disentangle how people modulate executed peak indentation forces depending on predictive versus sensory signals to softness, and investigate the influence of the participants’ motivational state on movement adjustments. In Experiment 1, participants performed a 2AFC softness discrimination task for stimulus pairs from one of four softness categories. We manipulated the predictability of the softness category. Either all stimuli of the same category were presented in a blocked fashion, which allowed predicting the softness category of the upcoming pair (predictive signals high), or stimuli from different categories were randomly intermixed, which made prediction impossible (predictive signals low). Sensory signals to softness category of the two stimuli in a pair are gathered during exploration. We contrasted the first indentation (sensory signals low) and last indentation (sensory signals high) in order to examine the effect of sensory signals.
Lezkan, A., Metzger, A., & Drewing, K. (2018). Active haptic exploration of softness: Indentation force is systematically related to prediction, sensation and motivation. Frontiers in integrative neuroscience, 12, 59.
Although the natural haptic perception of textures includes active finger movements, it is unclear how closely perception and movements are linked. Here we investigated this question using oriented textures. Textures that are composed of periodically repeating grooves have a clear orientation defined by the grooves. The direction of finger movement relative to texture orientation determines the availability of temporal cues to the spatial period of the texture. These cues are absent during movements directed in line with texture orientation, whereas movements orthogonal to texture orientation maximize the temporal frequency of stimulation. This may optimize temporal cues. In Experiment 1 we tested whether texture perception gets more precise the more orthogonal the movement direction is to the texture. We systematically varied the movement direction within a 2IFC spatial period discrimination task. As expected, perception was more precise (lower discrimination thresholds) when finger movements were directed closer towards the texture orthogonal as compared to in parallel to the texture. In Experiment 2 we investigated whether people adjust movement directions to the texture orthogonal in free exploration. We recorded movement directions during free exploration of standard and comparison gratings. The standard gratings were clearly oriented. The comparison gratings did not have a clear orientation defined by grooves. Participants adjusted movement directions to the texture orthogonal only for clearly oriented textures (standards). The adjustment to texture orthogonal was present in the final movement but not in the first movement. This suggests that movement adjustment is based on sensory signals for texture orientation that were gathered over the course of exploration. In Experiment 3 we assessed whether the perception of texture orientation and movement adjustments are based on shared sensory signals. We determined perceptual thresholds for orientation discrimination and computed 'movometric' thresholds from the stroke-by-stroke adjustment of movement direction. Perception and movements were influenced by a common factor, the spatial period, suggesting that the same sensory signals for texture orientation contribute to both. We conclude that people optimize texture perception by adjusting their movements in directions that maximize temporal cue frequency. Adjustments are performed on the basis of sensory signals that are also used for perception.
Lezkan, A., & Drewing, K. (2018). Interdependences between finger movement direction and haptic perception of oriented textures. Plos one, 13(12), e0208988.
When estimating the softness of an object by active touch, humans typically indent the object’s surface several times with their finger, applying higher peak indentation forces when they expect to explore harder as compared to softer stimuli [1]. Here, we compared how different types of prior knowledge differentially influence exploratory forces in softness discrimination. On each trial, participants successively explored two silicone rubber stimuli which were either both relatively soft or both relatively hard, and judged which of the two were softer. We measured peak forces of the first indentation. In the control condition, participants obtained no information about whether the upcoming stimulus pair would be from the hard or the soft category. In three test conditions, participants received implicit (pairs from the same category were blocked), semantic (the words soft and hard), or visual prior knowledge about the softness category. Visual information was provided by displaying the rendering of a compliant object deformed by a probe. Given implicit information, participants again used significantly more force in their first touch when exploring harder as compared to softer objects. Surprisingly, when given visual information, participants used significantly less force in the first touch when exploring harder objects. There was no effect when participants were given semantic information. We conclude that different types of prior knowledge influence the exploration behavior in very different ways. Thus, the mechanisms through which prior knowledge is integrated in the exploration process might be more complex than expected.
Zöller, A. C., Lezkan, A., Paulun, V. C., Fleming, R. W., & Drewing, K. (2018, June). Influence of different types of prior knowledge on haptic exploration of soft objects. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 413-424). Springer, Cham.
When judging the heaviness of two objects with equal mass, people perceive the smaller and denser of the two as being heavier. Despite the large number of theories, covering bottom-up and top-down approaches, none of them can fully account for all aspects of this size-weight illusion and thus for human heaviness perception. Here we propose a new maximum-likelihood estimation model which describes the illusion as the weighted average of two heaviness estimates with correlated noise: One estimate derived from the object’s mass, and the other from the object’s density, with estimates’ weights based on their relative reliabilities. While information about mass can directly be perceived, information about density will in some cases first have to be derived from mass and volume. However, according to our model at the crucial perceptual level, heaviness judgments will be biased by the objects’ density, not by its size. In two magnitude estimation experiments, we tested model predictions for the visual and the haptic size-weight illusion. Participants lifted objects which varied in mass and density. We additionally varied the reliability of the density estimate by varying the quality of either visual (Experiment 1) or haptic (Experiment 2) volume information. As predicted, with increasing quality of volume information, heaviness judgments were increasingly biased towards the object’s density: Objects of the same density were perceived as more similar and big objects were perceived as increasingly lighter than small (denser) objects of the same mass. This perceived difference increased with an increasing difference in density.
Wolf, C., Bergmann Tiest, W. M., & Drewing, K. (2018). A mass-density model can account for the size-weight illusion. PloS one, 13(2), e0190624.
Redundant estimates of an environmental property derived simultaneously from different senses or cues are typically integrated according to the maximum likelihood estimation model (MLE): Sensory estimates are weighted according to their reliabilities, maximizing the percept’s reliability. Mechanisms underlying the integration of sequentially derived estimates from one sense are less clear. Here we investigate the integration of serially sampled redundant information in softness perception. We developed a method to manipulate haptically perceived softness of silicone rubber stimuli during bare-finger exploration. We then manipulated softness estimates derived from single movement segments (indentations) in a multisegmented exploration to assess their contributions to the overall percept. Participants explored two stimuli in sequence, using 2–5 indentations, and reported which stimulus felt softer.
Metzger, A., Lezkan, A., & Drewing, K. (2018). Integration of serial sensory information in haptic perception of softness. Journal of Experimental Psychology: Human Perception and Performance, 44(4), 551.
Participants manually explored 47 solid, fluid, and granular materials and rated them according to a list of sensory and affective attributes. In principal component analyses (PCA) of sensory ratings, we extracted six dimensions: Fluidity, Roughness, Deformability, Fibrousness, Heaviness, and Granularity. PCAs on affective ratings revealed Valence, Arousal, and Dominance. PCAs explained 87 percent of variance or more. We found sensory dimensions beyond the surface characteristics on which many previous studies had focused, and the affective dimension of Dominance which previously had not been reported-probably due to our wide range of materials. Experiment 1 investigated a single sample, Experiment 2 distinguished between participants with more versus less outdoor experience during childhood
Drewing, K., Weyel, C., Celebi, H., & Kaya, D. (2018). Systematic relations between affective and sensory material dimensions in touch. IEEE Transactions on Haptics, 11(4), 611-622.
For different types of textures judged roughness has been shown to be an inverted U-shaped function of inter-element spacing when texture amplitude is low [1, 2]. This may be due to an interplay of two “components” that contribute to the skin’s spatial deformation, and thus to a spatial-intensive code to roughness [1, 3, 4]: (1) deformation increases with the depth of the finger’s intrusion between elements, which increases with inter-element spacing until the finger contacts the ground; and (2) skin deformation decreases with a decreasing number of inter-element gaps being simultaneously under the skin, i.e. with the texture’s spatial frequency (which is negatively correlated with inter-element spacing). The present study systematically tested these ideas. We presented participants different series of 3D-printed rectangular grating stimuli, in which the width of the grating’s grooves varied and the spatial frequency of grooves was constant, or vice versa. Participants touched the stimuli without lateral movement and judged roughness using magnitude estimation. As predicted and previously observed, judged roughness increased with groove width and groove frequency. However, the predicted increase with groove frequency, was only found for frequencies below about 0.5 mm−1. For larger frequencies, roughness decreased with increasing frequency. The decrease is at odds with findings from earlier studies that used aluminum rather than plastic gratings [5]. The results corroborate the assumption that the area of skin deformation plays a crucial role for roughness, but at the same time, point to the influence of subtle differences between materials that should be investigated in the future.
Drewing, K. (2018, June). Judged roughness as a function of groove frequency and groove width in 3D-printed gratings. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 258-269). Springer, Cham.
Demographic changes in most developed societies have fostered research on functional aging. While cognitive changes have been characterized elaborately, understanding of perceptual aging lacks behind. We investigated age effects on the mechanisms of how multiple sources of sensory information are merged into a common percept. We studied visuo-haptic integration in a length discrimination task. A total of 24 young (20–25 years) and 27 senior (69–77 years) adults compared standard stimuli to appropriate sets of comparison stimuli. Standard stimuli were explored under visual, haptic, or visuo-haptic conditions. The task procedure allowed introducing an intersensory conflict by anamorphic lenses. Comparison stimuli were exclusively explored haptically. We derived psychometric functions for each condition, determining points of subjective equality and discrimination thresholds. We notably evaluated visuo-haptic perception by different models of multisensory processing, i.e., the Maximum-Likelihood-Estimate model of optimal cue integration, a suboptimal integration model, and a cue switching model. Our results support robust visuo-haptic integration across the adult lifespan. We found suboptimal weighted averaging of sensory sources in young adults, however, senior adults exploited differential sensory reliabilities more efficiently to optimize thresholds. Indeed, evaluation of the MLE model indicates that young adults underweighted visual cues by more than 30%; in contrast, visual weights of senior adults deviated only by about 3% from predictions. We suggest that close to optimal multisensory integration might contribute to successful compensation for age-related sensory losses and provides a critical resource. Differentiation between multisensory integration during healthy aging and age-related pathological challenges on the sensory systems awaits further exploration.
Billino, J., & Drewing, K. (2018). Age effects on visuo-haptic length discrimination: evidence for optimal integration of senses in senior adults. Multisensory research, 31(3-4), 273-300.
Where textures are defined by repetitive small spatial structures, exploration covering a greater extent will lead to signal repetition. We investigated how sensory estimates derived from these signals are integrated. In Experiment 1, participants stroked with the index finger one to eight times across two virtual gratings. Half of the participants discriminated according to ridge amplitude, the other half according to ridge spatial period. In both tasks, just noticeable differences (JNDs) decreased with an increasing number of strokes. Those gains from additional exploration were more than three times smaller than predicted for optimal observers who have access to equally reliable, and therefore equally weighted, estimates for the entire exploration. We assume that the sequential nature of the exploration leads to memory decay of sensory estimates. Thus, participants compare an overall estimate of the first stimulus, which is affected by memory decay, to stroke-specific estimates during the exploration of the second stimulus. This was tested in Experiments 2 and 3. The spatial period of one stroke across either the first or second of two sequentially presented gratings was slightly discrepant from periods in all other strokes. This allowed calculating weights of stroke-specific estimates in the overall percept. As predicted, weights were approximately equal for all strokes in the first stimulus, while weights decreased during the exploration of the second stimulus. A quantitative Kalman filter model of our assumptions was consistent with the data. Hence, our results support an optimal integration model for sequential information given that memory decay affects comparison processes.
Lezkan, A., & Drewing, K. (2018). Processing of haptic texture information over sequential exploration movements. Attention, Perception, & Psychophysics, 80(1), 177-192.
When small holes are felt with the tongue, they are perceived to be larger compared with when felt with the index finger. This oral illusion has not yet been consistently explained. From present action-specific accounts of perception, we derived a high-level sticking-action hypothesis to explain the oral illusion. In 5 experiments, we contrasted this hypothesis’ predictions with predictions from the low-level bending hypothesis, which states that felt hole size decreases with decreasing bending of the skin at the hole’s edges. Results from Experiments 1 to 3 showed that felt hole size decreases with the pliability of the exploring effector (tongue > index finger > big toe, big fingers > small fingers), which affects skin bending, and that size perception with the highly pliable tongue is more accurate than with the less pliable finger and toe. Experiment 4 showed that holes of intermediate size are perceived to be larger with the tongue’s tip than with its dorsum. Finally, exploration styles that lessen the skin’s bending (using low vs. high tongue forces in Experiment 5) decreased perceived hole size. Overall, the results favor the low-level bending hypothesis over the high-level sticking-action hypothesis.
When touching an object, we focus more on some of its parts rather than touching the whole object’s surface, i.e. some parts are more salient than others. Here we investigated how different physical properties of rigid, plastic, relieved textures determine haptic exploratory behavior. We produced haptic stimuli whose textures were locally defined by random distributions of four independent features: amplitude, spatial frequency, orientation and isotropy. Participants explored two stimuli one after the other and in order to promote exploration we asked them to judge their similarity. We used a linear regression model to relate the features and their gradients to the exploratory behavior (spatial distribution of touch duration). The model predicts human behavior significantly better than chance, suggesting that exploratory movements are to some extent driven by the low level features we investigated. Remarkably, the contribution of each predictor changed as a function of the spatial scale in which it was defined, showing that haptic exploration preferences are spatially tuned, i.e. specific features are most salient at different spatial scales.
Metzger, A., Toscani, M., Valsecchi, M., & Drewing, K. (2018, June). Haptic saliency model for rigid textured surfaces. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 389-400). Springer, Cham.
When a short flash occurs in spatial alignment with a moving object, the moving object is seen ahead the stationary one. Similar to this visual “flash-lag effect” (FLE) it has been recently observed for the haptic sense that participants judge a moving hand to be ahead a stationary hand when judged at the moment of a short vibration (“haptic flash”) that is applied when the two hands are spatially aligned. We further investigated the haptic FLE. First, we compared participants’ performance in two isosensory visual or haptic conditions, in which moving object and flash were presented only in a single modality (visual: sphere and short color change, haptic: hand and vibration), and two bisensory conditions, in which the moving object was presented in both modalities (hand aligned with visible sphere), but the flash was presented only visually or only haptically. The experiment aimed to disentangle contributions of the flash’s and the objects’ modalities to the FLEs in haptics versus vision. We observed a FLE when the flash was visually displayed, both when the moving object was visual and visuo-haptic. Because the position of a visual flash, but not of an analogue haptic flash, is misjudged relative to a same visuo-haptic moving object, the difference between visual and haptic conditions can be fully attributed to characteristics of the flash. The second experiment confirmed that a haptic FLE can be observed depending on flash characteristics: the FLE increases with decreasing intensity of the flash (slightly modulated by flash duration), which had been previously observed for vision. These findings underline the high relevance of flash characteristics in different senses, and thus fit well with the temporal-sampling framework, where the flash triggers a high-level, supra-modal process of position judgement, the time point of which further depends on the processing time of the flash.
Drewing, K., Hitzel, E., & Scocchia, L. (2018). The haptic and the visual flash-lag effect and the role of flash characteristics. Plos one, 13(1), e0189291.
In haptic perception information is often sampled serially over a certain interval of time. For example, a stimulus is repeatedly indented to repeatedly estimate its softness. Albeit such redundant estimates are equally reliable, they seem to contribute differently to the overall haptic percept in a comparison task. When comparing the softness of two silicon rubber stimuli, the within-stimulus weights of estimates of the second stimulus' softness decrease during the exploration. Here we test the hypothesis that such decrease of weights depends on the representation strength of the first stimulus' softness. We varied the length of the first stimulus' exploration. Participants subsequently explored two silicon rubber stimuli by indenting the first stimulus (comparison) 1 or 5 times and the second stimulus (standard) always 3 times. We assessed the weights of indentation-specific estimates from the second stimulus by manipulating perceived softness during single indentations. Our results show that the longer the first stimulus is explored the more estimates of the second stimulus' softness can be included in the comparison of the two stimuli. This suggests that the exploration length of the first stimulus determines the strength of its representation which influences the decrease of weights of indentation-specific estimates of the second stimulus.
Metzger, A., & Drewing, K. (2017, June). The longer the first stimulus is explored in softness discrimination the longer it can be compared to the second one. In 2017 IEEE World Haptics Conference (WHC) (pp. 31-36). IEEE.
Das Kapitel gibt einen ersten Überblick über den Kenntnisstand zur gemeinsamen Verarbeitung der Information aus verschiedenen Sinnen beim Menschen. Behandelt werden Prozesse der multisensorischen Integration redundanter Information und der multisensorischen Kombination, das Problem der Zuordnung zusammengehöriger Information aus verschiedenen Sinnen, Mechanismen des Abgleichs zwischen den Sinnen, die Rolle der Aufmerksamkeit sowie die neurophysiologischen Grundlagen multisensorischer Verarbeitung. Anhand von Beispielen aus Ergonomie und Klinik wird die Anwendbarkeit der Erkenntnisse verdeutlicht.
Schlüsselwörter: Multisensorisch; Multimodal; Intersensorisch; Redundante Information; Signalintegration; Adaptation; Rekalibrierung; Crossmodale Aufmerksamkeit; Multisensorische Areale
Drewing, K. (2017). Multisensorische Informationsverarbeitung. In Allgemeine Psychologie (pp. 75-101). Springer, Berlin, Heidelberg.
Participants explored a representative set of 47 solid, fluid and granular materials and rated them according to a list of 32 perceptual and 20 affective attributes. In a principal component analysis (PCA) of the perceptual ratings, we extracted six dimensions: Fluidity, Roughness, Deformability, Fibrousness, Heaviness, and Granularity explained 88% variance. A PCA on affective ratings revealed the dimensions: Valence, Arousal, and Dominance, explaining 92% variance. Greater Valence was significantly associated with reduced Roughness, greater Arousal with more Fluidity and greater Dominance with decreasing Deformability and decreasing Heaviness. Overall, the present study demonstrates that the range of affective responses to touched material is broader than previously assumed, and that these affective responses are systematically associated with certain perceptual dimensions.
Drewing, K., Weyel, C., Celebi, H., & Kaya, D. (2017, June). Feeling and feelings: Affective and perceptual dimensions of touched materials and their connection. In 2017 IEEE World Haptics Conference (WHC) (pp. 25-30). IEEE.
Perceiving the sensory consequences of one’s own actions is essential to successfully interact with the environment. Previous studies compared self- (active) and externally generated (passive) movements to investigate the processing of voluntary action–outcomes. Increased temporal binding (intentional binding) as well as increased detection of delays between action and outcome have been observed for active compared to passive movements. Using transcranial direct stimulation (tDCS) it has been shown that left hemispheric anodal stimulation decreased the intentional binding effect. However, whether the left hemisphere contributes to delay detection performance between action and outcome is unknown. We investigated polarization-dependent effects of left and right frontoparietal tDCS on detecting temporal action–outcome discrepancies. We applied anodal and cathodal stimulation to frontal (F3/F4), parietal (CP3/CP4) and frontoparietal (F3/CP4) areas. After stimulation, participants were presented with visual feedback with various delays after a key press. They had to report whether they detected a delay between the key press and the feedback. In half of the trials the key press was self-initiated, in the other half it was externally generated. A main effect of electrode location indicated highest detection performance after frontal stimulation. Furthermore, we found that the advantage for active versus passive conditions was larger for left hemispheric anodal stimulation as compared to cathodal stimulation. Whereas the frontal cortex is related to delay detection performance in general, hemispheric differences seem to support the differentiation of self-initiated versus externally generated movement consequences.
Straube, B., Schülke, R., Drewing, K., Kircher, T., & van Kemenade, B. M. (2017). Hemispheric differences in the processing of visual consequences of active vs. passive movements: a transcranial direct current stimulation study. Experimental brain research, 235(10), 3207-3216.
Past sensory experience can influence present perception. We studied the effect of adaptation in haptic softness perception. Participants compared two silicon rubber stimuli, a reference and a comparison stimulus, by indenting them simultaneously with the index fingers of their two hands and decided which one felt softer. In adaptation conditions the index finger that explored the reference stimulus had previously been adapted to another rubber stimulus. The adaptation stimulus was indented 5 times with a force of >15 N, thus the two index fingers had a different sensory past. In baseline conditions there was no previous adaptation. We measured the Points of Subjective Equality (PSEs) of one reference stimulus to a set of comparison stimuli. We used four different adaptation stimuli, one was harder, two were softer and one had approximately the same compliance as compared to the reference stimulus. PSEs shifted as a function of the compliance of the adaptation stimulus: the reference was perceived to be softer when the finger had been adapted to a harder stimulus and it was perceived to be harder after adaptation to a softer stimulus. We conclude that recent sensory experience causes a shift of haptically perceived softness away from the softness of the adaptation stimulus. The finding that perceived softness is susceptible to adaptation suggests that there might be neural channels tuned to different softness values and softness is an independent primary perceptual quality.
Metzger, A., & Drewing, K. (2016, July). Haptic aftereffect of softness. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 23-32). Springer, Cham.
In haptic perception sensory signals depend on how we actively move our hands. For textures with periodically repeating grooves, movement direction can determine temporal cues to spatial frequency. Moving in line with texture orientation does not generate temporal cues. In contrast, moving orthogonally to texture orientation maximizes the temporal frequency of stimulation, and thus optimizes temporal cues. Participants performed a spatial frequency discrimination task between stimuli of two types. The first type showed the described relationship between movement direction and temporal cues, the second stimulus type did not. We expected that when temporal cues can be optimized by moving in a certain direction, movements will be adjusted to this direction. However, movement adjustments were assumed to be based on sensory information, which accumulates over the exploration process. We analyzed 3 individual segments of the exploration process. As expected, participants only adjusted movement directions in the final exploration segment and only for the stimulus type, in which movement direction influenced temporal cues. We conclude that sensory signals on the texture orientation are used online during exploration in order to adjust subsequent movements. Once sufficient sensory evidence on the texture orientation was accumulated, movements were directed to optimize temporal cues.
Lezkan, A., & Drewing, K. (2016, July). Going against the grain–Texture orientation affects direction of exploratory movement. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 430-440). Springer, Cham.
In the flash-lag illusion, a brief visual flash and a moving object presented at the same location appear to be offset with the flash trailing the moving object. A considerable amount of studies investigated the visual flash-lag effect, and flash-lag-like effects have also been observed in audition, and cross-modally between vision and audition. In the present study, we investigate whether a similar effect can also be observed when using only haptic stimuli. A fast vibration (or buzz, lasting less than 20 ms) was applied to the moving finger of the observers and employed as a “haptic flash.” Participants performed a two-alternative forced-choice (2AFC) task where they had to judge whether the moving finger was located to the right or to the left of the stationary finger at the time of the buzz. We used two different movement velocities (Slow and Fast conditions). We found that the moving finger was systematically misperceived to be ahead of the stationary finger when the two were physically aligned. This result can be interpreted as a purely haptic analogue of the flash-lag effect, which we refer to as “buzz-lag effect.” The buzz-lag effect can be well accounted for by the temporal-sampling explanation of flash-lag-like effects.
Cellini, C., Scocchia, L., & Drewing, K. (2016). The buzz-lag effect. Experimental brain research, 234(10), 2849-2857.
Roughness is probably the most salient dimension pertaining to the perception of textures by touch and has been widely investigated. There is a controversy on how roughness relates to the texture’s spatial period and which factors influence this relation. Here, roughness during bare finger exploration of coarse textures is studied for different types of textures with elements of low height (0.3 mm). Participants were presented with square-wave gratings that were defined along one dimension and sine-wave gratings that were defined along one or two dimensions. Textures of each type varied in their spatial half period between 0.25 and 5.17 mm. Participants explored the textures by a lateral movement or a stationary finger contact. In all conditions judged roughness increased with spatial period up to a peak roughness and then decreased again. The exact function depended on the texture type, but hardly on exploration mode. We conclude that roughness is an inverted U-shaped function of texture period, if the textures are of low amplitude. The effects are explained by the interplay of two components contributing to the spatial code to roughness: variability in skin deformation due to the finger’s intrusion into the texture, which increases with the textures’ period up to a maximum (when the skin contacts the texture’s ground), and variability associated with the spatial frequency of the deformation, which decreases with spatial period.
Drewing, K. (2016, July). Low-amplitude textures explored with the bare finger: roughness judgments follow an inverted U-shaped function of texture period modified by texture type. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 206-217). Springer, Cham.
The Gestalt theory of perception offered principles by which distributed visual sensations are combined into a structured experience (“Gestalt”). We demonstrate conditions whereby haptic sensations at two fingertips are integrated in the perception of a single object. When virtual bumps were presented simultaneously to the right hand's thumb and index finger during lateral arm movements, participants reported perceiving a single bump. A discrimination task measured the bump's perceived location and perceptual reliability (assessed by differential thresholds) for four finger configurations, which varied in their adherence to the Gestalt principles of proximity (small versus large finger separation) and synchrony (virtual spring to link movements of the two fingers versus no spring). According to models of integration, reliability should increase with the degree to which multi-finger cues integrate into a unified percept. Differential thresholds were smaller in the virtual-spring condition (synchrony) than when fingers were unlinked. Additionally, in the condition with reduced synchrony, greater proximity led to lower differential thresholds. Thus, with greater adherence to Gestalt principles, thresholds approached values predicted for optimal integration. We conclude that the Gestalt principles of synchrony and proximity apply to haptic perception of surface properties and that these principles can interact to promote multi-finger integration.
Lezkan, A., Manuel, S. G., Colgate, J. E., Klatzky, R. L., Peshkin, M. A., & Drewing, K. (2016). Multiple Fingers–One Gestalt. IEEE transactions on haptics, 9(2), 255-266.
When small holes are felt with the tongue they are perceived to be larger as compared to when felt with the index finger. It is an open question why the size perceived with the tongue differs from size perceived with the finger. We suggest that differences in perceived size are due to differences in the effector's deformation at the edge of the explored hole, which correlate with the effector's pliability. The more pliable the effector is, the more it will deform and the hole will be perceived to be larger. In two experiments we tested this hypothesis. In Experiment I we manipulated the force applied by the tongue, and thus the tongue's deformation. As predicted, the tongue perceived holes to be larger when higher forces were applied. Experiment II investigated how the toe perceives size as compared to the tongue and finger. The toe is less pliable than the finger and the finger is less pliable than the tongue. Also as predicted, holes at the toe were perceived to be smaller than holes at the finger and considerably smaller than holes at the tongue. The findings corroborate our pliability-deformation hypothesis of haptic size perception.
Drewing, K., Bruckbauer, S., & Szöke, D. (2015, June). Felt hole size depends on force and on the pliability of the effector. In 2015 IEEE World Haptics Conference (WHC) (pp. 100-105). IEEE.
The perception of softness is the result of the integration of information provided by multiple cutaneous and kinesthetic signals. The relative contributions of these signals to the combined percept of softness was not yet addressed directly. We transmitted subtle external vertical forces to the exploring human finger during the exploration of deformable silicone rubber stimuli to dissociate the force estimates provided by the kinesthetic signals and the efference copy from cutaneous force estimates. This manipulation introduced a conflict between the cutaneous and the kinesthetic/efference copy information on softness. We measured Points of Subjective Equality (PSE) of manipulated references to stimuli which were explored without external forces. PSEs shifted as a linear function of external force in predicted directions - to higher compliances with pushing and to lower compliances with pulling force. We found relative contribution of kinesthetic/efference copy information to perceived softness being 23% for rather hard and 29% for rather soft stimuli. Our results suggest that an integration of the kinesthetic/efference copy information and cutaneous information with constant weights underlies softness perception. The kinesthetic/efference copy information seems to be slightly more important for the perception of rather soft stimuli.
Metzger, A., & Drewing, K. (2015, June). Haptically perceived softness of deformable stimuli can be manipulated by applying external forces during the exploration. In 2015 IEEE World Haptics Conference (WHC) (pp. 75-81). IEEE.
An object's softness is stereotypically judged by pressure movements indenting the surface [1]. In exploration without movement constraints, participants repeat such indentation movements. We investigated how people modulate executed peak forces for different indentations depending on stimulus softness. Participants performed a 2AFC discrimination task for stimulus pairs from one of 4 softness categories. We assumed that movement control at different exploration moments is based on variations in the predictive and sensory signals available. We manipulated availability of predictive signals on softness category, by presenting either stimuli of the same category in a blocked fashion (high predictability) or by randomly mixing stimuli from different categories (low predictability). Effects of sensory signals were examined by contrasting first and last indentation, as sensory signals are hardly available when initiating exploration but gathered during exploration. The results show that participants systematically apply lower forces when sensory or predictive signals indicate softer objects as compared to harder objects. We conclude that softness exploration can be considered as a sensorimotor control loop, in which predictive and sensory signals determine movement control. Further, the results indicate a high importance of predictive processes throughout the entire exploration, as effects of predictive signals maintain in the last indentation.
Lezkan, A., & Drewing, K. (2015, June). Predictive and sensory signals systematically lower peak forces in the exploration of softer objects. In 2015 IEEE World Haptics Conference (WHC) (pp. 69-74). IEEE.
The integration of multisensory information is an essential mechanism in perception and action control. Research in multisensory integration is concerned with how the information from the different sensory modalities, such as the senses of vision, hearing, smell, taste, touch, and proprioception, are integrated to a coherent representation of objects (for an overview, see e.g., Calvert et al., 2004). The combination of information from the different senses is central for action control. For instance, when you grasp for a rubber duck, you can see its size, feel its compliance and hear the sound it produces. Moreover, identical physical properties of an object can be provided by different senses. You can both see and feel the size of the rubber duck. Even when you grasp for the rubber duck with a tool (e.g., with tongs), the information from the proximal hand, from the effective part of the distal tool and from the eyes are integrated in a manner to act successfully (for limitations of this integration see Sutter et al., 2013).
Sutter, C., Drewing, K., & Müsseler, J. (2014). Multisensory integration in action control. Frontiers in psychology, 5, 544.
Perception during active touch essentially depends on the executed exploratory movements. Humans use different movement schemes to perceive different haptic properties, the so-called exploratory procedures (EPs). The stereotypically used EPs are normally superior to other EPs in perceiving the associated property and it has been speculated that the EPs are a means of maximising pickup of the relevant sensory information. However, EPs are not always executed identically as they vary in a number of ways. For instance, the peak force and the number of fingers used during exploration are not fixed. This chapter reviews existing findings on the exploratory movement strategies that humans use in softness perception and gives an overview on how different manners of exploration affect the performance in softness tasks. It is shown that observers adapt their movement strategies depending on variations of the stimulus value and the exact conditions of the exploratory task, and that different movement parameters, e.g. the peak exploratory forces, considerably affect performance. Overall, results suggest that humans adjust their exploratory strategies to achieve the highest levels of performance in softness discrimination.
Drewing, K. (2014). Exploratory movement strategies in softness perception. Multisensory Softness, 109-125.
When participants are given the opportunity to simultaneously feel an object and see it through a magnifying or reducing lens, adults estimate object size to be in-between visual and haptic size. Studies with young children, however, seem to demonstrate that their estimates are dominated by a single sense. In the present study, we examined whether this age difference observed in previous studies, can be accounted for by the large discrepancy between felt and seen size in the stimuli used in those studies. In addition, we studied the processes involved in combining the visual and haptic inputs. Adults and 6-year old children judged objects that were presented to vision, haptics or simultaneously to both senses. The seen object length was reduced or magnified by different lenses. In the condition inducing large intersensory discrepancies, children’s judgments in visuo-haptic conditions were almost dominated by vision, whereas adults weighted vision just by 40%. Neither the adults’ nor the children’s discrimination thresholds were predicted by models of visuo-haptic integration. With smaller discrepancies, the children’s visual weight approximated that of the adults and both the children’s and adults' discrimination thresholds were well predicted by an integration model, which assumes that both visual and haptic inputs contribute to each single judgment. We conclude that children integrate seemingly corresponding multisensory information in similar ways as adults do, but focus on a single sense, when information from different senses is strongly discrepant.
Jovanovic, B., & Drewing, K. (2014). The influence of intersensory discrepancy on visuo-haptic integration is similar in 6-year-old children and adults. Frontiers in Psychology, 5, 57.
The contribution of vision to visuo-haptic softness judgments has been observed to be non-optimal and it has been speculated that the visual weights are “sticky”, i.e., do not account for the senses’ reliabilities. The present study tested the hypothesis of sticky weights by varying the quality of the visual information. Participants discriminated between the softness of two objects with deformable surfaces using only visual, only haptic, or bisensory information. Visually, we displayed the finger’s positions and stimulus deformations in a noisy or precise quality or in a precise quality enhanced by visual force information from color changes of the finger nail. We assessed the reliabilities of the judgments using the method of constant stimuli. In bisensory conditions, discrepancies between the two senses’ information were used to measure each sense’s weight. The reliability of visual judgments was lower with noisy as compared to precise position information, visual force information did not affect reliability. The reliability of bisensory judgments was suboptimal and visual weights were higher than optimal. Not as expected, the visual weights shifted with the visual reliability. The results confirm that visuo-haptic integration of softness information is suboptimal and biased towards vision, but with weights that are “lazy” rather than sticky.
Drewing, K., & Kruse, O. (2014, June). Weights in visuo-haptic softness perception are not sticky. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 68-76). Springer, Berlin, Heidelberg.
The sense of touch is characterized by its sequential nature. In texture perception, enhanced spatio-temporal extension of exploration leads to better discrimination performance due to combination of repetitive information. We have previously shown that the gains from additional exploration are smaller than the Maximum Likelihood Estimation (MLE) model of an ideal observer would assume. Here we test if this suboptimal integration can be explained by unequal weighting of information. Participants stroke 2 to 5 times across a virtual grating and judged the ridge period in a 2IFC task. We presented slightly discrepant period information in one of the strokes in the standard grating. Results show linearly decreasing weights of this information with spatio-temporal distance (number of intervening strokes) to the comparison grating. For each exploration extension (number of strokes) the stroke with the highest number of intervening strokes to the comparison was completely disregarded. The results are consistent with the notion that memory limitations are responsible for the unequal weights. This study raises the question if models of optimal integration should include memory decay as an additional source of variance and thus not expect equal weights.
Lezkan, A., & Drewing, K. (2014, June). Unequal but fair? Weights in the serial integration of haptic texture information. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 386-392). Springer, Berlin, Heidelberg.
Softness perception intrinsically relies on haptic information. However, through everyday experiences we learn correspondences between felt softness and the visual effects of exploratory movements that are executed to feel softness. Here, we studied how visual and haptic information is integrated to assess the softness of deformable objects. Participants discriminated between the softness of two softer or two harder objects using only-visual, only-haptic or both visual and haptic information. We assessed the reliabilities of the softness judgments using the method of constant stimuli. In visuo-haptic trials, discrepancies between the two senses' information allowed us to measure the contribution of the individual senses to the judgments. Visual information (finger movement and object deformation) was simulated using computer graphics; input in visual trials was taken from previous visuo-haptic trials.
Cellini, C., Kaim, L., & Drewing, K. (2013). Visual and haptic integration in the estimation of softness of deformable objects. i-Perception, 4(8), 516-531.
The open-loop model by Wing and Kristofferson has successfully explained many aspects of movement timing. A later adaptation of the model assumes that timing processes do not control the movements themselves, but the sensory consequences of the movements. The present study tested direct predictions from this “sensory-goals model”. In two experiments, participants were instructed to produce regular intervals by tapping alternately with the index fingers of the left and the right hand. Auditory feedback tones from the taps of one hand were delayed. As a consequence, regular intervals between taps resulted in irregular intervals between feedback tones. Participants compensated for this auditory irregularity by changing their movement timing. Compensation effects increased with the magnitude of feedback delay (Experiment 1) and were also observed in a unimanual variant of the task (Experiment 2). The pattern of effects in alternating tapping suggests that compensation processes were anticipatory—that is, compensate for upcoming feedback delay rather than being reactions to delay. All experiments confirmed formal model predictions. Taken together, the findings corroborate the sensory-goals adaptation of the Wing–Kristofferson model.
Drewing, K. (2013). Delayed auditory feedback in repetitive tapping: A role for the sensory goal. The Quarterly Journal of Experimental Psychology, 66(1), 51-68.
People perceive a smaller and denser object to be heavier than a larger, less dense object of the same mass. We developed a new model of heaviness perception that can explain this size-weight illusion. Modeling followed recent insights on principles of information integration. Perceived heaviness is modeled as a weighted average of one heaviness estimate derived from object mass and another one derived from object density with weights that follow estimate reliabilities. In an experiment, participants judged the heaviness of 18 objects using magnitude estimation methods. Objects varied in mass and density. We also varied the reliability of density information by varying visual reliability: Participants were blindfolded or had strongly impaired, mildly impaired or full vision. Because participants lifted each object via a string they required visual information on object size to assess object density. The pattern of mass and density influences on judged heaviness confirmed model predictions. Also as predicted, density influences on judged heaviness increased with increasing reliability, whereas mass influences decreased. Individual and average data were well fit by the model (r 2 s > 0.96). Density information contributed for 14%, 21 % and 29% to heaviness, when vision was strongly impaired, mildly impaired or not impaired, respectively. Overall, the results highly corroborate our model, which appears to be promising as unifying framework for a number of findings on the size-weight illusion.
Drewing, K., & Tiest, W. M. B. (2013, April). Mass and density estimates contribute to perceived heaviness with weights that depend on the densities' reliability. In 2013 World Haptics Conference (WHC) (pp. 593-598). IEEE.
We studied whether vision can teach touch to the same extent as touch seems to teach vision. In a 2 x 2 between-participants learning study, we artificially correlated visual gloss cues with haptic compliance cues. In two "natural" tasks, we tested whether visual gloss estimations have an influence on haptic estimations of softness and vice versa. In two "new" tasks, in which participants were either asked to haptically judge glossiness or to visually judge softness, we investigated how perceptual estimates transfer from one sense to the other. Our results showed that vision does not teach touch as efficient as touch seems to teach vision.
Wismeijer, D. A., Gegenfurtner, K. R., & Drewing, K. (2012). Learning from vision-to-touch is different than learning from touch-to-vision. Frontiers in integrative neuroscience, 6, 105.
The active control of exploratory movements is an integral part of active touch. We investigated and manipulated the relationship between the haptic discrimination performance for small bumps and the direction of exploratory movements relative to the body. Shape discrimination performance varied with the direction of stimulus exploration. Experimental manipulations successfully changed the normative relation between exploratory direction and discrimination performance. If participants were rewarded for “good perceptual performance” and had the choice, they displayed clear strategic preferences for exploratory directions that yield optimal performance—but only after having extensive experience with the changed perceptual conditions. Overall, the findings suggest that participants can actively adapt their exploratory movements in order to optimize haptic discrimination performance.
Drewing, K. (2012). After experience with the task humans actively optimize shape discrimination in touch by utilizing effects of exploratory movement direction. Acta psychologica, 141(3), 295-303.
Haptic perception essentially depends on the executed exploratory movements. It has been speculated that spontaneously executed movements are optimized for the computation of associated haptic properties. We investigated to what extent people strategically execute movements that are tuned for softness discrimination of objects with deformable surfaces. In Experiment 1, we investigated how movement parameters depend on expected stimulus compliance. In a discrimination task, we measured exploratory forces for less compliant (hard) stimuli and for more compliant (soft) stimuli. In Experiment 2, we investigated whether exploratory force also depends on the expected compliance difference between the two stimuli.
Kaim, L., & Drewing, K. (2011). Exploratory strategies in haptic softness discrimination are tuned to achieve high levels of task performance. IEEE Transactions on Haptics, 4(4), 242-252.
The active control of exploratory movements is an integral part of active touch. We investigated how the spatio-temporal extension of an exploration affects texture discrimination via active touch and how participants spontaneously control the extension of their exploration. Participants stroke one to 8 times across a virtual grating. Half of the participants judged the grating according to ridge amplitude, the other half judged ridge period. In both tasks just noticeable differences (JNDs) decreased with an increasing number of strokes, down to a (presumably) minimum JND for stroke numbers of about 6-7 or higher (according to a model fit). This extends previous findings on the perceptual effects of extended stimulation to the domain of active touch.
Drewing, K., Lezkan, A., & Ludwig, S. (2011, June). Texture discrimination in active touch: Effects of the extension of the exploration and their exploitation. In 2011 IEEE World Haptics Conference (pp. 215-220). IEEE.
In this study of the haptic perception of small bumps, we investigated the influence of exploratory movement variation on signal integration and the percept’s reliability. When sliding across a bump on a surface, the finger follows the geometry of the bump (i.e., the position signal). At the same time, patterns of forces depending on the gradient of the bump act on the finger (i.e., the force signal; Robles-de-la-Torre & Hayward, 2001). Consistent with the maximum likelihood estimation (MLE) model, haptically perceived shape can be described by a weighted average of the shapes signaled by the position and force signals (Drewing & Ernst, 2006; Ernst & Banks, 2002). Here, we found that the weights of the position and force signals and the reliability of the shape percept depend on the pressure (and velocity) of the exploratory movement (Experiment 1).
Kaim, L., & Drewing, K. (2010). Exploratory pressure influences haptic shape perception via force signals. Attention, Perception, & Psychophysics, 72(3), 823-838.
Many studies demonstrated a higher accuracy in perception and action when using more than one sense. The maximum-likelihood estimation (MLE) model offers a recent approach on how perceptual information is integrated across different sensory modalities suggesting statistically optimal integration. The purpose of the present study was to investigate how visual and proprioceptive movement information is integrated for the perception of trajectory geometry. To test this, participants sat in front of an apparatus that moved a handle along a horizontal plane. Participants had to decide whether two consecutive trajectories formed an acute or an obtuse movement path. Judgments had to be based on information from a single modality alone, i.e., vision or proprioception, or on the combined information of both modalities.
Reuschel, J., Drewing, K., Henriques, D. Y., Rösler, F., & Fiehler, K. (2010). Optimal integration of visual and proprioceptive movement information for the perception of trajectory geometry. Experimental brain research, 201(4), 853-862.
If participants simultaneously feel an object and see it through an anamorphic lens, adults judge object size to be in-between seen and felt size [1]. Young children’s judgments were, however, dominated by vision [2]. We investigated whether this age difference depends on the magnitude of the intersensory discrepancy. 6-year old children and adults judged the length of objects that were presented to vision, haptics or both senses. Lenses reduced or magnified seen length. With large intersensory discrepancies, children’s visuo-haptic judgments were dominated by vision (~90% visual weight), whereas adults weighted vision just by ~40%. With smaller discrepancies, the children’s visual weight (~50%) approximated that of the adults (~35%)–and a model of multisensory integration predicted discrimination performance in both age groups.
Drewing, K., & Jovanovic, B. (2010, July). Visuo-haptic length judgments in children and adults. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 438-444). Springer, Berlin, Heidelberg.
Softness perception intrinsically relies on haptic information. Everyday-life experience teaches us some correspondence rules between perceived softness and the concurring visual effects of exploratory movements that are executed to feel softness. We investigated whether and how the brain integrates visual and haptic information while estimating the softness of deformable objects. We created 2 sets of rubber specimens, whose compliance varied in a controlled fashion: a hard set (0.12 to 0.25 mm/N) and a soft set (0.74 to 1.26 mm/N). In the experiment, participants touched these real stimuli, while they watched a simulation of their finger movements and stimulus deformation on a collocated visual 3D display. The experiment used the method of constant stimuli combined with a 2AFC task: participants always explored two stimuli and judged which one was softer.
Cellini, C., Kaim, L., & Drewing, K. (2010). Visuohaptic integration in softness estimation of softness of deformable objects. Perception ECVP abstract, 39, 130-130.
We investigated multisensory softness perception using a set of custom-made rubber specimens (compliance 0.11 to 0.96 mm N-1). Participants judged the softness of the stimuli under haptic-only, vision-only and visuo-haptic conditions. In haptic and visuo-haptic conditions participants explored the stimuli without and with vision of their exploratory movements, respectively. In visual conditions, participants watched how another person explored the stimuli. Participants well differentiated between the stimuli under all three modality conditions. Stimuli were judged to be slightly softer under vision-only conditions than under haptic-only conditions; visuo-haptic judgments were in-between (average visual weight: 55%). These findings demonstrate that (a) participants can reliably infer softness from indirect visual information alone, and that (b) such visual information has a major contribution to the visuo-haptic judgments
Drewing, K., Ramisch, A., & Bayer, F. (2009). Multisensory softness perception of deformable objects. Perception ECVP abstract, 38, 144-144.
We investigated the effect of stimulus and task properties on the control of exploratory movements in active touch. In Experiment 1 we varied the compliance of silicon rubber stimuli with deformable surfaces. Participants freely explored pairs of lowly compliant stimuli (harder condition) or of highly compliant stimuli (softer condition) with a bare finger, and were asked to select the softer one. Pairs from the two conditions were presented either block-wise, or in randomized order (blocked vs randomized session). In the blocked session, exploratory force was always higher for the harder than for the softer stimuli (long-term adjustment). In randomized sessions force was only higher for the harder stimuli that were explored as second within a trial and it did not differ between softness conditions for the firstly explored stimuli, indicating even short-term movement adjustment.
Kaim, L. R., & Drewing, K. (2009). Stimulus compliance influences the force of the exploratory movement. Perception ECVP abstract, 38, 69-69.
In this experiment we investigated the influence of stimulus properties on exploratory movement parameters in active touch. More precisely, we investigated whether and to what extent variations in stimulus compliance influence the executed finger force and velocity of the exploratory movements. Therefore, we varied the compliance of silicon rubber stimuli with deformable surfaces. Participants freely explored pairwise presented stimuli with a bare finger, and were asked to select the softer one. We found that both exploratory force and velocity depended on the compliance of the stimulus. Our results suggest in particular that observers strategically adapt their maximum finger force to the expected softness of the stimulus.
L. R. Kaim and K. Drewing, "Finger force of exploratory movements is adapted to the compliance of deformable objects," World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2009, pp. 565-569, doi: 10.1109/WHC.2009.4810830.
The purpose of this study is to investigate multisensory visual-haptic softness perception using deformable objects. We created a set of rubber specimens, whose compliance varied in a controlled fashion (0.11 to 0.96 mm/N), but which were otherwise indistinguishable. Participants judged the magnitude of the stimuli according to their softness under haptic-only, vision-only and visuo-haptic conditions. In haptic and visuo-haptic conditions participants explored the stimuli without and with vision of their exploratory movements, respectively. In visual conditions, participants watched how another person explored the stimuli. Participants were well able to differentiate between the different stimuli under all three modality conditions. Stimuli were judged to be slightly softer under vision-only conditions than under haptic-only conditions; visuo-haptic judgments were in-between (average visual weight: 55%). These findings demonstrate that (a) participants can reliably infer softness from indirect visual information alone-that is from watching corresponding exploratory movements and stimulus deformations-, and that (b) such visual information has a major contribution to visuo-haptic softness judgments. We further observed that judgments were more variable under visual as compared to haptic conditions; the variability of visuo-haptic judgments was similar to that of haptic ones. The lack of benefit from adding visual to haptic information, and the contrast between the relatively high visual weight in visuo-haptic judgments on the one hand and the low reliability of visual relative to haptic-only information on the other hand, suggest that the integration of visual and haptic judgments was not optimal, but biased towards vision.
K. Drewing, A. Ramisch and F. Bayer, "Haptic, visual and visuo-haptic softness judgments for objects with deformable surfaces," World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2009, pp. 640-645, doi: 10.1109/WHC.2009.4810828.
We present experimental and computational evidence for the estimation of visual and proprioceptive directional information during forward, visually driven arm movements. We presented noisy directional proprioceptive and visual stimuli simultaneously and in isolation midway during a pointing movement. Directional proprioceptive stimuli were created by brief force pulses, which varied in direction and were applied to the fingertip shortly after movement onset. Subjects indicated the perceived direction of the stimulus after each trial. We measured unimodal performance in trials in which we presented only the visual or only the proprioceptive stimulus. When we presented simultaneous but conflicting bimodal information, subjects' perceived direction fell in between the visual and proprioceptive directions. We find that the judged mean orientation matched the MLE predictions but did not show the expected improvement in reliability as compared to unimodal performance. We present an alternative model (probabilistic cue switching, PCS), which is consistent with our data. According to this model, subjects base their bimodal judgments on only one of two directional cues in a given trial, with relative choice probabilities proportional to the average stimulus reliability. These results suggest that subjects based their decision on a probability mixture of both modalities without integrating information across modalities.
Drewing, K., & Jovanovic, B. (2010, July). Visuo-haptic length judgments in children and adults. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 438-444). Springer, Berlin, Heidelberg.
We investigated how exploratory movement influences signal integration in active touch. Participants judged the amplitude of a bump specified by redundant signals: When a finger slides across a bump, the finger’s position follows the bump’s geometry (position signal); simultaneously, it is exposed to patterns of forces depending on the gradient of the bump (force signal). We varied amplitudes specified by force signals independently of amplitudes specified by position signals. Amplitude judgment was a weighted linear function of the amplitudes specified by both signals, under different exploratory conditions. The force signal’s contribution to the judgment was higher when the participants explored with the index finger, as opposed to the thumb, and when they explored along a tangential axis, as opposed to a radial one (pivot ≌ shoulder joint). Furthermore, for tangential, as compared with radial, axis exploration, amplitude judgments were larger (and more accurate), and amplitude discrimination was better. We attribute these exploration-induced differences to biases in estimating bump amplitude from force signals. Given the choice, the participants preferred tangential explorations with the index finger—a behavior that resulted in good discrimination performance. A role for an active explorer, as well as biases that depend on exploration, should be taken into account when signal integration models are extended to active touch.
Drewing, K., & Kaim, L. (2009). Haptic shape perception from force and position signals varies with exploratory movement direction and the exploring finger. Attention, Perception, & Psychophysics, 71(5), 1174-1184.
This chapter presents an overview of interesting scientific findings related to human haptic perception and discuss the usability of these scientific findings for the design and development of virtual environments including haptic rendering. The first section of the chapter deals with pure haptic perception whereas the second and third sections are devoted to the integration of kinesthetic information with other sensory inputs like vision and audition.
Bresciani, J. P., Drewing, K., & Ernst, M. O. (2008). Human haptic perception and the design of haptic-enhanced virtual environments. In The Sense of Touch and its Rendering (pp. 61-106). Springer, Berlin, Heidelberg.
This study investigates the influence of stimulus properties on movement control in active perception. We studied exploratory movement parameters in haptic perception of stimuli with different stiffness values. Virtual stimuli were generated using a PHANToM force-feedback device. Participants freely explored pair-wise presented stimuli and were asked to select the softer one. Afterwards we analyzed their exploratory movements considering the parameters velocity, pressure and the indentation depth. We found a systematic influence of stimulus' stiffness on pressure/indentation depth and velocity. We conclude that observers adapted the movement parameters depending on stiffness variations. We discuss whether such adaptation might serve to optimize perception, extending optimal observer models known from vision towards active touch.
Kaim, L., & Drewing, K. (2008). Observers vary movement parameters in active touch depending on stimulus stiffness. Perception ECVP abstract, 37, 49-49.
This chapter examines the application of a psychophysical evaluation technique to quantify the fidelity of haptic rendering methods. The technique is based on multidimensional scaling analysis of similarity ratings provided by users comparing pairs of haptically-presented objects. Unbeknownst to the participants, both real and virtual deformable objects were presented. In addition, virtual objects were either rendered under high fidelity condition or under lower-fidelity condition in which filtering quality was reduced. The analysis of pairwise similarity data provides quantitative confirmation that users perceived a clear difference between real and virtual objects in the lower-fidelity, but not in the higher-fidelity condition. In the latter, a single perceptual dimension, corresponding to stiffness, sufficed to explain similarity data, while two perceptual dimensions were needed in the former condition.
Harders, M., Leskovsky, P., Cooke, T., Ernst, M. O., & Szekely, G. (2008). of Book: The Sense of Touch and its Rendering: Progresses in Haptics Research. Springer.
The active control of exploratory movements is an integral part of active touch. In two experiments we investigated (and manipulated) the relationship between the haptic discrimination of small bumps and the direction of exploratory movements relative to the body. Shape discrimination performance systematically varied with the direction of stimulus exploration. Further, if they were rewarded for good perceptual performance and had the choice, participants displayed clear strategic preferences for certain exploratory directions. Chosen directions, at least on average, were accompanied by low discrimination thresholds. Overall, the findings emphasize the necessity to focus at the explorator’s active contribution to haptic perception, and provide the first hints that exploratory behavior might be exploited to optimize haptic perception.
Drewing, K. (2008, June). Shape discrimination in active touch: Effects of exploratory direction and their exploitation. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 219-228). Springer, Berlin, Heidelberg.
When sliding a finger across a bump on a surface, the finger follows the geometry of the bump (position signal). At the same time, forces related to the slope of the bump accelerate and decelerate the finger (force signal) [1]. Consistent with the Maximum Likelihood Estimate (MLE) model [2] haptically perceived shape can be described by the weighted average of the shape signaled by the force and the position signal [3, 4]. Here we investigated – for the haptic perception of bump amplitude – the effects of the movement parameters pressure and velocity on the signal weighting, as well as on the discrimination threshold (experiment 1). In experiment 2 we examined, whether the integration of force and position signals within the haptic modality is consistent with the MLE model under varying exploratory pressure.
Kaim, L., & Drewing, K. (2007). Influence of parametric variation in exploratory movement on signal integration for haptic shape perception. In 10th Tübinger Perception Conference (pp. 99-99). Knirsch.
Wird ein Finger über eine physikalische Erhebung bewegt, so folgt er zum einen der Geometrie der Oberfläche (der Finger geht hoch und wieder runter = Positionssignale), zum anderen wird der Finger gemäß der Steigung der Erhebung abgebremst beziehungsweise auf der anderen Seite wieder beschleunigt (=Kraftsignale). Bei haptischer Formwahrnehmung wird das Gesamtperzept aus einem gewichteten Mittel der während der Exploration gewonnenen Kraft- und Positionssignale gebildet (Drewing & Ernst, 2006). Hier untersuchten wir den Einfluss der Bewegungsparameter Druck und Geschwindigkeit auf die Gewichtung des Kraft- und Positionssignals, sowie auf die Diskriminationsschwellen bei aktiver haptischer Wahrnehmung.
Kaim, L., & Drewing, K. (2007). Signalintegration bei haptischer Formwahrnehmung unter Variation von Kraft und Geschwindigkeit der exploratorischen Bewegung. Beiträge zur, 49, 194.
The target article fails to disentangle the functional description from the structural description of the two somatosensory streams. Additional evidence and thorough reconsideration of the evidence cited argue for a functional distinction between the how processing and the what processing of somatosensory information, while questioning the validity and usefulness of the equation of these two types of processing with structural streams. We propose going one step further: to investigate how the distinct functional streams are coordinated via attention.
Drewing, K., & Schneider, W. (2007). Disentangling functional from structural descriptions, and the coordinating role of attention. Behavioral and Brain Sciences, 30(2), 205-206. doi:10.1017/S0140525X07001446
We studied multi-sensory integration of directional information during the execution of goal-directed pointing movements. Subjects pointed at a visual target of 6 cm diameter, presented at 35 cm from the starting position of the arm movement. Subjects performed the pointing movement under open loop conditions, i.e. visual feedback about finger and target position was removed during the movement. Proprioceptive directional information was provided by applying a small force pulse (amplitude 1 N, pulse duration 50 ms) orthogonal to the movement direction early in the movement. In some trials, a noisy visual directional cue was presented. Time and spatial location of presentation of the visual cue were matched to the force pulse. The direction of the visual cue was either consistent with the force pulse direction or differed by 30°, either clockwise or counterclockwise. Subjects were instructed to hit the target within 1200 ms following target presentation. We measured perceived direction of the proprioceptive cue when both cues were provided and perceived direction for each cue alone. In conditions in which both cues were presented simultaneously, we compared subjects' responses to the predictions of an ideal observer model. The model combines visual and proprioceptive direction estimates measured in single-cue conditions by weighted averaging. The weights depend on the reliability of each cue. In accordance with the predictions of the ideal observer model we find that subjects' responses were less variable when both visual and proprioceptive cues were available. In conditions in which the mean direction of proprioceptive cue and visual cue differed, subjects' responses exhibited a bias towards the direction of the visual cue. This bias was larger for more reliable visual cues and smaller for more reliable force pulse directions. These results are consistent with the idea of a reliability weighted combination of both cues.
Serwe, S. Drewing, K. Trommershäuser, J. (2007). Integration of multi-sensory directional information during goal-directed pointing [Abstract]. Journal of Vision, 7(9):307, 307a, http://journalofvision.org/7/9/307/, doi:10.1167/7.9.307
When a participant’s unseen real hand and an artificial seen hand are stroked in synchrony, the participant reports a vivid illusion of feeling the tactile sensations as originating from the stroking of the artificial hand. Additionally, the felt position of the real hand is shifted towards the seen artificial hand (Botvinick, & Cohen, 1998). In two experiments we investigated top-down influences on the illusion as indicated by position shift and subjective onset time and duration of the illusion. Experiment 1 demonstrated for all three measures that the extent of the illusion systematically depends on the plausibility of the seen artificial limb (hand, distorted hand, cell phone). In experiment 2, we projected changing patterns of dots onto the artificial hand. Participants had to count backwards (stepsize 1 or 3), whenever a dot was stroked, or they just looked at the dots.
Drewing, K., Albus, P., & Kunkel, A. (2007). Top-down influences on the rubber hand illusion. K
When combining a saccade towards a visual target and a concurrent hand movement, the saccade is faster than when it is executed in isolation (Snyder et al, 2002 Journal of Neurophysiology 87 2279-2286). This coordination advantage may result from specifying the same movement goal for hand and eye, or it may be due to the shared trajectories of hand and eye. Participants (N= 16) in our experiment made a saccade in isolation, or a saccade accompanied by a pointing movement. Saccades and pointing movements aimed at the same or different goals, and followed similar or different trajectories (ie in the same or opposite direction). Start and target points for the movements were visually specified, the'go'signal was auditory. Simultaneous pointing movements increased saccadic peak velocity only when eye and hand followed similar trajectories--independently of whether eye and hand shared the movement
Drewing, K., & Spering, M. (2006). Saccades are faster when accompanied by a hand movement--effect of shared goals, shared trajectories, or both?. Perception ECVP abstract, 35, 0-0.
This article systematically explores cue integration within active touch. Our research builds upon a recently made distinction between position and force cues for haptic shape perception [Robles-de-la-Torre, G., Hayward, V., 2001. Force can overcome object geometry in the perception of shape through active touch, Nature 412, 445–448]: when sliding a finger across a bumpy surface, the finger follows the surface geometry (position cue). At the same time, the finger is exposed to forces related to the slope of the surface (force cue). Experiment 1 independently varied force and position cues to the curvature of 3D arches. Perceived curvature could be well described as a weighted average of the two cues. Experiment 2 found more weight of the position cue for more convex high arches and higher weight of the force cue for less convex shallow arches—probably mediated through a change in relative cue reliability.
Drewing, K., & Ernst, M. O. (2006). Integration of force and position cues for shape perception through active touch. Brain research, 1078(1), 92-100.
The present study investigates the contribution of general processing resources as well as other more specific factors to the life-span development of sensorimotor synchronization and its component processes. Within a synchronization tapping paradigm, a group of 286 participants, 6 to 88 years of age, were asked to synchronize finger taps with sequences of auditory signals. The auditory signals were given either isochronously with short or long interstimulus intervals in a regular condition or in a more demanding condition with alternating short and long intervals. The results provided the first direct life-span evidence showing that performance in these tasks improves substantially during childhood until about late teens, and thereon remains at least relatively stable until old age. This pattern of life-span age gradient holds for measures of different component processes of sensorimotor synchronization, such as basic timekeeping and error correction processes. The findings are not in line with simple general factor accounts of development. They rather suggest a more complex interaction between general resources and other specific factors in the life-span development of different components of sensorimotor synchronization.
Drewing, K., Aschersleben, G., & Li, S.-C. (2006). Sensorimotor synchronization across the life span. International Journal of Behavioral Development, 30(3), 280–287.
When sliding a finger across a bumpy surface, the finger follows the surface geometry (position signal). At the same time the finger is exposed to forces related to the slope of the surface (force signal)[1]. For haptic shape perception the brain uses both signals integrating them by weighted averaging [2]. This is consistent with the Maximum-Likelihood-Estimate (MLE) model on signal integration, previously only applied to passive perception. The model further predicts that signal weight is proportional to signal reliability. Here, we tested this prediction for the integration of force and position signals to perceived curvature by manipulating material properties of the curve. Low as compared to high compliance decreased the reliability and so the weight of the sensorily transduced position signal. High as compared to low friction decreased the reliability and so the weight of the transduced force signal. These results demonstrat that the MLE model extends to situations involving active touch.
Drewing, K., Ernst, M. O., & Wiecki, T. (2005, March). Material properties determine how we integrate shape signals in active touch. In 1st Joint Worldhaptic Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WorldHaptics 2005) (pp. 1-6).
We tested whether auditory sequences of beeps can modulate the tactile perception of sequences of taps (two to four taps per sequence) delivered to the index fingertip. In the first experiment, the auditory and tactile sequences were presented simultaneously. The number of beeps delivered in the auditory sequence were either the same as, less than, or more than the number of taps of the simultaneously presented tactile sequence. Though task-irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli systematically modulated subjects’ tactile perception; in other words subjects’ responses depended significantly on the number of delivered beeps. Such modulation only occurred when the auditory and tactile stimuli were similar enough. In the second experiment, we tested whether the automatic auditory-tactile integration depends on simultaneity or whether a bias can be evoked when the auditory and tactile sequence are presented in temporal asynchrony. Audition significantly modulated tactile perception when the stimuli were presented simultaneously but this effect gradually disappeared when a temporal asynchrony was introduced between auditory and tactile stimuli. These results show that when provided with auditory and tactile sensory signals that are likely to be generated by the same stimulus, the central nervous system (CNS) tends to automatically integrate these signals.
Bresciani, J. P., Ernst, M. O., Drewing, K., Bouyer, G., Maury, V., & Kheddar, A. (2005). Feeling what you hear: auditory signals can modulate tactile tap perception. Experimental brain research, 162(2), 172-180.
Based on existing knowledge on human tactile movement perception, we constructed a prototype of a novel tactile multipin display that controls lateral pin displacement and, thus produces shear force. Two experiments focus on the question of whether the prototype display generates tactile stimulation that is appropriate for the sensitivity of human tactile perception. In particular, Experiment I studied human resolution for distinguishing between different directions of pin displacement and Experiment II explored the perceptual integration of information resulting from the displacement of multiple pins. Both experiments demonstrated that humans can discriminate between directions of the displacements, and also that the technically realized resolution of the display exceeds the perceptual resolution (>14°). Experiment II demonstrated that the human brain does not process stimulation from the different pins of the display independent of one another at least concerning direction. The acquired psychophysical knowledge based on this new technology will in return be used to improve the design of the display
Drewing, K., Fritschi, M., Zopf, R., Ernst, M. O., & Buss, M. (2005). First evaluation of a novel tactile display exerting shear force via lateral displacement. ACM Transactions on Applied Perception (TAP), 2(2), 118-131.
The successful execution of movements not only requires directing the movement towards the selected goal, but also detecting and compensating for perturbations interfering with the goal of the movement. Here we asked if participants are able to detect external force perturbations, how the executed movement is affected by the perturbation and, how the perturbation interferes with the goal of the task. Participants were instructed to rapidly hit a visual target, which was presented within a three-dimensional visuo-haptic virtual environment. Late responses and failures to hit the target were penalized. Participants were presented with a force pulse, which was applied to their right finger tip during the initial phase of the pointing movement. Force perturbations were applied orthogonally to the movement direction. We determined detection thresholds for perturbations from six different directions (up, down, upper right/left; lower right/left) using a two-interval forced choice paradigm. 5 participants completed the experiment. Surprisingly, detection thresholds for the applied perturbations (threshold about .10 N) were just slightly higher than tactile-kinesthetic detection in a single-task context (about .05N, Lederman & Klatzky, 1999). Detection performance did not depend on the direction of the perturbation, but was better for short perturbations (30 ms presentation time) compared to longer perturbations (50 ms presentation time). Shorter perturbations differed from longer perturbations by a steeper increase in force amplitude (10% of the duration). Locally, perturbations (> about 0.07 N) affected the movement kinematics significantly as compared to trajectories without perturbation. However, the distribution of movement end points at the location of the visual target did not correlate with the direction of the perturbation. These results are a first hint that the brain is able to detect force perturbations during visually guided pointing movements without extra costs.
Drewing, K., & Trommershaeuser, J. (2005). Detection and costs of force perturbations during visually-guided pointing movements. Journal of Vision, 5(8), 625-625.
Most models of object recognition and mental rotation are based on the matching of an object’s 2-D view with representations of the object stored in memory. They propose that a time-consuming normalization process compensates for any difference in viewpoint between the 2-D percept and the stored representation. Our experiment shows that such normalization is less time consuming when it has to compensate for disorientations around the vertical than around the horizontal axis of rotation. By decoupling the different possible reference frames, we demonstrate that this anisotropy of the normalization process is defined not with respect to the retinal frame of reference, but, rather, according to the gravitational or the visuocontextual frame of reference. Our results suggest that the visual system may call upon both the gravitational vertical and the visuocontext to serve as the frame of reference with respect to which 3-D objects are gauged in internal object transformations.
Waszak, F., Drewing, K., & Mausfeld, R. (2005). Viewer-external frames of reference in the mental transformation of 3-D objects. Perception & psychophysics, 67(7), 1269-1279.
We perceive the world surrounding us via multiple sensory modalities, including touch, vision and audition. The information derived from all these different modalities has to converge in order to form a coherent and robust percept of the world. Here, we review a model (the MLE model) that in the statistical sense describes an optimal integration mechanism. The benefit from integrating sensory information comes from a reduction in variance of the final perceptual estimate. We here illustrate this integration mechanism in the human brain with two examples: the fist example demonstrates the integration of force and position cues to shape within haptic perception; the second example highlights multimodal perception and shows that tactile and auditory information for temporal perception interacts in a way predicted by the MLE integration model.
Ernst, M. O., Bresciani, J. P., Drewing, K., & Bülthoff, H. H. (2004). Integration of sensory information within touch and across modalities. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004).
Previous research indicated that sound can bias visual [1-4] as well as tactile perception [5,6]. The present experiment tested whether auditory stimuli can alter the tactile perception of sequences of taps (2 to 4 taps per sequence) delivered on the index ngertip. The taps were delivered using a PHANToM force feedback device. The subjects did not have any visual or auditory feedback about the tactile stimulation and their task was to report after each sequence how many taps they felt. In the rst experiment, for some trials, auditory sequences of beeps were presented concomitantly with the tactile sequences (through earphones). The number of beeps diffused in the auditory sequence could be the same as, less, or more than the number of taps of the simultaneously presented tactile sequence. Though irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli systematically biased subjects' tactile perception, i.e. subjects' responses depended signicantly on the number of diffused beeps. The results also suggested that for such an auditory-tactile interaction to occur, a certain amount of structural congruency between the simultaneously presented stimuli is required. Indeed, the diffusion of an auditory stimulus obviously incongruent with the tactile sequence failed to evoke any bias of tactile perception. In the second experiment, we tested whether the auditory-tactile interaction also requires temporal congruency or whether a bias can be evoked without temporal overlapping between the auditory and tactile presented sequences. The tactile and auditory stimuli were the same as in the rst experiment (the structurally incongruent auditory stimulus was not used here) but the auditory sequences were presented either simultaneously with, before the beginning, or after the end of the tactile sequences. Audition strongly biased tactile perception when the stimuli were temporally concomitant (reproduction of the results obtained in the rst experiment). With a temporally asynchronous audio-tactile stimulus the interaction gradually disappeared. We conclude that auditory and tactile sensory signals are integrated when they both provide redundant information in good temporal coherence.
Bresciani, J. P., Ernst, M. O., Drewing, K., Bouyer, G., Maury, V., & Kheddar, A. (2004, February). Feeling What You Hear: An Auditory-Evoked Tactile Illusion. In 7th Tübingen Perception Conference (TWK 2004) (p. 73). Knirsch.
When sliding a nger across a bumpy surface, the nger follows the geometry of the bumps/holes providing positional cues for the shape. At the same time the nger is opposed by forces related to the steepness of the bumps/holes. With a specic device Robles-de-la-Torre and Hayward [1] dissociated positional and force cues in the haptic perception of small-scale bumps and holes: Participants in this experiment reported to predominantly feel the class of shapes (bumps or holes) indicated by the force cues. Drewing and Ernst [2] extended this research by disentangling force and position cues to the perception of curves more systematically and by also quantifying the perceived curvature. The result was that the perceived curvature could be predicted from weighted averaging of the two cues. This is consistent with current models of cue integration [e.g., 3]. These integration models further predict that the cue weight is proportional to the cue's reliability. Here, we aimed at testing this prediction for the integration of force and position cues to haptic shape by manipulating the shapes' material properties: high softness can be assumed to decrease the reliability of the position cue as compared to low softness, and high friction to decrease the reliability of the force cue. Using the PHANToM force-feedback device we constructed haptic curve stimuli. We systematically intermixed force and position cues indicating curvatures of 14 and 24 /m. Using the method of double-staircases, we measured the point of subjective equality (PSE) of the curvature of these as compared to `natural' stimuli (i.e., with consistent position and force cues). From the PSE data we determined the cue weights. This was done under each combination of material properties (low vs high softness X low vs high friction). We found that material properties affected the cue weights in a manner consistent with our predictions. These results further conrm the validity of existing models of cue integration in haptic shape perception.
Drewing, K., Wiecki, T., & Ernst, M. O. (2004, February). Cue Reliabilities Affect Cue Integration in Haptic Shape Perception. In 7th Tübingen Perception Conference (TWK 2004) (p. 123). Knirsch.
The purpose of this study is to investigate multimodal visual-haptic texture perception for which we used virtual reality techniques. Participants judged a broad range of textures according to their roughness and their spatial density under visual, haptic and visual-haptic exploration conditions. Participants were well able to differentiate between the different textures both by using the roughness and the spatial density judgment. When provided with visualhaptic textures, subjects performance increased (for both judgments) indicating sensory combination of visual and haptic texture information. Most interestingly, performance for density and roughness judgments did not differ significantly, indicating that these estimates are highly correlated. This may be due to the fact that our textures were generated in virtual reality using a haptic pointforce display (PHANToM). In conclusion, it seems that the roughness and spatial density estimate were based on the same physical parameters given the display technology used.
Drewing, K., Ernst, M. O., Lederman, S. J., & Klatzky, R. (2004, June). Roughness and spatial density judgments on visual and haptic textures using virtual reality. In 4th International Conference EuroHaptics 2004 (pp. 203-206). Institute of Automatic Control Engineering.
At present, tactile displays are constructed either as shape or vibrotactile displays. While shape displays render the shape of objects to the skin, vibrotactile devices display high frequent but small amplitude patterns of forces. Existing tactile displays of both types base on an array of small pins, which move normal to the contact surface. That is, the pins create a pattern of indentation into the skin. Usually, the devices are applied to the human finger pad. However, in order to produce a realistic tactile impression of the environment it is probably as important to provide forces lateral to the human skin, so called shear forces. This is particularly reasonable when considering perceptions evoked by movements of the skin relative to the environment, eg when stroking with the finger across a surface. We aim at technically realizing a third type of tactile display which can provide shear forces. The poster presents the prototype of a shear force display for the finger tip and a first psychophysical evaluation. In order to explore whether the stimuli produced by the display are appropriate for human perception we studied in a first step discrimination performance of humans for distinguishing between different directions of pin movement. This basic psychophysical knowledge that so far did not exist because the technology was not yet available will in return be used to improve the design of the display.
Fritschi, M., Drewing, K., Zopf, R., Ernst, M. O., & Buss, M. (2004, June). Construction and first evaluation of a newly developed tactile shear force display. In 4th International Conference EuroHaptics 2004 (pp. 508-511). Institute of Automatic Control Engineering.
We tested whether the tactile perception of sequences of taps delivered on the index fingertip can be modulated by sequences of auditory beeps. In the first experiment, the tactile and auditory sequences were always presented simultaneously, and were structurally either similar or dissimilar. In the second experiment, the auditory and tactile sequences were always structurally similar but not always presented simultaneously. When structurally similar and presented simultaneously, the auditory sequences significantly modulated tactile taps perception. This automatic combination of “redundant-like” tactile and auditory signals likely constitutes an optimization process taking advantage of multimodal redundancy for perceptual estimates.
Bresciani, J. P., Ernst, M. O., Drewing, K., Bouyer, G., Maury, V., & Kheddar, A. (2004, June). Auditory modulation of tactile taps perception. In 4th International Conference EuroHaptics 2004 (pp. 198-202). Institute of Automatic Control Engineering.
t In a repetitive tapping task, the within-hand variability of intertap intervals is reduced when participants tap with both hands instead of single-handedly. This bimanual advantage has been attributed to timer as opposed to motor variance (according to the WingKristofferson model; Helmuth and Ivry 1996) and related to the additional sensory consequences of the movement of the extra hand in the bimanual case (Drewing et al. 2002). In the present study the effect of sensory feedback of the movement on this advantage was investigated by comparing the results of a person (IW) deafferented below the neck with those of age-matched controls. IW showed an even more pronounced bimanual advantage than controls, suggesting that the bimanual advantage is not due to actual sensory feedback. These results support another hypothesis, namely that bimanual timing profits from the averaging of different central control signals that relate to each effector’s movements.
Drewing, K., Stenneken, P., Cole, J., Prinz, W., & Aschersleben, G. (2004). Timing of bimanual movements and deafferentation: Implications for the role of sensory movement effects. Experimental Brain Research, 158(1), 50-57.
Most models of object recognition assume that object recognition is based on the matching of the 2-D view of the object with representations of the object stored in memory. They propose that a time-consuming normalisation process compensates for any difference in viewpoint between the 2-D percept and the stored representation. Our experiment shows that this normalisation is less time-consuming when it has to compensate for disorientations around the vertical than around the horizontal axis of rotation. By decoupling the different possible reference frames, we demonstrate that this anisotropy of the recognition performance is not defined with respect to the retinal, but with respect to the gravitational or the visuo-contextual frame of reference. Our results suggest that the visual system may call upon both the gravitational vertical and the visual context to serve as the frame of reference with respect to which objects are gauged in 3-D object recognition.
Waszak, F., Drewing, K., & Mausfeld, R. (2004). Viewer-external frames of reference in 3-D object recognition. Poster presented at 27th European Conference on Visual Perception (ECVP 2004), Budapest, Hungary.
Tactile feedback is among haptics one of the more recent modalities for human-system interaction. Research in tactile feedback using pin-array type actuators has been going on during the past years or so. A survey about technological achievements, human sensing capabilities, and psychophysical evaluation in this area is presented. Then the focus is on novel approaches in actuator technology and tactile feedback systems providing shear force (tangential force to the finger-tip).
Fritschi, M., Buss, M., Drewing, K., Zopf, R., & Ernst, M. O. (2004, September). Tactile feedback systems. In Workshop" Touch and Haptics": 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2004) (pp. 1-21).
Investigating multisensory integration, Shams et al (2000 Nature 408 788) recently found that the number of perceived visual flashes could be altered by a sequence of beeps presented simultaneously. Here, we tested whether auditory sequences of beeps can modulate the tactile perception of sequences of taps (2 to 4 taps per sequence). In experiment 1, the auditory and tactile sequences were presented simultaneously. The number of beeps delivered in the auditory sequence was either the same as, less than, or more than the number of tactile taps. Though task-irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli significantly modulated subjects' tactile perception. Such modulation occurred only when the auditory and tactile stimuli were structurally similar. In experiment 2, we tested whether auditory-tactile interaction depends on simultaneity or whether a bias can be evoked without temporal overlap between the auditory and tactile sequences. Audition significantly modulated tactile perception when the stimuli were presented simultaneously, but this effect gradually disappeared when a temporal asynchrony was introduced between auditory and tactile stimuli. These results show that when provided with auditory and tactile signals that are likely to be generated by the same stimulus, the brain tends to automatically combine these signals.
Ernst, M. O., Bresciani, J. P., & Drewing, K. (2004, September). Feeling what you hear: Auditory signals can modulate the perception of tactile taps. In 27th European Conference on Visual Perception (ECVP 2004) (p. 143). Pion Ltd..
This work presents the prototype of a shear force display for the finger tip and a first psychophysical evaluation. In order to explore whether the stimuli produced by the display are appropriate for human perception we studied discrimination performance of humans for distinguishing between different directions of pin movement. In a second step we explored the perceptual integration of multi-pin movements. This basic psychophysical knowledge that so far did not exist because the technology was not yet available to be used to improve the design of the display.
Fritschi, M., Drewing, K., Zopf, R., Ernst, M. O., & Buss, M. (2004, September). Construction and psychophysical evaluation of a novel tactile shear force display. In RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759) (pp. 509-513). IEEE.
The objective of the TOUCH-HapSys Project is the development of new haptic display technology. The present deliverable describes the standard of knowledge in human haptic science and attempts to analyze in how far this knowledge can be used to improve haptic display: We provide a brief overview on existing haptic displays, go on with the sensory foundations of human haptics and, especially, deal with the question how the haptic system derives a representation of the environment from its sensory input. Thereby, we report a couple of principles and illusions in haptic perception which look promising for the future development of haptic display technology and, thus, are worth to be investigated more thoroughly.
Drewing, K., Bresciani, J. P., & Ernst, M. O. (2003). Risk analysis on usability of haptic illusions for designing new haptic systems.
Our sensory channels provide us a direct access to several properties of the surrounding environment. A large part of these properties can be redundantly assessed on the basis of two or more of these channels. For instance, both vision and haptics are informative about the shape, the orientation or the texture of a given object. Actually, the different sensory signals are combined in the central nervous system to give rise to a unified percept. The general aim of the present document is to give an overview of the work dealing with the integration of haptic cues and other sensory inputs. More specifically, we will mainly focus on visuo-haptic integration, presenting also some interesting findings in the audio-haptic integration domain.
Bresciani, J. P., Drewing, K., & Ernst, M. (2003). Risk analysis for multidimensional illusions for multimodal systems. In Proceedings of the workshop” Touch and Haptics (pp. 4-1).
On a repetitive tapping task, the within-hand variability of intertap intervals is reduced when participants tap with two hands as compared to one-hand tapping. Because this bimanual advantage can be attributed to timer variance (Wing—Kristofferson model, 1973a, b), separate timers have been proposed for each hand, whose outputs are then averaged (Helmuth & Ivry, 1996). An alternative notion is that action timing is based on its sensory reafferences (Aschersleben & Prinz, 1995; Prinz, 1990). The bimanual advantage is then due to increased sensory reafference. We studied bimanual tapping with the continuation paradigm. Participants first synchronized their taps with a metronome and then continued without the pacing signal. Experiment 1 replicated the bimanual advantage. Experiment 2 examined the influence of additional sensory reafferences.
Drewing, K., & Aschersleben, G. (2003). Reduced timing variability during bimanual coupling: a role for sensory information. The Quarterly Journal of Experimental Psychology Section A, 56(2), 329-350.
The sense of presence in virtual environments may be greatly improved by the display of haptic virtual reality. Current haptic display technology, however, mostly remains unsatisfying and expensive. One way to overcome existing technical limitations might be to “cheat” the haptic system by exploiting its principles. Importantly, human perception of an environmental property normally relies upon the integration of several different cues, which technologically may-at least partly–be substituted to one another. A recent promising starting point for such substitution is the experimental demonstration that haptic perception of three-dimensional shapes can be evoked by just two-dimensional forces (Robles-de-la-Torre Hayward, 2001: Nature). The experiment dissociated positional and force cues in the perception of small-scale bumps: When sliding a finger across a bump on a surface, the finger follows the geometry of the bump providing positional cues for the shape. At the same time the finger is opposed by forces related to the steepness of the bump. Participants in this experiment reported to feel the shape indicated by the force cues and not by the positional cues. The present study extended this research. We aimed to disentangle the contributions of force and position cues to haptic shape perception more systematically and to explore their integration principles. For that purpose, we constructed a set of virtual standard curves, where we intermixed force and position cues related to curvatures of 0, 8 and 16/m using the PHANToM haptic device.
Drewing, K., & Ernst, M. O. (2003). Cue integration in the haptic perception of virtual shapes.
When one slides a finger across a surface with a bump on it, the finger follows the geometry of the bump, providing positional cues for the shape. At the same time, the finger is opposed by forces related to the steepness of the bump. With a specific device, Robles-de-la-Torre and Hayward (2001) dissociated positional and force cues in the haptic perception of small-scale bumps and holes: Participants in this experiment reported feeling the shape indicated by the force cues and not those indicated by the positional cues. We extended this research by systematically disentangling the contributions of these two cues to the perception of curvature. Using the PHANToM haptic device, we presented virtual curves, in which we intermixed force and position cues related to curvatures between 0 and 16/m. Participants compared these with pseudonatural curves. Our results suggest that perceived curvature is a weighted average of both positional and force cues.
Drewing, K., & Ernst, M. O. (2003, November). Integration of force and position cues in haptic curvature perception. In 44th Annual Meeting of The Psychonomic Society (p. 112).
In a repetitive tapping task, the within-hand variability of intertap intervals is reduced when participants tap with both hands, as opposed to single-handed tapping. This bimanual advantage can be attributed to timer variance (according to the Wing-Kristofferson model). Separate timers have been proposed for each hand whose outputs are then averaged (Helmuth & Ivry, 1996, Journal of Experimental Psychology: Human Perception and Performance, 22, 278–293). Alternatively, timing might be based on sensory reafference and the bimanual advantage due to the enhancement of sensory reafferences. This alternative hypothesis was tested in three experiments. In the first experiment, we replicated the bimanual advantage in tapping with two fingers of the same hand compared with single finger tapping.
Drewing, K., Hennings, M., & Aschersleben, G. (2002). The contribution of tactile reafference to temporal regularity during bimanual finger tapping. Psychological Research, 66(1), 60-70.