Modality Fusion during Touch-based Interaction (Samsung Electronics)

The goal of this project is to improve perception and performance during touch-based interaction in personal electronic devices. Specifically, we have identified the appropriate fusion of visual, audio, and haptic cues during fingertip interaction with touch screen images (Figure 10a).


This project was intiated by Samsung Telecommunications America based on our prior work in planning, design, and execution of user-specific studies to understand the effects of multimodal fusion. During this project, our research team provided consultation to Samsung about study design, user interface of an Android-based test-bed, and evaluation methods, as well as conducted three user studies (pilot, main, and confirmation) concerning modality fusion. In the main study, I presented more than 100 participants with a series of multimodal effects for 26 images with varying textures. An Android Application provided combinational effects of visual, auditory, and haptic cues, which vary according to textures on the images where the user’s finger is touching and hovering (Figure 10b). We collected participants’ evaluations about the given effects (Figure 10c) and asked them to build their best effects by themselves for another 15 images. The study found perceptual engagement was the best when vibro-tactile intensity was linearly combined with the vsual effects, and the analysis results have presented quatifiable proportions of sensory cues for inducing more natural and more appealing/engaging touch-based interactions with image textures.




(a)


(b)

(c)

Figure 10. Modality fusion for more appealing/engaging touch-based interaction.