PhD Thesis
Understanding, Modeling and designing Multi-touch Gestures Interaction.
Prepared at INRIA Lille Nord-Europe, Mint team. Defended on December 10th, 2014.
Jury Composition:
Laurence Nigay, Professor, University Joseph Fourier Grenoble 1 (Reviewer)
Joaquim Jorge, Professor, Technical University of Lisbon (Reviewer)
Martin Hachet, Research Scientist, Inria Bordeaux (Examinator)
Eric Lecolinet, Associate Professor, Telecom ParisTech (Examinator)
Laurent Grisoni, Professor, University of Lille1 (Advisor)
Nicolas Roussel, Senior Researcher, Inria Lille Nord-Europe (Advisor)
Radu-Daniel Vatavu, Associate Professor, University Stefan cel Mare of Suceava, Romania (Invited)
Gilles Grimaud, Professor, University of Lille1 (President)
Abstract
This thesis presents our research investigations on multi-touch gesture variability. We first study multi-touch gesture variability from a user perspective, and then we investigate applicable tools and techniques for multi-touch interaction.
Towards understanding multi-touch variability, we set-up a pair of user studies. From the first study, we outline a taxonomy of multi-touch gestures in which we present the different aspects of a single unified dynamic mechanism ruling users in the achievement of a multi-touch gesture. In particular, we introduce the concept of atomic movement that reflects users' perception of fingers movement during gesture articulations. From the second study, we provide a more comprehensive analysis on multi-touch gesture variability. We differentiate between the major and minor sources of variation during multi-touch gesture articulation and outline eight representative gesture classes. We analysis the link between gesture shape and gesture articulations.
Moreover, we address the question of whether these different sources of variations induce different degree of articulation difficulty or if they are equivalent from a user-centric perspective. We there-by conduct the first investigation on the user-perceived difficulty of multi-touch gesture articulations. We report correlation results between users' subjective assessments of difficulty and objective gesture descriptors to enable a better understanding of the mechanisms involved in the perception of articulation difficulty. Through an in-depth analysis, we reveal new findings about how people synchronize their fingers and hands during gesture articulation by studying gesture structure, geometry and kinematics descriptors. We use our large body of results and observations to compile a set of guidelines for multi-touch gesture design by considering the ergonomics of multi-touch input through the prism of the user-perceived difficulty of gesture articulation.
After studying multi-touch gestures from a purely user-centric perspective, we provide tools and techniques that can be integrated in multi-touch interaction systems. We first propose a new preprocessing step, Match-Up, specific to multi-touch gestures (a first in the gesture literature) that structures finger movements consistently into clusters of similar strokes, which we add to the practitioners' toolkit of gesture processing techniques. We then apply Match-Up to recognize multi-touch input under unconstrained articulation (Match-Up & Conquer), for which we show an improvement in recognition accuracy of up to 10%.
Finally, we introduce the concept of rigid movement and investigate its potential usage in order to strength interaction and offer users more flexibility in articulating gestures. In particular, we show how it can enable to free users from the use of a prefixed number of fingers, as well as from a predefined trace, when articulating a gesture.
Keywords : Multi-touch gestures, user studies, gesture taxonomy, gesture variability, gesture analysis, gesture structure, gesture geometry, gesture kinematics, gesture articulation difficulty, gesture recognition, structuring finger movements, gestural interaction techniques.