Yosra Rekik, Inria Lille & LIFL, University of Lille1, France
Radu-Daniel Vatavu, University Stefan cel Mare Of Suceava, Romania
Laurent Grisoni, Inria Lille & LIFL, University of Lille1, France
About
We show that users are consistent in their assessments of the articulation difficulty of multi-touch gestures, even under the many degrees of freedom afforded by multi-touch input, such as (1) various number of fingers touching the surface, (2) various number of strokes that structure the gesture shape, and (3) single-handed and bimanual input. To understand more about perceived difficulty, we characterize gesture articulations captured under these conditions with geometric and kinematic descriptors computed on a dataset of 7,200 samples of 30 distinct gesture types collected from 18 participants. We correlate the values of the objective descriptors with users’ subjective assessments of articulation difficulty and report path length, production time, and gesture size as the highest correlators (max Pearson’s r=.95). We also report new findings about multi-touch gesture input, e.g., gestures produced with more fingers are larger in size and take more time to produce than single-touch gestures; bimanual articulations are not only faster than single- handed input, but they are also longer in path length, present more strokes, and result in gesture shapes that are deformed horizontally by 35% in average. We use our findings to outline a number of 14 guidelines to assist multi-touch gesture set design, recognizer development, and inform gesture-to-function mappings through the prism of the user-perceived difficulty of gesture articulation.
If you find the dataset useful for your work, please let me know. If you use the dataset to report results for publications, please reference the work below.
Multi-touch gestures dataset
We employed 30 gesture types for our experiment addressing the effect of the number of (a) fingers, (b) strokes, and (c) hands on perceived difficulty (with 10 gestures per task). All gestures for the stroke task were carefully chosen so that they could be articulated both as single and multi-strokes. All gestures selected for the number of hands task expose a symmetry axis, as it has been observed previously that only symmetric gestures can be conveniently parallelized during articulation. In addition, half of the gesture shapes (the left 15 gestures (d)) were selected to appear familiar to our participants, where we defined the familiarity of a shape as previous, frequent practice articulating that specific shape during everyday handwriting (e.g., familiarity occurs for example for letters and simple geometric shapes). The remaining half of the gesture set (the right 15 gestures (e)) was designed to be unfamiliar to our participants, which enabled us to collect their subjective perceptions of articulation difficulty on the first encounter with a new geometric shape (e.g., the ``four curlicue'' or the ``stroke-through'' gestures).
Please note that we chose to investigate symbolic multi-touch gestures for this experiment instead of gestures traditionally used for multi-touch interaction (e.g., pan, zoom, rotate, etc.), as they are more versatile to generalize for other applications.
Downloads are available on the bottom of this page:
* The dataset
* Rating Scores
* Ranking Scores
Gestures were collected on a 32 inch (81.3 cm) multi-touch display, 3MTM C3266PW, supporting up to 40 simultaneous touches. The application logged touch coordinates with identification numbers and associated timestamps (x,y,touchId,t).
Overall, the Multi-touch Gestures dataset contains 7200 samples:
Finger count:
18 participants x
10 gesture types x
3 variation (1F, 2F and 3+F) x
5 executions x
=2,700 gesture samples
Stroke count:
18 participants x
10 gesture types x
3 variation (1S, 2S and 3+S) x
5 executions x
=2,700 gesture samples
Synchronicity:
18 participants x
10 gesture types x
2 variation (1H and 2H) x
5 executions x
=1,800 gesture samples