MAHATMA

 Multiscale Analysis of Human and Artificial Trajectories: 

Models and Applications


Brief

Trajectories are paths followed by objects in motion through space, as functions of time. Our senses continuously detect and interpret trajectories for surviving in environments populated by human, animal, and artificial moving agents. In general, trajectories unfold in space and time, with possible degenerate cases when one of the two domains is collapsed to a single point in space (e.g., point-like tactile stimulation) or time (e.g., the trace left by a snake on the sand).

The project is aimed at providing consistent representations of trajectories at different scales of the egocentric space, through a variety of technological means that exploit the different senses at the scale they are most effective. In particular, the proximal space is where the stimulus is in direct contact with the body, touch is mostly effective, and communication is inherently intimate; the peripersonal space is where the displayed trajectories are within reach, communication is private or semi-private and can occur acoustically all around or visually within the field of view; the public space is where trajectories unfold in large space, are shareable within crowds, and are audio-visual.

At all scales, the trajectories produced by humans or other animals have certain characteristics that let us detect them as produced by biological motion. In order to extend natural social interaction to virtual or hybrid settings, it is important to have a robust understanding of biological trajectories at all scales and across the senses, and to transfer such knowledge into technological solutions that would seamlessly extend communication via natural trajectories to mixed realities.

In some fields the characteristics of human-generated trajectories have been thoroughly studied. For example, there are empirical mathematical laws describing the motor act of drawing with pencil on paper. In human-computer interaction, the Fitts’ Law, the steering law, and kinematic theories describe fundamental actions such as pointing or path following. However, it is still not clear if and how such laws extend across scales and senses, and if laws and models can be found to include the expressive content of trajectories.

The project develops new mathematical and computational models of expressive trajectories for under-investigated senses (touch) and scales (public-address sound). Supramodal models of trajectories will be derived through cross-sensory translation by means of experimental phenomenology, and validated through controlled experiments. Explainable, feature-based models will be developed as extensions of existing models, and will inform the development of featureless data-driven models. Applications will be developed, as proofs-of-concept for the trajectory models, in the areas of sport and performing arts, with implications in a variety of fields, including culture- and art-enabled motor reactivation and rehabilitation, and navigation help for sensory-impaired people.