Research‎ > ‎

Research interests


My research interest focus on the biomechanical analysis and modelling of human locomotor trajectories, on interactions between people, and on the use of Virtual Reality for experimental analysis. 


Human locomotion

Walking is an important function of human motor synergies which depends on the goals or the environment properties (i.e., static or moving obstacles, geometry of the environment, ...). Humans have then to constantly adapt their path to their surroundings in order to move efficiently. This behaviour allows humans to avoid obstacles or simply to change the walking direction to encounter somebody or to turn in a corridor. Straight walking does not reflect all the complexity of everyday life trajectories. Therefore, the analysis and modelling of human walking needs to consider both straight and curved trajectories.

Straight-walking modelling

During straight walking, the CoM does not actually follow a pure straight-line Indeed, with respect to the stepping activity, the CoM trajectory moves from side to side. These oscillations show that the walking trajectory could be controlled, at least partially, according to a discrete scheme. Velocity can be associated with an antero-posterior control of displacement and curvature (i.e., the inverse of the radius of curvature) with a medio-lateral control. Velocity and curvature adjustments have then to be studied instantaneously but also in relation to the stepping activity.


We first examined human walking using a new step-by-step approach. The CoM trajectory of walking participants was modelled as a succession of circular arcs per step (figure a). Each arc was characterized by a curvature and a velocity (average values between two steps). This step-by-step model was accurate since RMS error between the model and the current CoM trajectory was negligible (figure b). Therefore, the model gives us a discrete and simple geometric description of the walker’s trajectory, which can be reduced to velocity-curvature information per step.




Secondly, we analysed velocity-curvature relation in straight walking. Results showed a power law relation in the [V, Cs] space between these parameters for all steps of all participants in straight walking. Therefore, the regulation between velocity and curvature may follow a common principle that is invariant for human straight walking. This invariance first provides a law of motion that can be used to understand trajectory generation or to model locomotion.


 



Turn detection

 

 

When a [V, Cs] pair goes outside the 95% interval of confidence of the V-Cs relation, this indicates that the participant’s behaviour differs from straight walking. Then, this law can be used to detect steps that are not performed in straight walking: in this way, a turn can be easily and automatically detected. 







Power Law Relation

Neuroscientific approaches have provided an important invariant linking kinematics and geometry in locomotion: a power law controls the relation between radius of curvature and velocity of the trajectory followed. However, these trajectories are predefined and cyclic. Consequently, they cannot be considered as fully natural.We investigate whether this relationship still exists in one unconstrained turn, which can be compared to an everyday life movement. Two different approaches were developed: an intra-individual one along each turn of each trial and an inter-individual one based on a specific instant for which a subject’s trajectory goes through its maximal curvature. The intra-individual approach did not lead to any power law between velocity and curvature along one single trial. Notwithstanding, the inter-individual approach showed a power law between the whole couples “minimal radius of curvature/associated velocity”. Thus, the speed/curvature relation is more a “long term” motor control law linked to the turning task goal rather than a “short term” one dealing with trajectory following all the time of the motion.






Interaction between walkers

Human navigation consists not only to move from a point A to a point B but also to avoid stationary and moving obstacles on the travel path while interacting with other people. A challenging question is therefore to understand the way humans adapt their trajectory to external disturbance.


Collision avoidance strategies


We investigated collision avoidance between two walkers. We first focused on the conditions that induce avoidance manoeuvres in locomotor trajectories. With the hypothesis of a reciprocal interaction, we suggested a variable which is common to both walkers, denoted the Minimum Predicted Distance ( MPD, i.e. distance of closest approach), to predict motion adaptations. Results showed that participants adapt their motions only when required, i.e., when  MPD is too low (<1m). Human walkers are then able to accurately estimate future crossing distance and to mutually adapt it. Collision avoidance can be  described by 3 successive phases: observation, reaction, and regulation. Respectively, these phases correspond to periods of time when, first, MPD  is constant, second, MPD increases to acceptable values by motion adaptation and, third, MPD reaches a plateau and even slightly decreases. This regulation phase demonstrates that collision avoidance is actually performed with anticipation. Second, we studied the role of each walker. A specific question was: does the walker giving way (2nd at the crossing) and the one passing first set similar and coordinated strategies? To answer this question, we inspected the effect of motion adaptations on MPD. Results showed that collision avoidance is performed collaboratively but that the crossing order impacts both the contribution and the strategies used: the participant giving way contributed more than the one passing first to avoid the collision. Both walkers reoriented their path but the participant giving way also adapted his speed. Future work is planned to investigate the influence of crossing angle and TTC on adaptations as well as new types of interactions, such as intercepting or meeting tasks.

 

Emotionally expressive interactions

The expression of emotions and their recognition by others play a major role in the human everyday social life. Emotionally tinted facial expressions have been most studied. Some more recent studies also highlighted the importance of the whole body movement when facial expression or verbal communication cannot be intercepted. Montepare et al. (1987) suggested that we often make social judgments according to gait information by identifying for example others’ goals, attitudes or emotional states. In the case of walking, emotions affect the global posture, the gait kinematics as well as coordination. Previous studies on locomotion and emotion concerned single individuals. However, daily social life is defined by the interaction with others, where coordination and social synchrony were shown to play very important role. As emotions impact individual human behaviour, they also influence interactions between humans. Despite the extensive literature on emotions and interaction, none of the previous studies concerned the commonly occurring combined effect of emotions and interaction for locomotor tasks. Interactions occur during locomotion as soon as two people walk in the same vicinity: they can be avoiding, meeting, following, intercepting each other, etc.  We hypothesize that kinematics of locomotion bear an important clue to understand the interactions within a group and the nature of the link between walkers.



Emotionally expressive walking for unconstrained trajectories

The effect of emotions on the walking gait has been demonstrated in previous studies which were limited to predefined straight-walking paths. However, these conditions are not fully representative of the human walking. Moreover, in the aim of studying interaction between walkers, each walker needs to freely adapt his locomotion trajectory to the other. We then measured the influence of emotions on individual human walking gait during free locomotion (including curved paths). Our observations are consistent with previous studies in straight-walking, with an effect of emotions on both the posture and the kinematics of walking. Free trajectories then appear to be a relevant case-study to investigate the influence of emotions on motion while enhancing the ecological realism of the situation.



Interaction mechanisms during emotional walking

We inspected the effect of emotions on the kinematics of two actors walking together. We showed that angrer resulted into faster and more jerky motion than neutral state, and the opposite for fear. This signature was stronger in symmetric conditions of interaction. When both actors play angriness, i.e., an anger-anger interaction, the couple speed was even faster than for an anger-neutral interaction. Moreover, we showed that symmetric conditions of emotion resulted into symmetric physical interactions, whereas asymmetric conditions unbalanced the couple interaction. In symmetric conditions of interactions (anger-anger, fear-fear or neutral-neutral), actors walked more side-by-side and are more in-phase. At the opposite, in asymmetric conditions (anger-neutral or fear-neutral), one of the actor was walking ahead and in advance of phase compared to the other. We emphasize leader-follower emergence in asymmetric conditions and cooperation in symmetric conditions of emotions.




Virtual Reality

Virtual reality is a powerfull tool for perception-action experiments. Human data acquisition in real situations is fundamental to provide strong motion analysis, while maximizing the ecological validity of the task. Nevertheless, it is sometimes challenging to accurately control or modify with subtle changes the experimental conditions, expecially when the experiment involves interaction between humans. In this context, virtual reality is a relevant and complementary tool to overcome these difficulties. However, being immersed in a virtual environment can lead to modifications of distance perception or gait speed, to unstable gait or motion sickness. Moreover, to allow a user to move in a large virtual environment, it is required to use motion interfaces. In that case, the motion applied to the device differs from the motion a user should have performed in real condition (i.e., the hand acting on a joystick to simulate a walking motion) and the resulting motion also depends on the associated transfer function. It is therefore essential to determine, for a given task, the most appropriate techniques and interfaces to achieve the best matching between real and virtual conditions.


Visual perception of collision avoidance model

Validating that a real user can correctly perceive the motion of a virtual human is first required to enable realistic interactions between real and virtual humans during navigation tasks through virtual reality equipment. We focus on collision avoidance tasks. is to evaluate the information conveyed by an animation of a walking virtual human (right image), with comparison to similar real situations (left image). Our main question is: does the conveyed information allow a real observer to take realistic navigation decision from the visualization of a synthetic walker?



Results using a simple screen display showed that judgment of crossing order was easier than for collision detection when an observer is in front of a simple display. Crossing order implies different collision avoidance strategies between two real walkers: role in interactions are then correctly anticipated. Therefore we can assume that this display modality allows users to perceive the right kind of reaction but not the appropriate quantity of adaptation. This difference can be explained by studying bearing angle (i.e., the angle between the walking direction and the direction under which another walker is perceived). In fact, relative motion between the observer and a moving obstacle can be defined by the bearing angle  and its derivative. A collision would be detected when the bearing angle derivative  is close to zero, under a threshold. Observers have then to accurately determine whether he bearing angle derivative is equal to this value. On the contrary, the crossing order depends on the sign of both he bearing angle and its derivative. When their signs are equal, observer passes first and conversely when their signs are opposite observer gives way to the virtual walker. Crossing order judgment results then in determining the quadrant matching the actual situation.




We found similar results when conducting this experiment in a CAVE. On average participants seemed able to correctly estimate the condition of interaction with the virtual human. The order condition is identified earlier than collision. As previously explained, estimating order is equivalent to determine if the relative motion is above or below a threshold (which results into passing in front or behind). Estimating order may be simpler than estimating the risk of collision. Correctly estimating order is more important, as order determines which kind of adaptation improves the situation. Collision accuracy is lower for some conditions as if there was a gap in the perception of their envelope in the virtual world, as if they feel like they are behind their actual virtual position. As a conclusion, participants correctly perceive the situation of interaction with the virtual character but information about collision is maybe delayed in comparison with reality, and the position in the virtual environment is perceived with an offset.




Evaluation of locomotion metaphors and transfer functions for locomotion and collision avoidance studies

Virtually walking in virtual environments is a fundamental requirement in numerous virtual reality (VR) applications. This task is greatly influenced by the locomotion interfaces , by the specificities of input and output devices, and by the way the virtual environment is represented. No matter how virtual walking is controlled, the generation of realistic virtual trajectories is absolutely required, especially for applications dedicated to the study of walking behaviors in VR, navigation through virtual places for architecture, rehabilitation and training. Previous studies focused on evaluating the realism of locomotion trajectories have mostly considered the result of the locomotion task (efficiency, accuracy) and its subjective perception (presence, cybersickness). Few focused on the locomotion trajectory itself, but in situation of geometrically constrained task. We studied the realism of unconstrained trajectories produced during virtual walking by addressing the following question: did the user reach his destination by virtually walking along a trajectory he would have followed in similar real conditions? We proposed a comprehensive evaluation framework consisting on a set of trajectographical criteria and a locomotion model to generate reference trajectories. We considered a simple locomotion task where users walked between two oriented points in space. The travel path was analyzed both geometrically and temporally in comparison to simulated reference trajectories. In addition, we demonstrated the framework over a user study which considered an initial set of common and frequent virtual walking conditions, namely different input devices, output display devices, control laws, and visualization modalities. The study provides insight into the relative contributions of each condition to the overall realism of the resulting virtual trajectories.




We evaluated the level of realism of collision avoidance trajectories performed using a VR system. We tested different locomotion interfaces, and wondered which of them could enable participants to adapt their trajectories as they would have done in reality. As a first evaluation, we considered 4 main conditions to control motion: natural walking in VR, a joystick with several transfer functions and two locomotion metaphors implying whole body motion. All the studied interfaces lead to qualitatively realistic trajectories, with some quantitative differences in avoidance distances or strategies. The interface that best matches human behaviour, observed in real conditions, was the joystick with a transfer function corresponding to an automatic forward motion at comfortable speed and a control of orientation based on rotation. These initial results are promising and open several perspectives in the study of human interactions.




Interaction with a crowd


Some crowd simulators consider that agents may be moving in groups. An important and yet unaddressed question is: how do individuals interact with such groups? And in particular, during collision avoidance, should individuals go through groups or go around them? We propose an experiment based on our VR platform to study such a behavior. We consider groups of a constant size, and study how individuals avoid collision when encountering groups. We consider 3 factors: spacing between people within the group, visual aspect of people in the group, and relative direction of motion of the group. Results showed that spacing between characters in the group is the most important factor in the decision for individuals to go through or around groups. These results based on real data in VR were used to adapt an existing crowd simulator (RVO2) in a realistic way.








Interaction with an emotionally expressive virtual walker


We were interested in whole body and emotionally tinted situations of interactions between real and virtual walkers. We first measure the effect of emotions on the kinematics and metrics of interactions between two walkers. Then, we reproduced a similar situation of interaction between a real subject and a virtual walker expressing emotions. We performed comparisons between real-real interactions and real-virtual ones. Similar effects were observed on the kinematics of interactions in both experiments. We demonstrated the ability of real subjects to perceive emotions expressed by a virtual character through whole body motion. We showed that real subjects’ reaction to the behavior of an expressive virtual character complied with their reaction to the behavior of an expressive real human walker. This result is one step toward the use of such virtual reality platform to study social interactions through fully controlled experiments.