Defining paths

This functionality allows the user to define a task for the robot via a set of predefined gestures, that is, define a path that the robot should perform later autonomously.

The orders of locomotion are calculated according to the user's position regarding the robot and using the locomotion orders presented in the following images. The movement displayed in Image A is recognized as the starting movement, Image B represents the gesture for reference point recognition and the movement displayed in Image C is recognized as stop.

   A. Starting gesture recognized by the robot     B. Spotting gesture recognized by the robot  C. Stop gesture recognized by the robot 

Thus, the user must conduct a gesture to indicate the beginning of the robot path. As the robot will follow the user, the later will perform gestures to set the path to memorized landmarks. To complete the process the user performs a stop gesture. At the end of the memorized points, these are associated with a mission that may later be selected to run autonomously by the robot.