Robot Localization & Navigation using Semantic Floor maps

This was an ambitious project that I explored during the initial years of my PhD as part of the ADACOMP lab. The goal was to allow a robot to visually navigate in indoor environments given just a floor map of that space, instead of traditional high quality slam maps that are difficult to collect at scale. For this, we envisioned an agent that could localize by correlating its visual feed with semantic landmarks marked in the floor map (the map legend). I developed a localisation module to support this using an end-to-end deep particle filter that localized the agent on topological voronoi graph of the environment. Navigation was achieved by interfacing the localisation module with a custom intention-net controller that generated actions based on the agent's current position in the graph and a high level path planner. Although the project showed promise, it unfortunately had to be shelved. But the experience was enlightening and I learnt a lot in this process. You can find the my report and the resources I collected on this topic here.