Kartik Garg
I am a researcher at IISc Bangalore at the Stochastic Robotics lab headed by Prof. Shishir Kolathaya on making a robot dog see and walk. I also collaborate with Dr. Sourav Garg from AIML, University of Adelaide, to develop viewpoint invariant visual place recognition systems. I completed my undergrad at the Indian Institute of Technology (BHU) Varanasi, pursuing my bachelor's in Mechanical Engineering.
Previously I have:
I worked with Prof. Michael Milford and Dr. Sourav Garg at the Centre for Robotics, Queensland University of Technology, on the development of robust 3D localization models using Sparse Convolutions.
I was an intern at the Flight Robotics and Perception Group, University of Stuttgart as a DAAD WISE Scholar, German-Indo exchange program. I developed a deep-RL based drone docking system under the supervision of Prof. Aamir Ahmad and Pascal Goldschmid.
In my prior works, I have also worked on biomedical control for prosthetics and control for bipedal robots.
In my free time I quiz and design cool posters :)
I am always happy to learn and open to network.
Research
My research interest lies in creating robots that can interact with the environment and take autonomous decisions. Perception based control is a fundamental "skill" any robotic entity needs to be taught in order to be useful to the human society. I am interested in adopting methods from SLAM and learning based control to develop this "common sense" in robots. I am a huge fan of Sci-Fi movies and want to develop robots like Baymax and Wall-E to help humans and the society.
This video perfectly encapsulates my vision for the future of society with robots.
Publications
A Standalone Real-Time Gait Phase Detection Using Fuzzy-Logic Implementation in Arduino Nano. SN COMPUT. SCI. 3, 21 (2022). [Link]
Sachin Negi, Kartik Garg, Milind Prajapat, Neeraj Sharma
Multi-stage model to optimize disaster preparedness and response under uncertainty. Socio-Economic Planning Sciences (Under Review)
Surabhit Gupta, Kartik Garg, Lakshay Taneja
Projects
Segment Recognition (Ongoing)
Kartik Garg, Shubodh Sai, Sourav Garg, Madhava Krishna
Developing viewpoint invariant visual recognition models by utilizing the zero-shot capabilities of large vision models like SAM, and DINOv2.
Visual Locomotion for Quadrupeds (Ongoing)
Kartik Garg, Shishir Kolathaya
Developing visual locomotion policies using deep reinforcement learning to enable locomotion on challenging terrains.
3D Place Recognition
Kartik Garg, Sourav Garg, Michael Milford
Developed patch pooling techniques for robust deep lidar-based place recognition systems.
Smart Emergency Response System
Kartik Garg, Surabhit Gupta, Yashasvi Singh, Prof. Lakshay
Developing a two staged disaster response system to tackle demand uncertainties and optimize the flow of relief material distribution to affected areas as soon as possible
Deep RL for Drone Decision Making
Kartik Garg, Pascal Goldschmid, Aamir Ahmad
Developed a deep reinforcement learning model for drone docking decision making to maximise the time for a drone to complete a task safely before it needs to dock.
Jerbot Beta
Kartik Garg, Raghav Soni, Surabhit Gupta, Lokesh Krishna, Niranth Sai
Addressed the challenges of bipedal locomotion, motivated by the agility and nimbleness of a Jerboa. Formulated a MPC controller inspired by MIT Cheetah 3 controller, for bipedal locomotion. Jerbot Beta can't run yet but it can fly :)
Powered Prosthetic Ankle
Kartik Garg, Sachin Negi
Developed control algorithm for powered prosthetic foot. This project’s aim was to make a computationally effective classification algorithm to detect human gait phases. A major limitation of existing powered foot prosthetics is that they use high power computational hardware which is costly and makes it undesirable for the majority of the Indian demographic. I formulated a fuzzy logic-based algorithm to perform real-time classification. The algorithm produced exemplary real-time results, which led to a journal publication in Springer Nature, Computer Science Journal.