Projects
Current
Designed a control framework for the dynamic rendezvous of an autonomous underwater vehicle (AUV) with a wave energy converter (WEC) using model predictive control under various ocean flow conditions.
Integrated active perception for flow state estimation.
Conducted laboratory experiments where a BlueROV2 autonomously docked with a station while subjected to mechanically generated waves, testing its response to various sea conditions.
Developing a wave prediction algorithm to compensate for unknown wave forces impacting the AUV's motion.
Developing trajectory optimization algorithms for autonomous underwater inspection and 3D reconstruction of the environment.
Designing a collection of chainable controllers for autonomous underwater vehicles (AUVs) using ros2_control.
The controllers are being designed to support the complete AUV control hierarchy and to enable benchmarking against other commonly used control algorithms.
Developing the core architecture that enables developers to create and deploy custom controllers for underwater systems.
Past
Designed an end-to-end deep learning framework that directly consumes the point cloud data from the laser-scan and learns the implicit surface function that is used to predict grasp quality for a two finger parallel jaw gripper.
Compared and analysed the performance of an end-to-end network with a sequentially trained network.
Designed a learning-based algorithm for autonomous underwater inspection that considers multiple objectives and constraints.
Compared and analyzed the performance of the algorithm with state of the art multiobjective optimization methods, such as NGSA-II and SPEA2.
Successfully performed 3D reconstruction of a turbine lander and an underwater BOP panel in a simulated environment.
Developed an autonomous navigation system to perform precise landing of an aerial vehicle on a moving target.
ArUco markers were used to detect, follow and land the UAV on the moving target.
Integrated ORBSLAM2 for mapping and localization.
Designed an MPC to control the aerial vehicle to precisely land on the moving target.
Rover-Bin & E-Bin
Developed a vision-based automated waste segregation system.
Fabricated a 4-DOF robotic arm and mounted it on a ground vehicle to perform active exploration.
Created a custom dataset consisting of 3000+ images from the following four categories of waste - glass, metal, paper and trash.
Designed and compared the performance of several CNN architectures for object detection. The location of the detected object was used as a set-point for the arm’s end-effector.
Implemented inverse velocity kinematics to obtain the desired joint space variables. Based on the detected label, the object will be disposed of in the appropriate bin.
Utilized ROS navigation stack (movebase) for autonomous control of the vehicle.