In this project, my team and I explored using visuo-tactile information in imitation learning frameworks for tackling complex manipulation problems, leveraging our multimodal dataset to pre-train our model via a contrastive loss. We show that our pretraining strategy, which gives a visuo-tactile agent a moderate performance improvement, can be used to significantly improve the performance of a vision-only agent. By pretraining with tactile information, vision-only agents were able to achieve a success rate on par with their visuo-tactile counterparts, without requiring tactile information during deployment.
We evaluated our method on the task of USB cable plugging, a dexterous manipulation task that relies on fine-grain visuo-tactile serving, along with two block-stacking tasks.
My primary role was to design the interpolation algorithm that would be used to estimate the deformation across the Gelsight's tracking pad. The Gelsight uses pixel based maps to determine deformation, but we were looking for more granular detail that the Gelsight could not provide, which necessitated the creation of this algorithm.
More details of this project (paper, video, etc.) can be found by clicking the box below.
The Automated Canopy System is an outdoor umbrella that reacts to environmental stimuli, like rain and temperature, to automatically determine when it needs to open and close. In my team, I was primarily responsible for the Finite Element Analysis (FEA) of the system and assisted with the development of the User Interface. The project was a great success and our team was awarded ‘Best Prototype’ at the CMU Design Expo 2021.
This experience helped me improve my knowledge of SolidWorks, especially with regard to its FEA capabilities, while also giving me a great introduction to Arduino based systems and feedback control.
More information can be found in this YouTube video and on my home page.
The goal of this project was to utilize Yolov5 to identify common grocery items and output the name through a speaker to help visually impaired individuals. My partner and I classified and labeled 16 different categories, ranging from apples to salami using the LabelImg software. We modified the detect code in the Yolov5 library to voice the class names through the speaker. Our system was run on a Jetson Nano to reduce its footprint and make it easier to carry around.
This project helped me reinforce Machine Learning concepts while making a product that aligns with my community driven goals.
More information can be found in the PowerPoint presentation linked here as well as on my home page.
This project focused on determining whether an aerodynamic tail could outperform a rigid tail (tail with a mass attached to the end) in helping a remote controlled (RC) car turn. Our team looked at turning speed and lateral acceleration during trials to determine which tail provided better turning capabilities. My primary contribution was designing the rigid tail and determining the motor that would meet our experimental needs.
This experience helped strengthen my core Mechanical Engineering fundamentals while also giving me great insight into experimentally driven projects, i.e., projects where data needed to be collected and analyzed. I was also the voice on the project video linked below and had to learn how to best communicate our project goals and results to viewers.
A detailed explanation of the project can be viewed at the YouTube link here as well as on my home page.