Seeking Exciting Robotics Employment Opportunities


Welcome to my homepage. I am a creative engineer, dedicated researcher, and passionate robotics enthusiast. This website serves as a platform for me to showcase a curated selection of projects that I have diligently worked on over the past few years. I hope that you find these projects intriguing and inspiring. For further insights into my past or ongoing work, I warmly encourage you to reach out to me.

My primary focus lies in the development of intelligent robotic systems that seamlessly bridge the gap between the virtual and physical realms. Specifically, I am deeply fascinated by the intricate interplay of haptic and visual perception, the advancement of robotic hand technologies, and the elegant algorithms encompassing control, planning, and learning. These pursuits find their purpose in the domain of robotic manipulation, where I strive to make significant contributions.

In my most recent professional role, I held a position of a Senior Robotics Engineer at Lab0, where I harnessed my expertise to drive an impactful project. Prior to that, I had the privilege of serving as a Senior Robotics Engineer at GelSight Inc., a pioneering startup commercializing high resolution tactile  sensing, that emerged from the research group led by Professor Edward Adelson at MIT, where I also worked as a postdoctoral researcher. My academic journey includes the completion of a Ph.D. and a M.S. in Computer Science from the University of Colorado at Boulder, under the guidance of Professor Nikolaus Correll. Furthermore, I earned a B.S. in Electrical Engineering from the National Institute of Technology at Surat, India.

I am driven by a relentless pursuit of excellence in the field of robotics and strive to make meaningful contributions to this ever-evolving domain. Feel free to explore my projects further and don't hesitate to get in touch with me for more detailed discussions.

GitHubLinkedInLinkLinkLink

radhen dot 17 at gmail dot com

PROJECTS

Hardness classification

As part of the demo prepared for the Consumer Electronics Show (CES), 2023, we aimed to utilize high-resolution, vision-based GelSight tactile sensors for the classification of three rubber samples with varying hardness (Shore 00-40, Shore 00-65, Shore 00-85). My responsibility included molding the rubber samples, hardware setup, including the robot arm and tactile sensors, as well as establishing the data collection pipeline. Additionally, I developed and trained a Recurrent Nerual Network (RNN) to successfully perform the classification task.

Roughness classification

As part of the demo prepared for the Consumer Electronics Show (CES), 2023, we aimed to utilize high-resolution, vision-based GelSight tactile sensors for the classification of three sandpaper sponges with varying roughness (36, 80, 220 grit). My responsibilities included hardware setup, including the robot arm and tactile sensors, as well as establishing the data collection pipeline. Additionally, I developed and trained a Convolution Neural Network (CNN) to successfully perform the classification task.

Pose estimation

This project focused on tracking the 6D pose of an object in contact with GelSight tactile sensors mounted on a robot gripper (end-effector/hand). By tracking the movement of the 3D geometry of the object in contact with the sensor, we could determine the position and orientation (pose) of the object relative to the sensor surface, which was crucial for in-hand manipulation, industrial insertion, and assembly tasks. The 3D geometry of the object in contact with the tactile sensor was registered using a standard point cloud registration algorithm, utilizing a known object model point cloud. This allowed us to accurately determine the 6D pose of the object with respect to the world frame. The above method was successfully implemented at a client site for an industrial insertion task, where we provided the code and technical support. This practical application demonstrated the effectiveness and robustness of the pose estimation technique in a real-world scenario, showcasing its potential for various industrial automation tasks.

GelSight Tactile Sensors

This work at Gelsight Inc. encompassed various responsibilities aimed at advancing the commercialization of tactile sensors for robotics applications. These responsibilities included, but were not limited to: 1) Development of software and hardware for vision-based tactile sensors, with a specific focus on applications in robotic manipulation. 2) Conducting research and development activities to drive innovation and create new products. 3) Providing direct technical support to customers, ensuring their needs were addressed and assisting them in implementing our tactile sensor solutions. 4) Creating compelling demonstrations (e.g. roughness and hardness demos) to showcase the capabilities and potential of both current and future products. I played a significant role in the development of two key products at the company. The first was GelSight R1.5, a wedge-shaped tactile sensor designed specifically for parallel jaw robot grippers, targeting the early adopters of this technology. The second product was Gelsight Mini, a miniature and cost-effective tactile sensor, which aimed to democratize tactile sensing by making it accessible to hobbyists and enthusiasts.

Digger Finger

The task of identifying objects buried in granular media using tactile sensors poses significant challenges. Firstly, particle jamming within the granular media hinders downward movement. Additionally, the presence of granular media particles between the sensing surface and the object of interest often distorts the object's actual shape. To address these challenges, we introduce the Digger Finger, an early prototype designed to overcome these limitations. The Digger Finger incorporates mechanical vibrations to fluidize the granular media during penetration, enabling smoother movement. Furthermore, it is equipped with high-resolution vision-based tactile sensing capabilities, allowing for the identification of buried objects within the granular media. The utilization of such robotic fingers holds great potential for applications such as explosive ordnance disposal and Improvised Explosive Device (IED) detection, offering a finer resolution compared to conventional techniques like Ground Penetration Radars (GPRs).

PDF | VIDEO | WEBPAGE

Tablet Defect Identification

This study aims to develop a pharmaceutical tablet inspection machine designed to identify manufacturing defects, including cracks, chipping, contamination, and more, through image analysis captured by a camera. The primary focus of my research involves the development of a robust algorithm utilizing computer vision and deep learning techniques to accurately detect these defects.

Tele-op-Glove

Designed and implemented a tele-operating system for precise manipulation of a four-fingered Allegro robot hand. The system incorporates a glove equipped with flex sensors on the finger joints, which exhibit changes in resistance when bent. Kinematic re-targeting i.e. mapping the flex sensor readings to control the corresponding movements of the robot hand was achieved by using the Sequential Least-Squares Quadratic Programming (SLSQP) algorithm,  inspired from the work titled DexPilot. To enhance the system's performance, magnetic sensors were later utilized, offering improved accuracy and response compared to the bend sensors previously employed.

preShape.mp4

Pre-Grasping Reflex

In this study, we leverage proximity measurements obtained from the PCF sensor to establish a reflex-like behavior for a five-fingered robot hand. Our objective is to enable the hand to dynamically adapt to the shape of objects and achieve gentle contact without imparting movement or causing any damage. To substantiate the effectiveness of our approach, we conduct empirical validation using a motion capture system. This system allows us to accurately track the position of the object both before and after the grasp, providing quantitative evidence of the hand's successful adaptation and gentle touch capabilities.

PDF

PCF Sensor for Prosthetic Hands

In this project, we undertook modifications to the initial design of the PCF sensor to seamlessly integrate it into an upper limb prosthesis device, specifically the Bebionic Hand. Our enhanced sensor design incorporates an infrared proximity sensor chip and a barometric pressure sensor chip, both carefully embedded within an elastomer layer. My involvement in this endeavor encompassed various responsibilities, including sensor prototyping, electronic testing and debugging, mechanical testing, and fitting. Additionally, I implemented a signal fusion algorithm to combine data from both sensor chips, enabling measurements of proximity within the range of 0-10mm, contact force at 0N, and force localization at five spatial locations and three angles of incidence (0-50N). Notably, our work has garnered attention in the media, with news articles featuring our advancements published in prestigious outlets such as CU News and CBS4.

PDF

Robotic Grasping and Manipulation Competition @ IROS

In this project, our focus was on the design and implementation of a comprehensive robotic grasping and manipulation system. The system leveraged multi-modal sensory data, combining vision and tactile feedback, to enable effective grasping and manipulation capabilities. The culmination of our efforts was showcased in two prestigious international competitions. The first was held in Daejeon, South Korea, as part of the International Conference on Intelligent Robots and Systems (IROS) 2016 event. The second took place in Vancouver, Canada, as part of the IROS 2017 proceedings. These competitions provided a platform for researchers and practitioners to demonstrate their advancements in the field of robotic grasping and manipulation.

PDF

Grasp Event Detection

Our study demonstrates the significant enhancement in the reliability of basic manipulation tasks through the integration of low-cost in-hand proximity and dynamic tactile sensing techniques. Inspired by the mechanoreceptors in human skin, we employed an array of infrared proximity sensors embedded in a transparent elastic polymer, along with an accelerometer located in the robot's wrist. By breaking down the manipulation task into eight distinct phases, we showcased the efficacy of our approach. Firstly, we illustrated how proximity information derived from our sensors can greatly improve the reliability of picking and placing objects. Additionally, we highlighted the utilization of dynamic tactile information to discern different phases of grasping. Our experimental findings, conducted with a Baxter robot engaged in a tower construction task, validate the effectiveness of our proposed methodology. The results reinforce the potential of incorporating proximity and dynamic tactile sensing for achieving more robust and accurate manipulation capabilities.

PDF

PCF Sensor for Robot Hands

In our research, we introduce a novel combined force and distance sensor designed for robotic manipulation tasks. This sensor utilizes a readily available infrared distance sensor embedded in a transparent elastomer material. Before contact is made, the sensor functions as a distance sensor, providing accurate measurements within the range of 0 to 10 cm. Once contact is established, the elastomer material acts as a spring, with the force exerted being proportional to the compression of the elastomer, with a range of 0 to 5 N. We provide a detailed description of the sensor's operational principle and key design parameters, such as polymer thickness, mixing ratio, and emitter current. Importantly, we demonstrate that the sensor response exhibits an inflection point at the moment of contact, which remains consistent regardless of the surface properties of the object. Furthermore, we showcase the practical applications of this sensor by employing two arrays of eight sensors each, mounted on a standard Baxter gripper. Through this configuration, we demonstrate three significant functionalities: (1) improved gripper alignment during grasping, (2) accurate determination of contact points with objects, and (3) generation of approximate 3D models that aid in identifying potential grasp locations. Our findings underscore the effectiveness and versatility of the proposed combined force and distance sensor, offering valuable insights for advancing robotic manipulation capabilities in various domains.

PDF

Haptic Gesture Classification

In this project, we outline our endeavors to leverage machine learning methods for the classification of hand gestures. To capture these gestures, we utilize a wristband equipped with an array of 16 embedded capacitive sensors. Throughout our study, we explore diverse classifiers and employ different data processing approaches. Our data collection involves gathering samples from 14 subjects performing five distinct hand gestures. Subsequently, we conduct experiments using various classifiers to evaluate the classification performance. Our findings indicate that it is indeed feasible to classify hand gestures using data obtained from our capacitive sensing wristband. However, we acknowledge that further refinement is necessary to optimize this method for real-time gesture recognition applications. This work sheds light on the potential of machine learning techniques and capacitive sensing in accurately discerning hand gestures. Nonetheless, additional research and development efforts are essential to enhance the effectiveness and practicality of this approach in real-world scenarios.

PDF

Autonomous RC Car

This work presents a comprehensive solution that addresses several key challenges associated with autonomous vehicles. Specifically, we focus on the following challenges: 1) Stop sign detection and stopping, 2) Collision avoidance with rolling objects, 3) Visual-Inertial Simultaneous Localization and Mapping (SLAM), and 4) Development of an accurate sparse map. We begin by providing a detailed overview of the hardware requirements for this project and outline the integration process. Next, we delve into the perception and control techniques employed to tackle each of the aforementioned challenges. We describe the experiments conducted and present the results obtained from our approach. Furthermore, we emphasize that the tools and algorithms developed for these challenges have been seamlessly integrated into the widely used open-source Robot Operating System (ROS). This integration enables broader accessibility and facilitates further research and development in the field of autonomous vehicles. By addressing these challenges and sharing our findings, we contribute to the advancement of autonomous vehicle technologies, paving the way for safer and more reliable autonomous driving systems.

PDF

Point Cloud Registration

This project assesses various algorithms for data set registration on the SQUID database, which was developed by the University of Surrey. The evaluation is performed using MATLAB and focuses on three algorithms: the Iterative Closest Point (ICP) algorithm, the Levenberg-Marquardt ICP (LMICP) algorithm, and the Coherent Point Drift (CPD) algorithm. The comparison between these algorithms primarily revolves around the distribution of resulting registration configurations. Given the limited research in this field, the project explores and evaluates the methodology required for conducting such a comparison. To facilitate dense sampling, an experiment suite is designed. Upon analyzing the experimental results, it becomes evident that the CPD algorithm surpasses the other two algorithms in terms of cluster size and the percentage of successful registrations. However, due to computational and time limitations, the 3D experiments were not conducted in depth, highlighting the need for further exploration in this area. In conclusion, this project sheds light on the performance of different data set registration algorithms using the SQUID database. The findings highlight the superiority of the CPD algorithm in certain aspects, while also acknowledging the necessity for additional investigations to delve deeper into 3D experiments.

PDF

State Estimation of Dynamical System

This project focuses on comparing two localization algorithms that utilize radio beacons capable of measuring range only. One key advantage of obtaining range information from radio signals is that it eliminates the need for a direct line of sight between the beacons and the transponder, thereby completely avoiding the data association problem. To fuse the range data obtained from the radio beacons with the dead reckoning data collected from an actual system, the project employs two filtering algorithms: the Extended Kalman Filter and the Particle Filter. These algorithms effectively integrate the range measurements and dead reckoning data to estimate the system's localization. The results obtained from the experiments demonstrate the performance and efficacy of the localization algorithms. By leveraging radio beacons for range measurements and employing the filtering algorithms, the project successfully achieves accurate localization without requiring direct line-of-sight communication. In summary, this project presents a comprehensive comparison of two localization algorithms that leverage radio beacons to measure range. By utilizing the Extended Kalman Filter and Particle Filter, the project successfully fuses range data with dead reckoning information, resulting in accurate and robust localization.

PDF

Mechatronic Design

As an undergraduate student, I had the opportunity to represent my college, the National Institute of Technology (NIT) Surat, in the Robocon Asia Pacific Robotics Competition in 2012 and 2013. As part of the team, I played a key role in designing and fabricating a group of robots that were tasked with completing a complex set of objectives. My contributions primarily focused on mechatronic and electronic design, as well as prototyping and fabrication.