Experience

Senior Data Engineer, Self-Driving (Feb 2022 - Present)

  • Building automated data pipelines and architecting data workflows to help analysts and engineers visualize, curate, annotate and analyze data for deep learning models.

  • Leading Operations effort to build a Data management solution for Perception data across the ADAS group, in conjunction with external vendors.

Software Engineer, Experience AI Cloud (Apr 2020-Feb 2022)

  • Lead Engineer and Product Owner (North America) for the SmartHome Integration Project for Mercedes-Benz (live in production vehicles since December 2020). Leading and coordinating a team of 4 across Germany and USA. Kick started the project to provide Mercedes-Benz customers the ability to interact with different smart home ecosystems from their vehicle. Built a micro-services based Cloud Architecture using Python-Flask, NodeJS-Koa and MongoDB, deployed and maintained through Kubernetes and Azure DevOps tools.

  • Engineering Data Pipelines through Azure Event Hubs, Data Lakes and Databricks for the ingestion, storage and analysis of Speech data for the Hey Mercedes voice assistant.

  • Building interactive dashboards and visualizations for data analysis through Grafana and SQL.

  • Worked on full-stack development for an internal platform for Speech engineers in the worldwide Mercedes-Benz ecosystem, utilizing the MongoDB-Express-Vue-NodeJS (MEVN) stack.

Software Engineer - Early Talent Program (Mar 2019-Mar 2020)

Sensor Fusion (Autonomous Driving) [October 2019- March 2020]

  • Worked on multi-stage multi-modal road topology learning for map-less urban driving using deep learning techniques (patent pending).

  • Worked on feature engineering, data collection and validation, optimization, evaluation and testing methods.

  • This system currently allows us to predict the static grid map with 90% accuracy. It will be used as a fail safe in case the current HD map fails, and in the future as the first stage to true map-less driving.

Localization (Autonomous Driving) [July 2019-October 2019]

  • Developed and integrated Surround View System (SVS) module into Localization package.

  • SVS reduces localization errors on highways and expressways by over 50%.

Speech and Digital Assistants [March 2019-July 2019]:

  • Worked on Hey Mercedes digital assistant integration with external providers.

  • Created new Hey Mercedes applications for the Mercedes-Benz UX (MBUX) infotainment unit using internal tools.

  • Built a chatbot engine based on Bidirectional Encoder Representational Transformer (BERT) and multi-armed bandit learning to determine user sentiment and conversation state with accuracy of over 85%.

Inovision Inc, Rochester Hills, Michigan

Software Engineering Intern (May 2018-Dec 2018)

  • Worked as a part of the Software Engineering team on the SMART_INSPECT Quality Verification System(QVS) in production at the General Motors assembly plant in Flint, Michigan.

  • This system is a dynamic, moving-line automatic paint-detection system with an extremely small footprint, that runs at plant line speed. As the vehicle passes through the system thousands of images of the vehicle are taken with the help of cameras and lights.

  • These images are then passed through an image processing system that finds regions of interest(ROI) where the paint may be defective.

  • These ROIs are then passed through a cascaded binary classifier to classify the real defects and false positives. Currently this classifier can detect paint defects with over 98% accuracy.

  • From September till December, I worked on Phase 2 of the project where the paint defects will be further classified based on their cause- such as dirt, cratering and orange peel, to name a few.

Project Trainee (Feb 2017-Jun 2017)

Supervisor: Mr. Bishwajit Sharma, Scientist Gr 'E', CAIR, DRDO

  • Worked with the Intelligent Systems Research and Development(ISRD) department on 3-D object and environment mapping in real-time using a modified variant of the Iterative Closest Point(ICP) theorem in conjunction with Simultaneous Localization and Mapping(SLAM).

  • The goal of the project was to make a semi-autonomous bot climb a flight of stairs on its own, as it rendered a full 3-D reconstruction of its environment in real-time.

  • The RGB-D images captured using an Xbox Kinect camera mounted atop the robot were aligned and reconstructed using a devised algorithm to create a 3-D model.

  • This algorithm was based on a modified variant of the Comprehensive Iterative Point Algorithm(CICP) implemented along with RANSAC and SLAM.

  • The efficiency of the model was measured using 2 parameters: performance time and use of the Modified Hausdorff Distance to calculate correspondence between 2 point clouds.

Industrial Trainee (Jun 2016-Jul 2016)

Supervisor: Mr. Sanjay Kumar Sharma, Chief Manager, HR Division, HAL Helicopter Division

  • Interned at HAL for a period of 1 month, learning about the workings of the mechanical and aeronautical departments, especially the design, manufacturing process and mechanisms of the avionic components present in a helicopter.

  • Delved into the depths of the Weapons System Integration(WSI) for the Mk IV Advanced Light Helicopter (ALH-Rudra). These included global positioning systems, FLIR(Forward Looking Infrared), HF/UHF communications radio, Infrared Friend or Foe (IFF) identification system, Doppler navigation and autopilot controlling systems.

Southern Generating Station, Calcutta Electric Supply Corporation (CESC)

Intern (Jun 2015)

Supervisor: Mr. Sudipta Kumar Laha, Station Manager

  • Interned at the Southern Generating Station of the Calcutta Electric Supply Corporation(CESC) for a period of 14 days, gaining a general overview of the workings of a thermal power station.