Kiran Bhattacharyya

     Google scholar    Github    LinkedIn

Hi. I'm Kiran.

I am a scientist and engineer with extensive experience in machine learning, computer vision, software engineering, optics and electronics instrumentation. Currently, I live in Atlanta and work as a machine learning scientist and software engineering manager at a surgical robotics company called Intuitive Surgical, Inc. Check out some of my projects to see what I'm up to.

Patents (in machine learning and AI)

A machine learning system to remove patient, surgeon, and OR staff identifiers from a surgical video recording.

A machine learning system that determines the surgical procedure type from a surgical video recording.

A machine learning system that can recognize surgical activities from surgical videos and segment the video into surgical steps.

A machine learning system that computes performance scores for surgeons based on tool movements extracted from surgical video during minimally invasive surgery.

Industry projects (2019 - Present)

Identifying and quantifying activities during surgery is essential for the advancement of surgical data science. Recent studies have shown that objective metrics computed during key surgical tasks correspond with surgeon skill and patient outcomes. However, identifying these surgical tasks is challenging. Here we present a probabilistic method of identifying tasks and computing objective metrics.
Timely and effective feedback within surgical training plays a critical role in developing the skills required to perform surgery. Feedback from expert surgeons, while especially valuable, is challenging to acquire due to their busy schedules. Using virtual reality (VR) surgical tasks, challenge competitors were tasked with localizing instruments and predicting surgical skill. Here we summarize the winning approaches and how they performed. 
Robot-assisted surgery provides a unique opportunity to quantifying technical skill since intra-operative surgeon behavior, like instrument kinematics and button presses, can be unobtrusively recorded. We build models based on metrics for specific types of surgical skill in order to provide interpretable, simple, and targeted feedback for surgeon improvement. 
We proposed a challenge to see whether virtual reality simulations of robot-assisted surgical tasks can be leveraged to learn transferable visual representation of surgical tasks in real-life settings. This work was completed with others at Intuitive Surgical, the organizers of the EndoVis Challenge at MICCAI 2020, and the challenge participants.

PhD projects (2013 - 2019)

Larval zebrafish are increasingly common in biomedical and genetics research. A close study of their behavior requires tracking their body during swimming to test if genetic or neurological manipulations resulted in the specific outcomes. Here I share an automated tracker that I wrote for larval zebrafish swimming in a dish. 
A volumetric imaging pipeline I made that starts with lightfield microscopy which collects fluorescent volumes in the larval zebrafish hindbrain. Then the pipeline progresses with stripe artifact removal, volume registration, segmentation and particle filtering for object tracking. This pipeline is used to track the activity of neurons in the brain of a fish while it behaves in a virtual reality environment. 
Evolutionary pressures have shaped locomotion and the neuro-muscular system which drives it. Even the relatively small nervous system (about 100,000 neurons) of a 5-day old larval zebrafish is able to produce a variety of movements ranging from fast ones used to escape from predators, to fine movements necessary to capture prey. Here I build a virtual reality system to induce and quantify these behaviors under a microscope
Neural activity from the hindbrain of larval zebrafish escaping from predators in virtual reality. Using naturalistic representations of virtual predators while simultaneously monitoring escape behavior and the recruitment of multiple reticulospinal neurons, we found that larval zebrafish perform a calibrated assessment of threat when attacked. 

Machine learning and AI (personal projects) 

I hack a Hexbug Spider robot with an Arduino and control it with a computer so it can navigate around obstacles and reach a goal position. This project involves a lot of good old-fashioned AI and computer vision methods including color detection, shape detection, path finding, path following and robotic control.
I take a deep dive in the Sketchy data set compiled by Georgia Tech of over 75,400 sketches of 125 different objects made by different people. In order to understand the patterns in the data, I explore different methods of visualization and perceptualization of the data including sonifications and encoding in latent dimensions by deep neural nets. This is an ongoing project. 
How much can the "anonymized" data reported by an Apple Watch or a Fitbit actually reveal about our daily lives? To get a quantitative measure of this, I use accelerometer data with labels of the activity the wearer was performing to build a classifier that predicts those activities. The findings are a little scary. You probably want to take that Apple Watch off. 
In the world of "fake news" and many "mistruths", we all need a little help in identifying what statements need some additional support. This is a web app I made that helps identify sentences in text that need citations. It was trained on a Wikipedia corpus that I scraped to include both sentences with and without citations.

Data science

Project submitted to the Congressional Data Competition. We demonstrate a method of organization and propose a formal system we term the congressional data complex to bring order to the expanse of congressional data being produced every year using NLP and network analysis.  A collaboration with Sara Milkes. 
Most of the music produced comes with human-given labels like the artist, the genre, and even the production company. Here I try to see if the music itself (features of the audio signal) can provide the necessary features for classification and categorization. 
How has neuroscience and the concepts that drive it changed over the last 20 years? Which topics have gained ground? And which have gone by the wayside? To perform a scientific study of these questions, I systematically analyzed the abstracts over the last 20 years in five reputable neuroscience review journals with NLP, automated summarization, and statistical modeling.
Here is do some analysis of data released by CDC to understand how COVID-19 deaths were related to racial demographics and education. As we all may have expected and have found out, the pandemic emphasized and amplified existing inequities in the US. But the degree to which it did so is truly surprising.

Art

Grows a random branching system using stochastic L-systems. A collaboration with Sara Milkes. 
I model parts of the bird brain responsible for sending commands to the motor centers that control singing and the syrinx which is the sound producing apparatus in the bird. I run the model through the parameter space and then cluster different chirps using a similarity metric. Then using a randomly generated HMM I create sequences of bird chirps which make a song and can also string together multiple songs to make an entire bird song composition. Here is a bird song symphony from many birds.
Winky is a haptic communication interface (concept) that connects people through a set of vibrational motors located in different areas of their bodies. Ultimately, our purpose is to study and create a haptic-digital language relying on vectorized dictionaries that account for movement and emotion. The vibrational motors are connected through Bluetooth to a drawing and voice-activated application on their smartphones. An ongoing collaboration with Sara Milkes. 
An interactive installation presented at the Museum of Contemporary Art in Chicago. Viewers could participate by voting through pressing the noses of stuffed animals. The plushies were equipped with touch sensors and cameras to record the voters during the process.This was a collaboration with Sara Milkes.