Kiran Bhattacharyya

Hi. I'm Kiran.

I am a scientist and engineer with extensive experience in machine learning, computer vision, optics and electronics instrumentation. Currently, I live in Chicago and am finishing my PhD in Biomedical Engineering at Northwestern University. Check out some of my personal and research projects.

Research projects

Larval zebrafish are increasingly common in biomedical and genetics research. A close study of their behavior requires tracking their body during swimming to test if genetic or neurological manipulations resulted in the expected or unexpected outcomes. This is my effort to share an automated tracker for larval zebrafish swimming in a dish which I wrote for my research.
How does the body organize smooth movement through the control of a large number of anatomical and neurophysiological degrees of freedom (DOF)? Each DOF is not controlled independently, but rather behaviors are composed of coordinated activities of groups of muscles. This co-activation of groups of muscles, rather than each independently, can be used to reduce the effective DOF and the computational load for control.
Since the ability to move is essential to many animals, selection pressures have shaped locomotion and the neuro-muscular system which drives it. Even the relatively small nervous system (about 100,000 neurons) of the larval zebrafish at 5 days post-fertilization (dpf) is able to produce a variety of movements ranging from fast ones used when escaping from predators, to the fine movements necessary for prey-capture. Here I describe an approach to understanding the neuromechanics of locomotion in this fish.
A volumetric imaging pipeline I made that starts with lightfield microscopy which collects fluorescent volumes in the larval zebrafish hindbrain. Then the pipeline progresses with stripe artifact removal, volume registration, segmentation and particle filtering for object tracking. This pipeline is used to track the activity of neurons in the brain of a fish while it behaves in a virtual reality environment. The output is a time-series of neural activity based on a fluorescent reporter.
Neural activity from the hindbrain of larval zebrafish escaping from predators in virtual reality. Using naturalistic approach rates of virtual predators while simultaneously monitoring escape behavior and the recruitment of multiple reticulospinal neurons, we found that larval zebrafish perform a calibrated assessment of threat when attacked.
Breast cancer is the most common cancer among women. Metastasis—the presence of secondary tumors caused by the spread of cancer cells via the circulatory or lymphatic systems—significantly worsens the prognosis of any breast cancer patient. Here I develop a technique to detect circulating breast cancer cells in human blood using a photoacoustic flow cytometry method. This method can not only be used to determine the disease state of the patient and the response to therapy but also it can be used for genetic testing and in vitro drug trials since the circulating cell can be captured and studied.

Personal projects

Project submitted to the Congressional Data Competition. We demonstrate a method of organization and propose a formal system we term the congressional data complex to bring order to the expanse of congressional data being produced every year using NLP and network analysis. A collaboration with Sara Milkes.
Grows a random branching system using stochastic L-systems. A collaboration with Sara Milkes.
How has neuroscience and the concepts that drive it changed over the last 20 years? Which topics have gained ground? And which have gone by the wayside? To perform a scientific study of these questions, I systematically analyzed the abstracts over the last 20 years in five reputable neuroscience review journals with NLP, automated summarization, and statistical modeling.
Most of the music produced comes with human-given labels like the artist, the genre, and even the production company. Here I try to see if the music itself (features of the audio signal) can provide the necessary features for classification and categorization.
I find myself drawn to the concept of connectivity in the physical world. Where we travel everyday and how we get there directly influences what we are exposed to and what is exposed to us. This has consequences for everything from disease epidemics to advertising. Moreover, such a record of movement would allow individuals to create an auxiliary memory of the history of places visited with great detail. For these reasons, I decided to track everywhere I go every day with a homemade GPS so I can be my own Big Brother and reclaim my life-logging.
I model parts of the bird brain responsible for sending commands to the motor centers that control singing and the syrinx which is the sound producing apparatus in the bird. I run the model through the parameter space and then cluster different chirps using a similarity metric. Then using a randomly generated HMM I create sequences of bird chirps which make a song and can also string together multiple songs to make an entire bird song composition. Here is a bird song symphony from many birds.
Winky is a haptic communication interface that connects people through a set of vibrational motors located in different areas of their bodies. Ultimately, our purpose is to study and create a haptic-digital language relying on vectorized dictionaries that account for movement and emotion. The vibrational motors are connected through Bluetooth to a drawing and voice-activated application on their smartphones. An ongoing collaboration with Sara Milkes.
I take a deep dive in the Sketchy data set compiled by Georgia Tech of over 75,400 sketches of 125 different objects made by different people. In order to understand the patterns in the data, I explore different methods of visualization and perceptualization of the data including sonifications and encoding in latent dimensions by deep neural nets. This is an ongoing project.
How much can the "anonymized" data reported by an Apple Watch or a Fitbit actually reveal about our daily lives? To get a quantitative measure of this, I use data collected from a 3-axis wrist worn accelerometer with labels of the activity the wearer was performing to build a classifier that predicts those activities. The findings are a little scary. You probably want to take that Apple Watch off.
An interactive installation where viewers could participate by voting through pressing the noses of stuffed animals. The plushies were equipped with touch sensors and cameras to record the voters during the process. This was presented at the Museum of Contemporary Art in Chicago. This was a collaboration with Sara Milkes. Please find the presentation, the architecture of the full system, the archives of collaboration and associated videos on the project page of Sara Milkes.