Computer Science 

PhD Student at Virginia Tech

TwitterLinkedInGitHub

Featured Articles

VQ-VAE: Neural Discrete Representation Learning

Research Interests

 Deep Generative Models: Generative modeling aims to learn unknown, complicated, high dimensional probability distribution that data samples belong to. By means of deep generative models, one can obtain the likelihood of an arbitrary image to investigate whether it comes from the same distribution as data samples come and further generate new samples from the underlying data distribution.


 Representation Learning:  I am also interested in the extraction of high-level, abstract and robust representations that can later be used for downstream tasks using transfer learning. Recently I have focused on Diffusion Based Representation Learning. Specifically, I have concentrated on Diffusion Based Autoencoders to learn decodable representations.


Probabilistic Inference:  Probabilistic Inference uses Probabilistic Models as an attempt to describe statistical problems in terms of probability theory and probability distributions.

Work Experience


RadiusAI Inc. 

(October 2022 - February 2023)

Research Engineer

University College London

 (June 2022 - October 2022)

Research Intern

University of Edinburgh

 (July 2022 - October 2022)

Research Intern



KTH Royal Institute of Technology 

(July 2021 - October 2021) 

Research Intern


Middle East Technical University 

(October 2021 - January 2022

CENG240 | Student Assistant


(October 2020 - January 2021) 

CENG111 | Student Assistant