About Me

Last updated September 2023

I joined EPFL in the winter of 2024, and prior to that, I spent three wonderful years at Mila and Université de Montréal (UdeM) from 2020-2023 as a research Master's student in Machine Learning under the supervision of Irina Rish.

During my time at Mila, I have had many great opportunities to do research alongside brilliant researchers. I am enthusiastic about exploring the situations where causality and invariance coincide, hence providing a much richer representation that could be exploited in various tasks. I have spent some time doing research on learning reusable independent mechanisms in the form of factor graphs, learning disentangled object-centric representations, and how the latter interplays with the former. These works resulted in publications at ICLR 2024 and NeurIPS 2023 (see publications).

I also had the opportunity to participate in the Summer@EPFL internship program. I joined the Data Science Lab under the supervision of Robert West because I was inspired by Daniel Kahneman's "Thinking Fast and Slow"; Consequently, I wanted to complement my understanding of "thinking slow" in the context of natural language. We developed a symbolic autoencoding mechanism to train a sequence-to-sequence-to-sequence models end-to-end, such that a rich symbolic representation in the bottleneck emerges through weak supervision. The framework is versatile, and the bottleneck representation could be an encoding in the form of structured text, i.e., or natural language. The full version in currently under review, link to the repo: https://github.com/epfl-dlab/symbol-lab-dev/ 

Prior to Joining Mila, I did my bachelor's in Electrical Engineering, Digital Systems Specialization, at Sharif University of Technology in Tehran. I spent some time getting my hands on deep RL following the seminal work of Mnih. et al. on Atari games. I was also fortunate to participate in the research of leading institutes in Switzerland and Japan with wonderful teams. In summer 2019, I joined Riken AIP in Fukuoka, Japan, to work on a novel method for boosting. I was co-supervised by Kohei Hatano and Eiji Takimoto. We exploited Zero Suppressed Decision Diagrams to boost classification on very large datasets using a counterpart to AdaBoost. In the summer of 2018, I joined the amazing institute of Neuroinformatics at ETH Zurich and the University of Zurich. I was supervised by Yulia Sandamirskaya, and we designed a Spiking Neural Network (SNN) inspired by known mechanisms in rodents to localize an agent in a simple map with obstacles.

My earlier interests include the intersection and the relation between AI and brain mechanisms, Continual Reinforcement Learning, and Deep RL. In recent years, I'm more interested in Causal Structure Learning on Graphs, and OOD Generalization in Vision and Language by learning compositional and disentangled representations that facilitate learning modular architectures comprising independent causal mechanisms.

Apart from AI research, I'm deeply passionate about music and the art of composing it, I love singing, hiking, and cycling. I'm also an avid photographer and traveler. I've visited more than 120 cities across 12 countries, and in 2022, I managed to complete a personal milestone of having touched the shores on the opposite sides of the Pacific Ocean! (2019: Japan shores, 2022: Vancouver - update: 2024 Again on Japanese Coast)

Other interests include Modern Physics, Astrophysics, and Cosmology, and Economics. Although, life is too short to explore all of these in depth, so I've chosen AI research as the main direction I'd like to devote my time to. I pursued my interest in Physics and Astrophysics during high school, which resulted in being awarded a Silver and a Bronze medal in the National Olympiad of Astronomy and Astrophysics.

I'm currently much excited about learning to play the piano and composing short pieces. I used to exercise interesting pieces and put my recordings here. Hopefully I'll buy a keyboard soon and continue recording ;)