Simla Burcu Harma
Ph.D. Student, Computer Science, EPFL
Ph.D. Student, Computer Science, EPFL
Contact me:
Email
simla[dot]harma[at]epfl.ch
EPFL IC IINFCOM PARSA
INJ 236 (Bâtiment INJ), Station 14
CH-1015 Lausanne, Switzerland
Hello! I am a sixth year Ph.D. Student at the École Polytechnique Fédérale de Lausanne (EPFL), supervised by Prof. Babak Falsafi, and I am affiliated with the Parallel Systems Architecture Lab (PARSA).
My current research focuses on novel numerical encodings and sparsity for time- and energy-efficient Deep Neural Network (DNN) workloads.
I am one of the recipients of the Microsoft Research PhD Fellowship, 2022. (Announcement, EPFL, EcoCloud)
I am also one of the recipients of the Generation Google Scholarship (EMEA). (Announcement, EcoCloud)
You can find my CV (updated June 2025) here.
You can find my footprint on the internet on GitHub and LinkedIn.
Doctor of Philosophy (Ph.D.), École Polytechnique Fédérale de Lausanne (EPFL), Feb. 2020 - Present
School of Computer and Communication Sciences
Supervisor: Prof. Babak Falsafi, Parallel Systems Architecture Lab (PARSA)
Master of Science (M.S.), TOBB University of Economics and Technology, Sept. 2018 - Jan. 2020
Computer Science
Thesis: An in-depth Performance Analysis of Neural Machine Translation Tasks
Supervisor: Prof. Oğuz Ergin
Bachelor of Science (B.S.), TOBB University of Economics and Technology, Sept. 2014 - Aug. 2018
Mathematics (Double Major)
G.P.A. 4.0/4.0
Bachelor of Science (B.S.), TOBB University of Economics and Technology, Sept. 2013 - Aug. 2018
Computer Science
G.P.A. 4.0/4.0
Amazon Web Services (AWS), Tuebingen, Germany, Sep 2024 - Feb 2025
Applied Scientist Intern
École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland, Jun - Aug 2018
Research Intern
Distributed Artificial Intelligence Lab. (DAI-Labor), Berlin, Germany, May - Aug 2017
Research Intern
TOBB ETU Theoretical Computer Science Lab. , Ankara, Turkey, Sep - Dec 2016
Research Intern
Havelsan Technology Radar, Ankara, Turkey, Jan - Mar 2016
Software Engineer Intern
Avant-Garde: Empowering GPUs with Scaled Numeric Formats
Minseong Gil, Dongho Ha, Simla Burcu Harma, Myung Kuk Yoon, Babak Falsafi, Won Woo Ro, Yunho Oh.
In ISCA'25.
Effective Interplay between Sparsity and Quantization: From Theory to Practice.
Simla Burcu Harma, Ayan Chakraborty, Elizaveta Kostenok, Danila Mishin, Dongho Ha, Babak Falsafi, Martin Jaggi, Ming Liu, Yunho Oh, Suvinay Subramanian, Amir Yazdanbakhsh
In ICLR'25 (Spotlight).
Accuracy Boosters: Epoch-Driven Mixed-Mantissa Block Floating-Point for DNN Training.
Simla Burcu Harma, Ayan Chakraborty, Babak Falsafi, Martin Jaggi, Yunho Oh.
In ISCA’23 Workshop on ML for Computer Architecture and Systems (MLArchSys).
Numerical Encoding for DNN Accelerators
Simla Burcu Harma, Mario Drumond, Babak Falsafi
In Computer Architecture Today (blog). September 20, 2021.
An in-depth Study of Neural Machine Translation Performance
Simla Burcu Harma, Mario Drumond, Babak Falsafi, Oğuz Ergin.
In ASPLOS’20 Workshop on the Convergence of Machine Learning and High-Performance Computing (ML4HPC).
In HiPEAC Workshop on Accelerated Machine Learning (AccML), 2020 (Full Paper).