Github       Twitter       Google Scholar       Linkedin

Email: lakshyasinghaliitd [at] gmail [dot] com

About me

I am a research assistant at Serre Lab, Brown University. advised by Prof. Thomas Serre. I recently obtained my bachelors degree from Indian Institute of Technology, Delhi. 

My interest lies in theoretical machine learning, more specifically in representation learning and generative modeling, improving upon it by reconciling it with the human brain’s generative mechanism. 

My research focuses on improving generative models such as Variational Autoencoder and Energy based models, on Out of Distribution generalization tasks. 

             



Publication(s)

Diffusion Models as Artists: Are we Closing the Gap between Humans and Machines?

Victor Boutin, Thomas FEL, Lakshya Singhal, Rishav Mukherji, Akash Nagaraj, Julien Colin, Thomas Serre   

International Conference on Machine Learning (ICML) 2023

[Code]  [PDF]

Minority Oversampling for Imbalanced Data via Class-Preserving Regularized Auto-Encoders

Arnab Mondal, Lakshya Singhal, Piyush Tiwari, Parag Singla and Prathosh AP   

Artificial Intelligence and Statistics Conference (AISTATS) 2023

[Code]  [PDF]

Diversity vs. Recognizability: Human-like generalization in one-shot generative models

Victor Boutin, Lakshya Singhal, Xavier Thomas and Thomas Serre    

Neural Information Processing Systems (NeurIPS) 2022

[Code]  [PDF]

eARDS: A multi-center validation of an interpretable machine learning algorithm of early onset Acute Respiratory Distress Syndrome (ARDS) among critically ill adults with COVID-19 

Lakshya Singhal, Yash Garg, Philip Yang, Azade Tabaie, A. Ian Wong, Akram Mohammed, Lokesh Chinthala, Dipen Kadaria, Amik  Sodhi,  Andre  L.  Holder,  Annette  Esper,  James  M.  Blum,  Robert  L.  Davis,  Gari  D.  Clifford,  Greg  S.  Martin and Rishikesan Kamaleswaran

PLOS One, 2021

[Code]  [PDF]

[Re:] Training Binary Neural Networks using the Bayesian Learning Rule

Prateek Garg, Lakshya Singhal and Ashish Sardana         

ReScience-C Journal (MLRC 2020)                         

[Code]  [PDF]

Professional Work