Yehuda Dar
I am a senior lecturer (assistant professor) in the Computer Science Department at Ben-Gurion University, working in the area of machine and deep learning.
I hold a BSc in Computer Engineering, MSc in Electrical Engineering, and a PhD in Computer Science, all from the Technion — Israel Institute of Technology. I conducted my PhD research (completed in 2018) under the supervision of Prof. Alfred Bruckstein and Prof. Michael Elad. I was a postdoctoral fellow in the Computer Science Department at the Technion, and a postdoctoral research associate in the Department of Electrical and Computer Engineering at Rice University where I worked with Prof. Richard Baraniuk.
I had the great pleasure to co-organize the 2021 and 2022 Workshop on the Theory of Overparameterized Machine Learning (TOPML). More recently, I was an area chair at NeurIPS 2024.
Research Area: Foundations of Overparameterized Machine Learning
My research provides fundamental insights into overparameterized machine learning where the learned models are highly complex with many more parameters than the number of training data examples. Deep neural networks are a prominent example for such overparameterized models.
In my research work, I address scientific questions in machine learning using both theoretical and empirical techniques from statistics, optimization, and signal processing.
My studies address one of the big challenges in the understanding of modern machine learning: Explaining the great generalization ability of overparameterized models, which often perfectly fit their training data (while ignoring the classical bias-variance tradeoff!) and therefore pose fascinating questions on the reasons for their success.
My papers in this area:
Y. Dar, V. Muthukumar, R. G. Baraniuk, "A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of Overparameterized Machine Learning", arXiv:2109.02355
G. Alon and Y. Dar, "How Does Overparameterization Affect Machine Unlearning of Deep Neural Networks?", arXiv:2503.086332503.08633
S. Hendy and Y. Dar, "TL-PCA: Transfer Learning of Principal Component Analysis", arXiv:2410.10805
Y. Sharon and Y. Dar, "How Do the Architecture and Optimizer Affect Representation Learning? On the Training Dynamics of Representations in Deep Neural Networks", arXiv:2405.17377
K. Abitbul and Y. Dar, "How Much Training Data is Memorized in Overparameterized Autoencoders? An Inverse Problem Perspective on Memorization Evaluation", in ECML 2024.
Y. Dar, D. LeJeune, and R. G. Baraniuk, "The Common Intuition to Transfer Learning Can Win or Lose: Case Studies for Linear Regression", SIAM Journal on Mathematics of Data Science, vol. 6, issue 2, 2024.
Y. Dar and R. G. Baraniuk, "Double Double Descent: On Generalization Errors in Transfer Learning between Linear Regression Tasks", SIAM Journal on Mathematics of Data Science, vol. 4, issue 4, 2022.
Y. Dar, P. Mayer, L. Luzi, and R. G. Baraniuk, "Subspace Fitting Meets Regression: The Effects of Supervision and Orthonormality Constraints on Double Descent of Generalization Errors", in ICML 2020.
G. Somepalli, L. Fowl, A. Bansal, P. Yeh-Chiang, Y. Dar, R. Baraniuk, M. Goldblum, T. Goldstein, "Can Neural Nets Learn the Same Model Twice? Investigating Reproducibility and Double Descent from the Decision Boundary Perspective", in CVPR 2022.
Y. Dar, L. Luzi, and R. G. Baraniuk, "Frozen Overparameterization: A Double Descent Perspective on Transfer Learning of Deep Neural Networks", arXiv:2211.11074.
L. Luzi, Y. Dar, and R. G. Baraniuk, "Double Descent and Other Interpolation Phenomena in GANs", arXiv:2106.04003.
Research Fields
Machine and deep learning
Computer vision
Signal, image, and video processing
Optimization
Image, video and data compression
For more details on my research, see the publications page.
E-mail: ydar@bgu.ac.il