Yehuda Dar
I am a senior lecturer (assistant professor) in the Computer Science Department at Ben-Gurion University, working in the area of machine and deep learning.
I hold a BSc in Computer Engineering, MSc in Electrical Engineering, and a PhD in Computer Science, all from the Technion — Israel Institute of Technology. I conducted my PhD research (completed in 2018) under the supervision of Prof. Alfred Bruckstein and Prof. Michael Elad. I was a postdoctoral fellow in the Computer Science Department at the Technion, and a postdoctoral research associate in the Department of Electrical and Computer Engineering at Rice University where I worked with Prof. Richard Baraniuk.
I had the great pleasure to co-organize the 2021 and 2022 Workshop on the Theory of Overparameterized Machine Learning (TOPML). More details are available at https://topml.rice.edu/
Research Area: Foundations of Overparameterized Machine Learning
My research provides fundamental insights into overparameterized machine learning where the learned models are highly complex with many more parameters than the number of training data examples. Deep neural networks are a prominent example for such overparameterized models.
In my research work, I address scientific questions in machine learning using both theoretical and empirical techniques from statistics, optimization, and signal processing.
My studies address one of the big challenges in the understanding of modern machine learning: Explaining the great generalization ability of overparameterized models, which often perfectly fit their training data (while ignoring the classical bias-variance tradeoff!) and therefore pose fascinating questions on the reasons for their success.
My papers in this area:
Y. Dar, V. Muthukumar, R. G. Baraniuk, "A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of Overparameterized Machine Learning", arXiv:2109.02355
Y. Dar, D. LeJeune, and R. G. Baraniuk, "The Common Intuition to Transfer Learning Can Win or Lose: Case Studies for Linear Regression", Accepted to SIAM Journal on Mathematics of Data Science, 2024.
Y. Dar and R. G. Baraniuk, "Double Double Descent: On Generalization Errors in Transfer Learning between Linear Regression Tasks", SIAM Journal on Mathematics of Data Science, vol. 4, issue 4, 2022. [arXiv]
Y. Dar, P. Mayer, L. Luzi, and R. G. Baraniuk, "Subspace Fitting Meets Regression: The Effects of Supervision and Orthonormality Constraints on Double Descent of Generalization Errors", in ICML 2020.
G. Somepalli, L. Fowl, A. Bansal, P. Yeh-Chiang, Y. Dar, R. Baraniuk, M. Goldblum, T. Goldstein, "Can Neural Nets Learn the Same Model Twice? Investigating Reproducibility and Double Descent from the Decision Boundary Perspective", in CVPR 2022.
Y. Dar, L. Luzi, and R. G. Baraniuk, "Frozen Overparameterization: A Double Descent Perspective on Transfer Learning of Deep Neural Networks", arXiv:2211.11074.
L. Luzi, Y. Dar, and R. G. Baraniuk, "Double Descent and Other Interpolation Phenomena in GANs", arXiv:2106.04003.
K. Abitbul and Y. Dar, "Recovery of Training Data from Overparameterized Autoencoders: An Inverse Problem Perspective", arXiv:2310.02897.
Research Fields
Machine and deep learning
Signal, image, and video processing
Computer vision
Optimization
Image, video and data compression
For more details on my research, see the publications page.
E-mail: ydar@bgu.ac.il