I am a tenure-track Assistant Professor at the University of California Berkeley, Department of Statistics.
From January 2021 to October 2022 I was a postdoctoral researcher at the department of mathematics ETH, Zürich hosted by Afonso Bandeira.
Between January 2019 and December 2020 I was a postdoctoral researcher at Google Research, Zürich hosted by Olivier Bousquet.
Before that I spent half a year at the department of mathematics, Technion I.I.T. hosted by Shahar Mendelson.
I defended my thesis at Moscow Institute of Physics and Technology Moscow in 2018 under the supervision of Vladimir Spokoiny and Konstantin Vorontsov. During my time in Moscow, I was affiliated (part-time) with the Institute for Information Transmission Problems, Higher School of Economics, and Skoltech.
My main interests are in the intersection of mathematical statistics, probability and learning theory.
Refined Risk Bounds for Unbounded Losses via Transductive Priors 2024. [arxiv] (with J. Qian, A. Rakhlin)
High-Probability Risk Bounds via Sequential Predictors, 2023. [arxiv] (with N. Cesa-Bianchi, and D. van der Hoeven)
A remark on Kashin's discrepancy argument and partial coloring in the Komlós conjecture, Portugaliae Mathematica, 2022. [arxiv] (with A. S. Bandeira, A. Maillard)
Dimension-free Private Mean Estimation for Anisotropic Distributions, NeurIPS, 2024. [arxiv] (with Y. Dagan, M. I. Jordan, X. Yang, L. Zakynthinou)
Derandomizing Multi-Distribution Learning, NeurIPS, 2024. [arxiv] (with K. Green Larsen and O. Montasser)
Revisiting Agnostic PAC Learning, IEEE Symposium on Foundations of Computer Science (FOCS), 2024. [arxiv] (with K. Green Larsen and S. Hanneke)
Majority-of-Three: The Simplest Optimal Learner?, Conference on Learning Theory (COLT), 2024. [arxiv] (with I. Aden-Ali, M. Møller Høgsgaard, K. Green Larsen)
Optimal PAC Bounds Without Uniform Convergence, IEEE Symposium on Foundations of Computer Science (FOCS), 2023. [arxiv] (with I. Aden-Ali, Y. Cherapanamjeri, A. Shetty)
Local Risk Bounds for Statistical Aggregation, Conference on Learning Theory (COLT), 2023. [arxiv] (with J. Mourtada and T. Vaškevičius)
Exploring Local Norms in Exp-concave Statistical Learning, Conference on Learning Theory (COLT), 2023. [arxiv] (with Nikita Puchkin)
The One-Inclusion Graph Algorithm is not Always Optimal, Conference on Learning Theory (COLT), 2023. [arxiv] (with I. Aden-Ali, Y. Cherapanamjeri, A. Shetty)
A Regret-Variance Trade-Off in Online Learning, NeurIPS 2022. [arxiv] (with N. Cesa-Bianchi, and D. van der Hoeven)
Stability and Deviation Optimal Risk Bounds with Convergence Rate O(1/n), NeurIPS (Oral presentation) 2021. [arxiv] (with Y. Klochkov)
Proper Learning, Helly Number, and an Optimal SVM Bound, Conference on Learning Theory (COLT) (Best Paper Award), 2020. [arxiv] (with O. Bousquet, S. Hanneke and S. Moran)
Fast Rates for Online Prediction with Abstention, Conference on Learning Theory (COLT), 2020. [arxiv] (with G. Neu)
Sharper bounds for uniformly stable algorithms, Conference on Learning Theory (COLT), 2020. [arxiv] (with O. Bousquet and Y. Klochkov)
Optimal learning via local entropies and sample compression. Conference on Learning Theory (COLT), 2017. [arXiv]
Permutational Rademacher Complexity: a New Complexity Measure for Transductive Learning. Algorithmic Learning Theory (ALT), 2015. [arXiv] (with G. Blanchard and I. Tolstikhin)
Statistically Optimal Robust Mean and Covariance Estimation for Anisotropic Gaussians, Mathematical Statistics and Learning (accepted), 2025. [arxiv] (with Arshak Minasyan)
Covariance Estimation: Optimal Dimension-free Guarantees for Adversarial Corruption and Heavy Tails, Journal of the European Mathematical Society, 2024. [arxiv] (with P. Abdalla)
Dimension-free Bounds for Sums of Independent Matrices and Simple Tensors via the Variational Principle, Electronic Journal of Probability, 2023. [arxiv]
Robustifying Markowitz, Journal of Econometrics, 2022. [arxiv] (with W. Härdle, Y. Klochkov and A. Petukhina)
Exponential Savings in Agnostic Active Learning through Abstention, 2022, IEEE Transactions on Information Theory. [arxiv] Preliminary version in Conference on Learning Theory (COLT), 2021. (with N. Puchkin)
On Mean Estimation for Heteroscedastic Random Variables, 2021. Annales de l'Institut Henri Poincaré, Probabilités et Statistiques [arxiv] (with L. Devroye, S. Lattanzi, and G. Lugosi)
Distribution-Free Robust Linear Regression, Mathematical Statistics and Learning, 2022. [arxiv] (with J. Mourtada and T. Vaškevičius)
Suboptimality of Constrained Least Squares and Improvements via Non-Linear Predictors, Bernoulli, 2022. [arxiv] (with T. Vaškevičius)
Fast classification rates without standard margin assumptions, 2021. Information and Inference: A Journal of the IMA [arxiv] (with O. Bousquet)
Robust k-means Clustering for Distributions with Two Moments, Annals of Statistics, 2021. [arxiv] (with Y. Klochkov and A. Kroshnin)
Empirical Variance Minimization with Applications in Variance Reduction and Optimal Control, Bernoulli 2021. [arXiv] (with D. Belomestny, L. Iosipoi and Q. Paris.)
Noise sensitivity of the top eigenvector of a Wigner matrix, Probability Theory and Related Fields, 2019. [arxiv] (with C. Bordenave and G. Lugosi)
Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method, Electronic Journal of Probability , 2020. [arXiv] (with Y. Klochkov)
Robust covariance estimation under L4 − L2 moment equivalence, Annals of Statistics, 2020. [arxiv] (with S. Mendelson)
Concentration of the spectral norm of Erdős–Rényi random graphs, Bernoulli, 2018. [arXiv] (with G. Lugosi and S. Mendelson)
When are epsilon-nets small?, Journal of Computer and System Sciences, 2020. [arXiv] (with A. Kupavskii)
Localization of VC Classes: Beyond Local Rademacher Complexities. Theoretical Computer Science (Invited paper, ALT special issue), 2018. [arXiv] Preliminary version in Algorithmic Learning Theory (ALT), 2016. (with S. Hanneke)
Classification with Abstention: Applications in Statistical, Online, and Active Learning
Estimation of the Covariance Matrix in the Presence of Outliers
Exponential savings in agnostic active learning through abstention [Video]
Robust k-means clustering for distributions with two bounded moments
Robust covariance estimation for vectors with bounded kurtosis
Reviewer Journals: Annals of Statistics, Probability Theory and Related Fields, Annals of Applied Probability, Bernoulli, Journal of the American Statistical Association, IEEE Transactions on Information Theory, Discrete and Computational Geometry, Journal of Machine Learning Research
Senior PC member conferences: COLT 2023, 2024 ALT 2020, 2024
Reviewer Conferences: COLT (2017 -- 2022), ALT (2019 -- 2021), ICML 2019, NIPS (2018, 2021), AISTATS 2019
In Spring 2021, 2022 I am co-lecturing an undergraduate course Mathematics of Machine Learning at ETH, Zürich.
Between January 2014 and October 2017 I was teaching statistics/probability/stochastic processes/abstract algebra/learning theory in Moscow.