CV and arxiv

About me

I am a tenure-track Assistant Professor at the University of California Berkeley, Department of Statistics.

From January 2021 to October 2022 I was a postdoctoral researcher at the department of mathematics ETH, Zürich hosted by Afonso Bandeira.

Between January 2019 and December 2020 I was a postdoctoral researcher at Google Research, Zürich hosted by Olivier Bousquet.

Before that I spent half a year at the department of mathematics, Technion I.I.T. hosted by Shahar Mendelson.

I defended my thesis at Moscow Institute of Physics and Technology Moscow in 2018 under the supervision of Vladimir Spokoiny and Konstantin Vorontsov. During my time in Moscow, I was affiliated (part-time) with the Institute for Information Transmission Problems, Higher School of Economics, and Skoltech.

My main interests are in the intersection of mathematical statistics, probability and learning theory.

Publications (scholar page)


  • Statistically Optimal Robust Mean and Covariance Estimation for Anisotropic Gaussians, 2023. [arxiv] (with Arshak Minasyan)

  • The One-Inclusion Graph Algorithm is not Always Optimal, 2022. [arxiv] (with I. Aden-Ali, Y. Cherapanamjeri, A. Shetty)

  • Covariance Estimation: Optimal Dimension-free Guarantees for Adversarial Corruption and Heavy Tails, 2022. [arxiv] (with P. Abdalla)

  • Dimension-free Bounds for Sums of Independent Matrices and Simple Tensors via the Variational Principle, 2021. [arxiv]

Expository notes and surveys

  • A remark on Kashin's discrepancy argument and partial coloring in the Komlós conjecture, Portugaliae Mathematica, 2022. [arxiv] (with A. S. Bandeira, A. Maillard)

Refereed conference papers (with no journal version)

  • A Regret-Variance Trade-Off in Online Learning, NeurIPS 2022. [arxiv] (with N. Cesa-Bianchi, and D. van der Hoeven)

  • Stability and Deviation Optimal Risk Bounds with Convergence Rate O(1/n), NeurIPS (Oral presentation) 2021. [arxiv] (with Y. Klochkov)

  • Proper Learning, Helly Number, and an Optimal SVM Bound, Conference on Learning Theory (COLT) (Best Paper Award), 2020. [arxiv] (with O. Bousquet, S. Hanneke and S. Moran)

  • Fast Rates for Online Prediction with Abstention, Conference on Learning Theory (COLT), 2020. [arxiv] (with G. Neu)

  • Sharper bounds for uniformly stable algorithms, Conference on Learning Theory (COLT), 2020. [arxiv] (with O. Bousquet and Y. Klochkov)

  • Optimal learning via local entropies and sample compression. Conference on Learning Theory (COLT), 2017. [arXiv]

  • Permutational Rademacher Complexity: a New Complexity Measure for Transductive Learning. Algorithmic Learning Theory (ALT), 2015. [arXiv] (with G. Blanchard and I. Tolstikhin)

Journal publications

  • Robustifying Markowitz, 2022, Journal of Econometrics (forthcoming). [arxiv] (with W. Härdle, Y. Klochkov and A. Petukhina)

  • Exponential Savings in Agnostic Active Learning through Abstention, 2022, IEEE Transactions on Information Theory. [arxiv] Preliminary version in Conference on Learning Theory (COLT), 2021. (with N. Puchkin)

  • On Mean Estimation for Heteroscedastic Random Variables, 2021. Annales de l'Institut Henri Poincaré, Probabilités et Statistiques [arxiv] (with L. Devroye, S. Lattanzi, and G. Lugosi)

  • Distribution-Free Robust Linear Regression, 2022. Mathematical Statistics and Learning [arxiv] (with J. Mourtada and T. Vaškevičius)

  • Suboptimality of Constrained Least Squares and Improvements via Non-Linear Predictors, Bernoulli, 2022. [arxiv] (with T. Vaškevičius)

  • Fast classification rates without standard margin assumptions, 2021. Information and Inference: A Journal of the IMA [arxiv] (with O. Bousquet)

  • Robust k-means Clustering for Distributions with Two Moments, Annals of Statistics, 2021. [arxiv] (with Y. Klochkov and A. Kroshnin)

  • Empirical Variance Minimization with Applications in Variance Reduction and Optimal Control, Bernoulli 2021. [arXiv] (with D. Belomestny, L. Iosipoi and Q. Paris.)

  • Noise sensitivity of the top eigenvector of a Wigner matrix, Probability Theory and Related Fields, 2019. [arxiv] (with C. Bordenave and G. Lugosi)

  • Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method, Electronic Journal of Probability , 2020. [arXiv] (with Y. Klochkov)

  • Robust covariance estimation under L4 − L2 moment equivalence, Annals of Statistics, 2020. [arxiv] (with S. Mendelson)

  • Concentration of the spectral norm of Erdős–Rényi random graphs, Bernoulli, 2018. [arXiv] (with G. Lugosi and S. Mendelson)

  • When are epsilon-nets small?, Journal of Computer and System Sciences, 2020. [arXiv] (with A. Kupavskii)

  • Localization of VC Classes: Beyond Local Rademacher Complexities. Theoretical Computer Science (Invited paper, ALT special issue), 2018. [arXiv] Preliminary version in Algorithmic Learning Theory (ALT), 2016. (with S. Hanneke)

Research Awards

Best Paper Award at COLT (Conference on Learning Theory) 2020.


Reviewer Journals: Annals of Statistics, Probability Theory and Related Fields, Annals of Applied Probability, Bernoulli, Journal of the American Statistical Association, IEEE Transactions on Information Theory, Discrete and Computational Geometry, Journal of Machine Learning Research

Senior PC member conferences: COLT 2023, ALT 2020

Reviewer Conferences: COLT (2017 -- 2022), ALT (2019 -- 2021), ICML 2019, NIPS (2018, 2021), AISTATS 2019

In Spring 2021, 2022 I am co-lecturing an undergraduate course Mathematics of Machine Learning at ETH, Zürich.

Between January 2014 and October 2017 I was teaching statistics/probability/stochastic processes/abstract algebra/learning theory in Moscow.