About me

From January 2021 I am a postdoctoral researcher at the department of mathematics ETH, Zürich hosted by Afonso Bandeira.

Between January 2019 and December 2020 I was a postdoctoral researcher at Google Research, Brain Team, Zürich.

Before that I spent one year at the department of mathematics, Technion I.I.T. hosted by Shahar Mendelson.

I defended my thesis at MIPT Moscow in 2018 under the supervision of Vladimir Spokoiny and Konstantin Vorontsov.

My main interests are in the intersection of mathematical statistics, probability and learning theory.

Publications

Preprints

  • N. Zhivotovskiy. Dimension-free Bounds for Sums of Independent Matrices and Simple Tensors via the Variational Principle, 2021. [arxiv]

  • T. Vaškevičius, N. Zhivotovskiy. Suboptimality of Constrained Least Squares and Improvements via Non-Linear Predictors, 2020. [arxiv]

  • L. Devroye, S. Lattanzi, G. Lugosi, N. Zhivotovskiy. On Mean Estimation for Heteroscedastic Random Variables, 2020. [arxiv]


Refereed conference papers

  • Y. Klochkov, N. Zhivotovskiy. Stability and Deviation Optimal Risk Bounds with Convergence Rate O(1/n), NeurIPS (Oral presentation) 2021. [arxiv]

  • N. Puchkin, N. Zhivotovskiy. Exponential Savings in Agnostic Active Learning through Abstention, Conference on Learning Theory (COLT), 2021. [arxiv]

  • O. Bousquet, S. Hanneke, S. Moran, N. Zhivotovskiy. Proper Learning, Helly Number, and an Optimal SVM Bound, Conference on Learning Theory (COLT) (Best Paper Award), 2020. [arxiv]

  • G. Neu, N. Zhivotovskiy. Fast Rates for Online Prediction with Abstention, Conference on Learning Theory (COLT), 2020. [arxiv]

  • O. Bousquet, Y. Klochkov, N. Zhivotovskiy. Sharper bounds for uniformly stable algorithms, Conference on Learning Theory (COLT), 2020. [arxiv]

  • N. Zhivotovskiy. Optimal learning via local entropies and sample compression. Conference on Learning Theory (COLT), 2017. [arXiv]

  • G. Blanchard, I. Tolstikhin, N. Zhivotovskiy. Permutational Rademacher Complexity: a New Complexity Measure for Transductive Learning. Algorithmic Learning Theory (ALT), 2015. [arXiv]



Journal publications

  • J. Mourtada, T. Vaškevičius, N. Zhivotovskiy. Distribution-Free Robust Linear Regression, 2021. Mathematical Statistics and Learning (forthcoming) [arxiv]

  • O. Bousquet, N. Zhivotovskiy. Fast classification rates without standard margin assumptions, 2021. Information and Inference: A Journal of the IMA [arxiv]

  • Y. Klochkov, A. Kroshnin, N. Zhivotovskiy. Robust k-means Clustering for Distributions with Two Moments, Annals of Statistics, 2021. [arxiv]

  • D. Belomestny, L. Iosipoi, Q. Paris, N. Zhivotovskiy. Empirical Variance Minimization with Applications in Variance Reduction and Optimal Control, Bernoulli (forthcoming) 2020+. [arXiv]

  • C. Bordenave, G. Lugosi, N. Zhivotovskiy. Noise sensitivity of the top eigenvector of a Wigner matrix, Probability Theory and Related Fields, 2019. [arxiv]

  • Y. Klochkov, N. Zhivotovskiy. Uniform Hanson-Wright type concentration inequalities for unbounded entries via the entropy method, Electronic Journal of Probability , 2020. [arXiv]

  • S. Mendelson, N. Zhivotovskiy. Robust covariance estimation under L4 − L2 moment equivalence, Annals of Statistics, 2020. [arxiv]

  • G. Lugosi, S. Mendelson, N. Zhivotovskiy. Concentration of the spectral norm of Erdős–Rényi random graphs, Bernoulli, 2018. [arXiv]

  • A. Kupavskii, N. Zhivotovskiy. When are epsilon-nets small?, Journal of computer and system sciences, 2020. [arXiv]

  • N. Zhivotovskiy, S. Hanneke. Localization of VC Classes: Beyond Local Rademacher Complexities. Theoretical Computer Science (Invited paper, ALT special issue), 2018. [arXiv] Preliminary version in Algorithmic Learning Theory (ALT), 2016.


Research Awards

Best Paper Award at COLT (Conference on Learning Theory) 2020.

Service

Reviewer Journals: Annals of Statistics, Annals of Applied Probability, Bernoulli, Journal of the American Statistical Association, IEEE Transactions on Information Theory, Discrete and Computational Geometry, Journal of Machine Learning Research

PC member conferences: ALT (2020 -- 2022)

Reviewer Conferences: COLT (2017 -- 2021), ALT 2019, ICML 2019, NIPS (2018, 2021), AISTATS 2019

In Spring 2021 I am co-lecturing an undergraduate course Mathematics of Machine Learning at ETH, Zürich.

Between January 2014 and October 2017 I was teaching statistics/probability/stochastic processes/abstract algebra/learning theory in Moscow.