N.Yoshida, I.Ishikawa, M.Imaizumi. "Zero Generalization Error Theorem for Random Interpolators via Algebraic Geometry", arXiv:2512.06347 (submitted)
G. Braun, B. Loureiro, H. Minh, M. Imaizumi. "Fast Escape, Slow Convergence: Learning Dynamics of Phase Retrieval under Power-Law Data", arXiv:2511.18661 (submitted)
Y.Takida, S.Hayakawa, T.Shibuya, M.Imaizumi, N.Murata, B.Nguyen, T.Uesaka, S-H.Lai, Y.Mitsufuji. "SONA: Learning Conditional, Unconditional, and Mismatching-Aware Discriminator", arXiv:2510.04576 (submitted)
S.Hayakawa, Y.Takida, M.Imaizumi, H.Wakaki, Y.Mitsufuji. "Demystifying MaskGIT Sampler and Beyond: Adaptive Order Selection in Masked Diffusion", arXiv:2510.04525 (submitted)
S.Nishiyama, M.Imaizumi "Precise Dynamics of Diagonal Linear Networks: A Unifying Analysis by Dynamical Mean-Field Theory", arXiv:2510.01930 (submitted)
D.Ponnoprat, M.Imaizumi, "Minimax Rates of Estimation for Optimal Transport Map between Infinite-Dimensional Spaces", arXiv:2505.13570 (submitted)
S.Iwazaki, J.Komiyama, M.Imaizumi, "High-dimensional Nonparametric Contextual Bandit Problem", arXiv:2505.14102 (submitted)
Q.Han, M.Imaizumi, "Precise gradient descent training dynamics for finite-width multi-layer neural networks", arXiv:2505.04898 (submitted)
T.Wakayama, M.Imaizumi, "Bayesian Analysis for Over-parameterized Linear Model via Effective Spectra", arXiv:2305.15754 (submitted)
N.Takeshita, M.Imaizumi, "Approximation of Permutation Invariant Polynomials by Transformers: Efficient Construction in Column-Size", arXiv:2502.11467 (Revise requested by Anal.App.)
S.Nakakita, T.Kaneko, S.Takamaeda-Yamazaki, M.Imaizumi, "Federated Learning with Relative Fairness", arxiv:2411.01161 (submitted)
K.Sawaya, Y.Uematsu, M.Imaizumi, "High-Dimensional Single-Index Models: Link Estimation and Marginal Inference", arxiv:2404.17812 (Revise requested by Inf.Inference)
M.Imaizumi, "Sup-Norm Convergence of Deep Neural Network Estimator for Nonparametric Regression by Adversarial Training", arxiv:2307.04042
K.Sawaya, Y.Uematsu, M.Imaizumi, "Moment-Based Adjustments of Statistical Inference in High-Dimensional Generalized Linear Models", arxiv:2305.17731 (Revise requested by Elect.J.Stat.)
M.Uehara, M.Imaizumi, N.Jiang, N.Kallus, W.Sun, T.Xie, "Finite Sample Analysis of Minimax Offline Reinforcement Learning: Completeness, Fast Rates and First-Order Efficiency". arXiv:2102.02981 (Resubmission encouraged by Ann.Stat.)
S.Nakakita, M.Imaizumi (2026), "Benign Overfitting in Time Series Linear Model with Over-Parameterization", Bernoulli, 32. paper arXiv:2204.08369
T.Tsuda, M.Imaizumi (2025+), "Universality of estimators for high-dimensional linear models with block dependency", Bernoulli, to appear. arxiv:2410.19244
B.Chen, S.Ito, M.Imaizumi (2025), "Optimal Dynamic Regret by Transformers for Non-Stationary Reinforcement Learning", Advances in Neural Information Processing Systems, to appear. arxiv:2508.16027 press
M.Sakai, R.Karakida, M.Imaizumi (2025), "Infinite-Width Limit of a Single Attention Layer: Analysis via Tensor Programs", Advances in Neural Information Processing Systems, to appear. arXiv:2506.00846
N.Yoshida, S.Nakakita, M.Imaizumi (2025), "Effect of Random Learning Rate: Theoretical Analysis of SGD Dynamics in Non-Convex Optimization via Stationary Distribution", Transactions on Machine Learning Research. paper movie code arXiv:2406.16032
S.Nakakita, P.Alquier, M. Imaizumi (2025), "Corrigendum to “Dimension-free bounds for sums of dependent matrices and operators with heavy-tailed distributions”", Electronic Journal of Statististics, 19(2). paper
R.Okano, M.Imaizumi (2025), "Wasserstein k-Centers Clustering for Distributional Data", Statistics and Computing, 35. paper arxiv:2407.08228, code
J.Leong, M.Imaizumi, H.Innan, N.Irie (2025), "Implications from the analogous relationship between evolutionary and learning processes", BioEssays, 47. paper
M.Nakata, M.Imaizumi (2025), "Landscape Computations for the Edge of Chaos in Nonlinear Dynamical Systems", JSIAM Letters, 17. paper arXiv:2503.06393
S.Hayakawa, Y.Takida, M.Imaizumi, H.Wakaki, Y.Mitsufuji (2025), "Distillation of Discrete Diffusion through Dimensional Correlations", PMLR: International Conference on Machine Learning, 267. arXiv:2410.08709 code
Y.Norimatsu, M.Imaizumi (2025), "Encode-Decoder-based GAN for Estimating Counterfactual Outcomes under Sequential Selection Bias and Combinatorial Explosion", PMLR: Causal Learning and Reasoning, 275. paper
G.Braun, M.Quang, M.Imaizumi (2025), "Learning a Single Index Model from Anisotropic Data with Vanilla Stochastic Gradient Descent", PMLR: Artificial Intelligence and Statistics, 258. paper arXiv:2503.23642
T.Tsuda, M.Imaizumi (2024), "Benign Overfitting of Non-Sparse High-Dimensional Linear Regression with Correlated Noise", Electronic Journal of Statistics, 18(2). arxiv:2304.04037 paper
R.Okano, M.Imaizumi (2024), "Distribution-on-Distribution Regression with Wasserstein Metric: Multivariate Gaussian Case", Journal of Multivariate Analysis, 203. arXiv:2307.06137 paper
M. Sugimoto, R. Okano, M. Imaizumi (2024) "Augmented Projection Wasserstein Distances: Multi-Dimensional Projection with Neural Surface", Journal of Statistical Planning and Inference, 233. paper
S. Kashiwamura, A. Sakata, M.Imaizumi (2024) "Effect of Weight Quantization on Learning Models by Typical Case Analysis", IEEE International Symposium on Information Theory 2024, 357-362. arXiv:2401.17269 paper
S.Nakakita, P.Alquier, M. Imaizumi (2024), "Dimension-free Bounds for Sum of Dependent Matrices and Operators with Heavy-Tailed Distribution", Electronic Journal of Statistics, 18(1). arXiv:2210.09756 paper
Y.Takida, M.Imaizumi, T.Shibata, C.Lai, T.Uesaka, N.Murata, Y.Mitsufuji (2024), "SAN: Inducing Metrizability of GAN with Discriminative Normalized Linear Layer", International Conference on Learning Representations. paper arxiv:2301.12811
D.Ponnoprat, R.Okano, M.Imaizumi (2024), "Uniform Confidence Band for Optimal Transport Map on One-Dimensional Data", Electronic Journal of Statistics, 18(1). arxiv:2307.09257 paper
A. Okuno, M.Imaizumi (2024), "Minimax Analysis for Inverse Risk in Nonparametric Planer Invertible Regression", Electronic Journal of Statistics, 18(1).arXiv:2112.00213 paper comment
T.Wakayama, M.Imaizumi (2024), "Fast Convergence on Perfect Classification for Functional Data". Statistica Sinica, 34. paper arXiv:2104.02978
R.Okano, M.Imaizumi (2024), "Inference for Projection-Based Wasserstein Distances on Finite Spaces", Statistica Sinica, 34. arXiv:2202.05495 paper.
J.Komiyama, M.Imaizumi (2023), "High-dimensional Contextual Bandit Problem without Sparsity", Advances in Neural Information Processing Systems, 36. arxiv:2306.11017 paper
R.Zhang, M.Imaizumi, B.Schölkopf, K.Muandet (2023), "Instrumental Variable Regression via Kernel Maximum Moment Loss". Journal of Causal Inference, 11(1). paper arXiv:2010.07684.
M.Kato, M.Imaizumi, K.Minami (2023), "Unified Perspective on Probability Divergence via Maximum Likelihood Density Ratio Estimation: Bridging KL-Divergence and Integral Probability Metrics", PMLR: Artificial Intelligence and Statistics, 20. paper arXiv:2201.13127.
M. Imaizumi, J. Schmidt-Hieber (2023), "On Generalization Bounds for Deep Networks Based on Loss Surface Implicit Regularization", IEEE Transaction on Information Theory, 69(2). arXiv:2201.04545 paper press.
M.Imaizumi (2022), "In Estimating Functions with Singularity on Hypersurfaces and Advantages of Deep Neural Networks", Journal of the Japan Statistical Society, Japanese Issue (Special Topic: The JSS Ogawa Prize Lecture) 52, paper.
M.Imaizumi, K.Fukumizu, (2022), "Advantage of Deep Neural Networks for Estimating Functions with Singularity on Hypersurface". Journal of Machine Learning Research, 23(111). arXiv:2011.02256 paper.
M.Imaizumi, H.Ota, T.Hamaguchi (2022), "Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension", Neural Computation, 34(6). arXiv:1910.07773 paper.
M.Kato, M.Imaizumi, K.McAlinn, S.Yasui, H.Kakehi (2022) "Learning Causal Models from Conditional Moment Restrictions by Importance Weighting", International Conference on Learning Representations (spotlight). paper
K.Takeuchi, M.Imaizumi, S.Kanda, K.Fujii, M.Ishihata, T.Maekawa, K.Yoda, Y.Tabei (2021) "Frechet Kernel for Trajectory Data Analysis", ACM SIGSPATIAL, paper.
A.Sannai, M.Imaizumi, M.Kawano, (2021), "Improved Generalization Bounds of Group Invariant / Equivariant Deep Networks via Quotient Feature Spaces", PMLR: Uncertainty on Artificial Intelligence 2021. arXiv:1910.06552 paper.
M.Imaizumi (2021), "Analysis on Mechanism of Deep Learning: Perspective of Generalization Error", Journal of the Japan Statistical Society, Japanese Issue (Special Section: Machine Learning and Its Related Fields) 50. paper
R.Nakada, M.Imaizumi (2020), "Adaptive Approximation and Generalization of Deep Neural Network with Intrinsic Dimensionality", Journal of Machine Learning Research 21(174). paper arXiv:1907.02177 press.
K.Hayashi, M.Imaizumi, Y,Yoshida (2020), "On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis", PMLR: Artificial Intelligence and Statistics 2020. paper arXiv:1901.09541.
M.Imaizumi, K.Fukumizu (2019). "Deep Neural Networks Learn Non-Smooth Functions Effectively". PMLR: Artificial Intelligence and Statistics 2019. paper arXiv:1802.04474
M.Imaizumi, K.Kato (2019), "A simple method to construct confidence bands in functional linear regression". Statistica Sinica. 29. paper arXiv:1612.07490
M.Imaizumi, T.Maehara, Y.Yoshida (2018). "Statistically Efficient Estimation for Non-Smooth Probability Densities". PMLR: Artificial Intelligence and Statistics 2018 (Best Paper Award). paper
M.Imaizumi, K.Kato (2018). "PCA-based estimation for functional linear regression with functional responses". Journal of Multivariate Analysis 163. paper arXiv:1609.00286
M.Imaizumi, T.Maehara, K.Hayashi (2017) "On Tensor Train Rank Minimization: Statistical Efficiency and Scalable Algorithm". Advances in Neural Information Processing Systems 30. paper arXiv:1708.00132
M.Imaizumi, K.Hayashi (2017). "Tensor Decomposition with Smoothness". PMLR: International Conference on Machine Learning 2017. paper
M.Imaizumi, R.Fujimaki (2017). "Factorized Asymptotic Bayesian Policy Search for POMDPs". Proc. of International Joint Conference of Artificial Intelligence. paper
M.Imaizumi, K.Hayashi (2016). "Doubly Decomposing Nonparametric Tensor Regression". PMLR: International Conference on Machine Learning 2016. paper arXiv:1506.05967
Japan Statistical Association, "Statistical Testing Data Science Expert-Level Exercises Data Science Expert Exercises", Tokyo-Tosho, 2025/11. link
T.Suzuki, T.Shibata, M.Imaizumi, J.Otsuka, E.Nakazawa, T.Ono, R.Uehara, K.Tachibana, S.Uesugi, K.Hori, K.Sekiguchi (2023) "How to deal with Artificial Intelligence", Keiso-Shobo, 2023/07. link
S.Akaho, M.Imaizumi, S.Uchida, T.Sei, S.Takano, S.Tsuji, N.Hara, R.Hisano, J.Matsubara, M.Miyaji, A.Morihata, Y.Yadohisa. (2023), "Data Science as Foundation of Applications", Kodan-sha, 2023/02. link
C.Aggarwal (author), A.Takemura, M.Imaizumi (translate) (2022), "Neural Network and Deep Learning", Gakujutsu-Tosho Shuppan-Sha, 2022/03. link
M.Imaizumi (2021), "Approaching the Principles of Deep Learning: A Mathematical Challenge", Iwanami Science Library, 2021/04. link