Research
Preprint (click to open)
R.Okano, M.Imaizumi, "Wasserstein k-Centres Clustering for Distributional Data", arxiv:2407.08228, code
N.Yoshida, N.Shogo, M.Imaizumi, "Effect of Random Learning Rate: Theoretical Analysis of SGD Dynamics in Non-Convex Optimization via Stationary Distribution", arXiv:2406.16032
R.Hataya, K.Matsui, M.Imaizumi, "Automatic Domain Adaptation by Transformers in In-Context Learning", arXiv:2405.16819
K.Sawaya, Y.Uematsu, M.Imaizumi, "High-Dimensional Single-Index Models: Link Estimation and Marginal Inference", arxiv:2404.17812
M.Imaizumi, "Sup-Norm Convergence of Deep Neural Network Estimator for Nonparametric Regression by Adversarial Training", arxiv:2307.04042
K.Sawaya, Y.Uematsu, M.Imaizumi, "Moment-Based Adjustments of Statistical Inference in High-Dimensional Generalized Linear Models", arxiv:2305.17731
T.Wakayama, M.Imaizumi, "Bayesian Analysis for Over-parameterized Linear Model without Sparsity", arxiv:2305.15754
T.Tsuda, M.Imaizumi, "Benign Overfitting of Non-Sparse High-Dimensional Linear Regression with Correlated Noise", arxiv:2304.04037 (R&R)
S.Nakakita, M.Imaizumi, "Benign Overfitting in Time Series Linear Model with Over-Parameterization", arXiv:2204.08369 (RbutR)
M.Uehara, M.Imaizumi, N.Jiang, N.Kallus, W.Sun, T.Xie, "Finite Sample Analysis of Minimax Offline Reinforcement Learning: Completeness, Fast Rates and First-Order Efficiency". arXiv:2102.02981 (RbutR)
M.Imaizumi, T.Otsu, "On Gaussian Approximation for M-Estimator". arXiv:2012.15678
Publication
R.Okano, M.Imaizumi (2024), "Distribution-on-Distribution Regression with Wasserstein Metric: Multivariate Gaussian Case", Journal of Multivariate Analysis, to appear. arXiv:2307.06137 paper
M. Sugimoto, R. Okano, M. Imaizumi (2024) "Augmented Projection Wasserstein Distances: Multi-Dimensional Projection with Neural Surface", Journal of Statistical Planning and Inference, 233. paper
S. Kashiwamura, A. Sakata, M.Imaizumi (2024) "Effect of Weight Quantization on Learning Models by Typical Case Analysis", IEEE International Symposium on Information Theory, to appear. arXiv:2401.17269
S.Nakakita, P.Alquier, M. Imaizumi (2024), "Dimension-free Bounds for Sum of Dependent Matrices and Operators with Heavy-Tailed Distribution", Electronic Journal of Statistics, 18(1). arXiv:2210.09756 paper
Y.Takida, M.Imaizumi, T.Shibata, C.Lai, T.Uesaka, N.Murata, Y.Mitsufuji (2024), "SAN: Inducing Metrizability of GAN with Discriminative Normalized Linear Layer", International Conference on Learning Representations, to appear. arxiv:2301.12811
D.Ponnoprat, R.Okano, M.Imaizumi (2024), "Uniform Confidence Band for Optimal Transport Map on One-Dimensional Data", Electronic Journal of Statistics, 18(1). arxiv:2307.09257 paper
A. Okuno, M.Imaizumi (2024), "Minimax Analysis for Inverse Risk in Nonparametric Planer Invertible Regression", Electronic Journal of Statistics, 18(1).arXiv:2112.00213 paper comment
J.Komiyama, M.Imaizumi (2023), "High-dimensional Contextual Bandit Problem without Sparsity", Advances in Neural Information Processing Systems, 36. arxiv:2306.11017 paper
R.Zhang, M.Imaizumi, B.Schölkopf, K.Muandet (2023), "Instrumental Variable Regression via Kernel Maximum Moment Loss". Journal of Causal Inference, 11(1). paper arXiv:2010.07684.
M.Kato, M.Imaizumi, K.Minami (2023), "Unified Perspective on Probability Divergence via Maximum Likelihood Density Ratio Estimation: Bridging KL-Divergence and Integral Probability Metrics", PMLR: Artificial Intelligence and Statistics, 20. paper arXiv:2201.13127.
T. Wakayama, M.Imaizumi (2024), "Fast Convergence on Perfect Classification for Functional Data". Statistica Sinica, to appear. arXiv:2104.02978
M.Imaizumi, J.Schmidt-Hieber (2023), "On Generalization Bounds for Deep Networks Based on Loss Surface Implicit Regularization", IEEE Transaction on Information Theory, 69(2). arXiv:2201.04545 paper press.
R.Okano, M.Imaizumi (2024), "Inference for Projection-Based Wasserstein Distances on Finite Spaces", Statistica Sinica, 34. arXiv:2202.05495 paper.
M.Imaizumi (2022), "In Estimating Functions with Singularity on Hypersurfaces and Advantages of Deep Neural Networks", Journal of the Japan Statistical Society, Japanese Issue (Special Topic: The JSS Ogawa Prize Lecture) 52, paper.
M.Imaizumi, K.Fukumizu, (2022), "Advantage of Deep Neural Networks for Estimating Functions with Singularity on Hypersurface". Journal of Machine Learning Research, 23(111). arXiv:2011.02256 paper.
M.Imaizumi, H.Ota, T.Hamaguchi (2022), "Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension", Neural Computation, 34(6). arXiv:1910.07773 paper.
M.Kato, M.Imaizumi, K.McAlinn, S.Yasui, H.Kakehi (2022) "Learning Causal Models from Conditional Moment Restrictions by Importance Weighting", International Conference on Learning Representations (spotlight). paper
K.Takeuchi, M.Imaizumi, S.Kanda, K.Fujii, M.Ishihata, T.Maekawa, K.Yoda, Y.Tabei (2021) "Frechet Kernel for Trajectory Data Analysis", ACM SIGSPATIAL, paper.
A.Sannai, M.Imaizumi, M.Kawano, (2021), "Improved Generalization Bounds of Group Invariant / Equivariant Deep Networks via Quotient Feature Spaces", PMLR: Uncertainty on Artificial Intelligence 2021. arXiv:1910.06552 paper.
M.Imaizumi (2021), "Analysis on Mechanism of Deep Learning: Perspective of Generalization Error", Journal of the Japan Statistical Society, Japanese Issue (Special Section: Machine Learning and Its Related Fields) 50. paper
R.Nakada, M.Imaizumi (2020), "Adaptive Approximation and Generalization of Deep Neural Network with Intrinsic Dimensionality", Journal of Machine Learning Research 21(174). paper arXiv:1907.02177 press.
K.Hayashi, M.Imaizumi, Y,Yoshida (2020), "On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis", PMLR: Artificial Intelligence and Statistics 2020. paper arXiv:1901.09541.
M.Imaizumi, K.Fukumizu (2019). "Deep Neural Networks Learn Non-Smooth Functions Effectively". PMLR: Artificial Intelligence and Statistics 2019. paper arXiv:1802.04474
M.Imaizumi, K.Kato (2019), "A simple method to construct confidence bands in functional linear regression". Statistica Sinica. 29. paper arXiv:1612.07490
M.Imaizumi, T.Maehara, Y.Yoshida (2018). "Statistically Efficient Estimation for Non-Smooth Probability Densities". PMLR: Artificial Intelligence and Statistics 2018 (Best Paper Award). paper
M.Imaizumi, K.Kato (2018). "PCA-based estimation for functional linear regression with functional responses". Journal of Multivariate Analysis 163. paper arXiv:1609.00286
M.Imaizumi, T.Maehara, K.Hayashi (2017) "On Tensor Train Rank Minimization: Statistical Efficiency and Scalable Algorithm". Advances in Neural Information and Processing Systems 30. paper arXiv:1708.00132
M.Imaizumi, K.Hayashi (2017). "Tensor Decomposition with Smoothness". PMLR: International Conference on Machine Learning 2017. paper
M.Imaizumi, R.Fujimaki (2017). "Factorized Asymptotic Bayesian Policy Search for POMDPs". Proc. of International Joint Conference of Artificial Intelligence. paper
M.Imaizumi, K.Hayashi (2016). "Doubly Decomposing Nonparametric Tensor Regression". PMLR: International Conference on Machine Learning 2016. paper arXiv:1506.05967
Book
T.Suzuki, T.Shibata, M.Imaizumi, J.Otsuka, E.Nakazawa, T.Ono, R.Uehara, K.Tachibana, S.Uesugi, K.Hori, K.Sekiguchi (2023) "How to deal with Artificial Intelligence", Keiso-Shobo, 2023/07. link
S.Akaho, M.Imaizumi, S.Uchida, T.Sei, S.Takano, S.Tsuji, N.Hara, R.Hisano, J.Matsubara, M.Miyaji, A.Morihata, Y.Yadohisa. (2023), "Data Science as Foundation of Applications", Kodan-sha, 2023/02. link
C.Aggarwal (author), A.Takemura, M.Imaizumi (translate) (2022), "Neural Network and Deep Learning", Gakujutsu-Tosho Shuppan-Sha, 2022/03. link
M.Imaizumi (2021), "Approaching the Principles of Deep Learning: A Mathematical Challenge", Iwanami Science Library, 2021/04. link
International Presentation
(*): invited / invited session
H.Ibayashi, M.Imaizumi (2022), "Why does SGD prefer flat minima?: Through the lens of dynamical systems", AAAI When Machine Learning meets Dynamical Systems: Theory and Applications, United States, 2022/02.
M.Kato, K.Ariu, M.Imaizumi, M.Uehara, M.Nomura (2023), "Best Arm Identification with a Fixed Budget under a Small Gap", 2023 ASA Annual Meeting, United States, 2023/01.
Y. Uematsu, K. Sawaya, M. Imaizumi (2022), "High-dimensional asymptotics for single-index models via approximate message passing", CMStatistics, London, 2022/12. (*)
M.Imaizumi (2022), "Sup-norm convergence of deep network estimator for nonparametric regression with corrected adversarial training", CMStatistics, London, 2022/12. (*)
M.Imaizumi (2022), "Theory of Deep Learning and Overparmeterization", Online Asian Machine Learning School, Asian Conference on Machine Learning, India, 2022/12.
M.Imaizumi (2022), "Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension", EcoSta, Kyoto, 2022/06. (*)
S.Nakakita, M.Imaizumi (2022), "Benign overfitting in stochastic regression", EcoSta, Kyoto, 2022/06. (*)
A.Okuno, M.Imaizumi (2021), "Minimax Analysis for Inverse Risk in Nonparametric Planer Invertible Regression", CMStatistics, London, UK, 2021/12. (*)
M.Kato, M.Imaizumi, K.McAlinn, S.Yasui, H.Kakehi (2021), "Learning Causal Relationships from Conditional Moment Restrictions by Importance Weighting", MLECON Workshop at NeurIPS 2021, virtual, 2021/12.
M. Imaizumi (2021), "Generalization of deep learning with singularity and implicit regularization", Statistics 2021 Canada conference, Montreal(virtual), Canada, 2021/07.
H.Ibayashi, T.Hamaguchi, M.Imaizumi, (2021), "Minimum sharpness: Scale-invariant parameter-robustness of neural networks", ICML Workshop on Theoretic Foundation, Criticism, and Application Trend of Explainable AI, US (Virtual), 2021/07. link
M.Imaizumi, T.Wakayama (2021), "Fast Convergence on Perfect Classification for Functional Data", Econometric and Statitsics, HongKong(virtual), China, 2021/06. (*)
R.Nakada, M.Imaizumi (2021), "Asymptotic Risk of Overparameterized Likelihood Models with Application to Double Descent on Deep Neural Networks", Econometric and Statitsics, HongKong(virtual), China, 2021/06. (*)
R.Nakada, M.Imaizumi, "Asymptotic Risk of Overparameterized Likelihood Models with Application to Double Descent on Deep Neural Networks", Workshop on the Theory of Overparameterized Machine Learning, Texas(virtual), US, 2021/04.
M.Imaizumi, "Generalization Analysis of Deep Models with Loss Surface and Likelihood Models", Workshop on Functional Inference and Machine Intelligence, Tokyo (virtual), Japan, 2021/03. (*).
M.Imaizumi, "Fast Convergence on Perfect Classification for Functional Data", CM-Statistics, London (virtual), UK, 2020/12. (*).
K.Hayashi, M.Imaizumi, Y,Yoshida, "On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis", Artificial Intelligence and Statistics, online, 2020/08.
M. Imaizumi, "Statistical inference on M-estimators by high-dimensional Gaussian approximation", Workshop on Functional Inference and Machine Intelligence 2020, Sophia Antipolis, France, 2020/02. (*)
M.Imaizumi, "Generalization Analysis for Mechanism of Deep Learning by Statistics and Learning Theory", Seminar, Seoul, Korea, 2020/02.
M.Imaizumi, "Generalization Analysis for Mechanism of Deep Learning via Nonparametric Statistics", International Chinese Statistical Association, HangZhou, China, 2019/12. (*)
M. Imaizumi, "Statistical inference on M-estimators by high-dimensional Gaussian approximation", CM-Statistics, London, UK, 2019/12. (*)
M.Imaizumi, H.Ota, T.Hamaguchi, "Hypothesis Test and Confidence Analysis with Wasserstein Distance on General Dimension", Asian Conference on Machine Learning Workshop on Statistics & Machine Learning Researchers in Japan, Nagoya, 2019/11.
M.Imaizumi, "Generalization Analysis for Mechanism of Deep Learning via Nonparametric Statistics", Asian Conference on Machine Learning Workshop on Statistics & Machine Learning Researchers in Japan, Nagoya, 2019/11. (*)
M.Imaizumi, "Statistical Estimation for Non-Smooth Functions by Deep Neural Networks", International Statistical Institute World Statistics Congress 2019, Kuala Lumpur, 2019/08. (*)
M.Imaizumi, "Generalization Analysis for Mechanism of Deep Learning via Nonparametric Statistics", Joint Statistical Meeting, Denver, 2019/07. (*)
M.Imaizumi, "Generalization Analysis for Mechanism of Deep Learning via Nonparametric Statistics", Third International Workshop on Symbolic-Neural Learning (SNL-2019), Tokyo, 2019/07. (*)
M.Imaizumi, "Deep Neural Networks Learn Non-Smooth Functions Effectively", IMS-China International Conference on Statistics and Probability, Dalian, 2019/07. (*)
M.Imaizumi, "Inference on level sets in functional linear regression", Econometrics and Statistics, Taichung, 2019/06. (*)
M.Imaizumi, K.Fukumizu, "Deep Neural Networks Learn Non-Smooth Functions Effectively", Artificial Intelligence and Statistics (AISTATS) 2019, Naha, 2019/04.
M.Imaizumi, "Statistical Generalization Analysis for Generative Adversarial Networks", Workshop on Functional Inference and Machine Intelligence 2019, Tokyo, 2019/03. (*)
M.Imaizumi, "Analysis for Deep Learning by Function Estimation Theory", ISI-ISM-ISSAS Joint Conference 2019, Taipei, 2019/01.
M.Imaizumi, K.Kato, "Inference on active domains of functional data via functional linear regression", Computational and Methodological Statistics, Pisa, 2018/12. (*)
M.Imaizumi, "Deep Neural Networks Learn Non-Smooth Functions Effectively", Statistics Seminar, Mathematical Institute, Leiden University, 2018/08.
M.Imaizumi, T.Maehara, Y.Yoshida, "Statistical Estimation for Non-Smooth Functions with the Regularity Lemma", Discrete Optimization and Machine Learning Workshop, Tokyo, 2018/07. (*)
M.Imaizumi, K.Fukumizu "Deep Neural Networks Learn Non-Smooth Functions Effectively". ICML 2018 Workshop on Theory of Deep Learning, Stockholm, 2018/07.
M.Imaizumi, K.Kato, "A simple method to construct confidence bands in functional linear regression", Econometrics and Statistics 2018, Hong Kong, 2018/16. (*)
M.Imaizumi, T.Maehara, Y.Yoshida "Statistically Efficient Estimation for Non-Smooth Probability Densities". Artificial Intelligence & Statistics (AISTATS), Canary Islands, 2018/04.
M.Imaizumi, K.Fukumizu "Statistical Estimation for Non-Smooth Functions by Deep Neural Networks". Workshop on Deep Learning: Theory, Algorithms, and Applications, Tokyo, 2018/03. (*)
M.Imaizumi, K.Fukumizu "Statistical Estimation for Non-Smooth Functions by Deep Neural Networks". AIP-IIS-MLGT Workshop 2018, Atlanta, 2018/02.
M.Imaizumi, K.Fukumizu "Statistical Estimation for Non-Smooth Functions by Deep Neural Networks". Workshop on Functional Inference and Machine Intelligence, Tokyo, 2018/02. (*)
M.Imaizumi, K.Kato, "A simple method to construct confidence bands in functional linear regression", Joint Meeting of 10th Asian Regional Section of the International Association for Statistical Computing and the NZ Statistical Association, New Zealand, 2017/12. (*)
M.Imaizumi, T.Maehara, K.Hayashi "On Tensor Train Rank Minimization: Statistical Efficiency and Scalable Algorithm". Neural Information and Processing Systems (NIPS), Long Beach, 2017/12.
M.Imaizumi, K.Yano, "Invariance Selection for Manifold Regression". ISI-ISM-ISSAS Joint Conference Tokyo 2017 , Tokyo, 2017/11.
M.Imaizumi, K.Hayashi, ”Tensor Decomposition with Smoothness”, The International Conference on Machine Learning (ICML) 2017, Sydney, 2017/08.
M.Imaizumi, K.Hayashi, ”Tensor Decomposition with Smoothness”, Seminar in Data61, The Commonwealth Scientific and Industrial Research Organisation (CSIRO), Canberra, 2017/08.
M.Imaizumi, R.Fujimaki, "Factorized Asymptotic Bayesian Policy Search for POMDPs", 27th International Joint Conference of Artificial Intelligence (IJCAI), Melbourne, 2017/08.
M.Imaizumi, K.Hayashi, "Tensor Decomposition with Smoothness", Neural Information Processing Systems workshop on Learning with Tensors (NIPS WS), Barcelona, 2016/12.
M.Imaizumi, "Nonlinear operator estimation with Bayes sieve prior", The 4th Institute of Mathematical Statistics Asia Pacific Rim Meeting (APRM), HongKong, 2016/07.
M.Imaizumi, "Regression with infinite dimensional spaces by reproducing kernel Hilbert space approach", 9th World Congress in Probability and Statistics (WCPS), Toronto, 2016/07.
M.Imaizumi, K.Hayashi, ”Doubly Decomposing Nonparametric Tensor Regression”, The International Conference on Machine Learning (ICML) 2016, New York, 2016/06.
M.Imaizumi, "Nonparametric multivariate regression with tensor product RKHS", International Meeting on High-Dimensional Data Driven Science (HD3), Kyoto, 2015/12.
M.Imaizumi, ”Efficient estimation for semiparametric models by reproducing kernel Hilbert space”, American statistical association, Joint statistical meeting (JSM), Seattle, 2015/08.
M.Imaizumi, K.Hayashi, ”Bayesian estimation for nonparametric regression with low-rank tensor data”, 10th Conference on Bayesian Nonparametrics (BNP), Raleigh, 2015/06.
M.Imaizumi, ”Efficient estimation for semiparametric models by reproducing kernel Hilbert space, STICERD Econometrics Seminar Series (LSE), London, 2014/12.
M.Imaizumi, ”An approximation method for discrete Markov decision models with high dimensional state space”, Econometric Society European Winter Meeting, Madrid, 2014/12.