About
I am an independent William Gordon Seggie Brown research fellow in School of Mathematics in The University of Edinburgh. Previously, I was a Turing doctoral student at The Alan Turing Institute and a PhD student at Newcastle University with Prof. Chris Oates.
My research interests lie in (1) methodologies to estimate and assess predictive uncertainty of machine learning models, (2) computationally efficient Bayesian methodologies applicable for modern complex models, and (3) theoretical foundations of robustness of Bayesian statistics. In my latest project, I worked on Wasserstein gradient boosting that returns a set of particles that approximates a target probability distribution assigned at each input. I also worked on the theory to reveal a Hamiltonian dynamical structure behind Bayesian inference, developing a new class of systems for saddle Hamiltonian functions over metric spaces. In my past projects, I worked on intractable likelihoods, outlier Bayesian robustness, posterior calibration, and Bayesian neural networks, drawing on elegant tools from kernel methods, Stein's method, statistical learning theory, and Monte Carlo methods.
Publications / Preprints
Matsubara, T. Wasserstein Gradient Boosting: A Framework for Distribution-Valued Supervised Learning. 2024 (arXiv:2405.09536v2).
Matsubara, T. Hamiltonian Dynamics of Bayesian Inference Formalised by Arc Hamiltonian Systems. 2023 (arXiv:2310.07680).
Mastubara, T., Knoblauch, J., Briol, F-X., Oates, C. J. Generalised Bayesian Inference for Discrete Intractable Likelihoods. Journal of the American Statistical Association, 0(0);1-11, 2023 (arXiv:2206.08420).
Mastubara, T., Mudd, R., Tax N., Guy, I. TCE: A Test-Based Approach to Measuring Calibration Error. The 39th Conference on Uncertainty in Artificial Intelligence, 2023 (arXiv:2306.14343).
Mastubara, T., Knoblauch, J., Briol, F-X., Oates, C. J. Robust Generalised Bayesian Inference for Intractable Likelihoods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 84(3):997-1022, 2022 (arXiv:2104.07359).
Mastubara, T., Oates, C. J., Briol, F-X. The Ridgelet Prior: A Covariance Function Approach to Prior Specification for Bayesian Neural Networks. Journal of Machine Learning Research (22) 1 - 57, 2021 (arXiv:2010.08488).
Academic Service
Organiser:
The University of Edinburgh Stats Seminar (2024 - present)
DCE Reading Group in The Alan Turing Institute (2020 - 2021)
Reviewer: Annals of Statistics, Journal of Machine Learning Research, Statistics and Computing, Electronic Journal of Statistics, NeurIPS, AISTATS.
Contact
Email : takuo[dot]matsubara[at]domain (replace "domain" with "ed.ac.uk")
Address : James Clerk Maxwell Building, EH9 3JZ
Awards
American Statistical Association SBSS Student Paper Competition Award 2022
NeurIPS 2021 Best Reviewer Awards
ISBA 2021 Best Student / Postdoc Paper Award
Newcastle University Postgraduate Research Prize
The Alan Turing Institute Doctoral Studentship
Archived News (2019 - 2023)
(12 / 2023) Preprint online at arXiv:2310.07680: Hamiltonian Dynamics of Bayesian Inference Formalised by Arc Hamiltonian Systems.
(09 / 2023) Started as a William Gordon Seggie Brown fellow at the University of Edinburgh!
(02 / 2023) Honoured to be a William Gordon Seggie Brown fellow from September at the University of Edinburgh!
(02 / 2023) Finished my academic collaborator role at Meta with a resulting paper.
(10 / 2022) Finished Research Scientist Intern at Meta and will continue to be a part-time academic collaborator with Meta.
(06 / 2022) Preprint online at arXiv:2206.08420: "Generalised Bayesian Inference for Discrete Intractable Likelihoods".
(06 / 2022) Invited Talk at EcoSta2022 on "Robust Generalised Bayesian Inference for Intractable Likelihoods".
(04 / 2022) Visited RIKEN Center for Advanced Intelligence Project for a month.
(01 / 2022) Paper "Robust Generalised Bayesian Inference for Intractable Likelihoods" accepted at Journal of the Royal Statistical Society: Series B.
(01 / 2022) Awarded "Student Paper Competition Award 2022" at The American Statistical Association Section on Bayesian Statistical Science.
(01 / 2022) Will start Research Scientist Intern at Meta in Summer 2022.
(11 / 2021) Workshop Paper at NeurIPS 2021 Workshop Your Model is Wrong on "Robust Generalised Bayesian Inference for Intractable Likelihoods".
(10 / 2021) Awarded "Outstanding Reviewer Award" at NeurIPS 2021.
(09 / 2021) Contributed Talk at Bayes at CIRM 2021 on "Robust Generalised Bayesian Inference for Intractable Likelihoods".
(07 / 2021) Awarded "The Best Student / Postdoc Paper Award at ISBA 2021 World Meeting".
(06 / 2021) Paper "The ridgelet prior: ... prior specification for Bayesian neural networks" accepted at Journal of Machine Learning Research.
(04 / 2021) Contributed Talk at ISBA 2021 World Meeting on "Robust Generalised Bayesian Inference for Intractable Likelihoods".
(04 / 2021) Preprint online at arXiv:2104.07359: "Robust Generalised Bayesian Inference for Intractable Likelihoods".
(12 / 2020) Poster Presentation at NeurIPS Europe meetup on Bayesian Deep Learning on "The ridgelet prior: ... prior specification for Bayesian neural networks".
(11 / 2020) Preprint online at arXiv:2010.08488: "The ridgelet prior: ... prior specification for Bayesian neural networks".
(10 / 2020) Invited Talk at Laplace’s Demon Seminar Series on "The ridgelet prior: ... prior specification for Bayesian neural networks".
(07 / 2020) Invited Talk at MCQMC 2020, Minisymposium of PN and Kernel- Based Methods on "Quadrature of Bayesian neural networks".
(02 / 2020) Poster Presentation at Workshop on FIMI 2020 on "Approximation of Gaussian Processes by Bayesian Neural Networks".
(12 / 2019) Invited Talk at Deep Structures 2019 on "Quadrature of neural networks based on the ridgelet transform, and the possibility of the extension".
(09 / 2019) My PhD started.