I am a researcher at NEC Laboratories Japan. I have a two-years experinence working in NEC Laboratories Europe, Germany. I was also working as a visiting researcher at SNAP group at Stanford University.
My research focuses on representation learning of structured data and its application to computational science. I am particularly interested in representation learning on high-order structured data such as mesh data, simplicial complexes, and hyperstructure. I have background of homotopy theory, algebraic topology, and fractal theory, and strong interest in study and application of those geometric theories.
Previously, I completed my Ph.D. in Mathematical Science from Nagoya University, Japan, in 2017, where I had a great time working with Professor Hitoshi Moriyoshi and Professor Thomas Geisser.
"LRIM: a Physics-Based Benchmark for Provably Evaluating Long-Range Capabilities in Graph Learning,"
Joël Mathys, Henrik Christiansen, Federico Errica, Takashi Maruyama, Francesco Alesiani,
ICLR 2026
[OpenReview]
"Adaptive Message Passing: A General Framework to Mitigate Oversmoothing, Oversquashing, and Underreaching,"
Federico Errica, Henrik Christiansen, Viktor Zaverkin, Takashi Maruyama, Mathias Niepert, and Francesco Alesiani
ICML 2025
[arXiv] [PMLR] [LinkedIn] [Medium]
"Higher-Rank Irreducible Cartesian Tensors for Equivariant Message Passing,"
Viktor Zaverkin, Francesco Alesiani*, Takashi Maruyama*, Federico Errica, Henrik Christiansen, Makoto Takamoto, Nicolas Weber, Mathias Niepert
NeurIPS 2024
[arXiv] [LinkedIn] [OpenReview]
"Compositional Generative Inverse Design"
Tailin Wu*, Takashi Maruyama*, Long Wei*, Tao Zhang*, Yilun Du*, Gianluca Iaccarino, and Jure Leskovec
ICLR 2024 (Spotlight), NeurIPS 2023 Workshop AI4Science
"Learning Controllable Adaptive Simulation for Multi-scale Physics"
Tailin Wu*, Takashi Maruyama*, Qingqing Zhao*, Gordon Wetzstein, and Jure Leskovec
ICLR 2023 (notable top-25%)
[Project page] [Github] [OpenReview] [arXiv]
"Learning to accelerate simulation and inverse optimization of PDEs via latent global evolution"
Tailin Wu, Takashi Maruyama, and Jure Leskovec
NeurIPS 2022
[Project page] [Github] [OpenReview] [arXiv]
"Fast, Modular, and Differentiable Framework for Machine Learning-Enhanced Molecular Simulations," Henrik Christiansen, Takashi Maruyama, Federico Errica, Viktor Zaverkin, Makoto Takamoto, and Francesco Alesiani, J. Chem. Phys. 163, 182501, 2025
[The Journal of Chemical Physics]
"A combinatorial Fredholm module on self-similar sets built on n-cubes,"
Takashi Maruyama and Tatsuki Seto
Journal of Fractal Geometry, 2023
[arXiv] [Journal of Fractal Geometry]
"Cyclic Cohomology Groups of Some Self-similar Sets"
Takashi Maruyama
Ph.D. thesis
A Leibniz rule of distributional pairing and hyperforce sum rule, Takashi Maruyama, Tatsuki Seto, Viktor Zaverkin, Henrik Christiansen, under review (2026)
arXiv:2603.01519
"Geometric Kolmogorov-Arnold Superposition Theorem," Francesco Alesiani, Takashi Maruyama, Henrik Christiansen, and Viktor Zaverkin, under review (2025)
"A Young-type integration on self-similar sets in intervals," Takashi Maruyama and Tatsuki Seto, under review (2025)
"Gradient of Clifford Neural Networks," Takashi Maruyama and Francesco Alesiani,
NeurIPS 2024 Workshop: Data-driven and Differentiable Simulations, Surrogates, and Solvers
"Clifford flows," Francesco Alesiani and Takashi Maruyama,
NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
"Hierarchy-based Clifford Group Equivariant Message Passing Neural Networks," Takashi Maruyama and Francesco Alesiani,
ICLR 2024 Workshop on AI4DifferentialEquations In Science
"A Young-type integration on self-similar sets in intervals," 2024
Takashi Maruyama and Tatsuki Seto
[arXiv]
Copenhagen-Nagoya graduate students seminar
Nagoya, 8/2016
Meeting for exchange among academia and industry
Tokyo, 11/2015
Workshop of graduate students from institutes of mathematics - vast Hanoi and Nagoya university,
Halong, 3/2015