Home‎ > ‎

Publications and Talks

Publications
    Refereed Papers
  1. K. Sato and H. Sato, Structure preserving H^2 optimal model reduction based on Riemannian trust-region method, IEEE Transactions on Automatic Control63(2), 505–512, 2018.
  2. K. Aihara and H. SatoA matrix-free implementation of Riemannian Newton's method on the Stiefel manifold, Optimization Letters11(8), 1729–1741, 2017.
  3. H. SatoRiemannian Newton-type methods for joint diagonalization on the Stiefel manifold with application to independent component analysis, Optimization66(12), 2211–2231, 2017.
  4. H. Sato and K. Sato, Riemannian optimal system identification algorithm for linear MIMO systems, IEEE Control Systems Letters1(2), 376–381, 2017.
  5. 佐藤寛之リーマン多様体上の最適化の理論と応用,応用数理,27(1), 21–30, 2017. (in Japanese)
  6. H. Sato, A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions, Computational Optimization and Applications64(1), 101–118, 2016.
  7. H. Sato and T. Iwai, A new, globally convergent Riemannian conjugate gradient method, Optimization64(4), 1011–1031, 2015.
  8. H. Sato, Joint singular value decomposition algorithm based on the Riemannian trust-region method, JSIAM Letters, 7, 13–16, 2015.
  9. H. Sato and T. Iwai, Optimization algorithms on the Grassmann manifold with application to matrix eigenvalue problems, Japan Journal of Industrial and Applied Mathematics, 31(2), 355–400, 2014.
  10. H. Sato and T. Iwai, A Riemannian optimization approach to the matrix singular value decomposition, SIAM Journal on Optimization, 23(1), 188–212, 2013.
    Refereed Conference Proceedings
  1. H. Kasai, H. Sato, and B. Mishra, Riemannian stochastic recursive gradient algorithm with retraction and vector transport and its convergence analysis, Proceedings of the 35th International Conference on Machine Learning (ICML 2018), Proceedings of Machine Learning Research80, 2521–2529, 2018.
  2. H. Kasai, H. Sato, and B. Mishra, Riemannian stochastic quasi-Newton algorithm with variance reduction and its convergence analysis, Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018), Proceedings of Machine Learning Research84, 269–278, 2018.
  3. M. Kawai, T. Shiohama, and H. Sato, Supervised-topic-model-based hybrid filtering for recommender systems, Proceedings of the 2nd International Conference on Big Data, Cloud Computing, and Data Science Engineering (BCD 2017), 272–277, 2017.
  4. H. Sato and K. Sato, A new H^2 optimal model reduction method based on Riemannian conjugate gradient method, Proceedings of the 55th IEEE Conference on Decision and Control (CDC 2016), 5762–5768, 2016.
  5. H. Sato and K. Sato, Riemannian trust-region methods for H^2 optimal model reduction, Proceedings of the 54th IEEE Conference on Decision and Control (CDC 2015), 4648–4655, 2015.
  6. H. Sato, Riemannian conjugate gradient method for complex singular value decomposition problem, Proceedings of the 53rd IEEE Conference on Decision and Control (CDC 2014), 5849–5854, 2014.
  7. H. Sato and T. Iwai, A complex singular value decomposition algorithm based on the Riemannian Newton method, Proceedings of the 52nd IEEE Conference on Decision and Control (CDC 2013), 2972–2978, 2013.
    Non-refereed Papers and Others
        Please see Japanese version.

Talks
    International Conferences
  1. H. Kasai, H. Sato, and B. Mishra, Riemannian stochastic recursive gradient algorithm, The 35th International Conference on Machine Learning (ICML 2018), Stockholmsmässan, Stockholm, Sweden, July, 2018.
  2. H. Kasai, H. Sato, and B. Mishra, Stochastic recursive gradient on Riemannian manifolds, Geometry in Machine Learning (GiMLi 2018), Stockholmsmässan, Stockholm, Sweden, July, 2018.
  3. H. Kasai, H. Sato, and B. Mishra, Riemannian stochastic quasi-Newton algorithm with variance reduction and its convergence analysis, The 21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018), H10 Rubicón Palace, Playa Blanca, Spain, April, 2018.
  4. K. Aihara and H. Sato, Solving a Newton equation on the Stiefel manifold with matrix-free Krylov subspace methods, 2018 SIAM Conference on Parallel Processing for Scientific Computing, Waseda University, Tokyo, Japan, March, 2018.
  5. H. Sato and K. Sato, Riemannian optimal system identification algorithm for linear MIMO systems, The 56th IEEE Conference on Decision and Control, Melbourne Convention Center, Melbourne, Australia, December, 2017.
  6. M. Kawai, T. Shiohama, and H. Sato, Supervised-topic-model-based hybrid filtering for recommender systems, 2nd International Conference on Big Data, Cloud Computing, and Data Science Engineering, ACT CITY Hamamatsu, Hamamatsu, Japan, July, 2017.
  7. H. Sato and K. Sato, A new H^2 optimal model reduction method based on Riemannian conjugate gradient method, The 55th IEEE Conference on Decision and Control, ARIA Resort & Casino, Las Vegas, USA, December, 2016.
  8. H. Kasai, H. Sato, and B. Mishra, Riemannian stochastic variance reduced gradient on Grassmann manifold, The 9th NIPS Workshop on Optimization for Machine Learning, Centre Convencions Internacional Barcelona, Barcelona, Spain, December, 2016. (Poster Presentation)
  9. H. Kasai, H. Sato, and B. Mishra, Riemannian stochastic variance reduced gradient on Grassmann manifold, The Fifth International Conference on Continuous Optimization, National Graduate Institute for Policy Studies, Tokyo, Japan, August, 2016.
  10. H. Sato and K. Sato, Riemannian trust-region methods for H^2 optimal model reduction, The 54th IEEE Conference on Decision and Control, Osaka International Convention Center, Osaka, Japan, December, 2015.
  11. K. Aihara and H. Sato, Matrix-free Krylov subspace methods for solving a Riemannian Newton equation, 2015 SIAM Conference on Applied Linear Algebra, Hyatt Regency Atlanta, Atlanta, USA, October, 2015.
  12. H. Sato and K. Aihara, Riemannian Newton's method for optimization problems on the Stiefel manifold, 22nd International Symposium on Mathematical Programming, Wyndham Grand Pittsburgh Downtown, Pittsburgh, USA, July, 2015.
  13. A. Kitao, T. Shiohama, and H. Sato, Financial news classification based on topographic independent component analysis: Optimization on the Stiefel manifold, 16th Conference of the Applied Stochastic Models and Data Analysis, University of Piraeus, Piraeus, Greece, June, 2015.
  14. H. Sato, Riemannian optimization and its applications, Hong Kong-Tokyo Workshop on Scientific Computing, National Institute of Informatics, Tokyo, Japan, April, 2015.
  15. H. Sato, Riemannian conjugate gradient method for complex singular value decomposition problem, The 53rd IEEE Conference on Decision and Control, JW Marriott Los Angeles L.A. LIVE, Los Angeles, USA, December, 2014.
  16. H. Sato, Global convergence analysis of several Riemannian conjugate gradient methods, 2014 SIAM Conference on Optimization, Town and Country Resort & Convention Center, San Diego, USA, May, 2014.
  17. H. Sato, Several matrix computation algorithms based on Riemannian optimization techniques, International Workshop on Eigenvalue Problems: Algorithms; Software and Applications, in Petascale Computing, Tsukuba International Congress Center EPOCHAL TSUKUBA, Tsukuba, Japan, March, 2014. (Poster Presentation)
  18. H. Sato and T. Iwai, A complex singular value decomposition algorithm based on the Riemannian Newton method, The 52nd IEEE Conference on Decision and Control, Firenze Fiera Congress & Exhibition Center, Florence, Italy, December, 2013.
  19. H. Sato, Optimization algorithms on the Grassmann and the Stiefel manifolds with applications to numerical linear algebra, Nanjing-Kyoto Joint Workshop on Algorithms, Optimization and Numerical Analysis 2012, Kyoto University, Kyoto, Japan, March, 2012.
  20. H. Sato, Optimization on manifolds, Joint Workshop on Modeling, Systems, and Control 2011, Hanoi University of Science and Technology, Hanoi, Vietnam, March, 2011.
    Domestic Conferences and Awards
        Please see Japanese version.
Comments