Home

Keisuke Yano

Assistant Professor,

Mathemarical Informatics 4th Laboratory,

Department of Mathematical Informatics,

Graduate School of Information Science and Technology,

The University of Tokyo,

Bunkyo-ku, Tokyo, 113-8656, Japan

Email: yano[at]mist.i.u-tokyo.ac.jp

Research Interests

I am interested especially in

  • Statistics;
  • Information theory;
  • Probability.

Curriculum Vitae

  • Research Experience
    • 4/2017- : Assistant Professor at the University of Tokyo
    • 4/2015-3/2017: JSPS Research Fellow (DC2)
  • Grands
    • 8/2017-3/2019: Grant-in-Aid for Research Activity start-up
    • 4/2015-3/2017: Grant-in-Aid for JSPS Fellows
  • Education
    • Doctor of Information Science and Technology from Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, 2017. (superviser: Professor Fumiyasu Komaki)
    • Master of Information Science and Technology from Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, 2014. (superviser: Professor Fumiyasu Komaki)
    • Bachelor of Engineering from Information Physics Corse, Department of Mathematical Engineering and Information Physics, Faculty of Engineering, The University of Tokyo, 2012.(superviser: Professor Hiroshi Nakamura)
  • Born in Ehime prefecture/Japan, Mar, 1989.


Research Interests (more detailed)

I am interested in statistics and information theory.

I would like to identify what is ultimate information useful to statistical inference (and our life).

My main tools so far are (i) Bayesian approach, (ii) divergence, and (iii) nonparametric statistics.

The reason why I employ Bayesian approach is that combination of Bayesian approach and decision theory easily reveals which kind of "information" is useful to inference.

We simply put some ''prior information'' on unknown quantities via prior distributions and then decision rules judge whether their solutions called Bayes solutions are useful to inference or not.

Divergences are metrics of probability measures.

I like the relative entropy the best among divergences because of its beautiful duality relationship and the role of serving as a bridge between statistics and information theory.

Nonparametric statistics are inferring unknown quantities making as little assumptions on them as possible.

I like nonparametric statistics because I believe that I can see "abstract" information through combination of Bayesian approach and nonparametric statistics.