The First IEEE VR Workshop on Human Augmentation and Its Applications (HAA2019)

March 24th (Sun), 2019

co-located with IEEE VR 2019 (OSAKA)


Human augmentation is regarded as an important research field with a view to the future society in which computer technology, Virtual and Augmented Reality, Artificial Intelligence, Computer Vision, Robotics, etc. are highly integrated. Human augmentation is not just a research to make human much stronger. It should be used to assist people’s daily activities. For example, it can be used to teach sports or musical performances more effectively. It can be used even for assisting handicapped person. This workshop is expecting novel research results or late breaking results on designs, methods, implementations, or applications to augment or enhance human ability, in physical and intellectual, by using advanced technologies of VR/AR, AI, CV, and Robotics.

Main topics:

・VR/AR/MR/Robotics for human augmentation;

・Computer vision and Artificial Intelligence for human augmentation;

・Brain machine interface and human augmentation;

・Design consideration and cognitive aspect for human augmentation;

・Applications of human augmentation such as sports, music and art performance;

Keynote speakers:

We are very happy to announce that we have two splended keynote speakers.

Dr. Maria Herrojo Ruiz

Department of Psychology, Goldsmiths University of London , UK

Prof. Dr. Didier Stricker

German Research Center for Artificial Intelligence, DFKI, Germany

Accepted papers:

We received many submissions and the following papers were accepted for the publication and presentation at the workshop.

  • Aftereffect of Visuomotor Adaptation to Gradually Distorted Reality Displayed on a See-Through Head-Mounted Device Is Associated with Pre-Movement Readiness Cortical Activity: Ryota Mori, Shoko Kasuga, Shunichi Kasahara, Junichi Rekimoto, Junichi Ushiba
  • A Novel Vibrotactile Biofeedback Device for Optimizing Neuromuscular Control in Piano Playing: Takanori Oku, Shinichi Furuya
  • Lucid Virtual/Augmented Reality (LVAR) Integrated with an Endoskeletal Robot Suit: StillSuit --- A new framework for cognitive and physical interventions to support the ageing society: Satoshi Oota, Akihiko Murai, and Masaaki Mochimaru
  • Generating Synthetic Humans for Learning 3D Pose Estimation: Kohei Aso, Dong-Hyun Hwang, Hideki Koike
  • Development of Sensitive Glove Type Wearable Robot System: Bin Zhang, Atsufumi Suzuki, Hunok Lim
  • SuppleView: decreasing physically limitations on the movement imitation with viewing motions in the video: Natsuki Hamanishi, Jun Rekimoto
  • Toward human motion capturing with an ultra-wide fisheye camera on the chest: Dong-Hyun Hwang, Kohei Aso, Hideki Koike
  • The method of reducing Phantom Limb Pain using optical see-through Head Mounted Display: Kenta Saito, Takashi Miyaki, Jun Rekimoto
  • wavEMS: Improving Signal Variation Freedom of Electrical Muscle Stimulation: Michinari Kono, Jun Rekimoto
  • HYPERSPECTIVE: Shaping Experiences beyond Perspectives: Jun Nishida and Kenji Suzuki
  • Falconer: A Tethered Aerial Companion for Enhancing Personal Space: Romain Nith, Jun Rekimoto
  • A Novel Soft Exoskeleton Glove for Motor Skill Acquisition Resembling Anatomical Structure of Forearm Muscles: Nobuhiro Takahashi, Hayato Takahashi, Hideki Koike
  • InterPoser: Visualizing Interpolated Movements for Bouldering Training: Keisuke Shiro, Kazme Egawa, Takashi Miyaki, Jun Rekimoto
  • Design of Control Method for Soft Exoskeleton Glove: Hayato Takahashi, Nobuhiro Takahashi, Shinichi Furuya, Hideki Koike
  • Post-Data Augmentation to Improve Deep Pose Estimation of Extreme and Wild Motions: Kohei Toyoda, Michinari Kono, Jun Rekimoto
  • Augmented Sports For Learning Using Wearable Head-worn and Wrist-worn Devices: Hui-Shyong Yeo, Hideki Koike, Aaron Quigley
  • A Real-Time Projection System for Golf Training using Virtual Shadow: Atsuki Ikeda, Dong-Hyun Hwang, Hideki Koike
  • Near-Future Body Motion Foresight and Its Applications in Sports: Yasutoshi Makino, Yuuki Horiuchi, Shuya Suda and Hiroyuki Shinoda
  • CompoundDome:A wearable dome device that enables interaction with the real world by partially transmitting the screen: Eriko Maruyama, Jun Rekimoto
  • Real-Time Human Motion Forecasting using a RGB Camera: Erwin Wu, Hideki Koike


Papers must be between 2 to 6 pages in VGTC format and submitted to koike [@] via email.

Important dates:

Submission deadline: January 16th, 2019. extended to February 1st, 2019.

Notification: February 5th, 2019.

Camera ready: February 17th, 2019.

Workshop: March 24, 2019.


Hideki Koike, Tokyo Institute of Technology, Japan

Jun Rekimoto, The University of Tokyo, Japan

Junichi Ushiba, Keio University, Japan

Shinichi Furuya, Sony Computer Science Laboratory, Japan

Asa Ito, Tokyo Institute of Technology, Japan

Supported by:

JST CREST Project "A Study on Skill Acquisition and Development of Skill Transfer Systems"