Yuki Onishi

Institution:

Research Topic: Human-Computer Interaction, Robotic display, Shape-changing interface, Spatial interface

Contact:    Email: yuki.onishi.c5[at]tohoku.ac.jp
      yukionishi87[at]gmail.com
                        Twitter / LinkedIn / Google Scholar / ICD-Lab

Resume:    CV (updated at 2023.05)

Employment

2023.06-Present        Postdoctoral Fellow at Research Institute of Electrical Communication, Tohoku University

2023.05-2024.05       Postdoctoral Fellow at School of Computing and Information Systems, Singapore Management Univerisity

2023.04-2023.05       Project Researcher at Research Institute of Electrical Communication, Tohoku University

 Education

2020.04-2023.03      Ph.D. candidate in Information Sciences at Graduate School of Information Sciences, Tohoku University

2018.04-2020.03  MSc at Graduate School of Information Sciences, Tohoku University

2014.04-2018.03  Bachelor at Engineering, Tohoku University

2011.04-2014.03  Kichijo girl's high school

Experience

2020.04-2023.03        Japan Society for the Promotion of Science (JSPS), Research Fellowship (DC1)

2018.06-2023.03        Tohoku University Science Ambassador

2021.12       VRST'21 Student Volunteer Chair

2020.04-2021.09      Teaching assistant, The basic of information, Miyagi University of Education

2020.04-2020.09  Teaching assistant, The basic of computer science A, Tohoku University

2019.05-2019.05  ACM CHI 2019 Student Volunteer

2018.05-2018.05  Japan-Russia Student Forum, Japan Student Representative

2017.11-2017.11      ACM SIGGGRAPH Asia 2017 Student Volunteer

2017.10-2017.10      World Festival of Youth and Student, Japan Delegate 

Research

Anesth-on-the-Go

Gaining extensive practice in anesthesia procedures is critical for medical students. However, the current high-fidelity simulators for clinical anesthesia practice limit such vital opportunities due to their professional-level functions, large size, and high installation costs. In this paper, we propose Anesth-on-the-Go, a portable game-based anesthesia simulator that allows medical students to repeatedly practice clinical anesthesia procedures anywhere, anytime on a conventional personal computer. The proposed simulator is designed for medical students, and it allows them to manipulate controls appropriately for various anesthesia procedures according to pre-determined surgical scenarios. Based on iterative interviews with clinical anesthesia instructors, we designed a simulation interface with four key components: 1) a monitor, 2) a decision-making board, 3) a surgical field view, and 4) a communication view. As our initial evaluation, we introduced the simulator in the training curriculum for medical students and verified its effect on their proficiency level in subsequent training with high-fidelity simulation. The results show that individual and prior training with our simulator improved each student’s overall performance (i.e., score) and stimulated discussion among teammates during the subsequent formal training. The evaluation’s findings also indicate that the simulator experience can facilitate post-training reflection.

CHI'21 Late Breaking Work: [DOI] [Video Preview] [Presentation]

UbiSurface

Room-scale VR has been considered an alternative to physical office workspaces. For office activities, users frequently require planar input methods, such as typing or handwriting, to quickly record annotations to virtual content. However, current off-the-shelf VR HMD setups rely on mid-air interactions, which can cause arm fatigue and decrease input accuracy. To address this issue, we propose UbiSurface, a robotic touch surface that can automatically reposition itself to physically present a virtual planar input surface (VR whiteboard, VR canvas, etc.) to users and to permit them to achieve accurate and fatigue-less input while walking around a virtual room. We design and implement a prototype of UbiSurface that can dynamically change a canvas-sized touch surface's position, height, and pitch and yaw angles to adapt to virtual surfaces spatially arranged at various locations and angles around a virtual room. We then conduct studies to validate its technical performance and examine how UbiSurface facilitates the user's primary mid-air planar interactions, such as painting and writing in a room-scale VR setup. Our results indicate that this system reduces arm fatigue and increases input accuracy, especially for writing tasks. We then discuss the potential benefits and challenges of robotic touch devices for future room-scale VR setups. 

Proc. ACM Hum.-Comput. Interact. (ISS'23) : [DOI] [Video

WaddleWalls

We propose WaddleWalls, a room-scale interactive partitioning system using a swarm of robotic partitions that allows occupants to interactively reconfigure workspace partitions to satisfy their privacy and interaction needs. The system can automatically arrange the partitions' layout designed by the user on demand. The user specifies the target partition's position, orientation, and height using the controller's 3D manipulations. In this work, we discuss the design considerations of the interactive partition system and implement WaddleWalls' proof-of-concept prototype assembled with off-the-shelf materials. We demonstrate the functionalities of WaddleWalls through several application scenarios in an open-planned office environment. Through initial user evaluation and interview with experts, we clarify the feasibility, potential, and future challenges of WaddleWalls.

UIST'22 Technical Paper: [DOI] [Video] [Video Preview]

Self-actuated Stretchable Partition

We propose a self-actuated stretchable partition whose physical height, width, and position can dynamically change to create secure workplaces (e.g., against privacy and infectious risks) without inhibiting group collaboration. To support secure workplace layouts and space reconfigurations, the partitions’ height, length, and position are adapted automatically and dynamically. We implement a proof-of concept prototype with a height-adjustable stand, a roll-up screen, and a mobile robot. We then show some example application scenarios to discuss potential future actuated-territorial offices.

CHI'21 Late Breaking Work: [DOI] [Video] [Video Preview] [Presentation]

Bouncy Screen

We explore BouncyScreen, an actuated 1D display system that enriches indirect interaction with a virtual object by pseudo-haptic feedback mechanics enhanced through the screen's physical movements. When the user manipulates a virtual object using virtual reality (VR) controllers, the screen moves in accordance with the virtual object. We configured a proof-of-concept prototype of BouncyScreen with a flat-screen mounted on a mobile robot. The psychophysical study confirmed that the screen's synchronous physical motions significantly enhance the reality of the interaction and the sense of presence.

IEEE VR'21 Conference Paper: [DOI] [Video] [Video Preview] [Presentation]
第24回 日本VR学会大会'19: [PDF]

Living Wall Display

Living Wall Display displays interactive content on a mobile wall screen that moves in concern with content animation. In addition to the audio-visual information, the display dynamically changes its position and orientation, responding to the content animation triggered by user interactions for augmentaion of the intaraction experience. We implement three proof of concept prototypes (FPS Shooting Game, Drive Simulation, Baseball Pitching Simulation) that represent pseudo force impact of the interactive content using physical screen movement.

SIGGRAPH Asia'18 Emerging Technology: [DOI] [Video] [Media]
日本VR学会論文誌 (2019):  [DOI]
エンターテインメントコンピューティング(2017): [DOI]

Be Bait!

We present “Be Bait!”, a unique virtual fishing experience that allows the user to become a bait by lying on the hammocks, instead of holding a fishing rod. We implement the hammock-based locomotion method with haptic feedback mechanisms. In our demonstration, users can enjoy exploring a virtual underwater world and fighting with fishes in direct and intuitive ways. 

IEEE VR'19 Demo: [DOI]
IVRC'18: [Link] (Unity Award, Dospara Award)

 Publication







 All Rights Reserved. Yuki Onishi, 2020.