Yuichi Kurita is a Professor in the Graduate School of Engineering, director of the Applied Human Augmentation Project Research Center, and director of the KOBELCO Construction Machinery Dream-Driven Co-Creation Research Center at Hiroshima University. He received his B.E. degree from Osaka University and his M.E. and Ph.D. degrees in computer science from Nara Institute of Science and Technology. His research interests include haptics, human-machine systems, and human augmentation.
Invited talk: The Frontiers of Motor Assistance and Rehabilitation Using Physical VR
Abstract: Maintaining, enhancing, and restoring motor function requires continuous physical training; however, many individuals find it difficult to engage in such training proactively. In recent years, advanced technologies, particularly virtual reality (VR), have been increasingly explored as a means of enhancing motivation and engagement in physical training. Among these technologies, haptics, including tactile and force feedback, plays a crucial role in reproducing bodily sensations and enabling physical interaction during training. Following vision and audition, haptics is now recognized as a key modality for extending VR experiences into the physical world. In this talk, we define Physical VR (PVR) as VR systems that provide physical feedback to the human body. We introduce an approach to realizing PVR using soft actuators and present recent developments in motor assistance and rehabilitation support enabled by this framework.
Dr. Dražen Brščić is an Associate Professor at Kyoto University’s Graduate School of Informatics, Department of Social Informatics. He received his Ph.D. from the University of Tokyo and subsequently conducted research at the Technical University of Munich and the Advanced Telecommunications Research Institute International (ATR) in Kyoto. Prior to joining Kyoto University, he was an Assistant Professor at the University of Rijeka. His research focuses on human behavior modeling and human–robot interaction, with an emphasis on robots operating in public spaces.
Invited talk: Challenges in Bringing Social Robots to Public Spaces
Abstract: Integrating socially interactive robots into everyday human environments presents many challenges. Robots operating in public spaces such as shops and malls must interact appropriately with people in dynamic and socially complex settings, where uncertainty, human behavior, and social norms are difficult to model explicitly. This talk will present examples from our work on the real-world use of social robots, highlighting key challenges related to autonomy, interaction, and robustness. I will discuss different approaches to addressing these challenges, including fully autonomous robots and teleoperated avatar robots in retail environments, and touch upon how immersive and augmented interfaces can play a role in supporting future social robot systems.
Dr. Arévalo Arboleda is an incoming Assistant Professor in Human-Computer Interaction at Birmingham City University in the UK. Previously, she was a postdoctoral fellow at the RWTH Aachen University and TU Ilmenau, where her work focused on avatar and robot-mediated communication. She received her PhD from the University Duisburg-Essen, where she investigated the use of augmented reality for human-robot interaction. Her current research includes understanding how people perceive and behave in extended reality environments, as well as human-robot relationships and interactions. Her approach involves designing technology from a human-centered perspective.
Invited talk: Blending Extended Reality and Human-Robot Interaction: Collaboration, Telepresence, and Embodiment
Abstract: Extended Reality (XR) enables adding virtual elements to the real-world or creating fully virtual environments, allowing for new ways of interaction. Human-Robot Interaction (HRI) creates a unique synergy where humans and robotic systems can complement each other's abilities to perform different tasks. In this talk, I will present some examples of using XR for human-robot collaboration and how AR avatars and telepresence robots compare in mediated communication.