Hybrid conferences have become more common since the COVID-19 pandemic, enabling onsite face-to-face interaction and remote participation via video. However, platforms like Zoom or Google Meet lack spatial and nonverbal cues, limiting remote users’ sense of presence. We propose a hybrid poster presentation system using a 360-degree camera surrounded by 2×6 face-to-face displays to simulate mutual gaze and presence. Remote HMD users view the onsite environment from a shared or movable viewpoint, with their faces shown on screens corresponding to their gaze direction, enabling natural online-online and online-onsite interactions. A virtual screen, placed in front of the real poster or projector screen, provides high-resolution content visibility for remote users.
Relaxation is a critical counterbalance to the demands of modern business life. Footbaths, a simple yet highly effective therapeutic practice, have been used for centuries across various cultures to promote relaxation and overall well-being. This study presents a novel approach to simulating the experience of a public footbath through the use of tactile and thermal stimulation of airflow to the calf and those on the foot soles. Our system aims to offer a realistic and immersive virtual footbath experience without the need for actual water, by controlling the temperature and airflow to mimic the sensation of soaking feet in water or a water wave. Without using actual water, our system can be more compact, highly responsive, and more reproducible. The layer of airflow is made as thin as possible by adjusting air outlet, and the Coanda effect is also considered to generate a water surface more realistic. The system can provide a multi-sensory experience, including visual and audio feedback of water flow, enhancing the relaxation and therapeutic benefits of a footbath.
Traveling to different places simultaneously is a dream for several people, but it is difficult to realize this aspiration because of our physical space limits. In this study, we used autonomous mobile robots; a dog and wheel type one, where their movements’ direction can be controlled by an operator. The operator can alternatively choose/re-choose the space (or robot) to attend and can move the viewpoint using a head-mounted display (HMD) controller. A live video image with 4 K resolution is transmitted to the HMD via web real-time communication (WebRTC) network from a 360° camera placed to the top of each robot. The operator perceives viewpoint movement feedback as a visual cue and vestibular feeling via waist motion and proprioception on the legs. Our system also allows viewpoint sharing in which fifty users can enjoy omnidirectional viewing of the remote environments through the HMD without walk-like sensation feedback.
Walking is a daily activity for most people. Lack of opportunities or inability to walk may cause both mental and physical health problems. However, in some circumstances, such as during a global pandemic, fear of heights, or withdrawal from society (hikikomori), people tend to walk less. To overcome such issues, we developed a walking rehabilitation system, Action Reproducer, to encourage people to walk in a virtual environment, e.g., around sightseeing spots, or to train walking on a high building. The proposed system comprised a motion seat to present vestibular sensation to the waist, slider-pedal devices to provide motion sensation to the lower limbs, wearable pseudo force devices to pull sensation to the fingers, and an avatar in the virtual environment to hold the user’s hand during walking.
Dual Body was developed to be a telexistence or telepresence system, one in which the user does not need to continuously operate an avatar robot but is still able to passively perceive feedback sensations when the robot performs actions. This system can recognize user speech commands, and the robot performs the task cooperatively. The system that we propose, in which passive sensation feedback and cooperation of the robot are used, highly reduces the perception of latency and the feeling of fatigue, which increases the quality of experience and task efficiency. In the demo experience, participants will be able to command the robot from individual rooms via a URL and RoomID, and they will perceive sound and visual feedback, such as images or landscapes of the campus of Tokyo Metropolitan University, from the robot as it travels.
Controller such as Joystick or keypad is commonly used to operate a telepresence robot. However, such a device is difficult to use to operate the complicated movement of the robot such as walking like motion. To increase the sensation of presence as well as to reduce the operation error, we proposed a tele-experience system in which user’s arm swing is used to control the robot. The omniwheel telepresence robot attached with two degree cameras, and walking motion feedback were developed. The robot can be moved in 3 degrees of freedom, which are for walking movement on the surface, and the motion feedback device can present walking sensation to the user without walking space requirement.
Personal vehicles such as the Segway have been actively used for security patrols or supervision of construction sites, because of their mobility capabilities. In the current study, we proposed a vehicle-ride sensation sharing system enabling a rider to remotely collaborate with a driver, and to receive both 3D visual perception and vibro-vestibular sensation. We developed a prototype personal vehicle system with two 360° cameras attached to the Segway with a stabilizer to capture stereoscopic 3D images and send them to each eye of a head-mounted display worn by a remotely collaborating rider. We also developed a prototype of vibro-vestibular display by modifying a conventional wheelchair with a simple lightweight mechanism for actuation and vibration by two DC motors. In our presentation algorithm, each wheel of the wheelchair is accelerated or decelerated proportionally to the acceleration of each wheel of the Segway. When the velocity of each wheel was almost constant and the acceleration was nearly zero, the wheelchair slowly moved to the initial position, with movement that the rider could not perceive, to keep the wheelchair accelerating or decelerating in a limited space.
We developed a vehicle ride simulation system for immersive virtual reality, consisting of a wheelchair for vibration and vestibular sensation, and a pedestal with a curved surface for the wheelchair to run on, utilizing a gravitational component. Vehicle motion feedback systems often use a six degrees of freedom motion platform to induce virtual vehicle acceleration on the user’s body. However, because motion platforms are typically complex and expensive, their use is limited to relatively large-scale systems. The proposed system enables the presentation of variety of road property sensations as well as continuous acceleration of vehicle motion using high-bandwidth wheel torque produced by two direct-current (DC) motors. Our unique combination of a wheel and a pedestal can present vibration and vestibular sensations of vehicle acceleration with simple, light-weight, and low-cost equipment.
It is known that our touch sensation is a result of activities of four types of mechanoreceptors, each of which responds to different types of skin deformation; pressure, low frequency vibration, high frequency vibration, and shear stretch. If we could selectively activate these receptors, we could combine and present any types of tactile sensation. This approach has been studied but not fully achieved. In our study, we developed FinGAR (Finger Glove for Augmented Reality), in which we combined electrical and mechanical stimulation to selectively stimulate these four channels and thus to achieve high-fidelity tactile sensation. The electrical stimulation with array of electrodes presents pressure and low frequency vibration with high spatial resolution, while the mechanical stimulation with DC motor presents high frequency vibration and shear deformation of the whole finger. Furthermore, FinGAR is lightweight, simple in mechanism, easy to wear, and does not disturb the natural movement of the finger, all of which are necessary for general-purpose virtual reality system.
We propose a new haptic presentation method using the rotational motor’s counterforce that occurs during acceleration. We use the rotor of motor itself as the vibration mass, so the mass can move indefinitely without limitation.We also found that a DC motor is able to provide a rotational pseudo-force sensation. The combination of vibration and pseudo-force produced by a single motor allows a wide range of haptic presentation to the fingertips.
We developed a 3D virtual reality system comprising two fingertip gloves and a finger-motion capture device to deliver a force feedback sensation when grasping a virtual object. Each glove provides a pseudo-force sensation to a fingertip via asymmetric vibration of a DC motor.