This week we tested the robot VR simulation and iterated the VR prototype based on the user feedback. We then applied what we learned into a more realistic test within a building corridor with a blindfolded user following the robot.
After testing the virtual environment experience to simulate the interaction between humans and robots, we found many interesting insights that can be used to improve the interaction experience. Most of these findings were regarding the confusion that can arise when the user's prompted with instructions from the robot. In the VR experience, the user is provided with a spatial audio effect of hearing the sound being emitted from the robot as if the robot is in real space. For example, if the robot was to the right of the user in VR, then the user would hear the sound from the right side in their headphones.
On top of testing VR, we also conducted a mock demo for the robot in a real corridor as well. This test gave us a much better perspective and experience of what it was like to walk blindfolded and follow a robot within. Although this test was conducted with Jak with blindfolds in a simple corridor, we found some useful insights about the experience as well.
With the VR setup, the robot emitted simple tones that the user can use to navigate based on the direction of the sound. The participants successfully navigated to the desired destination at the end of the virtual maze solely based on following the sound that is emitted from the virtual robot, with some complications.
Initially, we thought that emitting sounds from the robot and letting the user follow that sound would be enough to get the user to the destination. In one particular case, we found that the user knows that the robot is at the waypoint, but the user thought that they had already reached the robot but actually did not. This leads to confusion for the user, not knowing which way to go. This might be due to the limitation of the accuracy of the spatial sound within the VR environment, but when the user is close to the robot, it's difficult for the user to tell which direction the sound is coming from.
Throughout the navigation journey of the user with the robot, the users can become confused if the instructions are not provided in a concise manner. Initially, throughout the testing, we used simple sounds and tones to indicate whether the robot has reached the waypoint. However, as we were testing we found that we had to explain what the tones mean to the users. The user consistently became confused by what they were expected to do when the instructions were not clear. Therefore, when the robot approached the waypoint and instructs the user to follow the robot, the robot should provide exact instructions on how to follow the robot.
Similar to the previous point, one of the confusion is when the user reached the waypoint at a corner and the robot changes direction as it's moving to the next waypoint. This caused confusion to the user and they felt that they do not expect the sudden change in direction without any instructions. One of the users specifically mentioned that they would like to have a feature similar to Google Maps, where any change in the current route is indicated through speech ahead of time. Therefore, we included clear phrases such as "turn left 90 degrees and walk 10 steps" to help users get a better understanding of the route that they're traveling.
During testing in the real environment, we found that indicating the steps can result in some errors. For example, the robot could indicate to the user to walk 15 steps forward, however, at the end of the 15 steps, the user still did not reach the waypoint. This would therefore require the robot to instruct the user to take more steps in order to reach the waypoint. Another case was when the user nearly crashed into the robot even though they were taking the same number of steps that the robot instructed. There will also need to be feedback for handling this case as well and making sure the user can safely and accurately get to the desired waypoint along the navigation journey.
When testing without the feedback in VR, the user also mentioned that they felt somewhat uneasy when walking blindly toward the robot and not knowing if they were going the right way or not.
The delay before instructing the user to go to the next waypoint should be minimized. With this approach, the delay when the robot is traveling to the next waypoint is inevitable, however, the robot should minimize any delay in giving the next instruction to the user.