This virtual reality application is built for a study which aims to understand human behavior and performance for a simple point and select task in different walking conditions. In particular, we are interested to see if people use their left hand (controller) for objects that are on the left side of their body or not.
Setting is a hallway that there are selectable objects (cubes) on both sides, and users should select all the cubes as fast as they can while minimizing the error. A ray is assigned to the pointer that can be activated and then used for selection. The color of the ray will be changed based on the location user is pointing for visual feedback; red color for when they are pointing outside the cubes and green for when they are pointing towards a selectable object. Different sounds will be played when they click inside or outside the selectable object for audio feedback.
Objects are chosen in a way that a person with average height can see them easily inside their FoV.
Different walking conditions
Without walking Conditions
one controller, for interacting (dominant hand) -c1
two controllers, for interacting - c2
Actual walking conditions
one controller for interacting (dominant hand), actual walking - c3
two controllers for interacting, actual walking - c4
Virtual walking conditions (teleport)
one controller, for interacting and virtual walking (dominant hand) - c5
two controllers, both for interacting and virtual walking - c6
two controllers, one for interacting (dominant Hand), one for virtual walking (non-dominant hand) - c7
For each condition, six different set of objects will be shown, which is the combination of three widths (10, 20, 30 cm), and two densities (10, 20). For each density half of the objects will be on the right side of the user and half will be on the left side.
Order of widths and densities are randomized. And order of conditions will be based on latin square.
Without Walk
Actual Walk
Virtual Walk
Download the project from GitHub
Open Unity Hub
Add the project folder
Change the unity version to "2018.3.14f1"
After opening the project, go to Project tab -> Assets -> Scenes-p
Open 1 one of the scenes (conditions)
example: c1Z-v2-withoutWalkOne
Run the application
Follow the instruction bellow based on the scene (condition) you chose and the platform you are using to see and use the application
The scene (condition) can be changed while playing by pressing the key that is assigned to the condition
c1: z, c2: x, c3:c, c4: v, c5: b, c6: n, c7: m
(if you are testing on PC, to use the key presses, switch off the XR Camera Rig and Camera Rig switcher in the scene and make the Sim Camera Rig active)
Toolkits used for Simulation and Interaction:
Sounds:
Correct Selection: https://freesound.org/people/unfa/sounds/154887/
Error Selection: http://soundbible.com/1540-Computer-Error-Alert.html
Consent form
Demographics form (age, gender, handedness, familiarity)
Total training time: 5 minutes (~1 min per condition)
Demographics
Within Subject Design (each participant will perform all conditions)
Latin Square for the order of conditions (7)
For each condition, 6 sets of targets
Randomized Width (3) * Randomized Density (2)
In each set, 10 or 20 targets of same width based on the density
Target Positions are randomized (half right side and half left side)
Total study time: 40 minutes (~ 5 min per condition)
Interview form
Nasa TLX form
Interview
NASA TLX
For each participant, we will collect 7 log files (one file per condition), each log files have 6 rows (one row per set of targets). In each set, we collect the time, error count, hand used for selection of each target along with the side of the target.
The ID of the participant is based on the name of the file (combination of date and time).
Here is a sample CSV file for 1 participant and 1 condition.
Sample Log File
Independent variables
Target width (3 levels) - 10, 20, 30
Target Density (2 levels) - 10, 20
Target Position (2 level) - right, left
Condition (7 levels)
Dependent variables
Task completion time (ms)
Number of errors
Selection Hand
Total Number of Blocks (set level)
20 participants * 7 conditions * 3 widths * 2 densities = 840
Total Number of Trials (target level)
20 participants * 7 conditions * 3 widths * (10+20) targets = 12600
we will use linear mixed effect modeling, and participants as random factor.
HTC VIVE
PC
Initially we intended to develop this application for HTC VIVE and maybe CAVE2, which was not possible due to the situation. We developed this application using the VR simulator on a PC, we also did some tests on HTC VIVE and received feedback. However the version for VIVE needs further tests for translating the controllers input events, to be usable for our user study. It is possible that the trigger ranges need to be adjusted. At this moment this application is fully functional on PC VR simulator.
We aim to add another condition to this experiment, in which users have to touch and select each item. Obviously the items will be located in an appropriate height to be reachable by the user.
Initially we wanted to conduct a pilot study on this application, but due to the COVID-19 outbreak that has to be postponed. In the future we will conduct a pilot study with 5 participants, get feedback, adjust the application for a further user study with more than 20 participants.
Different VR platforms have many similarities, but at the same time they can have very different requirements. We found it impossible to use an application developed for the VR simulator on a PC, to work properly on another platform like VIVE. It is important to adjust your application to each platform you are using and even create separate versions for each platform.
Since we did not get to conduct our pilot study, and we did not get a chance to test our application using a headset ourselves, it is still immature to consider anything as our findings. Once we get to finish these tasks, we will be able to complete this section.