Our newest major project underway, with the intention of working with Access Now Inc. a ADA advocacy group which works to support disabled Americans around the country.
The primary idea being a unique and full body experience that relies on senses other than visual. With this in mind, we have a 16 stereo surround sound setup, along with a haptic feedback vest from bHaptics and Oculus Quest headset.
The suit is the Tactsuit X40, an adjustable vest with 40 total vibro-tactile motors, 4x5 across the front, and 4x5 across the back. bHaptics also sells some more add-ons for the hands, arms, and feet, but that's a bit overkill for now while we're still doing initial testing.
A complicated setup of 16 channel speakers arranged in a circular array fixed towards a listener in the middle.
Your typical Oculus Quest HMD, along with the bHaptics wireless vest.
Getting started to setup the bHaptics Vest was pretty simple, essentially the vest came with a tiny USB stick for wireless connection to the computer, and the bHaptics standalone player can be downloaded from their website, and from their it's easy to connect it. The bHaptics standalone player has some nice features like connecting to the audio output so you can feel your music based on bass or pop, and it also connects VR and PC games like Beatsaber or PUBG and so on. It also allows a "Feedback Test" in which you can draw along a picture of the vest and the motors will play as you mouse across them.
All in all, pretty simple to setup and getting running right away, and even useful out of the box for music and games without needing to get too technical.
Of course, we didn't purchase the bHaptic vest to feel music in our chest, so fortunately bHaptics also has an great Unity plugin complete with examples, SDK, prefabs, and and "Haptic Clips". The Haptic clips are essentially pre-developed clips to control the motors, so one might be a "scan" where all the vertical motors vibrate, then disable then the adjacent ones horizontally vibrate and so on, all around the vest to give the feeling of something rolling or scanning across your body. bHaptics provides a large list of pre-made ones like getting hit, shot, special interactions, rumbling and so on. They also have a developer portal to create and export your own bHaptic clips to use in Unity.
This may be worth exploring more in the future, developing our own. Unfortunately, I had made some initial tests with a "Scan" example, that featured a little drone in Unity flying around your body and scanning you, but I seem to have misplaced that project somewhere and can't find it, so I can't show it off.
Either way, that was only created as a simple initial test, our plans involve controlling the motors for finitely through C# scripting rather than pre-made clips, so that was our main focus.
To start that, I opened up a Draw Dot demo scene to examine how the scripts work and access the motors, and moreover how their configured in their layout. As you can see, and as I mentioned above, the front and back contain 20 motors each, with a row of 4 across and 5 down. The motors are held in and Index, counting from 0-20 from right to left on the front, and left to right on the back.
Looking at this configuration and the scripting setup, we decided to create a virtual version of the vest in Unity with markers to act as the spawn points for raycasts, which on collision would then trigger the motor, and the distance of the object would affect how strong or intense the motor vibrates.
Ideally, this, combined with surround sound audio or a 16 channel speaker setup could make for a very immersive experience. An audience member sits center of space, wears the vest and the VR headset. Virtual objects with point sound sources in Unity can play long and medium distance audio through the multi-channel setup, they would also feel the audio through the haptic vest, and if it gets really close, audio could play through the Oculus speakers for a full range.
Here is a simple design I quickly mocked up to act as a visualizer, with all the white dots acting as the source points for the rays to cast from. You can see the basic raycast working here, colliding with the cube, now it's a matter of connecting the script to the motors.
The full hierarchy of the vest prefab, all of which falls under a root "bHaptic Vest" object. First is the "motorsFront" and "motorsBack", which just contains an example script called Bhaptics Dot Point Controller Example. This just gives us whether the motors are on the front or back. Underneath that is the "VestBack" and "VestFront", which just contain the mesh data and our custom Raycast script that contains all the important code and settings. Then finally all the motors separated out individually and placed correctly are listed underneath. These are nothing more than empty's with sprite renderers.
Here are the 2 important scripts to getting the whole thing to work, as mentioned on the left. The example they give for the Dot Point Controller, and our raycasting script. The Dot Point script is used to determine which index of motors we're communicating with.
The script is actually quite simple, though it definitely took a bit of trail and error to piece it together correctly. This first part is pretty simple, first we call Bhaptics.Tact class, which we use to get information about the motors themselves and call the function to turn them on at whatever strength.
We setup a few basic variables, an animation curve for strength adjustment, the motorIndex and motorIntensity, a controller and dotPoint variable that bHaptics use, and a maxDistance to control how far our raycasts will travel out.
On Awake we just assign the dotpoint use one of their example fuctions, to assign the index and strength of the motors, and assign the controller.
And the only other essential part all happens under Update() function. Setting the index to 0 to start, and clearing the active motors in the controller (which is a bit of a weird quirk with how dot point example is built and setup). We do a simple foreach to run through all the children transforms, using their transforms to get the origin and direction for our raycasts, and when something is hit within the maxDistance, we adjust the motorIntensity based on the distance, a inverse lerp to convert it the opposite way, with an easeCurve to adjust the falloff of this strength, and finally multiplied by 100 because that's the range of the motor intensity as developed by bHaptics.
From there, check if the intensity is greater than 0, draw a debug line for visualization, and add the motor to the controller list to turn it on.
After finishing up the scripting, testing it, and polishing the default values a bit, we create a sample scene to people to walk around in. Once again, using a basic model of our studio space, created with a 360 camera and simple projection texturing, we add some simple meshes to walk around and collide with, and attack our vest underneath the CenterEyeAnchor of the OVR Camera prefab so it's follows them throughout the space.
This is something I still need to mess around with. Since I built this project and understand how it's setup it, it's intuitive to me that the vest is attached to the VR Camera, meaning as you look around, turning left and right, the vest rotates with that, instead of rotating with your physical body. (This isn't so much a big deal here, but it comes with later when I build an invisible maze example.)
The next example I created was a simple point source audio player that flies around the player/vest, located at world origin, it circles a few times before diving in and through the player. The audio is a sort of flapping sound that outputs to our quad speaker setup, thus allowing the sound to really travel around you in the room. The vest raycasts and collides with the audio source as it flies around you, and you also feel the strength of it as it flies right towards and through the player. I'd paste a picture but since it's really based around movement and audio, I didn't feel the necessity.
So to expand on the previous examples, and push to really using the tactile function of the vest, I created a pretty simple maze that utilizes the full space of the studio, that the player isn't capable of seeing. Instead, they must try and navigate only through the intensity of the motors on the vest to figure out how to move forward. A few settings had to be adjusted for the vest to work properly with these distances, meaning turning the max distance value way down, and also adjusting the curve a bit to really ramp up intensity the closer you are to the wall.
This is where the issue of the vest following the player's head/view versus their body and the way their standing really come to a head. Often it seem the player would get confused about where exactly the walls are in their mind, because there constantly rotating their head around slightly, but mostly move their body along a forward direction. I can only speculate on how this affects people not used to it, or those unused to VR and technology in general, but it definitely seems to trip people up. It also doesn't help that I haven't locked the X rotation, so it also follows the player's head as they look up and down.
We can fix that locking rotation quite simple, getting the vest to track the player's body is a bit more tricky. The only methods I can think of would be to lock them somehow through code to the hand/controllers, which should generally follow people's body better, or just attaching a mount to the vest to be tracked in QTM, though that seems like overkill.
We do have access to the Final-IK plugin on the Unity asset store, which has a VR example that tracks character body using the HMD, controllers, and player movement. This solution also isn't perfect, but might be the easiest to setup and get to work, while being close enough, as the VR-IK is pretty solid. (It's also something we use in our VR-MP-QTM project to allow networked player's see each other and themselves in VR without much hassle.)
Time on this project was limited as the President of Access Now was only in Philadelphia for a short period of time, and a large part of this project also hinged on a surround sound 16 channel setup, that we were not able to get setup in time with the appropriate software to send the audio to Unity, then to our VR and the Haptic Feedback Vest. Due to this, this project has temporarily been put on hold until further notice.