Hi, my name is Joseph Kennedy, a recent graduate (as of 2022) at the University of the Arts, with a Bachelor of Fine Arts, specializing in Game Arts.
I came to the Center for Immersive Media (CIM), as a work-study student in my senior year, and since graduating I've been awarded a Fellow position to continue my work for another year.
I pretty much act as a sort of generalist, which is befitting as that's how I would generally describe my skills and knowledge of 3D. I mainly create and maintain various projects using our various equipment at the CIM, under the guidance of Alan Price, the Director of the CIM.
Most of these projects are created with Unity, and utilize Meta's Oculus headsets, and/or our motion capture system, or other various technical goodies we have lying around. These projects are either speculative templates that could be used as a starting base for future projects, or applied projects that we are actively working on with a deadline.
I'll also help out around the space, whether it be moving or setting up equipment, acting as a sort of guide for students coming in with a class or workshop to experience the space and test the equipment, an instructor to help onboard other work-study students who begin working here, or an assistant for other projects going on that I don't actively work on.
Proficiencies include:Â
Game Engines (Unity & Unreal Engine)
3D Modeling, Texturing, Lighting, Rendering, Simulation & Fluids (Blender & Maya)
Game Asset Optimization for use in Real-Time Engines
Adobe Suite (Photoshop, Illustrator, After Effects, Premiere Pro)
Texturing (Quixel Mixer, Substance Painter)
I also have a very tenuous grasp on programming, mainly C# for Unity, but I also have past experiences working with Java, HTML & CSS, Python, and Lua.
More of my other work can be found at my:
There you'll see various personal projects and professional work I've developed over the years.
One of my major responsibilities was to manage and run our 12 infrared camera mocap system, created by Qualisys.
Here we would show off our mocap capabilities to faculty, external partners, and students who were interested. In this, we would do recordings, as well as live demonstrations due to Qualisys' ability to stream the skeletal data in real-time over our local network, which we could then input into Unity or Unreal Engine.
Another likely long-running major project is our own custom solution for standalone networked VR projects. This project will serve as a template and starting point for future projects and games.
The basic setup is a 3D scan of our studio space, with Unity Netcode players, and the Qualisys Mocap plugin. With this, we can build standalone projects to the Oculus Quest, and have visitors explore a virtual version of our space that aligns with the real space. From there, we have countless ideas for future projects which can expand off this base.
This project started off as a kind of off-chute/experimentation on the Room-Scale Multiplayer VR project.
Essentially, we acquired a work-study student named Aspen, who's in the School of Dance. We dug up an old script that would take a Unity avatar, and record the position and rotations of it's joints, which could then playback later.
With this, we were able to create an experimental tool specifically designed for dancers and performers to move around in VR while mocapped, record their movements, and watch it playback in VR from any angle.
Our newest major project underway, with the intention of working with Access Now Inc. a ADA advocacy group which works to support disabled Americans around the country.
The primary idea being a unique and full body experience that relies on senses other than visual. With this in mind, we have a 16 channel surround sound setup, along with a haptic feedback vest and Oculus Quest headset.
This quick project saw us creating a simple animation to be projected on a scrim behind a dance performance. We worked closely with Jesse Zaritt, a dance instructor for the School of Dance here at UArts.
Jesse was inspired by the motion trails created by a performer that we would visualize to him in our QTM software, and came up with an idea based around that aspect, exploring space and architecture as created by our bodies when we move around.
The 3D aspect of the final animation I worked on utilized the motion trails of various performers to create these spindly lines that would grow and evolve during the course of the video.
The MovementSDK project is primarily a test project utilizing Oculus's Movement SDK template, combined with the new hardware and software integrated in the recently released Oculus Meta Quest Pro, which allows full facial feature, including eyes, and limited body IK tracking.
The idea being that if we combine this with our work in the Avatar Recorder project and similar projects, we will be able to fully track and digitalize someone in a virtual reality space, where they can have fully accurate body and positional tracking, as well as being able to track the facial expressions, and even a limited form of lip sync.
The "Echo Chamber" is a collaborative project/installation we created with the help of Philadelphia's National Liberty Museum, for an upcoming exhibition they were hosting called "Data Nation: Democracy in the Age of A.I.".
The point being to show off "interactive installations, topical interactives, and provocative artwork that aims to make visitors ponder how rapid advancements in technology impact democratic norms".
At the CIM, our purpose was the creation of this "Echo Chamber" concept, which is a geodesic dome presenting a volume for visitors to enter.Â
At the center of the chamber will be a touch screen display kiosk that allows visitors to type in any message, phrase, or "belief", and plays them back in multitudes through speakers placed around the volume interior.Â
We recently acquired a re-configurable LED Wall. Besides essentially being a very large screen with better detail and color than a projector, it allows us a variety of neat project ideas that wouldn't have been possible before, the biggest example being Virtual Production in a similar way to how Hollywood is beginning to move it's TV and Film production workflows in. Additionally, since it's reconfigurable, due to it's rearrangeable interlocking panels, we can also create a wide variety of esoteric and abstract setups far more unique than a general widescreen large TV screen.
Further tech specifications can be found inside the project documentation page below.
We had some upcoming classes and workshops to teach the basics of Projection Mapping, something which Alan has a lot of previous experience in. In order to get more content and show off a a greater variety of work, and showcase the relatively ease of getting started, he taught me the basic workflows he was used to, allowing me to experiment, before we then brought on the other work-study students, teaching them, and encouraging them to experiment and create their own little pieces.
We started with a simple setup using a small Optoma projector, along with some basic white geometric boxes for shape, and white projection cloth.
I first started in After Effects to get the basics down, before moving to Blender to create more advanced scenes with 3D lighting (which were ultimately put together in After Effects anyway), before then finally building out a new solution in Unity for the greatest ease of use and flexibility in terms of 3D workflow, direct output to the projector, real-time adjustment of the images are mapped, and interactive input.
With the arrival of our newest piece of tech, that is a 60 panel, interlocking and interchangeable LED Wall, and some smaller experiments in the past utilizing Unity and projectors, we decided to try and learn and create a template for Virtual Production.
Inspired by the many new companies, and general Hollywood shift to Virtual Production workflows, and LED/Motion Tracked volumes, we figured we had all the right technology and tools to accomplish our own version, and we could help inspire and teach other faculty and students in the School of Film, which seems prudent given the latest shift Film and TV is heading in.
We experienced quite a few hiccups and roadblocks, but we have a decent handle of the entire workflow, and created a small example short, as well as a basic template project to work on more ambitious projects in the future.
For this project, I mainly spent my time in Unity developing and testing various particle systems to be displayed on this hanging sheet cloth.
I also briefly helped with setting up the cloth itself, the motion tracking dots, and occasional trouble shooting.
I helped develop some visuals in Unity for the play, 'I Want a Country', from the boats the audience would sit in, to animation and slight modifications for a CG Superman, all to be projected onto sheets of cloth during the play.
In collaboration with a "Designing Soundscapes" class, which I also happened to be taking at the time, I was tasked with creating a virtual space in Unity, where each student's final soundscape project would be uploaded.
Anyone could then walk around the real space with motion tracked wireless headphones, and move from one soundscape to another with seamless blending.
We recently found out that the event planned for later in October is going to be cancelled for now. (All the work for the "Square Fade Particles" and looping background we've been working on for the past few days).
That being said, we've made a lot of progress already, and Alan having done a lot of work to setup all the projections screen seemed like it would be a waste if we didn't finish.
That being said, we continued to put this large support structure that Alan had built out of pipes, we set it up in a proper spot, and we aligned the projectors we would be using to align as best as we could with the 16x10 rectangular shape that Alan mentioned. This shape would repeat 2 other times, essentially creating a large 48x10 screen.
(We later found out that Alan may have made a mistake in the measurements, we were doing 16x10 as that what he thought the projector could fit, but after another quick measurement we found it may only be in 16x9 aspect ratio.)
Either way, this would be a simple enough fix, so we decided to finish setting up what we could, we may as well see how well this is going to work, at the very least it can serve as a template for a future project.
Spent 2 days out of work due to a cold, but today I began the morning by labeling and writing down S/Ns for all the Oculus Quest headsets that the Design for Interaction class will be using soon, I believe next week, for their projects.
All headsets are setup to go, logged in with the CIMuarts3 accounts. Each serial number is written down and saved, and a sticker label was placed on each individual headset and 2 controllers, labeling what number it is.
We also merged the current work I've done with the fading particles, and put it together in a master Unity project that Alan is working on, which is the primary goal of the project for later this October.
He created various floating widgets (as seen below) in the form of colored text bubbles, groupings and connected lines, as well as videos, each element is also interactable and will function on a touch screen on the day of the event.
The fading particles and looping background are easily toggled on and off in the background, and we're awaiting the lead designer to create some more visual mockups for us to brainstorm more visual designs around.
Today we started work for a new project/collaboration
There's a few moving parts, but as the visual artist, I was first tasked with creating this abstract geometric shape based on a concept image for the project.
On the left, is the first image of a presentation that will be given for the project. The black welcome screen will be a digital display projected onto a long sheet of cloth, planned to be 48 feet wide by 10 feet in height. My first goal was to create the white geometric shapes you see specifically circled in blue, which is a little hard to make out due to the color overlay across the whole image.
Per guidance from Alan, the best way to handle this in a simple way would be to just have a simple particle system that instantiates a simple square texture, and that texture repeated enough over time would create the complex shape seen.
On the left, is the texture I created in Photoshop, it's simply a 512 by 512 pixel image, black background, and about a 3-4 pixel stroke of a square. This went through a couple iterations obviously, originally the stroke on the square was much wider, maybe around 20 to 30 pixels, but over testing we discovered we wanted the shape to be much thinner.
The settings for the textures and materials themselves were pretty simple as well. On the texture, selecting the "From Gray Scale" for the Alpha Source, as well as checking the box Alpha is Transparency.Â
For the material, the shader is just an unlit particle shader (lighting wouldn't be used or important for this and thus an unnecessary use of computer resources), Rendering Mode set to "Fade" and Color Mode set to "Multiply", Two Sided checked, and the box texture plugged into the Albedo.
As far materials go, this was good to go.
From here, all that was left was creating an Empty Game Object to act as the particle source's parent and controller, and that would have an animation added to it that moves it back and forth across the widescreen, and loop as well.
Here you can already see the complex shape that is formed when adding a bit of movement, and adjusting the appropriate settings to get the particle system to behave as intended.
On the right, is a general overview of all the important settings used to achieve the desired look. The most important general settings are:
Looping - to loop it, obviously
Start Lifetime - to determine how long the particles will last (this is used in conjunction with the Color over Lifetime option using Alpha for a fading effect as well)
Start Speed - set to 0 so the particles themselves don't move
Start Rotation - arguably one of the most important settings and took up most of my time in testing and iteration. This setting determines how much and how faster each new particle is rotated, ultimately to achieve the desired spinning/circular effect.
Simulation Space - this just ensures the particles don't move along with the Particle Controller and properly drift away.
At the bottom, is the particle curve box, this is how we controlled the "Start Rotation" option. It's a little confusing, but essentially its a graph with a curve that describes how much the rotation value should be, over time (essentially it's speed of rotation).
Over testing, we found that a linear line not curved at all actually worked best, with the starting value at 0, and the ending value at 0.5 (out of 1, which in this case means 90 degrees, as the graph goes up to 180).
If my explanation here confuses you, it's because I still don't quite understand it myself.
Further settings below:
The emission just determines the rate of spawning for the particles, in this case the value is currently 11 as it seems to work best.
The shape was a bit important, as we want each particle to spawn as close to the same point as possible, so we use an extremely tiny sphere.
This setting got tweaked a bit as well for a final polished look, essentially a simple color gradient, from a soft whiteish pink to a deeper red.
Most importantly, each particle spawns with 0 alpha (invisible) quickly fades in to 1, then slowly fades back to 0 over life time. This along with the Start Lifetime helps make the particles softly fade in and and out.
This just adds a tiny bit more rotation after the particle spawns, so it spins just a little bit more as it fades out.
This just adds a tiny bit of random noise to the movement of the particles as the drift and fade away.
Very small Strength and Frequency
Update: Another important setting to check is this Random Seed, this will prevent Unity from automatically creating a random seed each time it runs, which could help with art direction as you can guarantee the particle system will run the exact same way every time as long as other settings don't change.
With the particle system explained and essentially done, the only thing left to do was to create a nice background. Alan wanted the same colors used in the original reference concept, so Eden created a gradient in Photoshop at the same dimensions as the screen. The full gradient is essentially seen below, it just wraps around at the end, and a second plane is duplicated and placed on it's right side so it can be animated to scroll sideways and then eventually seamlessly loop back to the beginning. That took a bit of playing around but it's pretty seamless now.
Eden also created a duplicate version with a bit of noise introduced to add some variation, that would be applied to a second plane, which would also be animated to travel sideways at the same rate, and have a bit of rotation just for some extra variation. That secondary noise plane would then also used a Fade material so it's Alpha can be just adjusted to fade in and and out on the timeline.
Finally, here we can see the whole scene as is, as of the latest iteration, and likely quite finished outside of some minor tweaking and polishing.
And here we can see the final result rendering in very widescreen real-time in the Game View after a few moments have passed.
Today I continue finalizing and polishing our new Ambisonic Sound Template in Unity for use in future projects.
Mainly I brought the compass down a little bit closer to the horizon line, or essentially eye level with the VR headset, and made the compass use an unlit transparent material so no lighting would need to be used and thus saving a little bit on computation.
I'm also continuing to try and find a look for this blog, getting ready to have it be published and accessible for both internal documentation and any interested parties.
After this VR setup and Ambisonic template testing is finished with these Oculus Rift S', we plan to setup our Oculus Quest out in the main room, to take advantage of the space, for a future possible soundscapes project that will allow users to walk around the entire area broken up into sections, each with a different soundscape that Unity will fade into.
Here you can see the main studio space in the CIM, and this photo only captures about half of the room, so when cleared up, there's plenty of space to walk around in the VR headset with. All we need to do is disable the guardian boundary feature (as it features a max limit that doesn't allow us to utilize the full space), then we find the center point of the room and create a custom script that allows a custom guardian boundary to be used.
Today marks the start of this ongoing CIM blog-post for posterity. Besides some short information about me and my projects at the top, everything from this point on will be ongoing updates about my personal work done at the CIM.
The image seen on the left is a screencap of a new Unity project that will act as a simple template going forward for those interested in Ambisonic sound in VR.
There is currently a Soundscapes class every Wednesday, which for their current project, are experimenting creating soundscapes through the capture and editing of field recordings. These are then divided into a quad-channel format for quad surround sound. Primarily, the class will be using the quad speakers setup in the CIM to achieve this, but Alan and I were also interested in creating a simple beginner's template to allow an ambisonic sound file to be played in VR using Unity. This may interest and allow the Soundscapes class to experiment and get interested in soundscapes in VR, as well as acting as a template for future ideas and projects.
The scene is quite simple with only a few objects in it, a black backdrop with a simple compass that is toggleable for the user's orientation in space. The compass itself is a simple cylinder object with a compass type texture painted on in Photoshop and aligned correctly. The cylinder is brought into Unity and uses a default material with a cutout transparency to hide the rest of the cylinder and only have the compass texture showing. It does need to be lit with a point light within the black scene, but this is an easy enough adjustment to make in the future.
All that needs to be done is dropping in an ambisonic .WAV file into Unity, and putting in the audio source, then just press play.
In the above photo, you can see the Oculus Rift S with the controllers hooked up to the computer with the template project. While this template is finished, it wasn't without issue. The primary one being the Oculus itself, for some reason the the software is still in a continuous updating loop, and will never finish, on top of this and maybe related, the Home function in the Oculus won't load either. As of now, we still haven't quite solved this problem, but at least it hasn't prevented this project from working.
Above is the Oculus Software Update notification of death.
Below are some further notes & reminders for posterity.
Make sure that the OVRCameraRig and LocalAvatar are in the scene.
Make sure the audio file is a .WAV, ambisonic with 4 audio channels, and Ambisonic is checked.
No special settings for the audio source itself, just make sure it's 2D Spatial Blend.
Making sure the toggle for the compass is attached to a separate object, and references the compass as a separate variable, rather this "this.gameObject".
This article was a great help in setting up the project to work with Oculus VR, initially we couldn't get it too work, which may have been due to missed steps, but even loading the SampleScenes didn't help originally, so who knows.