Today was the first class. I learnt a lot about what the semester ahead will look like.
My GPU system looks like so:
The Readings I chose for the discussion included:
Democratizing Virtual Production: Vū Partners with AbelCine - Virtual Producer
Vū Technologies, the world’s largest network of virtual studios, has partnered with cinematic integrator AbelCine to democratize access to virtual production through Vu One, an all-in-one, cost-effective virtual production system. Vu One integrates high-end hardware, proprietary software (including Vu.ai, Scene Forge, and Remote VP), and supports tools like Unreal Engine, making it easier for filmmakers, educators, brands, and enterprises to create high-quality digital content.
Avatar: The Way of Water pushed the limits of filmmaking technology, requiring a production pipeline that could handle 4K HDR at 48fps for stereoscopic viewing. To evaluate content in real-time as close to theatrical quality as possible, Lightstorm built a “projection pod” for live 3D monitoring.
This week we explored some really cool stuff in class, I learnt about the importance of the T-Pose while building characters, here is what I came up with on Mixamo as my unreal meta human wasn't loading up properly for some reason.
So I had a go at creating metahuman on a device that supported it, and im really glad I was able to finally build my meta human!
The second part of the assignment was to explore the volumizer for mo-cap technologies!
We explored the mo-cap volumizing and equilibriating feature as part of the homework. It was a really cool experience and we got a bit of assistance from people who were hanging around the mo cap volumizer.
We saved it as fbx files and here is a video of our movements that I got to open up on touchdesigner! It was super cool stuff:
As we are still exploring virtual worlds, here is progression on my world that I have been creating and working on for the past couple weeks:
This week, we had to adjust our animated sillhouettes that we captured thru mo-cap into our virtual worlds.
To be quite honest, I really struggled with this assignment, and wasn't able to transfer the file onto my unreal engine. I hope to keep working on it, and have it reflect more of what I want it to do.
A constant feeling that I am oberwhelmed with while adjusting and using these technologies, is that its important to feel in control of the technology and not let it control how you feel about yourself and your progress.
Below are some visuals of us callibrating at the mo-cap volume!
A video of the mo cap capturing the real-time information, as well as some BTS :)
This week we hooked our meta human to the mo cap fbx files and then we also connected our phones with the real time, to reflect into our unreal metahumans!
It was super fun and interesting getting everything done. Especially seeing the metahuman take the same form as what was being observed on the selfie camera
This week was all about mixing our creations on blender and having our meta humans/mixamos work thru the retargetting and have it work real time on unreal.
Honestly, Im still struggling getting my mixamo to connect to my fbx file, and with my unreal metahuman as well...documented is my struggling with connecting the two.
For some reason, as pictured, im unable to get my mixamo to connect to motion builder. SO im still stuck on like... week 3?
RIP, lets see how this week goes.
Midterms!
We captured Mo-Cap data for our midterm, and are planning to re-create the dance scene from Pulp Fiction. Here is some of the bts, our visualization and trajectory of our project.
Im adding the visuals on here so we can build our unreal world to mimic the scene from Pulp Fiction.
We found the setting for our world, and decided to go for a galaxy setting in contrast to the pulp fiction aesthetic, the audience would probably assume, once they recognized the characters of Uma Thurman and John Travolta.
Apart from helping the team re-target the character, I created a tone of FBX files, with background characters, skimmed around and found the stage to center the performance on for unreal, and also found samples of bar set-ups for the bar background in the setting on the scene.
https://youtube.com/shorts/gZpNZekW52c?feature=share
Here is a video of the retargeting so far!
Here is a video of the FBX file with the creation of the character.
Phew, we got week 10 off to recuperate from stressful midterms.
Anyways, I've started the rendering for my final project now, and i want to start prototyping for my thesis project and use touch designer to explore AI and human interaction.
I did a bit of research and digging on the multiverse of HCI/AI- Human interaction online and ideated a couple really cool renders so far. I want to continue this project and prototype it for my final project, and also include it as a developing prootype for my thesis.
hand recognition
pictures of the toxes so far
Here is some info about my thesis research, and I wanna represent it in an artistic way through touchdesigner:
"My decision to pursue artificial intelligence and machine learning is a strategic one. My vision is to leverage advances in Natural Language Processing (NLP) and reinforcement learning to create adaptive therapeutic tools. Unlike static exercises, these systems could offer real-time phonetic feedback, track cognitive improvements, and dynamically adjust difficulty, personalizing the rehabilitation journey in a way that is impossible to scale with human therapists alone. These tools would be designed specifically to support the intuitive, spontaneous interactions crucial for reintegrating into daily life, helping patients rebuild the cognitive and communicative pathways disrupted by stroke. During my time at NYU, I took a course on AI systems that deepened my understanding of how to design AI-driven products for innovative, human-centered solutions. This experience helped me refine my technical perspective and will help me in developing intelligent systems that address real-world challenges."