Synthesis is an interactive installation built around a simple tree stump that becomes the gateway to a living narrative through a continuous animation. Layering an AR projection onto the stump’s surface, the work merges digital and physical mediums to reinterpret the cinematic experience as spatial and participatory rather than confined to a screen. By encouraging the audience to engage with the stump with the digital projections, it creates an interplay between the two mediums transforming the set into a three dimensional space.
Princess Mononoke: A creature of nature being turned into a hateful demon after a human invasion that ultimately killed him.
Modern Times: The working class being dehumanized to cogs to a machine.
Wall-E: Humanity's complacency in total reliance to technology has rendered them into their obese state
Art Projects that Inspired Our Work
Chasing Stars in the Shadows: Moon's use of digital projections transforms a empty room into another world
Hyper-Reality: XR(AR, VR) can enhance perception but can also easily overwhelm it
As ICAM students we sought to seek out how the intertwining of technology and art can shape both the artistic process and the audience’s experience. Interested in technologies place within the art world, we wanted to discover how much the use of technology affects the way art is consumed. In filmmaking, the choice between analog and digital film can be seen as a trade off between artistic imperfection and technical perfection. While digital cameras offer flexibility in post production and cost savings, many filmmakers argue that the grain, color depth, and imperfections of film imbue each frame with a unique warmth that can’t be replicated with zeros and ones.
Is the artistry of the creation diminished when using a digital camera, compared to the use of film? Why does Miyazaki choose to adhere to creating his films on paper than the optimized route of digital animations?
Synthesis was influenced by works that tackled this topic through various films; Miyazaki's Princess Mononoke, Charlie Chaplin's Modern Times, and Pixar's Wall-E. Miyazaki's Princess Mononoke, through their depiction of how the advancements of technology can overtake what is natural for the sake of progress. This is demonstrated through the rapidly growing Iron Town decimating surrounding nature for its resources, with their retaliation being manifested through demons crafted upon the corpses of animals. Charlie Chaplin's Modern Times, criticizes industrialization and its dehumanizaing nature and is a stark reminder of AI and its implications in our world today where human artists are slowly being replaced by actual machines. And finally, Pixar's Wall-E, with their critique on the complacency of technology, and how an over reliance will ultimately lead to a loss of humanity.
Drawing inspiration also from artists such as Moon, and Matsuda, we explore the role that technology has in storytelling. Joon Moon’s Chasing Stars in the Shadows casts viewers into a visual landscape, using layered projections to blur the lines between physical and digital mediums demonstrating how technology can deepen enhance elements rather than detract from it. Keiichi Matsuda’s Hyper-Reality thrusts the audience into a world full of dense AR overlays, critiquing how augmented interfaces can both enrich our perception and overwhelm our senses.
Ultimately, these were the factors that led us to the pursuit of creating this project.
1995 Coca Cola Ad
2025 AI Coca Cola Ad
This ad came under fire for its blatant use of AI. It used the works of the 1995 ad to generate the current commercial. People have compared the works to one another to investigate the differences in the emotions that both commercials convey to the audience.
We were eager to explore the correlation between these two mediums, specifically because of the increased capabilities of AI. The influence that its bringing into the culture of art, was one we could no longer ignore. Driven by this relationship, we asked ourselves, "Does the use of technology devalue the authenticity of the art?"
While there are those who see this technology as a simple tool for artistic expression, others dub it as an affront to creative process. With so many AI creations seeping into the digital world, it seems only inevitable that AI would take over. We noticed that this effect of technology dominating a landscape was one that was not unknown to us. For instance, the use of film cameras have readily been pushed out of the limelight, in favor of the ease that our digital cameras provide. Animation, has also been slowly moving away from its traditional pencil and paper route, to the optimization that digital programs provide. Artists have already lost jobs to the utilization of AI, and we can already see their works on billboards and Coca-Cola ads. Back then, the use of digital software to paint was uncommon compared to the traditional methods. However, as a new audience began to utilize these tools, their influence began to dominate the community. Will AI follow this same path? Or does it's definition of a "tool" differ when compared to the camera and the software? Inspired by these questions, the cyclical nature of replacement, and the influence technology poses to the work of artists, we wanted to create a project that reflected this. Our conversations around the topic would ultimately fuel our decision to take part in the discussion as well, through our project, Synthesis.
The digital elements were made from start to finish in Blender, where each asset was modeled, UV mapped, and animated. The project required models of trees, a deer head, and buildings. We modeled all of these and their variations through the program. Once we created all of our assets, we went through many variations on how we wanted to demonstrate the AR.
We first used Adobe Aero to prototype the AR experience by importing our Blender animations and anchoring looping sequences to the real-world stump. Though the animation could play, we noticed that as the complexity of the scene advanced, the ability for the program to create the scene suffered greatly. So after testing out a few other methods, we ultimately landed on utilizing Reality Composer. The animations were then exported via FBX into the software, where the AR scene was created. In reality Capture, an AR framework was made to recognize the stump’s real-world surface, that continually plays the imported animation loop. As the viewer circles the stump, virtual cameras sample the animation from slightly shifted angles, ensuring the digital imagery remains locked to the stump’s surface.
At the same time, the physical tree stump was made from a lightweight frame of cardboard and foam, layered with a paper mâché mix of paper, drywall compound, and glue. The surface was sculpted while drying and after being fully dry, painted to mimic a realistic bark texture. This physical sculpture served as the tangible anchor point for our AR scene.
For final deployment, we exported each USDZ file directly into Reality Composer and used AR Quick Look on our mobile device to launch the AR scene. Finally, we tested on a mobile device to debug and fine tune the tracking stability, brightness levels, and shader parameters so that the AR projection behaves properly.