PROJECT RUNWAY AR 2.0
The goal of this project is to produce a high fashion runway event showcasing augmented reality technology. Real time animated computer graphics will be connected to clothing using image based augmented reality tracking technology. The clothing designs and their custom textile patterns will stand on their own but when viewed with a computer vision enabled camera the augmentation is revealed.
The show itself will have one main large format projector and multiple television monitors connected to a tablet to display the augmentations. The audience will also be invited to use their own mobile devices to see the augmented clothing from their point of view. The pipeline for this project is artist friendly, meaning existing state of the art off the shelf software will be used to produce the results. Project RunwayAR will bring together multiple disciplines: fashion design, graphic design, game development and computer animation.
Fashion designs by artists Aastha Shah and Peggy Kuo
Aastha Shah is a graduate of Fashion Design at the Academy of Arts University in San Francisco. Aastha celebrates her Indian culture through her fashion designs.
Taipei-born, Los Angeles raised textile designer Peggy Kuo currently resides in San Francisco where she recently graduated from the Academy of Art University with a BFA in Textile Design.
Will has been involved in state of the art of computer graphics for 4 decades as an artist and software engineer. Will came up with the idea of using augmented reality in a high fashion runway event. He is designing the augmented reality pipeline as well as creating some of the CG content.
Steven "Goody" Goodale began his gaming career in 1992 at Sega. Steven is Game Design Lead at the Academy of Arts University Gaming Department in San Francisco.
Pan is a computer graphic artist who specializes in visual effects. Pan is a graduate of the Academy of Arts in San Francisco. He is currently working in the game industry. He will be implementing some of the interactive CG content for the project.
Mahith is a graduate from Academy of Art University in San Francisco who is passionate and a witty creator of VFX in the realm of spells, explosions, fire, rain, sub-sonic lasers, world expansions and many other dynamics effects. Mahith has been working as a VFX and web designer freelancer for the past couple of years. He will be creating particles systems and marker images.
The AR Pipeline
The AR pipeline is designed to be artist friendly. The software applications have been selected to exclude any requirement for source code programming. The CG model and animation assets will be made in Maya. The assets will be imported into the Unity game engine via FBX files. The AR tracking will be performed with Vuforia an easy to use state of the art augmented reality designer kit for Unity. All textile patterns will go through testing to confirm that they are consistently track-able. The image markers must be unsymmetrical with a high degree of contrast and varying shapes. The artist team will include a fashion designers, cg modelers, cg animators and unity game developers.
Mock up of the catwalk and set design for the project runwayAR event.
This set will have one main/hero mobile device with it's the output broadcast to large monitors strategically placed so the audience can see the computer generated augmentation. The set requires bright lighting to provide the mobile devices a clear view of any markers on the clothing. The output of a hero Ipad hdmi will be wirelessly broadcast with apple TV/Airplay to the monitors placed around the runway. Some members of the audience will be provided with Ipads to view the AR from their point of view or switch to view the output of the hero Ipad.
As a proof of concept for the proposal we did a test shoot with one of Aastha's and Peggy's designs. The purpose of the test shoot was to confirm the technology performs as expected. The shoot isn't just to get footage of models walking towards camera ... its a live simulation of the runway event(in my opinion no elaborate set is required for this test). A video camera will record the point of view of whoever is holding the Ipad. The Ipad screen will be recorded through a USB cable connected to a Mac laptop or it can be recorded directly on the iPhone through a new iOS11 feature. The test shoot should not take more than a few hours. The final edited test shoot video will include clips of someone holding the Ipad and clips of the full screen augmented reality.
An early test that inspired the Project runway AR idea
This was all done with C++ programming along with an open source computer vision library called openCV and rendered with openGL graphics library. Printed images were used as tracking markers to render wire frame geometry.
Unity Particle Animation Tests
Other than 3d geometry we can attach particles systems to the clothing.
In the future we may be wearing tight fitting jump suits with distinguishable textile patterns that will be fully replaced with virtual fashion ordered online. This assumes that most of us will be wearing computer vision hardware 24/7. With the current pace of advances in computer vision there may be no marker patterns required on the jumpsuits. The vision algorithms will be capable of tracking human body features.