. PROGETTI
Tracing and Transforming Space
Using Procreate, draw over your own photographs or work from observation to capture the architecture of Florence.
Draw from observation or trace perspective
Extend or alter the structure
Introduce something that doesn’t exist or is impossible.
Output:
2 altered architectural images
Screen
Perspective
Drawing perspective in Procreate
Artists
Unfinished: In Progress
Florence is full of bodies that are not whole—figures emerging from stone, torsos without arms, limbs without origin. At places like Galleria dell'Accademia, you encounter the tension between the ideal human form and something perpetually unfinished, caught between material and release.
But this condition extends beyond sculpture. The living bodies here—tourists and residents alike—are in constant flux. Visitors arrive, overlap, and move on; locals repeat gestures until they fuse with the city itself. Travel reshapes perception, posture, and identity—you are not quite who you were, and not yet who you will become.
Collage is already embedded in Florence—layered posters, graffiti, fragments accumulating on walls. Even the collision of humanity in its streets becomes a kind of living collage.
Like The Prisoners, your collages should reflect this emergence: bodies forming, dissolving, interrupted, and transformed mid-experience—held in the unresolved state of becoming.
Focus on the human figure (yourself, classmates, strangers, statues, reflections, fragments in windows, etc.)
Use only your own photographs
Cut, layer, and recombine
Explore emergence / incompleteness
Output:
3 collages (PNG or JPG)
Screen:
Collaging with Procreate:
Artists:
Face as Interface
Create two AR face masks using Reality Composer that transform the face into a constructed, hybrid surface. Combine 2D image elements with built-in 3D assets, experimenting with scale, repetition, and spatial layering. The first mask (pre-Milan) should be exploratory and playful. The second (post-Milan) should draw from your experiences in Milan, showing greater conceptual focus and visual cohesion. You are encouraged to incorporate photogrammetry captures to introduce depth and realism. Export short videos of both masks and upload your .Reality files for others to experience.
Mask 1: Experimental (before Milan): Due on Wednesday.
Mask 2: Informed by Milan
Optional:
Incorporate photogrammetry captures
Output:
Short videos of both masks
Upload .Reality or .USDZ file to Google drive for others to expereince.
Screen:
Reality Composer:
Artists:
The Return of the Medici
Create a site-responsive AR experience using a horizontal image anchor. Your work should draw directly from the historical, artistic, or cultural context of Uffizi Gallery, Palazzo Pitti, or another location in Florence that has meaningfully impacted you during your time here.
Translate a specific reference—an artwork, architectural detail, historical narrative, or spatial experience—into an augmented reality intervention. The goal is not to replicate, but to reinterpret: extend the past into the present.
Your AR experience should:
Use a horizontal image anchor as the entry point
Expand from a flat image into 3D dimensional space.
Incorporate interaction and/or animation
Demonstrate a clear conceptual link to Florence’s history
Utilize tools such as Polycam or Meshy for asset creation
Think of this as activating history—allowing an image to unfold into a living, spatial idea.
We will review and experience all projects together on the large class table.
Output:
1 AR scene (screenshot/video + working file)
Screen:
Reality Composer:
Artists:
Augmented Urban Intervention
Synthesize your work in perspective, collage, face-based AR, and spatial anchoring into a final, site-specific intervention within Florence. Create a concept-driven AR experience tied to a precise location. Draw inspiration from the Venice Biennale—its contemporary aesthetics and narratives—as well as Venice’s textures and atmosphere. You may use any anchor types available in Reality Composer. The final work will be published in a public iOS app on the Apple App Store, framing this as a real exhibition, not just classwork.
Must respond to a precise, non-transferable location within Florence
Must be conceptually grounded (idea first, not assets)
Must be poetic, intentional, and spatially resolved
Must exist within the defined historic Florence boundary
Must draw inspiration from the Venice Biennale and Venice’s atmosphere (not literal replication)
May use any anchor types available in Reality Composer
Must function reliably in situ (stable anchoring, scale, lighting awareness)
Must be exhibition-ready for public release via the Apple App Store
Project Timeline
Assignment Launch — May 4
Introduction to the final project, expectations, and exhibition framework.
Preliminary Concept Presentations — May 11
Each team must present a clearly defined direction, including:
1–2 proposed sites (specific, documented locations)
Concept summary (2–3 sentences: what, where, why)
Mood board / visual references (aesthetic, tone, influences)
Rough spatial plan (sketch, diagram, or simple mockup)
Proposed interaction or behavior
Initial asset direction (what will be made vs. sourced)
Ideation + Development Workshop — May 11
In-class refinement session with guest artist Mindy McDaniel. Focus on selecting the final site and locking concept, site specificity, and production plan.
Venice Trip (Biennale Visit)
Gather inspiration from the Venice Biennale—focus on contemporary aesthetics, narrative, and spatial strategies.
Progress Check Meetings — May 18
Each team presents a working, in-progress AR experience in the studio:
Core elements built and functioning
Anchoring method established and tested
Clear spatial behavior and scale
Concept actively realized (not speculative)
Defined path to completion
Note: This is a production checkpoint—not a concept review. Projects should be partially built and functioning.
Final Submission Deadline — May 29
Submit completed AR experience as a .reality file. All projects will be integrated into the class app for debugging and final testing prior to public release.
Output
Presented as part of a class-wide, geo-located AR exhibition within a publicly released iOS app on the Apple App Store
Fully functional AR scene integrated into the class app (final .reality file or equivalent)
Documentation:
1 composed screenshot (intentional framing)
1 short video (5–15 seconds, stable and readable in situ)
Short concept statement (3–5 sentences: what, where, why)
All assets and files organized and submitted for app integration