Table of Contents
Concepts
Visual:
Depth must parallel visual cues
Gradual changes from dark to light and vice versa
Place objects just below line of sight
Motion:
Visual motion should match vestibular motion
Objects should not interfere with the viewer’s perception of movement
Eye effort
Objects in view for long time should be below horizon, center
Provide natural break points in experience
Concepts:
Spatial Audio
Reverberation on object in room (from spatial audio sources)
Add slightly random noise (repetition with slight pitch and amplitude change)
Subtle and give feedback (timed)
General Audio
Should have certain randomization (pitch)
Placed to match object or space
Volume as to not take over space
Concepts:
RealityKit
Add interactivity, audio, and animation
Add 3D elements
Audio
Spacial Audio: by default, sources sound like they actually exist by using directivity to project sound in a direction
Ambient audio: audio for a surroundings environment
Channel audio: comes straight through headphones, no directionality (matches your head turning)
SwiftUI
Defines windows/spaces
Volumetric windows: holds 3D models, and can be seen in any space
Immersive/Fully immersive spaces: extend outside a window
Coordinate conversion functions convert RealityView coordinates to Vision Pro coordinates
Materials
Shaders applied from RealityView
Reality Composer Pro
Add, compose, and edit 3d content (USD files)
Input targets
Collision component and shape
Set up gestures like “DragComponent”
Concepts:
XCode
App project template
Initial scene: Window (2D) or Volume (3D)
Immersive Space: None or Mixed or Progressive (partial AR/VR) or Full (avoid rapid introduction of immersion)
WindowGroup (top level)
ContentView (main code)
VStack (vertical stack)
RealityView
Only call update on change
Gesture toggle
Simulator
Bottom bar for main commands
XCode Preview
All commands still there
Reality Composer Pro
Develop shapes
USDZ Cloud model for pre-created models
Data
-z in front
y from toes to head
x to right
Immersive Content
ImmersiveSpace
No ".update" needed
Can add ".gesture" with .targetedToAnyEntity
ContentView
@Enviroment(\.openImmersiveSpace) var openImmersiveSpace
Asynchronous
ImmersiveSpace starts static
Concepts:
ARKit
Can interact with content (i.e. place components and knock down)
Integrate Swift and C
Anchors
Used for position and orientation in relation to real objects
ARKitSession
Multiple ARKit features integrated
Sensors --> ARKit daemon --> ARKit clients (app)
App needs to request authorization (built in features can do this as well)
Hand tracking needs to be authenticated by user
World Tracking
Anchoring (6 dof) prevents jumping upon using the crown
Uses WorldTrackingProvider
Location based (can identify where person is)
Use Metal to render objects
Scene Understanding (.wall, .floor, .ceiling, .table, .seat, .window, .door)
Plane detection (Plane anchor) for placement and physics
Can identify known objects
Scene Geometry
Mesh Anchors - complex object physics
Image Tracking
Used for real images
Load or instance as CGImage
Hand Tracking
HandTrackingProvider uses HandAnchor
Chirality defines which hand is being instanced
Transform for wrist and origin
References a root and parent joint (tree format)
Concepts
Volumetric Windows (.volumetric)
fixed in scale
Model3D
Loading modes (.empty or .failure or .success)
TimelineView
for animation and movement
Attachments
annotations or editing
live views
Gestures
CollisionComponent for interactions with the hand
MagnifyGesture and RotateGesture for movement of hand
Concepts:
Features of good games
standalone spatial computing device
high resolution, FOV, refresh rate
spatial audio
world mapping
hand tracking
Vision Pro Spaces
Shared Space or Full Space or Fully Immersive
2D Window or 3D Volume
Game Types
arcade
casual
tabletop
interactive
transformative
Rendering
Dynamic foveation: high resolution where player looks
Realistic blend (custom in MaterialX)
System effects
Depth mitigation: floating reals
Near-field vignette: fading content
Breakthrough: real objects enter if too close
Grounding shadows: realism in shadows and light
Audio
Play audio through RealityKit
Hand Tracking
Uses ARKit
Request permission for tracking from user
Only uses hand data visible in camera
Fast hand movements are not detectable (latency)
Concepts
Working with Augmented Reality
ARKit
RealityKit
Immersive Spaces
Able to add multiple spaces by dismissing prior ones (switching between ConceptView and ImmersiveView)
Locations are relative to user space
Use anchors to target locations of surface
Reality View
Loads asynchronously
Retrieves data for head and hand positions
Spaces (.active or .inactive)
Coordinate conversions occur through ImmersiveSpace
Virtual hands
Useful for matching the look of hands to the space
Concepts
Sliders
Used for range of values
Inputs
any type of variable (booleans, integers, floats, etc.)
ShaderGraphs
Colors calculated by data
Inputs can be taken from the user
Can map/convert one input type to another (float to color)
Hand tracking (position and orientation of hands)
Movement of object (happiness beam) based on hand location
Looked at Bluetooth integration on iOS
Adding specificity to the target Bluetooth device