before after
---- ----
2 | 5 | Goal 1: can articulate different AR/VR visualization software tools
1 | 4 | Goal 2: can comparatively analyze AR/VR visualization software tools
1 | 5 | Goal 3: can choose the most appropriate visualization tool for each project
1 | 5 | Goal 4: become familiar with hardware interactions
4 | 5 | Goal 5: can develop a well-planned research strategy aligned with the research question
3 | 5 | Goal 6: become familiar with setting up virtual environments and building projects
1 | 5 | Goal 7: can think critically about what is currently feasible and what needs improvement
3 | 5 | Goal 8: can iteratively adjust and improve designs
3 | 5 | Goal 9: communicate ideas more clearly
3 | 5 | Goal 10: can evaluate my work through self/peers/professor feedback, and identify takeaways
Project 1 Proposal <Google doc>
Project 1 Proposal Presentation <Google Presentation>
Project 1 Progress Presentation <Google Presentation>
Project 1 Final Presentation <Google Presentation>
Project 2 Proposal Presentation <Google Presentation>
Project 2 Progress Presentation <Google Presentation>
Project 2 Final Presentation <Google Presentation>
Flash Talk <Google Presentation>
Final Poster <PDF>
CONTRIBUTION 1 - Created Unity Hand Gesture Tracking page to walk through how to import Meta hand tracking SDK into Unity and how hand gestures are detected based on poses and shapes
CONTRIBUTION 2 - Created Passthrough Hand Gesture Tracking page to walk through how to enable hand gesture tracking in passthrough mode
CONTRIBUTION 3 - Created Enable Passthrough in Unity page to walk through how to enable passthrough mode in Unity
CONTRIBUTION 4 - Created Custom Hand Gesture Creation page to explain how to define gestures using both manual finger parameter configuration and recorded hand capture data
CONTRIBUTION 5 - Created 3D Annotation Line Rendering page to explain how to create a 3D annotation drawing system in Unity that is triggered by a custom hand gesture created in the previous step
CONTRIBUTION 6 - Created Unity Poke Interaction page to explain how to make UI buttons physically pressable with fingers in VR including the XR Interaction Toolkit approach that didn't pan out, and the simpler custom solution that actually worked
CONTRIBUTION 7 - Created Unity Custom VR Keyboard page to walk through building a fully custom VR keyboard from scratch with physical poke interaction
CONTRIBUTION 8 - Added Lazy Follow Effect Section to Unity UI page to explain how to use XR Interaction Toolkit's built-in Lazy Follow component to make a World Space Canvas smoothly track the user's head in VR, including configuration settings and behavior thresholds
CONTRIBUTION 9 - Created Unity Troubleshooting page to document common Unity errors with solutions, and to serve as a shared space where contributors can add their own issues and fixes over time
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Journey Video: https://drive.google.com/file/d/18NqydTzuR0yYj6s7vCG4275IfjC9cpXk/view?usp=sharing
Visiting three places (images in order below):
(1) The house where I grew up / (2) The house I live at Brown / (3) The place that's significant to me but not famous
Visiting three places again (images in order below)
(1) The house where I grew up / (2) The house I live at Brown / (3) The place that's significant to me but not famous
Total: 151.5 hours
5/10/26 - 1 Hours
Completed self-evaluation and class learning goals scoring
5/9/26 - 5 Hours
Created Unity Poke Interaction page to explain how to make UI buttons physically pressable with fingers in VR
Created Unity Custom VR Keyboard page to walk through building a fully custom VR keyboard from scratch with physical poke interaction
5/7/26 - 5 Hours
Prepared Final poster
5/4/26 - 1.5 Hours
Prepared Flash talk slide
4/30/26 - 3 Hours
Prepared Project 2 final presentation (2/2)
4/29/26 - 3 Hours
Analyzed Google survey response data from in-class activity
Prepared Project 2 final presentation (1/2)
4/28/26 - 1 Hours
Uploaded .apk / Google survey form / Google Doc for screenshot in class timeline
4/27/26 - 5 Hours
Prepared in -class activity
Pondered a task for in-class activity
Created Google survey form for in-class activity
Refined UX issues and completed product building
Made all UI buttons and panels auto-hide when three-finger pinch drawing starts, and reappear when the stroke ends for a clear view while annotating
Changed 'View saved annotation' button below the history panel to Close after poking the View saved annotation button
Changed name input field visuals based on different status (before editing, while editing, after editing)
4/26/26 - 10 Hours
Adjusted panel width and centering, reduced thumbnail border, added filename label below each thumbnail (struggled with Mask clipping), added red X delete button with VR poke support. Started building annotation viewer UI but left mid-configuration.
Built the Annotation Viewer UI of a full-screen overlay panel that opens when a thumbnail is poked, showing the large screenshot with an editable name field and close/delete buttons
Attempted Quest system keyboard for the name field but it doesn't work in VR so built a custom pokeable VR keyboard after multiple fixes to key-to-input-field wiring
4/24/26 - 3 Hours
Attempted to implement a horizontally scrollable history panell, but struggled with content anchor and sizing configuration, the thumbnail kept rendering outside of the visible area or with negative width. Rebuilt the scroll view from scratch, but the same anchor and Content sizing issues persisted. Eventually abandoned the scroll view approach entirely and switched to a simpler fixed 3 slots panel
The 3-slot panel appeared as thumbnails in the panel. But an empty placeholder slot that existed in the scene as a scene rather than being deleted after prefab creation, remained visible on the left, so deleted it from hierarchy and it worked
4/23/26 - 3 Hours
Implemented screen capture on Save to capture the current annotations with the passthrough background as a png
No log output found via MQDH, and accessing saved files directly on the Quest was difficult (Android File Transfer unavailable on Mac, MQDH file manager unresponsive). Couldn't verify what was being captured, so ended up pivoting to do in VR instead
4/21/26 - 4 Hours
Resolved the UI visibility issue by simply adding the new scene in building settings...
Created buttons (save/delete) in unity and fixed them near the bottom
Added a delayed follow mechanic to ensure the buttons move smoothly and slowly for more stable head-motion tracking
Implemented delete functions
4/20/26 - 3 Hours
Continued troubleshooting in-headset visibility issues
Attempts that I'd made
Checked whether the Canvas was set to World Space and whether the UI was placed in front of the camera
Confirmed that the Event Camera was assigned to the correct Main Camera
Checked the camera’s Culling Mask to make sure the UI layer was included
Tested whether the issue was specific to the UI by placing a simple cube/object in the scene
Checked whether the UI objects and parent Canvas were active in the Hierarchy
Realized that the correct/new Scene had not been added to Build Settings, so the headset build was running a different or outdated scene
Added the correct Scene to Build Settings, rebuilt the app, and the UI became visible
4/15/26 - 2.5 Hours
Caught up on 17 steps of AI analysis activity since I missed class on the 14th
4/11/26 - 2 Hours
Started building Unity UIs (Save/Delete buttons) and troubleshot in-headset visibility issues
4/9/26 - 2 Hours
Looked into UI options and stuck with Unity's native tool. I though about exporting from Figma but ran into some technical hurdles, which is importing Figma assets as png preserves appearance but limit interactivity (such as text edits, button status, hover/press feedback)
4/5/26 - 2.5 Hours
Prepared project 2 progress presentation
4/3/26 - 6 Hours
Designed Interaction flows and created UIs (Buttons / Icons / Panels) in Figma
4/2/26 - 2 Hours
Finalize design scope and direction for annotation storage and retrieval
3/31/26 - 4 Hours
Stabilized three pinch hand gesture detection reliability
3/17/26 - 2.5 Hours
Brainstormed tentative in-class activity
Prepared Project 1 proposal presentation
Below is Project 1 (Time logged: 80.5 hours)
3/13/26 - 3 Hours
Documented 3D Annotation Line Rendering guide on Wiki
3/12/26 - 4 Hours
Documented Custom Hand Gesture Creation guide on Wiki
Went through the whole process of creating a custom hand gesture again so that anyone coming in without background knowledge can easily follow along.
3/10/26 - 5 Hours
Prepared project 1 final presentation
Analyzed the survey result from in-class activity (Observation-only VS Using Annotation on Answer accuracy, Confidence rates, Completion time, Mental effort, overall preference, etc)
3/5/26 - 4 Hours
Looked into how to set up passthrough without hand tracking, so anyone whose project doesn't involve hand tracking can get it set up -2h
Documented Enable Passthrough in Unity guide on Wiki -2h
3/4/26 - 6.5 Hours
Designed experiments for in-class activity - 5h
Explored several experiment ideas (visual analysis of randomly placed color dots, text analysis, etc.) and settled on a visual analysis of color dot clusters, since it's more suited for users to annotate, group, or count them.
Adjusted the experiment difficulty level
Created Google survey form - 1h
Shared instructions & Resources for in-class activity - 0.5h
3/3/26 - 10 Hours
Rendered an Annotation Liner
Created a pen-like object (body / tip)
Wrote a code script for an annotation liner
Went through extensive troubleshooting to connect the custom gesture with the annotation tool
Adjusted the liner's quality
Fine-tuned the finger values for improved gesture detection (once the connection was successful, the detection reliability had degraded for some reason)
2/25/26 - 5 Hours
Looked for a better solution for more accurate custom gesture detection, recording custom hand gesture and importing specific frames from the recording
Adjusted each finger threshold to avoid being too strict/loose to detect target gesture
Updated Project progress presentation
2/24/26 - 2 Hours
Enabled Hand Gesture Tracking in Passthrough mode
Created a wiki page documenting Passthrough Hand Gesture Tracking
2/23/26 - 2 Hours
Prepared project progress presentation
2/22/26 - 4 Hours
Created a wiki page documenting Unity Hand Gesture Tracking setup and testing
Attempted to customize a three-finger pinch gesture by manually adjusting the input values for each finger
2/21/26 - 5 Hours
Looked into tutorials and documentation on hand tracking
Set up the hand-tracking SDK in Unity
Configured the environment and tested grab interactions in a VR headset
Tested various default hand gestures (e.g., thumbs up/down, open palm up, pointing) using the debugging UI panel
2/18/26 - 0.5 Hours
Installed applications for Paraview, Ben's Volume Rendering tutorial
2/17/26 - 3 Hours
Imported the Meta XR SDK (Hand Tracking) into Unity and configured hand tracking
Troubleshot headset-to-PC connection issues to enable VR testing from Unity
2/15/26 - 1.5 Hours
Unity tutorials on animations, audio, lighting, and game sharing
2/11/26 - 1.5 Hours
Installed Unity
Began Unity tutorials on basic interface 3D fundamentals
2/9/26 - 2.5 Hours
Prepared Project 1 Proposal Presentation here
2/8/26 - 4 Hours
Pivoted and came up with another idea
Updated Project 1 proposal here
2/4/26 - 1 Hours
Created pre-plans for project 1
Went over other classmates' project ideas
2/3/26 - 2 Hours
Installed DinoVR
Installed ShapesXR and play around
Installed Google Earth
2/2/26 - 7 Hours
Review wiki & search web for the projects ideation
Bring up the ideas on here
1/29/26 - 3 Hours
Set up Quest 3
Play around with the basic functionalities to get familiar with the device
1/26/26 - 1 Hours
Read Kenny Gruchalla's bio and website and add a question
Explored wiki
1/22/26 - 2 Hours
Read through the course homepage
Joined Slack and introduced myself