2 | 5 | Goal 1: articulate AR/VR visualization software tool goals, requirements, and capabilities;
1 | 4 | Goal 2: construct meaningful evaluation strategies for software libraries, frameworks, and applications; strategies include surveys, interviews, comparative use, case studies, and web research;
1 | 4 | Goal 3: execute tool evaluation strategies;
2 | 4 | Goal 4: build visualization software packages;
1 | 4 | Goal 5: comparatively analyze software tools based on evaluation;
2 | 5 | Goal 6: be familiar with a number of AR/VR software tools and hardware;
3 | 5 | Goal 7: think critically about software;
3 | 5 | Goal 8: communicate ideas more clearly;
1 | 5 | Goal 9: Contribute high-quality, lasting content to the VR Software Wiki to aid future researchers and students.
Homework 1:
10 min changes:
Add protein VR to the VR Software or general tools sections on the wiki. ProtienVR is a lightweight tool that can be an alternative to Nanome.
Add a surigcal VR tab in the VR in Medcine section. The VR is medicine has many differenct types of applications and seems overly broad. It would be helpful for medical students or sugeons searcing for tools if there was distinction between visualization tools, surgical planning tools, etc.
Complete the WebVR Tutorials landing page to have the relevant summary/links. It is currently empty.
1 hour changes:
Compile just 1 page for all 3D model or VR relevant file types. There is already a page that describes some files types for 3D models, but I feel like it would be helpful for there to be just 1 pages that covers and describes all types.
Add a surigcal VR tab in the VR in Medcine section. The VR is medicine has many differenct types of applications and seems overly broad. It would be helpful for medical students or sugeons searcing for tools if there was distinction between visualization tools, surgical planning tools, etc.
Add an entry that convers "how to cite VR tools/ software." This would be helpful for researchers who aren't very familar with VR tools, but would like to use them and need to cite them in a manuscript.
10 hour changes:
Add a new page for AI in VR. Conduct a literature and general technology review and add the new and relevant developmetns in AI and how they are/can change(ing) the way VR/AR is used.
Add a new page for Gaussian Splatting. This is a new rendering technique for VR that represents scenes as millions of translucent, episodal "splats." It is not covered in the Wiki. It has mostly replaced NeRF's due to its faster rendering speeds.
Add more tools in the VR in sports section. It currently only includes NBA visualtion tools; however, many other tools such as Rezzil (soccer rehab and drills AR), WIN reality (baseball training simulations), STRIVER (QB training), exist can could be helpful for people to reference.
Homework 2 (Class 1/29):
Dino VR Screen Shot:
Google Earth Screen Shots and Link:
Project Ideas:
I want to get a high-dimensional dataset like Word2Vec, preprocess it for 3D projection, and implement a VR point-cloud renderer that supports selection. I plan to test some kind of sensory navigation interactions to see if physical movement aids understanding. For a class activity, people can perform an A/B cluster search to evaluate different visualization metaphors. My deliverables will include a comparative wiki entry on density vs. legibility in 3D plots and a reusable Unity/WebXR component for loading CSV embedding data.
I want to get skeletal movement data for correct and incorrect squat forms and develop an overlay system in VR to superimpose the ideal form over the user's path. You can use visual deviation indicators to highlight unnatural movements. In a activity, people will use the tool to guide a peer, analyzing the efficacy of the visual cues. Deliverables will be a case study on immersive spatial feedback versus video playback and a wiki tutorial on importing BVH motion files.
My goal is to capture network traffic data to build a VR visualization where devices are nodes and data packets are animated edges representing bandwidth. I will create filtering tools to isolate connections by grabbing nodes. The class will participate in a heuristic evaluation to identify a simulated security breach and rate the tool's usability. Potential deliverables include a table evaluating 3D graph layout libraries for VR performance and a wiki page on design patterns for selecting nodes in dense graphs.
Project Plan:
Goal: Build a VR point-cloud visualization system for exploring high-dimensional AI embeddings (Word2Vec), with interactive selection and navigation, culminating in a comparative study of density vs. legibility in 3D plots.
Platforms: Unity (Quest) and/or WebXR (A-Frame/Three.js)
Deliverables: Reusable Unity/WebXR component for loading CSV embedding data. Comparative wiki entry on density vs. legibility in 3D point-cloud plots. In-class A/B cluster search activity with evaluation data
Week 1: Data Acquisition & Preprocessing Pipeline
Monday 2/10 — Dataset Selection & Initial Preprocessing
Download pre-trained Word2Vec embeddings
Write Python script to extract a meaningful subset (e.g., 1,000–5,000 words from a semantic category like emotions, animals, or places)
Apply dimensionality reduction (PCA or t-SNE)
Export results to CSV format
Deliverable: CSV file with 3D-projected embeddings ready for visualization
Wednesday 2/12 — Visualization Framework Setup
Set up Unity project with XR Interaction Toolkit (experiment w/ WebXR development environment)
Review wiki resources: IATK Basics, DinoVR for Point Cloud Data
CSV parser to load data
Render initial static point cloud (no interaction yet)
Deliverable: Basic point cloud rendering from CSV data
Week 2: Core Interaction Development
Wednesday 2/19 — Selection & Labeling System
Implement point selection via ray-casting (controller or gaze-based)
Display word labels on hover/selection (floating text or tooltip UI)
Add color coding by cluster or semantic category
Implement basic camera/locomotion controls (teleportation or smooth movement)
Deliverable: Interactive point cloud with selection and labels working
Week 3: Navigation & Density Experiments
Monday 2/24 — Sensory Navigation Prototype
Implement physical movement mapping (room-scale walking corresponds to data navigation)
Add scaling controls (zoom in/out of point cloud)
Create two visualization modes for A/B testing:
Mode A (Dense): All points visible, smaller point size, full dataset
Mode B (Legible): Filtered points, larger labels, decluttered view with LOD (level of detail)
Deliverable: Two toggleable visualization modes ready for comparison
Wednesday 2/26 — Refinement & Activity Design
Polish interactions and fix bugs from testing
Design in-class activity protocol:
Task: "Find all words related to [category X] as quickly as possible"
Metrics: Time to completion, accuracy, subjective preference
Create simple data collection form (Google Form or in-app logging)
Prepare activity instructions and evaluation questionnaire
Deliverable: Activity protocol document and data collection system
Week 4: In-Class Activity & Analysis
Monday 3/03 — Final Pre-Activity Testing
Deliverable: Fully tested build ready for class activity
Wednesday 3/05 — Begin Wiki Documentation
Deliverable: Wiki entry draft (technical sections complete)
Week 5: In-Class Activity
Activity: A/B Cluster Search Evaluation
Participants split into two groups; each starts with a different visualization mode. Find and select all words related to 'weather' (or similar category) within the point cloud. Group A starts with Dense view, Group B starts with Legible view; then swap. Completion time, accuracy (% correct selections), NASA-TLX workload rating, preference ranking
Week 5–6: Analysis & Final Deliverables
By 3/12–3/13:
Analyze activity data (quantitative + qualitative)
Complete wiki entry with findings and recommendations
Total: 23 hours
1/26/26 - 4 Hours
Joined up slack and introduced myself
Read through the wiki and researched new VR tech (ProtienVR, Gaussian Splatting)
Added my proposed changes to my Journal
Explored Kenny Gruchalla's bio and research and added my questions to Gdoc.
1/29/26 - 10 Hours
Worked on project ideas
Completed Google earth setup and activity.
Completed Dino VR Setup
Completed Virtual Desktop Setup
Completed ShapesVR Lab
2/4/26 - 4 Hours
Completed In class DinoVR assignment
Drafted and added project proposal and activites to journal
Researched mapping mapping points using Unity/WebXR
2/8/26 - 5 Hours
Completed project proposal
Completed project proposal powerpoint presentation
Researched tools needed for project (Unity, Python, etc)