# Word2Vec to Unity VR Pipeline Tutorial
Sanil, Spring 2026
This tutorial walks through converting high-dimensional AI word embeddings into a 3D point cloud for VR visualization in Unity, deployable to Meta Quest 3.
Overview
Word2Vec embeddings represent words as 300-dimensional vectors where semantic similarity corresponds to geometric proximity. By reducing these to 3D and visualizing in VR, users can physically explore how AI "understands" language relationships.
Prerequisites
- Python 3.8+ with pip
- Unity 2022.3 LTS
- Meta Quest 3 (or Quest 2/Pro)
Part 1: Python Data Preprocessing
Step 1: Install Dependencies
```bash
pip install gensim scikit-learn pandas numpy
```
Step 2: Download Word2Vec Model
Download Google's pre-trained model from the gensim data repository or Google's archive:
```python
from gensim.models import KeyedVectors
# Load pre-trained Word2Vec (this takes a few minutes)
model = KeyedVectors.load_word2vec_format(
'GoogleNews-vectors-negative300.bin',
binary=True
)
```
Step 3: Extract Words by Category
```python
import numpy as np
categories = {
'weather': ['rain', 'snow', 'sunny', 'cloudy', 'storm', 'wind', 'temperature'],
'emotions': ['happy', 'sad', 'angry', 'fear', 'joy', 'love', 'anxiety'],
'animals': ['dog', 'cat', 'bird', 'fish', 'horse', 'lion', 'elephant'],
'places': ['city', 'mountain', 'beach', 'forest', 'desert', 'ocean', 'river']
}
all_words = []
all_vectors = []
all_categories = []
for category, seeds in categories.items():
for seed in seeds:
if seed in model:
# Get similar words to expand the category
similar = model.most_similar(seed, topn=50)
for word, score in similar:
if word not in all_words and word in model:
all_words.append(word)
all_vectors.append(model[word])
all_categories.append(category)
print(f"Extracted {len(all_words)} words")
```
Step 4: Dimensionality Reduction with t-SNE
```python
from sklearn.manifold import TSNE
vectors_array = np.array(all_vectors)
# t-SNE parameters tuned for word embeddings
tsne = TSNE(
n_components=3,
perplexity=30, # Balance local/global structure
n_iter=1000, # Iterations for convergence
random_state=42 # Reproducibility
)
coords_3d = tsne.fit_transform(vectors_array)
print("t-SNE complete")
```
Step 5: Export to CSV
```python
import pandas as pd
df = pd.DataFrame({
'word': all_words,
'x': coords_3d[:, 0],
'y': coords_3d[:, 1],
'z': coords_3d[:, 2],
'category': all_categories
})
# Normalize to VR-friendly scale (meters)
for col in ['x', 'y', 'z']:
df[col] = (df[col] - df[col].mean()) / df[col].std() * 3
# IMPORTANT: Use period as decimal separator for Unity
df.to_csv('word_embeddings_3d.csv', index=False)
print("Saved to word_embeddings_3d.csv")
```
Part 2: Unity Setup
Step 1: Create Project
1. Open Unity Hub → New Project → 3D (URP) template
2. Select Unity 2022.3 LTS
3. Name project (e.g., "WordEmbeddingVR")
Step 2: Install XR Packages
Step 3: Configure for Quest
Edit → Project Settings → XR Plug-in Management:
- Check "Oculus" under Android tab
- Check "Oculus" under Standalone tab (for editor testing)
Edit → Project Settings → Player → Android:
- Set Minimum API Level to Android 10 (API 29)
- Set Scripting Backend to IL2CPP
- Set Target Architectures to ARM64
Step 4: Import CSV and Create Parser
Place `word_embeddings_3d.csv` in `Assets/Resources/`
Create `CSVPointCloudLoader.cs`:
```csharp
using UnityEngine;
using System.Collections.Generic;
public class CSVPointCloudLoader : MonoBehaviour
{
public TextAsset csvFile;
public Material weatherMat, emotionsMat, animalsMat, placesMat;
public float pointSize = 0.05f;
void Start()
{
LoadPoints();
}
void LoadPoints()
{
string[] lines = csvFile.text.Split('\n');
for (int i = 1; i < lines.Length; i++) // Skip header
{
if (string.IsNullOrWhiteSpace(lines[i])) continue;
string[] values = lines[i].Split(',');
if (values.Length < 5) continue;
// Parse coordinates - handle potential parsing issues
if (!float.TryParse(values[1], out float x)) continue;
if (!float.TryParse(values[2], out float y)) continue;
if (!float.TryParse(values[3], out float z)) continue;
Vector3 position = new Vector3(x, y, z);
string word = values[0].Trim();
string category = values[4].Trim().ToLower();
// Create point
GameObject point = GameObject.CreatePrimitive(PrimitiveType.Sphere);
point.transform.position = position;
point.transform.localScale = Vector3.one * pointSize;
point.name = word;
// Assign material by category
Renderer renderer = point.GetComponent<Renderer>();
switch (category)
{
case "weather": renderer.material = weatherMat; break;
case "emotions": renderer.material = emotionsMat; break;
case "animals": renderer.material = animalsMat; break;
case "places": renderer.material = placesMat; break;
}
}
}
}
```
Step 5: Set Up Materials
Create 4 unlit materials with distinct colors:
- Weather: Blue (#3498db)
- Emotions: Red (#e74c3c)
- Animals: Green (#2ecc71)
- Places: Yellow (#f39c12)
Step 6: Build and Deploy
1. File → Build Settings → Switch Platform to Android
2. Connect Quest 3 via USB (with Developer Mode enabled)
3. Build and Run