Figure 8 simulation under various Codebook and Feature Vector Distributions. Increasing the variance of both the codebook and feature vector distribution from Uniform(-5,5) to Uniform(-10,10) does not meaningfully affect quantization error (0.120 vs. 0.149). Similarly, shifting the distribution to have higher norm from Uniform(-5,5) to Uniform(5,15) does not meaningfully affect quantization error (0.120 vs. 0.123, respectively). Finally, substantially decreasing the variance of the distribution to Uniform(-0.001, 0.001) does not meaningfully affect quantization error (0.120 vs. 0.143).

For the first two cases, this is because the codebook EMA and commitment loss pull the codebook vectors towards feature vectors and vice-versa. For the third case, although points initially have very low quantization error, they spread apart as they move toward the local minima. While each of these four examples presents very different "feature vector distributions", quantization error imparts useful information: a quantitative measure of how close feature vectors are to the corresponding codebook vectors.