Sub-Random Numbers

In numerical analysis, low-discrepancy sequences play a crucial role. They are special sequences of points in a d-dimensional unit hypercube ([0,1]^d) designed to distribute themselves as uniformly as possible within that space.

Here's a breakdown of the key aspects:

In the code example, from top to bottom, a low order output bit of a Linear Congruence generator, a Xor Shift generator and an Additive Recurrence generator, together with a 2d plot of the Walsh Hadamard transform to the right.

Random projections obtained from completely random sign flippling followed by the Walsh Hadamard transform do not retain positional information of say clumped data within the input such as a translated circle.  If used with a neural network that gives the neural network far more work to do. A solution is to utilize a sub-random projection using a low discrepancy sequence of sign flips.

It seems like the Xor Shift generator wins, subject to further experimentation with the various parameters.