The permutahedron with n! vertices is the Cayley graph of the group S_n, which in this case corresponds to an election with n candidates. We can take each of the possible votes, (an element of S_n) and color the permutahedron with the number of people who voted for that arrangement of candidates. This is called the signal on the permutahedron. Those familiar with linear algebra can think of this signal as a vector of length n!, where there is a fixed ordering of the permutations, and the entries in the vectors are just the number of people who voted for the corresponding ordering.
One of the tools from graph signal processing is taking this signal on the permutahedron and decomposing it into a weighted sum of the Laplacian eigenvectors of the graph. These form an eigenbasis that spans R^(n!), so this is guaranteed to be a valid decomposition. We can then make generalizations about the original signal based on the weights of given eigenvectors in the decomposition. For example, the eigenvectors with smaller eigenvalue are smooth -- they exhibit very little change in magnitude between adjacent vertices.
At right is the orthonormal basis for the 2 dimensional eigenspace corresponding to the eigenvalue 1.2679. Notice that there are some patterns in the structure of these vectors, but unfortunately these patterns are hard to interpret. In the next section we will explore an alternative way of spanning this space that lends itself to interpretation.
Where the orthonormal eigenvectors generally lack interpretability, we can instead look to a frame as our dictionary for decomposition. Below is the part of the frame that corresponds to the eigenspace with eigenvalue 1.2679. These three vectors (represented on the permutahedron) span the space, but do not form a basis, as this is a 2-dimensional space and there are 3 frame vectors. However, similarly to a basis, we can still write the whole signal as a linear combination of these frame vectors, by projecting the signal onto the frame vectors and doing some careful re-scaling.
Fortunately, the eigenvectors of the permutahedron correspond to eigenvectors on the Schreier coset graphs. These can be thought of as clustering some subset of the vertices on the permutahedron. The eigenvectors of the Schreier graphs can be "lifted" up to the permutahedron to give us our frame vectors. At right, we can see that the top vertex on the Schreier graph corresponds to all vertices on the permutahedron that have candidates 1 and 2 in positions 3 and 4, as seen on the permutahedron labeled with {12|34}, which is in fact one of our frame vectors. Because this is how we find our frame vectors, we can look back to the structure of the Schreier eigenvector to give interpretability. On this example at left, the red dots, the postive values of large magnitude, correspond to rankings 1 and 2 being together and rankings 3 and 4 being together (the values in the same row of the tableau). The candidates that are placed in these positions are how we label the frame vectors. The {12|34} frame vector has a relatively large coefficient of 239 when the signal is decomposed, we can infer that it is fairly common for candidates 1 and 2 to be in positions 1 and 2 together, or 3 and 4 together.
This kind of computation gets very time consuming very quickly. At 10 candidates, we're looking at vectors of length 10! or 3.62 million, and so our Laplacian matrix has 10! x 10! entries, and computing (and storing) the eigenvectors of this is no small feat. Fortunately, this method of data analysis gives us a couple of avenues for computational shortcuts. First, we can compute the frame vectors by lifting the eigenvectors of the Schreier coset graphs, which are generally much smaller.
We can also notice that the most substantial contributors to our frame decomposition are the frame vectors that correspond to the eigenspaces with low eigenvalue. We can therefore get a good sense of the composition of the data by only looking at and computing these first few frame vectors. Fortunately, these also tend to be the most interpretable. Below are the eigenvectors of the Schreier graph that corresponds to 2 positions clustered together. Looking at the first one, we can see that the positive values correspond to high popularity positions (close to 1 and 2), and the negative values correspond to low popularity positions (close to 9 and 10). The nice structure of these eigenvectors can make it easier to make generalizations about very high-dimensional data without even having to compute the whole frame.