For evaluation, I started with making communities of size two. The balance criteria in my partitioner makes sure that roughly sized two partitions are created. For evaluation, I used a method similar to cross validation in machine learning where I randomly remove some percentage of the edges in my network and try to recover them. I started by removing 10% of the edges and re-ran the partitioner to predict the communities. I was only able to recover 6% of the communities on such large number of partitions.
Next, I reduce the number of partitions and tried on slightly larger size communities. Again for communities of size 3, I was only able to recover 16%. I kept increasing the size of communities (decreasing the number of communities) till finally when I ended up with 33 communities, I was able to recover 98% of the original communities. Figure 4 shows the percentage of recovered communities on different number of partitions while figure 5 shows the plot of the same table.
To further evaluate my result, I removed more edges from my network and re-ran the partitioner to generate 33 communities. In different runs, I removed 10%, 20%, 30%, 40%, and 50% of the edges and was still able to recover up to 96% of the original communities. Figure 6 summarizes these results.
In summary, the partitioner is not an accurate tool for small sized communities but it can predict larger sized communities with good accuracy.