Graphs and Networks are important in modeling structure across disciplines. We demonstrate that graph partitioning in a minimum cut sense can be improved with an ensemble of low-fidelity eigenvectors which can outperform a single high-fidelity eigenvector. These ensembles can be computed faster and be more helpful than one high-fidelity eigenvector. Since the individual ensemble members are independent they can be computed in parallel. This effect arises because of the discrepancy between solutions to a continuous relaxation of a discrete optimization problem and the discrete optimization solutions
Additional info to come.