An Experimental Look at the Stability of Graph Neural Networks against Topological Perturbations

The Relationship Between Graph Properties and Stability

More Info
expand_more

Abstract

GNNs are a powerful tool for learning tasks on data with a graph structure. However, the topology of the graph in which GNNs are trained is often subject to change due to random, external perturbations. This research investigates the relationship between 5 topological properties of graphs (assortativity, density, edge connectivity, closeness centrality, diameter) and how stable GNNs trained on graphs with different topological properties are against different perturbations. The analysis is conducted by first synthetically generating graphs with different topological properties and training a GNN using the generated graphs. The synthetic graphs are then perturbed, and the relative change in the GNNs' output is measured. These results are further supported by conducting the same process on three popular GNN datasets: Cora, CiteSeer and PubMed citations. Finally, relationships between the graph properties under investigation and GNN stability are inferred using the results obtained from both synthetic and real-world datasets.