Graph-Time Convolutional Neural Network

Learning from Time-Varying Signals defined on Graphs

More Info
expand_more

Abstract

Time-varying network data are essential in several real-world applications, such as temperature forecasting and earthquake classification. Spatial and temporal dependencies characterize these data and, therefore, conventional machine learning tools often fail to learn these joint correlations from data. On the one hand, hybrid models to learn from time-varying network data combine several specialized models able to capture these dependencies separately, thus ignoring their joint spatio-temporal interactions. On the other hand, state-of-the-art approaches for jointly learning time-varying network data do not exploit the useful prior provided by a graph-time product graph. This prior structural knowledge can aid learning and help the models improve their performance. For this reason, we propose a novel neural network architecture to learn from time-varying network data using product graphs and graph convolutions, thus exploiting this prior during learning. In particular, our architecture (i) learns the most suitable graph-time product graph to represent the time-varying network data and model the graph-time interactions; (ii) performs graph convolutions over this product graph to learn the graph-time interactions from data; (iii) employs graph-time pooling to reduce the dimensionality over layers. To the best of our knowledge, no research has yet attempted to capture spatial and temporal dependencies using graph convolutions over product graphs. Results on synthetic and real-world data show that the proposed method is useful in learning from time-varying network data for both regression and classification tasks. For classification, we cure a real-world dataset for earthquake classification and compare the proposed approach with state-of-the-art models. Results indicate that graph-aware models outperform graph-unaware models on this task. For regression, we evaluate the proposed method on two real-world temperature datasets and compare our architecture with graph-aware and graph-unaware models. Results on the smaller dataset show that linear models outperform neural-network models. On the larger dataset, we find that prior knowledge about graph-time interactions seems to be less beneficial in case of abundance of data. For the synthetic experiments, we find that (i) learning the structure of the graph-time product graph from data improves the performance compared to adopting a fixed type of product graph; (ii) learning sparser graph-time product graphs further improves the performance; (iii) the proposed graph-time pooling technique contributes to the model's generalization capabilities.