Convolutional Graph Neural Networks

Conference Paper (2019)
Author(s)

Fernando Gama (University of Pennsylvania)

Antonio G. Marques (Universidad Rey Juan Carlos)

G. Leus (TU Delft - Signal Processing Systems)

Alejandro Ribeiro (University of Pennsylvania)

Research Group
Signal Processing Systems
Copyright
© 2019 Fernando Gama, Antonio G. Marques, G.J.T. Leus, Alejandro Ribeiro
DOI related publication
https://doi.org/10.1109/IEEECONF44664.2019.9048767
More Info
expand_more
Publication Year
2019
Language
English
Copyright
© 2019 Fernando Gama, Antonio G. Marques, G.J.T. Leus, Alejandro Ribeiro
Research Group
Signal Processing Systems
Pages (from-to)
452-456
ISBN (electronic)
9781728143002
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Convolutional neural networks (CNNs) restrict the, otherwise arbitrary, linear operation of neural networks to be a convolution with a bank of learned filters. This makes them suitable for learning tasks based on data that exhibit the regular structure of time signals and images. The use of convolutions, however, makes them unsuitable for processing data that do not exhibit such a regular structure. Graph signal processing (GSP) has emerged as a powerful alternative to process signals whose irregular structure can be described by a graph. Central to GSP is the notion of graph convolutional filters which can be used to define convolutional graph neural networks (GNNs). In this paper, we show that the graph convolution can be interpreted as either a diffusion or aggregation operation. When combined with nonlinear processing, these different interpretations lead to different generalizations which we term selection and aggregation GNNs. The selection GNN relies on linear combinations of signal diffusions at different resolutions combined with node-wise non-linearities. The aggregation GNN relies on linear combinations of neighborhood averages of different depth. Instead of node-wise nonlinearities, the nonlinearity in aggregation GNNs is pointwise on the different aggregation levels. Both of these models particularize to regular CNNs when applied to time signals but are different when applied to arbitrary graphs. Numerical evaluations show different levels of performance for selection and aggregation GNNs.

Files

Convolutional_Graph_Neural_Net... (pdf)
(pdf | 0.716 Mb)
- Embargo expired in 01-02-2022
License info not available