Nonlinear State-Space Generalizations of Graph Convolutional Neural Networks

Conference Paper (2021)
Author(s)

Luana Ruiz (University of Pennsylvania)

Fernando Gama (University of California)

Alejandro Ribeiro (University of Pennsylvania)

E. Isufi (TU Delft - Multimedia Computing)

Multimedia Computing
Copyright
© 2021 Luana Ruiz, Fernando Gama, Alejandro Ribeiro, E. Isufi
DOI related publication
https://doi.org/10.1109/ICASSP39728.2021.9414672
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Luana Ruiz, Fernando Gama, Alejandro Ribeiro, E. Isufi
Multimedia Computing
Pages (from-to)
5265-5269
ISBN (print)
978-1-7281-7606-2
ISBN (electronic)
978-1-7281-7605-5
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Graph convolutional neural networks (GCNNs) learn compositional representations from network data by nesting linear graph convolutions into nonlinearities. In this work, we approach GCNNs from a state-space perspective revealing that the graph convolutional module is a minimalistic linear state-space model, in which the state update matrix is the graph shift operator. We show that this state update may be problematic because it is nonparametric, and depending on the graph spectrum it may explode or vanish. Therefore, the GCNN has to trade its degrees of freedom between extracting features from data and handling these instabilities. To improve such trade-off, we propose a novel family of nodal aggregation rules that aggregate node features within a layer in a nonlinear state-space parametric fashion allowing for a better trade-off. We develop two architectures within this family inspired by the recurrence with and without nodal gating mechanisms. The proposed solutions generalize the GCNN and provide an additional handle to control the state update and learn from the data. Numerical results on source localization and authorship attribution show the superiority of the nonlinear state-space generalization models over the baseline GCNN.

Files

00_rsn_ICASSP21.pdf
(pdf | 0.411 Mb)
License info not available