Mixed Integer (Non-) Linear Programming Formulations of Graph Neural Networks

More Info
expand_more

Abstract

Recently, ReLU neural networks have been modelled as constraints in mixed integer linear programming (MILP) enabling surrogate-based optimisation in various domains as well as efficient solution of machine learning verification problems. However, previous works have been limited to multilayer perceptrons (MLPs). The Graph Convolutional Neural Network (GCN) model and the GraphSAGE model can learn from non-euclidean data structures efficiently. We propose a bilinear formulation for ReLU GCNs and a MILP formulation for ReLU GraphSAGE models. We compare our formulations to a Genetic Algorithm (GA) by comparing solution times and optimality gaps while modelling a dataset of boiling points of different molecules. Our method guarantees to solve optimisation problems with trained GNNs embedded to global optimality. Between our two formulations the GraphSAGE neural network achieves similar model accuracy, and achieves faster solving times when embedded as a surrogate model in an MILP problem. Finally, we present a computer aided molecular design (CAMD) case study where the formulations of the trained GNNs are used to find molecules with optimal boiling points.