Tensor Network B-splines for high-dimensional function approximation

More Info
expand_more

Abstract

B-splines are basis functions for the spline function space and are extensively used in applications requiring function approximation. The generalization of B-splines to multiple dimensions is done through tensor products of their univariate basis functions. The number of basis functions and weights that define a multivariate B-spline surface, therefore, increase exponentially with the number of dimensions, i.e. B-splines suffer from the curse of dimensionality. Tensor network theory provides a mathematical framework to alleviate the curse of dimensionality of B-splines by representing the high-dimensional weight tensor as a low-rank approximation. This thesis presents the Tensor Network B-spline (TNBS) model, along with an optimization algorithm that allows the estimation of the exponentially large weight tensor directly from data, without ever needing to explicitly construct it. P-spline regularization is incorporated to induce additional smoothness and ensure the B-spline hypersurface generalizes well across the high-volume domain. The developed TNBS framework opens doors for the application of B-spline theory in high-dimensional function approximation. This thesis provides an overview of both B-spline and tensor network theory, then uses it to derive the TNBS model. We validate the effectiveness of the model through an application in black-box nonlinear system identification using a NARX approach. An open-source MATLAB implementation of TNBS is made available on GitHub. The work is concluded with some recommendations for further research on this topic.