Batch Bayesian Learning of Large-Scale LS-SVMs Based on Low-rank Tensor Networks

More Info
expand_more

Abstract

Least Squares Support Vector Machines (LS-SVMs) are state-of-the-art learning algorithms that have been widely used for pattern recognition. The solution for an LS-SVM is found by solving a system of linear equations, which involves the computational complexity of O(N^3). When datasets get larger, solving LS-SVM problems with standard methods becomes burdensome or even unfeasible. The Tensor Train (TT) decomposition provides an approach to representing data in highly compressed formats without loss of accuracy. By converting vectors and matrices in the TT format, the storage and computational requirements can be greatly reduced. In this thesis, we develop a Bayesian learning method in the TT format to solve large-scale LS-SVM problems, which involves the computation of a matrix inverse. This method allows us to include the information we know about the model parameters in the prior distribution. As a result, we are able to obtain a probability distribution of the parameters, which enables us to construct confidence levels of the predictions. In the numerical experiment, we show that the developed method performs competitively with the current methods.