Alternating linear scheme in a bayesian framework for low-rank tensor approximation

Journal Article (2022)
Author(s)

C.M. Menzen (TU Delft - Team Manon Kok)

M. Kok (TU Delft - Team Manon Kok)

K. Batselier (TU Delft - Team Kim Batselier)

Research Group
Team Manon Kok
Copyright
© 2022 C.M. Menzen, M. Kok, K. Batselier
DOI related publication
https://doi.org/10.1137/20M1386414
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 C.M. Menzen, M. Kok, K. Batselier
Research Group
Team Manon Kok
Issue number
3
Volume number
44
Pages (from-to)
A1116-A1144
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition. This is useful because complexity can be significantly reduced and the treatment of large-scale data sets can be facilitated. In this paper, we find a low-rank representation for a given tensor by solving a Bayesian inference problem. This is achieved by dividing the overall inference problem into subproblems where we sequentially infer the posterior distribution of one tensor decomposition component at a time. This leads to a probabilistic interpretation of the well-known iterative algorithm alternating linear scheme (ALS). In this way, the consideration of measurement noise is enabled, as well as the incorporation of application-specific prior knowledge and the uncertainty quantification of the low-rank tensor estimate. To compute the low-rank tensor estimate from the posterior distributions of the tensor decomposition components, we present an algorithm that performs the unscented transform in tensor train format.

Files

20m1386414.pdf
(pdf | 2.55 Mb)
License info not available