MERACLE

Constructive Layer-Wise Conversion of a Tensor Train into a MERA

Journal Article (2021)
Author(s)

Kim Batselier (TU Delft - Team Kim Batselier)

Andrzej Cichocki (Skolkovo Institute of Science and Technology)

Ngai Wong (The University of Hong Kong)

Research Group
Team Kim Batselier
Copyright
© 2021 K. Batselier, Andrzej Cichocki, Ngai Wong
DOI related publication
https://doi.org/10.1007/s42967-020-00090-6
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 K. Batselier, Andrzej Cichocki, Ngai Wong
Research Group
Team Kim Batselier
Issue number
2
Volume number
3
Pages (from-to)
257-279
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

In this article, two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz (MERA). The Tucker core tensor is never explicitly computed but stored as a tensor train instead, resulting in both computationally and storage efficient algorithms. Both the multilinear Tucker-ranks as well as the MERA-ranks are automatically determined by the algorithm for a given upper bound on the relative approximation error. In addition, an iterative algorithm with low computational complexity based on solving an orthogonal Procrustes problem is proposed for the first time to retrieve optimal rank-lowering disentangler tensors, which are a crucial component in the construction of a low-rank MERA. Numerical experiments demonstrate the effectiveness of the proposed algorithms together with the potential storage benefit of a low-rank MERA over a tensor train.