Searched for: subject%3A%22tensor%255C%252Btrains%22
(1 - 7 of 7)
document
Li, Lingjie (author), Yu, Wenjian (author), Batselier, K. (author)
In recent years, the application of tensors has become more widespread in fields that involve data analytics and numerical computation. Due to the explosive growth of data, low-rank tensor decompositions have become a powerful tool to harness the notorious curse of dimensionality. The main forms of tensor decomposition include CP...
journal article 2022
document
Menzen, C.M. (author), Kok, M. (author), Batselier, K. (author)
Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition. This is useful because complexity can be significantly reduced and the treatment of large-scale data sets can be facilitated. In this paper, we find a low-rank representation for a given tensor by solving a...
journal article 2022
document
Batselier, K. (author), Cichocki, Andrzej (author), Wong, Ngai (author)
In this article, two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz (MERA). The Tucker core tensor is never explicitly computed but stored as a tensor train instead, resulting in both computationally and...
journal article 2021
document
Ko, Ching Yun (author), Batselier, K. (author), Daniel, Luca (author), Yu, Wenjian (author), Wong, Ngai (author)
We propose a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated...
journal article 2020
document
Ko, Ching Yun (author), Chen, Cong (author), He, Zhuolun (author), Zhang, Yuke (author), Batselier, K. (author), Wong, Ngai (author)
Sum-product networks (SPNs) constitute an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. This brief reveals an important connection between SPNs and tensor trains (TTs), leading to a new canonical form which we call tensor SPNs (tSPNs). Specifically, we...
journal article 2020
document
Chen, Cong (author), Batselier, K. (author), Ko, Ching Yun (author), Wong, Ngai (author)
There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one...
conference paper 2019
document
Gedon, Daniel (author), Piscaer, P.J. (author), Batselier, K. (author), Smith, C.S. (author), Verhaegen, M.H.G. (author)
An extension of the Tensor Network (TN) Kalman filter [2], [3] for large scale LTI systems is presented in this paper. The TN Kalman filter can handle exponentially large state vectors without constructing them explicitly. In order to have efficient algebraic operations, a low TN rank is required. We exploit the possibility to approximate the...
conference paper 2019
Searched for: subject%3A%22tensor%255C%252Btrains%22
(1 - 7 of 7)