Searched for: subject%3A%22tensor%255C-trains%22
(1 - 15 of 15)
document
Chimmalgi, S. (author), Wahls, S. (author)
Riemann theta functions play a crucial role in the field of nonlinear Fourier analysis, where they are used to realize inverse nonlinear Fourier transforms for periodic signals. The practical applicability of this approach has however been limited since Riemann theta functions are multi-dimensional Fourier series whose computation suffers...
journal article 2023
document
Li, Z. (author), Rajan, R.T. (author)
Multiagent systems have been widely researched and deployed in the industry for their potential to collectively achieve goals by distributing tasks to individual agents [1]–[4]. Formation control, one of the many applications of multiagent systems, aims at steering agents into a stable geometric pattern in space [3], [4]. There has been a...
conference paper 2022
document
de Rooij, S.J.S. (author), Hunyadi, Borbala (author)
Epilepsy is one of the most common neurological conditions, affecting nearly 1% of the global population. It is defined by the seemingly random occurrence of spontaneous seizures. Anti-epileptic drugs provide adequate treatment for about 70% of patients. The remaining 30%, on the other hand, continue to have seizures, which has a significant...
conference paper 2022
document
Goedemondt, K.S. (author), Yang, J. (author), Wang, Q. (author)
Touchscreens and buttons had became a medium for virus transmission during the COVID-19 pandemic. We have seen in our daily life that people use tissues and keys to press buttons inside elevators, on public screens, etc. In the post- COVID world, touch-free interaction with public touchscreens and buttons may become more popular. Motivated by...
conference paper 2022
document
Li, Lingjie (author), Yu, Wenjian (author), Batselier, K. (author)
In recent years, the application of tensors has become more widespread in fields that involve data analytics and numerical computation. Due to the explosive growth of data, low-rank tensor decompositions have become a powerful tool to harness the notorious curse of dimensionality. The main forms of tensor decomposition include CP...
journal article 2022
document
Menzen, C.M. (author), Kok, M. (author), Batselier, K. (author)
Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition. This is useful because complexity can be significantly reduced and the treatment of large-scale data sets can be facilitated. In this paper, we find a low-rank representation for a given tensor by solving a...
journal article 2022
document
van Klaveren, Pieter (author)
Online video completion aims to complete corrupted frames of a video in an online fashion. Consider a surveillance camera that suddenly outputs corrupted data, where up to 95% of the pixels per frame are corrupted. Real time video completion and correction is often desirable in such scenarios. Therefore, this thesis improves the Tensor-Networked...
master thesis 2021
document
Batselier, K. (author), Cichocki, Andrzej (author), Wong, Ngai (author)
In this article, two new algorithms are presented that convert a given data tensor train into either a Tucker decomposition with orthogonal matrix factors or a multi-scale entanglement renormalization ansatz (MERA). The Tucker core tensor is never explicitly computed but stored as a tensor train instead, resulting in both computationally and...
journal article 2021
document
de Rooij, S.J.S. (author)
In streaming video completion one aims to fill in missing pixels in streaming video data. This is a problem that naturally arises in the context of surveillance videos. Since these are streaming videos, they must be completed online and in real-time. This makes the streaming video completion problem significantly more difficult than the related...
master thesis 2020
document
Ko, Ching Yun (author), Batselier, K. (author), Daniel, Luca (author), Yu, Wenjian (author), Wong, Ngai (author)
We propose a new tensor completion method based on tensor trains. The to-be-completed tensor is modeled as a low-rank tensor train, where we use the known tensor entries and their coordinates to update the tensor train. A novel tensor train initialization procedure is proposed specifically for image and video completion, which is demonstrated...
journal article 2020
document
Ko, Ching Yun (author), Chen, Cong (author), He, Zhuolun (author), Zhang, Yuke (author), Batselier, K. (author), Wong, Ngai (author)
Sum-product networks (SPNs) constitute an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. This brief reveals an important connection between SPNs and tensor trains (TTs), leading to a new canonical form which we call tensor SPNs (tSPNs). Specifically, we...
journal article 2020
document
Chen, Cong (author), Batselier, K. (author), Ko, Ching Yun (author), Wong, Ngai (author)
There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one...
conference paper 2019
document
Gedon, Daniel (author), Piscaer, P.J. (author), Batselier, K. (author), Smith, C.S. (author), Verhaegen, M.H.G. (author)
An extension of the Tensor Network (TN) Kalman filter [2], [3] for large scale LTI systems is presented in this paper. The TN Kalman filter can handle exponentially large state vectors without constructing them explicitly. In order to have efficient algebraic operations, a low TN rank is required. We exploit the possibility to approximate the...
conference paper 2019
document
Gunes, Bilal (author)
doctoral thesis 2018
document
Gunes, Bilal (author), van Wingerden, J.W. (author), Verhaegen, M.H.G. (author)
In this paper, we present a novel multiple input multiple output (MIMO) linear parameter varying (LPV) state-space refinement system identification algorithm that uses tensor networks. Its novelty mainly lies in representing the LPV sub-Markov parameters, data and state-revealing matrix condensely and in exact manner using specific tensor...
journal article 2018
Searched for: subject%3A%22tensor%255C-trains%22
(1 - 15 of 15)