Searched for: +
(1 - 4 of 4)
document
Chen, Cong (author), Batselier, K. (author), Yu, Wenjian (author), Wong, Ngai (author)
Tensor, a multi-dimensional data structure, has been exploited recently in the machine learning community. Traditional machine learning approaches are vector- or matrix-based, and cannot handle tensorial data directly. In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional...
journal article 2022
document
Ko, Ching Yun (author), Chen, Cong (author), He, Zhuolun (author), Zhang, Yuke (author), Batselier, K. (author), Wong, Ngai (author)
Sum-product networks (SPNs) constitute an emerging class of neural networks with clear probabilistic semantics and superior inference speed over other graphical models. This brief reveals an important connection between SPNs and tensor trains (TTs), leading to a new canonical form which we call tensor SPNs (tSPNs). Specifically, we...
journal article 2020
document
Chen, Cong (author), Batselier, K. (author), Ko, Ching Yun (author), Wong, Ngai (author)
A restricted Boltzmann machine (RBM) learns a probability distribution over its input samples and has numerous uses like dimensionality reduction, classification and generative modeling. Conventional RBMs accept vectorized data that dismiss potentially important structural information in the original tensor (multi-way) input. Matrix-variate...
conference paper 2019
document
Chen, Cong (author), Batselier, K. (author), Ko, Ching Yun (author), Wong, Ngai (author)
There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one...
conference paper 2019
Searched for: +
(1 - 4 of 4)