A Support Tensor Train Machine

Conference Paper (2019)
Author(s)

Cong Chen (The University of Hong Kong)

K. Batselier (TU Delft - Team Jan-Willem van Wingerden, The University of Hong Kong)

Ching Yun Ko (The University of Hong Kong)

Ngai Wong (The University of Hong Kong)

Research Group
Team Jan-Willem van Wingerden
Copyright
© 2019 Cong Chen, K. Batselier, Ching Yun Ko, Ngai Wong
DOI related publication
https://doi.org/10.1109/IJCNN.2019.8851985
More Info
expand_more
Publication Year
2019
Language
English
Copyright
© 2019 Cong Chen, K. Batselier, Ching Yun Ko, Ngai Wong
Research Group
Team Jan-Willem van Wingerden
ISBN (print)
978-1-7281-2009-6
ISBN (electronic)
978-1-7281-1985-4
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one tensor constraint, and STuM is not scalable because of the exponentially sized Tucker core tensor. To overcome these limitations, we introduce a novel and effective support tensor train machine (STTM) by employing a general and scalable tensor train as the parameter model. Experiments validate and confirm the superiority of the STTM over SVM, STM and STuM.

Files

08851985.pdf
(pdf | 0.434 Mb)
- Embargo expired in 30-03-2020
License info not available