Robust Gram Embeddings

Conference Paper (2016)
Author(s)

Taygun Kekec (TU Delft - Pattern Recognition and Bioinformatics)

David Tax (TU Delft - Pattern Recognition and Bioinformatics)

Research Group
Pattern Recognition and Bioinformatics
More Info
expand_more
Publication Year
2016
Language
English
Research Group
Pattern Recognition and Bioinformatics
Pages (from-to)
1060-1065

Abstract

Word embedding models learn vectorial word representations that can be used in a variety of NLP applications. When training data is scarce, these models risk losing their generalization abilities due to the complexity of the models and the overfitting to finite data. We propose a regularized embedding formulation,
called Robust Gram (RG), which penalizes overfitting by suppressing the disparity
between target and context embeddings. Our experimental analysis shows that the RG model trained on small datasets generalizes better compared to alternatives, is more robust to variations in the training set, and correlates
well to human similarities in a set of word similarity tasks.

No files available

Metadata only record. There are no files for this record.