On Sensitive Minima in Margin-Based Deep Distance Learning

Journal Article (2020)
Author(s)

R. Serajeh (K.N. Toosi University of Technology)

S Khademi (TU Delft - Pattern Recognition and Bioinformatics)

Amir Mousavinia (K.N. Toosi University of Technology)

J.C. Van Gemert (TU Delft - Pattern Recognition and Bioinformatics)

Research Group
Pattern Recognition and Bioinformatics
Copyright
© 2020 R. Serajeh, S. Khademi, Amir Mousavinia, J.C. van Gemert
DOI related publication
https://doi.org/10.1109/ACCESS.2020.3013560
More Info
expand_more
Publication Year
2020
Language
English
Copyright
© 2020 R. Serajeh, S. Khademi, Amir Mousavinia, J.C. van Gemert
Research Group
Pattern Recognition and Bioinformatics
Volume number
8
Pages (from-to)
145067-145076
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This paper investigates sensitive minima in popular deep distance learning techniques such as Siamese and Triplet networks. We demonstrate that standard formulations may find solutions that are sensitive to small changes and thus do not generalize well. To alleviate sensitive minima we propose a new approach to regularize margin-based deep distance learning by introducing stochasticity in the loss that encourages robust solutions. Our experimental results on HPatches show promise compared to common regularization techniques including weight decay and dropout, especially for small sample sizes.