LAB

Learnable Activation Binarizer for Binary Neural Networks

Conference Paper (2023)
Author(s)

Sieger Falkena (Shell Global Solutions International B.V., Student TU Delft)

H. Jamali Rad (TU Delft - Pattern Recognition and Bioinformatics, Shell Global Solutions International B.V.)

Jan C. van Gemert (TU Delft - Pattern Recognition and Bioinformatics)

Research Group
Pattern Recognition and Bioinformatics
Copyright
© 2023 Sieger Falkena, H. Jamali-Rad, J.C. van Gemert
DOI related publication
https://doi.org/10.1109/WACV56688.2023.00636
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Sieger Falkena, H. Jamali-Rad, J.C. van Gemert
Research Group
Pattern Recognition and Bioinformatics
Pages (from-to)
6414-6423
ISBN (print)
978-1-6654-9347-5
ISBN (electronic)
978-1-6654-9346-8
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Binary Neural Networks (BNNs) are receiving an up-surge of attention for bringing power-hungry deep learning towards edge devices. The traditional wisdom in this space is to employ sign(.) for binarizing feature maps. We argue and illustrate that sign(.) is a uniqueness bottleneck, limiting information propagation throughout the network. To alleviate this, we propose to dispense sign(.), replacing it with a learnable activation binarizer (LAB), allowing the network to learn a fine-grained binarization kernel per layer - as opposed to global thresholding. LAB is a novel universal module that can seamlessly be integrated into existing architectures. To confirm this, we plug it into four seminal BNNs and show a considerable accuracy boost at the cost of tolerable increase in delay and complexity. Finally, we build an end-to-end BNN (coined as LAB-BNN) around LAB, and demonstrate that it achieves competitive performance on par with the state-of-the-art on ImageNet. Our code can be found in our repository: https://github.com/sfalkena/LAB.

Files

LAB_Learnable_Activation_Binar... (pdf)
(pdf | 3.64 Mb)
- Embargo expired in 06-08-2023
License info not available