Searched for: +
(1 - 4 of 4)
document
Zhu, B. (author), Hofstee, H.P. (author), Lee, Jinho (author), Al-Ars, Z. (author)
Attention mechanism has been regarded as an advanced technique to capture long-range feature interactions and to boost the representation capability for convolutional neural networks. However, we found two ignored problems in current attentional activations-based models: the approximation problem and the insufficient capacity problem of the...
conference paper 2021
document
Zhu, B. (author), Al-Ars, Z. (author), Hofstee, H.P. (author)
High-level feature maps of Convolutional Neural Networks are computed by reusing their corresponding low-level feature maps, which brings into full play feature reuse to improve the computational efficiency. This form of feature reuse is referred to as feature reuse between convolutional layers. The second type of feature reuse is referred to...
journal article 2020
document
Zhu, B. (author), Al-Ars, Z. (author), Pan, W. (author)
Binary Convolutional Neural Networks (CNNs) can significantly reduce the number of arithmetic operations and the size of memory storage, which makes the deployment of CNNs on mobile or embedded systems more promising. However, the accuracy degradation of single and multiple binary CNNs is unacceptable for modern architectures and large scale...
book chapter 2020
document
Zhu, B. (author), Al-Ars, Z. (author), Hofstee, H.P. (author)
Binary Convolutional Neural Networks (CNNs) have significantly reduced the number of arithmetic operations and the size of memory storage needed for CNNs, which makes their deployment on mobile and embedded systems more feasible. However, after binarization, the CNN architecture has to be redesigned and refined significantly due to two reasons:...
conference paper 2020
Searched for: +
(1 - 4 of 4)