AI on Low-Cost Hardware

Software Subgroup

Bachelor Thesis (2023)
Author(s)

H.J. Zheng (TU Delft - Electrical Engineering, Mathematics and Computer Science)

C. van den Berg (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Charlotte Frenkel – Mentor (TU Delft - Electronic Instrumentation)

Justin Dauwels – Mentor (TU Delft - Signal Processing Systems)

F.P. Widdershoven – Mentor (TU Delft - Bio-Electronics)

S. Feld – Graduation committee member (TU Delft - Quantum Circuit Architectures and Technology)

Rob Remis – Graduation committee member (TU Delft - Tera-Hertz Sensing)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2023 Hong Jie Zheng, Christian van den Berg
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 Hong Jie Zheng, Christian van den Berg
Graduation Date
21-06-2023
Awarding Institution
Delft University of Technology
Programme
['Electrical Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Artificial Intelligence has become a dominant part of our lives, however, complex artificial intelligence models tend to use a lot of energy, computationally complex operations, and a lot of memory resources. Therefore, it excluded a whole class of hardware in its applicability. Namely, relatively resource-constrained low-cost hardware. This paper investigates learning methods that are potentially better suited for these types of devices: the forward-forward algorithm and Hebbian learning rules. The results are compared to backpropagation with equivalent network configurations, training hyperparameters and internal data types on different types of low-cost hardware. Backpropagation has consistently outperformed other algorithms in various tests. It exhibits higher accuracy, faster training, and faster inference compared to forward-forward models. However, forward-forward models can come close to matching backpropagation's performance, but they suffer from longer training times and decreased performance with multi-layer networks. Additionally, a poorly trained forward-forward model is sensitive to quantization, resulting in a significant drop in accuracy. On the other hand, forward-forward models offer the benefit of independently training each layer, allowing for more flexibility in optimizing the training process. Hebbian models were not found to be competitive, displaying performance below the required threshold. Smaller models for MCU and FPGA would likely perform even worse.

Files

BAP_Software_final.pdf
(pdf | 0.895 Mb)
License info not available