Comparative Evaluation of Machine Learning Inference Machines on Edge-class Devices

Conference Paper (2021)
Authors

Petros Amanatidis (International Hellenic University)

George Iosifidis (TU Delft - Embedded Systems)

Dimitris Karampatzakis (International Hellenic University)

Research Group
Embedded Systems
Copyright
© 2021 Petros Amanatidis, G. Iosifidis, Dimitris Karampatzakis
To reference this document use:
https://doi.org/10.1145/3503823.3503843
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Petros Amanatidis, G. Iosifidis, Dimitris Karampatzakis
Research Group
Embedded Systems
Pages (from-to)
102-106
ISBN (print)
978-1-4503-9555-7
DOI:
https://doi.org/10.1145/3503823.3503843
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Computer science and engineering have evolved rapidly over the last decade offering innovative Machine Learning frameworks and high-performance hardware devices. Executing data analytics at the edge promises to transform the mobile computing paradigm by bringing intelligence next to the end user. However, it remains an open question to explore if, and to what extent, today's Edge-class devices can support ML frameworks and which is the best configuration for efficient task execution. This paper provides a comparative evaluation of Machine Learning inference machines on Edge-class compute engines. The testbed consists of two hardware compute engines (i.e., CPU-based Raspberry Pi 4 and Google Edge TPU accelerator) and two inference machines (i.e., TensorFlow-Lite and Arm NN). Through an extensive set of experiments in our bespoke testbed, we compared three setups using TensorFlow-Lite ML framework, in terms of accuracy, execution time, and energy efficiency. Based on the results, an optimized configuration of the workload parameters can increase accuracy by 10%, and in addition, the class of the Edge compute engine in combination with the inference machine affects execution time by 86% and power consumption by almost 145%.

Files

3503823.3503843.pdf
(pdf | 0.678 Mb)
- Embargo expired in 26-05-2022
License info not available