Sustainability of Edge AI at Scale

An empirical study on the sustainability of Edge AI in terms of energy consumption

Master Thesis (2024)
Author(s)

S.R. van der Noort (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Luis Cruz – Mentor (TU Delft - Software Engineering)

Silverio Martínez-Fernández – Mentor (Technical University of Catalonia - BarcelonaTech (UPC))

Arie Van Deursen – Graduation committee member (TU Delft - Software Engineering)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2024
Language
English
Graduation Date
08-05-2024
Awarding Institution
Delft University of Technology
Programme
Computer Science
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Edge AI is an architectural deployment tactic that brings AI models closer to the user and data, relieving internet bandwidth usage and providing low latency and privacy. It remains unclear how this tactic performs at scale, since the distribution overhead could impact the total energy consumption. We identify four architectural scalability factors that could impact the energy consumption of AI: environment, optimisation, throughput, and overhead. The latter consists of downloading, verification, and updating the model over time. This work performs an empirical study on the sustainability of Edge AI compared to Cloud AI at scale in terms of energy consumption. For the environment variable, energy consumption measurement experiments are run on a cloud device and multiple edge devices, various quantized models for optimisation, and various throughput levels per hour. We simulate the distribution overhead and combine the results with the measurements to find the holistic energy efficiency of each architectural strategy. We find that all four variables impact energy consumption, but the main contributors are environment, throughput, and overhead. We observe that Edge AI is most energy-efficient in low-distribution, low-demand scenarios, whereas in high-distribution, high-demand scenarios Cloud AI is better optimised and outperforms Edge AI in energy efficiency. This means that developers depending on their use case and the project’s scalability need to consider these quality attributes for the most sustainable architectural solution.

Files

Sustainability_of_Edge_AI_Thes... (pdf)
(pdf | 3.78 Mb)
- Embargo expired in 08-05-2024
License info not available