BC

Authored

1 records found

Multi-model inference on the edge

Scheduling for multi-model execution on resource constrained devices

Deep neural networks (DNNs) are becoming the core components of many applications running on edge devices,especially for image-based analysis, e.g., identifying objects, faces, and genders. While very successful in resource rich environments like the cloud of powerful computers, ...

Contributed

4 records found

Deep neural networks have revolutionized multiple fields within computer science. It is important to have a comprehensive understanding of the memory requirements and performance of deep networks on low-resource systems. While there have been efforts to this end, the effects of s ...
The increasingly growing expansion of the Internet of Things (IoT) along with the convergence of multiple technologies such as the arrival of next generation wireless broadband in 5G, is creating a paradigm shift from cloud computing towards edge computing. Performing tasks norma ...
The execution of multi-inference tasks on low-powered edge devices has become increasingly popular in recent years for adding value to data on-device. The focus of the optimization of such jobs has been on hardware, neural network architectures, and frameworks to reduce execution ...
Edge Devices and Artificial Intelligence are important and ever increasing fields in technology. Yet their combination is lacking because the neural networks used in AI are being made increasingly large and complex while edge devices lack the resources to keep up with these devel ...