Back to Search
Start Over
Enhanced AI as a Service at the Edge via Transformer Network
- Publication Year :
- 2025
-
Abstract
- Artificial intelligence (AI) has become a pivotal force in reshaping next generation mobile networks. Edge computing holds promise in enabling AI as a service (AIaaS) for prompt decision-making by offloading deep neural network (DNN) inference tasks to the edge. However, current methodologies exhibit limitations in efficiently offloading the tasks, leading to possible resource underutilization and waste of mobile devices' energy. To tackle these issues, in this paper, we study AIaaS at the edge and propose an efficient offloading mechanism for renowned DNN architectures like ResNet and VGG16. We model the inference tasks as directed acyclic graphs and formulate a problem that aims to minimize the devices' energy consumption while adhering to their latency requirements and accounting for servers' capacity. To effectively solve this problem, we utilize a transformer DNN architecture. By training on historical data, we obtain a feasible and near-optimal solution to the problem. Our findings reveal that the proposed transformer model improves energy efficiency compared to established baseline schemes. Notably, when edge computing resources are limited, our model exhibits an 18\% reduction in energy consumption and significantly decreases task failure compared to existing works.
- Subjects :
- Computer Science - Networking and Internet Architecture
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2501.14967
- Document Type :
- Working Paper