Back to Search
Start Over
On-Device LLMs for SMEs: Challenges and Opportunities
- Publication Year :
- 2024
-
Abstract
- This paper presents a systematic review of the infrastructure requirements for deploying Large Language Models (LLMs) on-device within the context of small and medium-sized enterprises (SMEs), focusing on both hardware and software perspectives. From the hardware viewpoint, we discuss the utilization of processing units like GPUs and TPUs, efficient memory and storage solutions, and strategies for effective deployment, addressing the challenges of limited computational resources typical in SME settings. From the software perspective, we explore framework compatibility, operating system optimization, and the use of specialized libraries tailored for resource-constrained environments. The review is structured to first identify the unique challenges faced by SMEs in deploying LLMs on-device, followed by an exploration of the opportunities that both hardware innovations and software adaptations offer to overcome these obstacles. Such a structured review provides practical insights, contributing significantly to the community by enhancing the technological resilience of SMEs in integrating LLMs.<br />Comment: 9 pages, 1 figure. The work is supported by the SIT-NVIDIA Joint AI Centre
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2410.16070
- Document Type :
- Working Paper