Towards Energy-Efficient Edge AI: A Novel Architecture for On-Device Large Language Models

By: Professor Kai Hansen, Dr. Lena Popova, Mr. John M. Smith, Dr. Isabella Garcia, Dr. Wei Wang

Published: 2025-12-31

View on arXiv →
#cs.AI

Abstract

We propose a new architectural design that significantly reduces the computational and energy footprint of large language models (LLMs), enabling their efficient deployment on edge devices. This breakthrough facilitates real-time, privacy-preserving AI applications in mobile computing, IoT, and embedded systems, addressing the critical need for sustainable and decentralized AI inference.

FEEDBACK

Projects

No projects yet

Towards Energy-Efficient Edge AI: A Novel Architecture for On-Device Large Language Models | ArXiv Intelligence