Energy-Aware Data-Driven Model Selection in LLM-Orchestrated AI Systems
By: Daria Smirnova, Hamid Nasiri, Marta Adamska, Zhengxin Yu, Peter Garraghan
Published: 2025-12-28
View on arXiv →#cs.AI
Abstract
This paper addresses energy consumption in AI systems orchestrated by Large Language Models (LLMs) by proposing an energy-aware, data-driven model selection strategy. This research is critical for developing more sustainable and efficient AI solutions, especially as LLM usage grows, with direct real-world impact on green computing and operational costs.