Sustainable and Green Machine Learning: An Energy-Aware Framework for Carbon-Efficient Training and Inference

Main Article Content

Ashmi Saji
Reshmi R
Sivaprakasam M

Abstract

The rapid advancement of machine learning (ML), particularly deep learning, has led to unprecedented computational demands, resulting in significant energy consumption and carbon emissions. Large-scale models such as Transformer architectures and foundation models require extensive training on energy-intensive hardware platforms, contributing to environmental concerns. This paper addresses the research problem: How can ML training and inference processes be optimized to reduce carbon footprint and energy costs without compromising performance? This paper propose an Energy-Aware Sustainable ML Framework (EASMLF) that integrates model compression, adaptive precision scaling, carbon-aware scheduling, and dynamic resource allocation. The framework monitors real-time energy usage and carbon intensity of computing environments to optimize training schedules and inference deployment strategies. Techniques such as pruning, quantization, knowledge distillation, and edge-cloud workload balancing are combined with carbon-intensity-aware optimization to ensure environmentally responsible AI deployment. The proposed framework is evaluated using simulated energy consumption metrics and performance benchmarks. Experimental analysis demonstrates that the approach reduces energy usage by up to 30–40% while maintaining comparable predictive accuracy. The results indicate that sustainable ML practices can significantly lower operational costs and environmental impact. This work contributes to the emerging field of Green AI by providing a structured, scalable, and practical methodology for developing environmentally responsible machine learning systems.

Article Details

Section
Articles