Enhancing smart city sustainability with explainable federated learning for vehicular energy control.
Journal:
Scientific reports
Published Date:
Jul 4, 2025
Abstract
The rise of electric and autonomous vehicles in smart cities poses challenges in vehicular energy management due to un-optimized consumption, inefficient grid use, and unpredictable traffic patterns. Traditional centralized machine learning models and cloud-based Energy Management Systems (EMSs) struggle with real-time adaptability, high-dimensional data processing, and data privacy risks. These issues lead to high costs, excessive energy waste, and regulatory concerns. Federated Learning (FL) offers a decentralized approach where multiple edge devices collaboratively train models without sharing raw data. This enhances privacy, reduces communication overhead, and is well-suited for smart city applications. However, FL's black-box nature limits interpretability, reducing trust in AI-driven decisions. Explainable AI (XAI) addresses this by enhancing transparency, interpretability, and regulatory compliance. This research introduces Explainable FL (XFL) for optimizing vehicular energy management in smart cities. The proposed XFL framework integrates distributed learning with explainability techniques for interpretable and accountable decision-making. Using a real-world AEV telemetry dataset of approximately 1,219,567 records with features like speed, energy consumption, and traffic density, it employs a hierarchical FL architecture to ensure secure and decentralized learning. It efficiently analyzes real-time traffic, vehicle energy states, and grid load balancing while preserving privacy. Experimental results show that the proposed Multi-Layer Perceptron (MLP)-based global model achieves superior predictive accuracy, with R² values of 94.73% for energy consumption and 99.83% for traffic density, significantly outperforming previous methods.
Authors
Keywords
No keywords available for this article.