TGEL-transformer: Fusing educational theories with deep learning for interpretable student performance prediction.
Journal:
PloS one
Published Date:
Jun 30, 2025
Abstract
With the integration of educational technology and artificial intelligence, personalized learning has become increasingly important. However, traditional educational data mining methods struggle to effectively integrate heterogeneous feature data and represent complex learning interaction processes, while existing deep learning models lack educational theory guidance, resulting in insufficient interpretability. To address these challenges, this study proposes the TGEL-Transformer (Theory-Guided Educational Learning Transformer) framework, which integrates multiple intelligence theory and social cognitive theory, featuring three innovations: a dual-channel feature processing module that integrates cognitive, affective, and environmental dimension features; a theory-guided four-head attention mechanism that models educational interaction dynamics; and an interpretable prediction layer that provides theoretical support for educational interventions. Using a dataset of 6,608 students, TGEL-Transformer achieved RMSE = 1.87 and R2 = 0.75, outperforming existing methods with statistically significant improvements (p < 0.001) ranging from 1.1% against recent state-of-the-art models to 5.6% against transformer baselines. External validation on cross-cultural data (n = 480) demonstrated strong generalizability with R2 = 0.683. Attention weight analysis revealed that teacher support (0.15), prior knowledge (0.15), and peer interaction (0.13) are key factors influencing learning outcomes. This study provides a theory-guided framework for educational data mining, offering data-driven support for personalized education and advancing intelligent education development.