Fairness Over Time: A Nationwide Study of Evolving Bias in Dropout Prediction

Type
Publication
Proceedings of the Twelfth ACM Conference on Learning@Scale (L@S ’25)

Abstract: “The use of student learning data to predict educational outcomes has been widely studied, both in terms of model performance and fairness. An example of these predictive models is the use of Early Warning Systems (EWS), which identify students at risk of dropping out. They can be used continuously to make predictions, from the time of first enrollment until years into a degree program, to provide timely support. However, changes to student composition and their learning trajectories can alter the performance and group fairness of predictions over time. Using a nationwide higher education dataset, we examine changes in the fairness of a dropout prediction model at various points along the academic calendar. Our findings reveal that fairness is not static but evolves over time: the largest differences in AUC occur at 12 months after enrollment, a common evaluation point for dropout EWS. We discuss implications for the continued assessment of fairness in predictive algorithms in education.”