Work place: Coimbra Institute of Engineering, Polytechnic University of Coimbra, Portugal
E-mail: arborges@isec.pt
Website: https://orcid.org/0000-0003-3167-8714
Research Interests:
Biography
Ana Rosa P. Borges, PhD, is a Professor at the Institute of Engineering of the Polytechnic University of Coimbra, where she has been teaching since 1989. She received the Computer Engineering degree (1989), the MSc degree in System and Information Technology (1995), and the PhD degree in Electrical Engineering, with a specialization in Informatics (2005), from Coimbra University, Portugal. Her current research areas include multiple objective programming optimization, decision support systems, fuzzy sets, business intelligence, and programming education.
By Joao P. J. Pires Jorge F. R. Bernardino Anabela J. Gomes Ana Rosa P. Borges Fernanda M. R. Brito R. Correia
DOI: https://doi.org/10.5815/ijmecs.2026.01.01, Pub. Date: 8 Feb. 2026
Analyzing student performance in Introductory Programming courses in Higher Education is crucial for early intervention and improved academic outcomes. This study investigates the predictive potential of a Programming Cognitive Test in assessing student aptitude and forecasting success in an Introductory Programming course. Data was collected from 180 students, both freshmen and repeating students, enrolled in a Computer Engineering program. The dataset includes the Programming Cognitive test results, background variables, and final course outcomes. To identify latent patterns within the data, the K-means clustering algorithm was applied, focusing particularly on freshmen students to avoid bias from prior programming exposure. In parallel, six Machine Learning classification models were developed and evaluated to predict students’ likelihood of passing the Introductory Programming course: Decision Tree, K-Nearest Neighbor, Naïve Bayes, Random Forest, Support Vector Machine, and Deep Neural Network. Among these, the Deep Neural Network model demonstrated superior performance, achieving the highest values across key metrics—Accuracy, Recall, and F1-score—effectively identifying students at risk of underperformance. These findings underscore the potential of this model in educational settings, where timely and accurate detection of struggling students can enable proactive, targeted interventions.
This work contributes to the field by combining cognitive assessment with predictive modelling, offering a novel approach to forecasting programming performance. The models and methods described are adaptable for broader educational applications and may assist educators in refining teaching strategies and improving retention and success rates in programming education.
Subscribe to receive issue release notifications and newsletters from MECS Press journals