FocusTrack: Real-Time Student Engagement Monitoring via Facial Landmark Analysis

PDF (2097KB), PP.76-92

Views: 0 Downloads: 0

Author(s)

Vidhya K. 1 T. M. Thiyagu 2 Antony Taurshia 1 Jenefa A. 1,*

1. Karunya Institute of Tech. and Sci., Coimbatore, India

2. Vel Tech Rangarajan Dr. Sagunthala RD Institute of Science and Technology, Avadi, Chennai, India

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2026.01.05

Received: 30 Nov. 2024 / Revised: 18 May 2025 / Accepted: 28 Jul. 2025 / Published: 8 Feb. 2026

Index Terms

Engagement, Facial Landmarks, FocusTrack, Real-Time Monitoring, Segmentation

Abstract

Increased focus on personalized learning has highlighted the need for real-time monitoring of student engagement. Understanding attention levels during instruction helps improve teaching effectiveness and learning outcomes. However, existing methods rely on manual observation or periodic assessments, which are subjective and lack consistency. These approaches fail to capture moment-to-moment variations in engagement. Conventional systems using basic video tracking or facial detection lack robustness in variable lighting, head pose changes, and classroom dynamics. They are also limited in providing timely, actionable insights. This study presents FocusTrack, a real-time engagement monitoring system that utilizes facial cues and behavioral indicators for accurate classification. The system processes video frames locally and provides continuous engagement feedback. Two annotated datasets—EngageFace (150 hours, classroom-based) and StudyFocus (90 hours, home-based)—were developed to capture diverse learning scenarios. Each dataset includes labels for gaze direction, drowsiness, and facial cues. Experimental results show accuracy levels of 97.0% and 95.5% across the two datasets, outperforming conventional models. The system also maintains latency under 60 ms on CPU- based setups. FocusTrack offers a scalable, privacy-aware solution for continuous engagement monitoring in real-world educational environments. It provides instructors with objective feedback to adapt teaching strategies dynamically.

Cite This Paper

Vidhya K., T. M. Thiyagu, Antony Taurshia, Jenefa A., "FocusTrack: Real-Time Student Engagement Monitoring via Facial Landmark Analysis", International Journal of Modern Education and Computer Science(IJMECS), Vol.18, No.1, pp. 76-92, 2026. DOI:10.5815/ijmecs.2026.01.05

Reference

[1]C. Pabba and P. Kumar, “An intelligent system for monitoring students’ engagement in large classroom teaching through facial expression recognition,” Educational Systems Journal, October 2021. DOI:10.1111/exsy.12839
[2]P. B. Jha, A. Basnet, B. Pokhrel, B. Pokhrel, G. K. Thakur, and S. Chhetri, “An Automated Attendance System Using Facial Detection and Recognition Technology,” Apex Journal of Business and Management, vol. 1, no. 1, pp. 103-120, 2023.
[3]G. Tonguc¸ and B. Ozaydın Ozkara, “Automatic recognition of student emotions from facial expressions during a lecture,” Computers & Education, 2019. DOI:10.1016/j.compedu.2019.103797
[4]O. Mohamad Nezami, M. Dras, L. Hamey, D. Richards, S. Wan, C. Paris, “Automatic Recognition of Student Engagement Using Deep Learning and Facial Expression,” in Machine Learning and Knowledge Discovery in Databases, ECML PKDD 2019. DOI:10.48550/arXiv.1808.02324
[5]S. Dev and T. Patnaik, “Student Attendance System using Face Recognition,” in Proceedings of the 2020 In- ternational Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 2020, pp. 90-96. DOI:10.1109/ICOSEC49089.2020.9215441
[6]A. V. Savchenko, L. V. Savchenko, and I. Makarov, “Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network,” IEEE Transactions on Affective Computing, vol. 13, no. 4, pp. 2132-2143, Oct.-Dec. 2022. DOI:10.1109/TAFFC.2022.3188390
[7]S. Gupta, P. Kumar, and R. K. Tekchandani, “Facial emotion recognition based real-time learner engagement detec- tion system in online learning context using deep learning models,” Multimedia Tools and Applications, vol. 82, pp. 11365–11394, 2023. DOI:10.1007/s11042-022-13558-9
[8]S. Gupta, P. Kumar, and R. Tekchandani, “A multimodal facial cues based engagement detection system in e- learning context using deep learning approach,” Multimedia Tools and Applications, vol. 82, pp. 28589–28615, 2023. DOI:10.1007/s11042-023-14392-3 
[9]M. H. M. Kamil, N. Zaini, L. Mazalan, et al., “Online attendance system based on facial recognition with face mask detection,” Multimedia Tools and Applications, vol. 82, pp. 34437–34457, 2023. DOI:10.1007/s11042-023-14842-y
[10]T. Potluri, V. S, and V. K. K. K, “An automated online proctoring system using attentive-net to assess student mischievous behavior,” Multimedia Tools and Applications, vol. 82, pp. 30375–30404, 2023. DOI:10.1007/s11042- 023-14604-w
[11]R. Abdulkader, F. T. M. Ayasrah, V. R. G. Nallagattla, K. K. Hiran, P. Dadheech, V. Balasubramaniam, S. Sengan, “Optimizing student engagement in edge-based online learning with advanced analytics,” Array, 2023. DOI:10.1016/j.array.2023.100301
[12]Y. Li, X. Qi, A. K. J. Saudagar, A. M. Badshah, K. Muhammad, and S. Liu, “Student behavior recognition for interaction detection in the classroom environment,” Image and Vision Computing, 2023. DOI:10.1016/j.imavis.2023.104726
[13]W. E. Villegas-Ch, J. Garc´ıa-Ortiz, and S. Sa´nchez-Viteri, “Identification of Emotions From Facial Gestures in a Teaching Environment With the Use of Machine Learning Techniques,” IEEE Access, vol. 11, pp. 38010-38022, 2023. DOI:10.1109/ACCESS.2023.3267007
[14]L. Sharara et al., ”A Real-Time Automotive Safety System Based on Advanced AI Facial Detection Algorithms,” IEEE Transactions on Intelligent Vehicles, vol. 9, no. 6, pp. 5080-5100, June 2024. DOI:10.1109/TIV.2023.3272304
[15]M. N. Hasnine, H. T. Nguyen, T. T. T. Tran, H. T. T. Bui, G. Akc¸apınar, and H. Ueda, ”A Real-Time Learning Analytics Dashboard for Automatic Detection of Online Learners’ Affective States,” Sensors, vol. 23, no. 9, article 4243, 2023. DOI:10.3390/s23094243
[16]B. Fang, X. Li, G. Han, and J. He, “Facial Expression Recognition in Educational Research From the Per- spective of Machine Learning:  A Systematic Review,” IEEE Access, vol. 11, pp. 112060-112074, 2023. DOI:10.1109/ACCESS.2023.3322454
[17]H. Farman, A. Sedik, M. M. Nasralla, and M. A. Esmail, “Facial Emotion Recognition in Smart Education Systems: A Review,” presented at the 2023 IEEE International Smart Cities Conference (ISC2), Bucharest, Romania, 2023, pp. 1-9. DOI:10.1109/ISC257844.2023.10293353
[18]J. Xin-Ying Lek and J. Teo, “Academic Emotion Classification Using FER: A Systematic Review,” 2023. DOI:10.1155/2023/9790005.
[19]Aanya Khan, Esther Alice Mathew, Junia Sam Dani, Giftlin Olivia, and T. S. Shivani. ”Enhancing human behaviour analysis through multi-embedded learning for emotion recognition in images.” In 2023 7th International Conference on Intelligent Computing and Control Systems (ICICCS), pp. 331-336. IEEE, 2023.
[20]Abdelhadi, Z., Naseif, M., Alhejali, W., & Elhayek, A. (2025, January). TeacherEye: An AI-Powered System for Monitoring Student Engagement in Online Education. In 2025 22nd International Learning and Technology Conference (L&T) (Vol. 22, pp. 25-30). IEEE.