Work place: Department of Computer Science, American International University-Bangladesh (AIUB), Dhaka, 1229, Bangladesh
E-mail: nusrat.trisna@aiub.edu
Website: https://orcid.org/0009-0003-6827-6791
Research Interests:
Biography
Nusrat Jahan Trisna currently holds the position of Lecturer in the Department of Computer Science at American International University-Bangladesh. She graduated with a bachelor's degree in computer science and engineering from American International University in 2020. Continuing her academic pursuits, she obtained her master's degree in computer science with a specialization in information and Database Management from American International University-Bangladesh in 2023. Her research focuses on big data, particularly Database Management and Computer Vision.
By Saikat Baul Md. Ratan Rana Nusrat Jahan Trisna Farzana Bente Alam
DOI: https://doi.org/10.5815/ijem.2025.05.04, Pub. Date: 8 Oct. 2025
Recently, accidents caused by drowsy driving have emerged as a significant concern for society, often resulting in severe consequences for victims, including fatalities. Lives are the most valuable asset in the world and deserve greater safety on the road. Given the urgency, it is essential to develop an effective drowsiness detection system that can identify drowsiness in drivers and take necessary steps to alert them before any unfortunate incident occurs. Dlib and MediaPipe Face Mesh have shown promising results. However, most previous studies have relied solely on blinking patterns to detect drowsiness, while some have combined blinking with yawning patterns. The proposed research focuses on creating a straightforward drowsy driver detection system using Python, incorporating OpenCV and MediaPipe Face Mesh. The shape detector provided by MediaPipe Face Mesh assists in finding critical facial coordinates, allowing for the calculation of the driver's eye aspect ratio, mouth aspect ratio, and head tilt angle from video input. The system's performance evaluation utilizes standardized public datasets and real-time video footage. Notably, in both scenarios, the system exhibited remarkable recognition accuracy. A performance comparison was undertaken, demonstrating the proposed method's effectiveness. The proposed system has the potential to enhance travel safety and efficiency when integrated with vehicles' supplementary safety features and automation technology.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals