Multimodal Emotion Recognition Using EEG and Facial Expressions with Potential Applications in Driver Monitoring

PDF (1035KB), PP.163-176

Views: 0 Downloads: 0

Author(s)

Ch. Raga Madhuri 1 Anideep Seelam 2,* Fatima Farheen Shaik 2 Aadi Siva Kartheek Pamarthi 2 Mohan Kireeti Krovi 2

1. Department of Computer Science and Engineering, Faculty of Engineering, Siddhartha Academy of Higher Education, Deemed to be University, Vijayawada -520007, Andhra Pradesh, India

2. Department of Computer Science and Engineering, Student of Engineering, Velagapudi Ramakrishna Siddhartha Engineering College, Vijayawada -520007, Andhra Pradesh, India

* Corresponding author.

DOI: https://doi.org/10.5815/ijigsp.2026.01.10

Received: 10 Jun. 2025 / Revised: 26 Aug. 2025 / Accepted: 18 Dec. 2025 / Published: 8 Feb. 2026

Index Terms

Advanced Driver Safety, Electroencephalography (EEG), Cognitive Monitoring, LSTM, CNN, Fatigue Detection, Transformers, Vision Transformers (ViT), Driver Safety

Abstract

Mental conditions such as fatigue, distraction, and cognitive overload are known to contribute significantly to traffic accidents. Accurate recognition of these cognitive and emotional states is therefore important for the development of intelligent monitoring systems. In this study, a multimodal emotion recognition framework using electroencephalography (EEG) signals and facial expression features is proposed, with potential applications in driver monitoring. The approach integrates Long Short-Term Memory (LSTM) networks and Transformer architectures for EEG-based temporal feature extraction, along with Vision Transformers (ViT) for facial feature representation. Feature-level fusion is employed to combine physiological and visual modalities, enabling improved emotion classification performance compared to unimodal approaches. The model is evaluated using accuracy, precision, recall, and F1-score metrics, achieving an overall accuracy of 96.38%, demonstrating the effectiveness of multimodal learning. Although the experiments are conducted on general-purpose emotion datasets, the results indicate that the proposed framework can serve as a reliable foundation for driver monitoring applications, such as fatigue, distraction, and cognitive state assessment, in intelligent transportation systems.

Cite This Paper

Ch. Raga Madhuri, Anideep Seelam, Fatima Farheen Shaik, Aadi Siva Kartheek Pamarthi, Mohan Kireeti Krovi, "Multimodal Emotion Recognition Using EEG and Facial Expressions with Potential Applications in Driver Monitoring", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.18, No.1, pp. 163-176, 2026. DOI:10.5815/ijigsp.2026.01.10

Reference

[1]Y. Alotaibi and V. A. Vuyyuru, “Electroencephalogram-based face emotion recognition using multimodal fusion and 1-D convolution neural network (ID-CNN) classifier,” AIMS Mathematics, vol. 8, no. 10, pp. 22984–23002, 2023, doi: 10.3934/math.20231169.
[2]N. Saffaryazdi et al., “Using facial micro-expressions in combination with EEG and physiological signals for emotion recognition,” Frontiers in Psychology, vol. 13, p. 864047, 2022, doi: 10.3389/fpsyg.2022.864047.
[3]A. M. Mutawa and A. Hassouneh, “Multimodal real-time patient emotion recognition system using facial expressions and brain EEG signals based on machine learning and log-sync methods,” Biomedical Signal Processing and Control, vol. 91, p. 105942, 2024, doi: 10.1016/j.bspc.2023.105942.
[4]H. Gao, A. Yüce, and J.-P. Thiran, “Detecting emotional stress from facial expressions for driving safety,” in Proc. IEEE Int. Conf. Image Processing (ICIP), Oct. 2014, pp. 5961–5965, doi: 10.1109/ICIP.2014.7026203.
[5]M. Hashemi, A. Mirrashid, and A. Beheshti Shirazi, “Driver safety development: Real-time driver drowsiness detection system based on convolutional neural network,” SN Computer Science, vol. 1, no. 5, p. 289, 2020, doi: 10.1007/s42979-020-00306-9.
[6]Y. Zhou et al., “Cognitive workload recognition using EEG signals and machine learning: A review,” IEEE Transactions on Cognitive and Developmental Systems, vol. 14, no. 3, pp. 799–818, 2021, doi:  10.1109/TCDS.2021.3090217.
[7]M. Jabon, J. Bailenson, E. Pontikakis, L. Takayama, and C. Nass, “Facial expression analysis for predicting unsafe driving behavior,” IEEE Pervasive Computing, vol. 10, no. 4, pp. 84–95, 2010, doi:  10.1109/MPRV.2010.46
[8]Y. Shang et al., “Driver emotion and fatigue state detection based on time series fusion,” Electronics, vol. 12, no. 1, p. 26, 2022, doi: 10.3390/electronics12010026.
[9]M. Dua, Shakshi, R. Singla, S. Raj, and A. Jangra, “Deep CNN models-based ensemble approach to driver drowsiness detection,” Neural Computing and Applications, vol. 33, pp. 3155–3168, 2021, doi: 10.1007/s00521-020-05209-7.
[10]Y. Yi, H. Zhang, W. Zhang, Y. Yuan, and C. Li, “Fatigue working detection based on facial multifeature fusion,” IEEE Sensors Journal, vol. 23, no. 6, pp. 5956–5961, 2023, doi: 10.1109/JSEN.2023.3239029.
[11]S. Wang, J. Qu, Y. Zhang, and Y. Zhang, “Multimodal emotion recognition from EEG signals and facial expressions,” IEEE Access, vol. 11, pp. 33061–33068, 2023, doi: 10.1109/ACCESS.2023.3263670.
[12]J. Pan et al., “Multimodal emotion recognition based on facial expressions, speech, and EEG,” IEEE Open Journal of Engineering in Medicine and Biology, 2023, doi: 10.1109/OJEMB.2023.3240280.
[13]Y. Huang, J. Yang, P. Liao, and J. Pan, “Fusion of facial expressions and EEG for multimodal emotion recognition,” Computational Intelligence and Neuroscience, vol. 2017, p. 2107451, 2017, doi: 10.1155/2017/2107451.
[14]A. Roshdy, A. Karar, S. A. Kork, T. Beyrouthy, and A. Nait-Ali, “Advancements in EEG emotion recognition: Leveraging multi-modal database integration,” Applied Sciences, vol. 14, no. 6, p. 2487, 2024, doi: 10.3390/app14062487.
[15]Y. Peng et al., “The application of electroencephalogram in driving safety: Current status and future prospects,” Frontiers in Psychology, vol. 13, p. 919695, 2022, doi: 10.3389/fpsyg.2022.919695.
[16]G. S. Thirunavukkarasu, H. Abdi, and N. Mohajer, “A smart HMI for driving safety using emotion prediction of EEG signals,” in Proc. IEEE Int. Conf. Systems, Man, and Cybernetics (SMC), Oct. 2016, pp. 4148–4153, doi: 10.1109/SMC.2016.7844882.
[17]M. Yousaf et al., “Enhancing driver attention and road safety through EEG-informed deep reinforcement learning and soft computing,” Applied Soft Computing, vol. 167, p. 112320, 2024, doi: 10.1016/j.asoc.2024.112320.
[18]S. Bhatlawande, S. Shilaskar, S. Pramanik, and S. Sole, “Multimodal emotion recognition based on the fusion of vision, EEG, ECG, and EMG signals,” International Journal of Electrical and Computer Engineering Systems, vol. 15, no. 1, pp. 41–58, 2024, doi: 10.32985/ijeces.15.1.5.
[19]D. Li, X. Zhang, X. Liu, Z. Ma, and B. Zhang, “Driver fatigue detection based on comprehensive facial features and gated recurrent unit,” Journal of Real-Time Image Processing, vol. 20, no. 2, pp. 19–29, 2023, doi: 10.1016/j.heliyon.2024.e39479.
[20]L. Zhao et al., “Data-driven learning fatigue detection system: A multimodal fusion approach of ECG and video signals,” Measurement, vol. 201, p. 111648, 2022, doi: 10.1016/j.measurement.2022.111648.
[21]R. M. Chandra, G. S. Neelaiahgari, and S. S. Vanapalli, “Enhancing driver safety through sensor-based detection and mitigation of health risks in vehicles,” in Proc. Int. Conf. Algorithms and Computational Theory for Engineering Applications, Cham, Switzerland: Springer Nature, Feb. 2024, pp. 199–204, doi: 10.1007/978-3-031-72747-4_30.
[22]X. Lin, Z. Huang, W. Ma, and W. Tang, “EEG-based driver drowsiness detection based on simulated driving environment,” Neurocomputing, vol. 616, p. 128961, 2025, doi: 10.1016/j.neucom.2024.128961.
[23]H. Jia, Z. Xiao, and P. Ji, “End-to-end fatigue driving EEG signal detection model based on improved temporal-graph convolution network,” Computers in Biology and Medicine, vol. 152, p. 106431, 2023, doi: 10.1016/j.compbiomed.2022.106431.
[24]A. Topic and M. Russo, “Emotion recognition based on EEG feature maps through deep learning network,” Engineering Science and Technology, an International Journal, vol. 24, no. 6, pp. 1442–1454, 2021, doi: 10.1016/j.jestch.2021.03.012.
[25]T. Ergin, M. A. Ozdemir, and A. Akan, “Emotion recognition with multi-channel EEG signals using visual stimulus,” in Proc. Medical Technologies Congress (TIPTEKNO), Oct. 2019, pp. 1–4, doi: 10.1109/TIPTEKNO.2019.8895242.
[26]J. Y. Kim, C. H. Jeong, M. J. Jung, J. H. Park, and D. H. Jung, “Highly reliable driving workload analysis using driver electroencephalogram (EEG) activities during driving,” International Journal of Automotive Technology, vol. 14, no. 6, pp. 965–970, 2013, doi: 10.1007/s12239-013-0106-z.
[27]W. Deng and R. Wu, “Real-time driver-drowsiness detection system using facial features,” IEEE Access, vol. 7, pp. 118727–118738, 2019, doi: 10.1109/ACCESS.2019.2936663.
[28]A. Salbi, M. A. Gadi, T. Bouganssa, A. E. Hassani, and A. Lasfar, “Design and implementation of a driving safety assistant system based on driver behavior,” IAES International Journal of Artificial Intelligence, vol. 13, no. 3, pp. 2603–2613, 2024, doi: 10.11591/ijai.v13.i3.pp2603-2613.
[29]P. Neeraja, R. G. Kumar, M. S. Kumar, K. K. S. Liyakat, and M. S. Vani, “DL-Based Somnolence Detection for Improved Driver Safety and Alertness Monitoring,” in Proc. IEEE Int. Conf. on Computing, Power and Communication Technologies (IC2PCT), Greater Noida, India, 2024, pp. 589–594, doi: 10.1109/IC2PCT60090.2024.10486714 
[30]D. Kim, H. Park, T. Kim, et al., “Real-time driver monitoring system with facial landmark-based eye closure detection and head pose recognition,” Scientific Reports, vol. 13, p. 18264, 2023, doi: 10.1038/s41598-023-44955-1.
[31]J. Chen, Y. Cui, C. Wei, K. Polat, and F. Alenezi, “Driver fatigue detection using EEG-based graph attention convolutional neural networks: An end-to-end learning approach with mutual information-driven connectivity,” Applied Soft Computing, vol. 186, Part A, p. 114097, 2026, doi: 10.1016/j.asoc.2025.114097
[32]O. F. Hassan, A. F. Ibrahim, A. Gomaa, et al., “Real-time driver drowsiness detection using transformer architectures: A novel deep learning approach,” Scientific Reports, vol. 15, p. 17493, 2025, doi: 10.1038/s41598-025-02111-x.