Work place: University of Asia Pacific/Electrical and Electronic Engineering, Dhaka, 1205, Bangladesh
E-mail: 22208152@uap-bd.edu
Website: https://orcid.org/0009-0009-6934-3769
Research Interests:
Biography
Md Naimur Rahman Khan Sifat was born in Narayanganj, Dhaka, Bangladesh. He is currently in his 3rd year of Electrical and Electronic Engineering at University of Asia Pacific, Dhaka, Bangladesh. He began his undergraduate studies in 2022 and is expected to graduate in 2026. His interests are in the fields of electronics(IoT), natural language processing (NLP), and machine learning(ML).
By Sayem Shahad Salman Sayeed Md Naimur Rahman Khan Sifat Shuvo Biswas Tishna Sabrina
DOI: https://doi.org/10.5815/ijmecs.2026.01.07, Pub. Date: 8 Feb. 2026
In recent days, we have largely adopted Advanced Large Language Models (LLMs) in educational settings, where we use them as content creators, teaching assistants, and interactive conversation agents. However, the responses generated by these models are often monotonous, verbose, and ambiguous, which can hinder their effectiveness in educational contexts. Addressing these shortcomings, we introduce EduAgent, a multimodal chatbot framework specifically designed to enhance interactive learning in Electrical and Electronics Engineering (EEE) education. EduAgent can respond with pedagogically enhanced answers to electronics-related queries, complemented by relevant images and detailed explanations. It is designed to provide complete, concise, step-by-step responses, ensuring that foundational knowledge is clearly mentioned before diving deep. To develop EduAgent, we constructed a dataset comprising 596 four-turn conversations and a collection of 118 images covering a wide range of EEE concepts. The conversation dataset was used to fine-tune the open-source LLMs and facilitate in-context learning. Both images and their corresponding explanations were integrated into a knowledge base for efficient retrieval. Finally, we evaluated multiple text generation and image retrieval methods using both automatic metrics and human assessments, demonstrating the effectiveness and engagement of our approach.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals