An Aligned Assessment Item Authoring Environment based on Interoperability Standards

Full Text (PDF, 718KB), PP.1-9

Views: 0 Downloads: 0


Muhammad H. Zedan 1,* Hesham A. Hassan 2

1. Computer & Information Sciences Department, ISSR, Cairo University, Giza, Egypt

2. Computer Science Department, Faculty of Computes & Information, Cairo University, Giza, Egypt

* Corresponding author.


Received: 14 Sep. 2013 / Revised: 5 Oct. 2013 / Accepted: 2 Nov. 2013 / Published: 8 Dec. 2013

Index Terms

IMS QTI, ILOs, Assessment, Interoperability.


Standard representations support sharing assessment items and learning objects among learning environments. Different standards have been developed to provide interoperability-based descriptions for all learning aspects. Designing assessment items or questions using standard representations became a key point in learning/teaching domain. This paper proposes an environment for authoring assessment items using a combination of standards linked together to align the produced questions. The environment consists of a set of tools. The first tool is dedicated for building question body using IMS (Innovation, Adoption, and Learning) QTI (Question and Test Interoperability) standards. An extension to IMS QTI was designed to represent question Intended Learning Outcomes (ILOs), difficulty degree, assessed concepts, and target groups. The second tool manipulates competency definitions bank which is used in representing assessment item ILOs. The third tool deals with target groups to whom question will be delivered.

Cite This Paper

Muhammad H. Zedan, Hesham A. Hassan, "An Aligned Assessment Item Authoring Environment based on Interoperability Standards", International Journal of Modern Education and Computer Science (IJMECS), vol.5, no.12, pp.1-9, 2013. DOI: 10.5815/ijmecs.2013.12.01


[1]Patricia Santos, Wenceslao Llobet, Davinia Hernández-Leo and Josep Blat, “QTI for self-assessment and embedded assessment in competence oriented scenarios: The Agora Case,” International Conference on Intelligent Networking and Collaborative Systems, Barcelona, Spain, 2009.
[2]Patricia Santos, Xavier Colina, Davinia Hernández-Leo, Javier Melero, Josep Blat, “Enhancing Computer Assisted Assessment Using Rubrics in a QTI Editor,” Ninth IEEE International Conference on Advanced Learning Technologies (ICALT), Riga, Latvia, 2009.
[3]IMS Question & Test Interoperability Specification: Accessed at December, 2013.
[4]IEEE Reusable Competency Definitions (RCD),, accessed December, 2013.
[5]Melvyn Dodridge, “Learning outcomes and their assessment in higher education,” ENGINEERING SCIENCE AND EDUCATION, vol. 8, p. 161-186, 1999.
[6]IMS Global Learning Consortium:, Accessed at December, 2013.
[7]Question Mark Corporation Ltd., accessed December, 2013.
[8]Abelardo Pardo, Álvaro Agea, Carlos Delgado Kloos, “Current Issues with Assessment Formats and Interoperability,” IEEE EDUCON Education Engineering, Madrid, Spain, 2010.
[9]Beatriz E. Florián G., Silvia M. Baldiris, Ramón Fabregat Gesa, “Adaptive Integral Assessment Package for the A2UN@ Project,” European Association for Education in Electrical and Information Engineering conference (EAEEIE'20), Valencia, Spain, 2009.
[10]Alberto Abelló, et al., “LEARN-SQL: Automatic Assessment of SQL Based on IMS QTI Specification,” ICALT, Santander, Spain, 2008.
[11]Neil Y. Yen, Martin M. Weng, Louis R. Chao, “A Novel System Architecture to Enhance Web-based Assessment Environment,” IEEE International Symposium on IT in Medicine & Education (ITIME'9), Jinan, China, 2009.
[12]Israel Gutiérrez, Carlos Delgado Kloos, Raquel M. Crespo, “Assessing Assessment Formats: The Current Picture,” IEEE EDUCON Education Engineering, Madrid, Spain, 2010.
[13]Reusable Definition of Competency or Educational Objective,, accessed December, 2013.
[14]Raquel M. Crespo1, et al., “Aligning Assessment with Learning Outcomes in Outcome-based Education,” IEEE EDUCON Education Engineering, Madrid, Spain, 2010.
[15]M. Sokolova, G. Totkov, “Extended IMS Specification for Accumulative Test System,” International Conference on Computer Systems and Technologies - CompSysTech’08, Gabrovo, Bulgaria, 2008.
[16]Imran A. Zualkernan, Yaser A. Ghanam, Mohammed F. Shoshaa and Amir S. Kalbasi, “Architecture for Dynamic Generation of QTI 2.1 Assessments for Mobile Devices Using Flash Lite,” ICALT, Niigata, Japan, 2007.
[17]Davinia Hernández-Leo, Patricia Santos, Eloy D. Villasclaras-Fernández, Toni Navarrete, Juan I. Asensio-Pérez, Josep Blat, Yannis Dimitriadis, “Educational patterns as a guide to create units of learning and assessment,” ICALT, Santander, Spain, 2008.
[18]A. HARCHAY, L. CHENITI-BELCADHI, R. BRAHAM, “An Investigation of the Enhancement and the Formal Description of IMS/QTI Specification for Programming Courses,” ICLAT, Sousse, Tunisia, 2010.
[19]Muhammad H. Zedan, Hesham A. Hassan, Samhaa R. El-Beltagy, Ahmed A. Rafea, “A Model for Aligning Assessment Items,” Canadian Journal on Data, Information and Knowledge Engineering Vol. 2, No. 1, January 2011.
[20]Onjira Sitthisak, Lester Gilbert, Hugh C Davis, “Deriving e-assessment from a competency model,” ICALT, Santander, Spain, Santander, Spain, 2008.
[21]Sandra Arnold, Joël Fisler, “OLAT: The Swiss Open Source Learning Management System,” International Conference on e-Education, e-Business, e-Management and e-Learning (IC4E '10), Sanya, China, 2010.
[22]Xavier Gumara, Lluís Vicent, Marc Segarra, “QTI Result Reporting Stats Engine for Question-Based Online Tests,” ICALT, Santander, Spain, 2008.
[23]Java Architecture for XML Binding (JAXB),, accessed December, 2013
[24]Document Object Model (DOM),, accessed December, 2013.
[25]Ekit tool,, accessed December, 2013.
[26]Learning Object Metadata (LOM),, accessed December, 2013.