Quality of multiple choice question items: item analysis

Authors

  • Ayenew Takele Alemu Department of Public Health, College of Medicine and Health Sciences, Injibara University, Injibara, Ethiopia
  • Hiwot Tesfa Department of Public Health, College of Medicine and Health Sciences, Injibara University, Injibara, Ethiopia
  • Addisu Mulugeta Department of Public Health, College of Medicine and Health Sciences, Injibara University, Injibara, Ethiopia
  • Enyew Tale Fenta Department of Public Health, College of Medicine and Health Sciences, Injibara University, Injibara, Ethiopia
  • Mahider Awoke Belay Department of Public Health, College of Medicine and Health Sciences, Injibara University, Injibara, Ethiopia

DOI:

https://doi.org/10.18203/issn.2454-2156.IntJSciRep20241316

Keywords:

Multiple choice question, DIF, DI, DE, Ethiopia

Abstract

Background: There are different types of exam formats for educational assessment. Multiple choice questions (MCQs) are frequently utilized assessment tools in health education. Considering the reliability and validity in developing MCQ items is vital. Educators often face the difficulty of developing credible distractors in MCQ items. Poorly constructed MCQ items make an exam easier or too difficult to be answered correctly by students as intended learning objectives. Checking the quality of MCQ items is overlooked and too little is known about it. Therefore, this study aimed to assess the quality of MCQ items using the item response theory model.

Methods: A descriptive cross-sectional study was conducted among MCQ items of public health courses administered to 2nd year nursing students at Injibara university. A total of 50 MCQ items and 200 alternatives were evaluated for statistical item analysis. The quality of MCQ items was assessed by difficulty index (DIF), discrimination index (DI), and distractor efficiency (DE) using students’ exam responses. Microsoft excel sheet and SPSS version 26 were used for data management and analysis.

Results: Post-exam item analysis showed that 11 (22%) and 22 (44%) MCQs had too difficult and poor ranges for difficulty and discriminating powers respectively. The overall DE was 71.3%. About forty (20%) distractors were non-functional. Only 8 (16%) MCQs fulfilled the recommended criteria for all-DIF, DI, and DE parameters.

Conclusions: The desirable criteria for quality parameters of MCQ items were satisfied only in a few items. The result implies the need for quality improvement. Continuous trainings are required to improve the instructors’ skills to construct quality educational assessment tools.

Metrics

Metrics Loading ...

References

Mehta G, Mokhasi V. Item analysis of multiple choice questions-an assessment of the assessment tool. Int J Health Sci Res, 2014;4(7):197-202.

Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences, 3rd edittion, National Board of Medical Examiners Philadelphia. 1998.

Tarrant M, Ware J. A framework for improving the quality of multiple-choice assessments. Nurse Educator. 2012;37(3):98-104.

Palmer E, Devitt P. Constructing multiple choice questions as a method for learning. Ann Academy Med Singapore. 2006;35(9):604.

Gajjar S, Sharma R, Kumar P, Rana M. Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian J Community Med. 2014;39(1):17-20.

Burton SJ, Sudweeks RR, Merrill PF, Wood B. How to prepare better multiple-choice test items: Guidelines for university faculty. Brigham Young University, Department of Instructional Science. 1990.

Coughlin P, Featherstone C. How to write a high quality multiple choice question (MCQ): a guide for clinicians. Eur J Vascular Endovascular Surg. 2017;54(5):654-8.

Tarrant M, Ware J, Mohammed AM. An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Med Educat. 2009;9(1):1-8.

Mahjabeen W, Alam S, Usman H, Tahira Z, Rubab B, Sadaf K, et al. Difficulty index, discrimination index and distractor efficiency in multiple choice questions. Ann PIMS-Shaheed Zulfiqar Ali Bhutto Med University. 2017;13(4):310-5.

Christian DS, Prajapati AC, Rana BM, Dave VR. Evaluation of multiple choice questions using item analysis tool: a study from a medical institute of Ahmedabad, Gujarat. Int J Community Med Public Health. 2017;4(6):1876-81.

Brady AM. Assessment of learning with multiple-choice questions. Nurse Educat Pract. 2005;5(4):238-42.

Kehoe J. Basic item analysis for multiple-choice tests. Practical Assess Res Evaluat. 1994;4(1):10.

Ali SH, Carr PA, Ruit KG. Validity and Reliability of Scores Obtained on Multiple-Choice Questions: Why Functioning Distractors Matter. J Scholarship Teaching Learning. 2016;16(1):1-14.

Biggs J, Tang C. E-book: Teaching for Quality Learning at University. McGraw-hill education (UK). 2011.

Epstein RM. Assessment in medical education. N Eng J Med. 2007;356(4):387-96.

Tavakol M, Dennick R. The foundations of measurement and assessment in medical education. Med Teacher. 2017;39(10):1010-5.

Tavakol M, Dennick R. Psychometric evaluation of a knowledge based examination using Rasch analysis: an illustrative guide: AMEE guide no. 72. Med Teacher. 2013;35(1):e838-48.

Patel RM. Use of Item analysis to improve quality of Multiple Choice Questions in II MBBS. J Educat Technol Heal Sci. 2017;4(1):22-9.

Kumar D, Jaipurkar R, Shekhar A, Sikri G, Srinivasd V. Item analysis of multiple choice questions: A quality assurance test for an assessment tool. Med J Armed Forces India. 2021;77(1):S85-9.

Rehman A, Aslam A, Hassan SH. Item analysis of multiple choice questions. Pak Oral Dental J. 2018;38(2):291-3.

Kaur M, Singla S, Mahajan R. Item analysis of in use multiple choice questions in pharmacology. Int J Appl Basic Med Res. 2016;6(3):170.

Shete AN, Kausar A, Lakhkar K, Khan ST. Item analysis: An evaluation of multiple choice questions in Physiology examination. J Contemporary Med Education. 2015;3(3):106-9.

Kolte, V., Item analysis of multiple choice questions in physiology examination. Indian J Basic Appl Med Res. 2015;4(4):320-6.

Namdeo SK, Sahoo B. Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. Int J Res Med Sci. 2016;4(5):1716-9.

Downloads

Published

2024-05-23

Issue

Section

Original Research Articles