Analysis of Multiple-Choice Questions (MCQs): Item and Test Statistics from the 2nd Year Nursing Qualifying Exam in a University in Cavite, Philippines
AbstractMultiple Choice Questions (MCQs) is used extensively as a test format in nursing education.However, making MCQs still remains a challenge to educators. To avoid issues about itsquality, this should undergo item analysis. Thus, the study evaluated item and test qualityusing difficulty index (DIF) and discrimination indices (DI), with distractor efficiency (DE);determined the reliability using Kuder-Richardson 20 coefficients (KR20); and identifiedwhich valid measure was developed. The study was conducted among 41 level two nursingstudents in the College of Nursing. The qualifying examination comprised of 194 MCQs.Data were entered in Microsoft Excel 2010 and SPSS22 and were analyzed. According toDIF, out of 194 items, 115 (59.3%) had right difficulty and 79 (40.7%) were difficult.Regarding DI, 17 (8.8%) MCQs were considered very good items to discriminate the low andhigh performer students. While 21 (10.8%), 32 (16.5%), 24 (12.4%), and 100 (51.5%)demonstrated good, fair quality, potentially poor, and potentially very poor items,respectively. On the other hand, the number of items that had 100% distractor effectiveness is57 (29.4%), as 65 (33.5%), 49 (25.3%), and 23 (11.9%) revealed 66.6%, 33.3% and 0%,respectively. The reliability of the test using KR20 is 0.9, suggesting that the test is highlyreliable with considered good internal consistency. After careful analysis of each item, 55(28.35%) items were retained without revisions. Further, the stem of the 24 (12.37%) items,the distractors of the 66 (34.02%) items and both the stem and distractors of 46 (23.71%)items were modified, and 3 (1.55%) items were removed. The researcher recommends doingan analysis between upper and lower scorers and its relationship to DE. For future study, itwill be beneficial to explore other factors like student’s ability, quality of instructions, andnumber of students in relation to quality of MCQs.
Barud, I., Nagandla, K., & Agarwal, P. (2019). Impact of distractors in item analysis of multiple choice questions. International Journal of Research in Medical Sciences, 7(4), 1136-1139. doi: http://dx.doi.org/10.18203/2320-6012.ijrms20191313
Crossman, A. (2018). What you need to understand about purposive sampling. Retrieved from https://www.thoughtco.com/purposive-sampling-3026727
D'Sa, J. L. & Visbal-Dionaldo, M. L. (2017). Analysis of multiple choice questions: Item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education, 9(3), 109-114. doi:10.5958/0974-9357.2017.00079.4.
Hijji, B. M. (2017). Flaws of multiple choice questions in teacher-constructed nursing examinations: A pilot descriptive study. Journal of Nursing Education, 56(8), 490-496. doi: 10.3928/01484834-20170712-08
Houser, J. (2018). Nursing research: Reading, using, and creating (4th ed.). Burlington, MA: Jones & Bartlett Learning.
Mahjabeen, W., Alam, S., Hussan, U., Zafar, T., Butt, R. Konain, S., & Rizvi, M. (2018). Difficulty index, discrimination index, and distractor efficiency in multiple choice questions. Annals of Pakistan Institute of Medical Sciences, 4, 310-315. Retrieved from https://www.researchgate.net/publication/323705126_Difficulty_Index_Discriminatio n_Index_and_Distractor_Efficiency_in_Multiple_Choice_Questions
Mannion, C. A., Hnatyshyn, T., O'Rae, A., Beck, A. J., & Patel, S. (2018). Nurse Educators and Multiple-Choice Examination Practices. Retrieved from http://hdl.handle.net/1880/108887
Mehta, G. & Mokhasi,V. (2014). Item analysis of multiple choice questions- An assessment of the assessment tool. International Journal of Health Sciences & Research, 4(7), 197-202.
Mukherjee, P. & Lahiri, S. K. (2015). Analysis of multiple-choice questions (MCQs): Item and Test Statistics from an assessment in a medical college of Kolkata, West Bengal. Journal of Dental and Medical Sciences, 14(12), 47-52. www.iosrjournals.org
Musa, A., Shaheen, S., Elmardi, A., & Ahmed, A. (2018). Item difficulty & item discrimination as quality indicators of physiology MCQ examinations at the Faculty of Medicine, Khartoum University. Khartoum Medical Journal, 11(02), 1477-1486. Retrieved from https://www.researchgate.net/publication/328583573_Item_difficulty_item_discrimin ation_as_quality_indicators_of_physiology_MCQ_examinations_at_the_Faculty_of_ Medicine_Khartoum_University
Nedeau-Cayo, R. (2013). Assessment of item-writing flaws in multiple-choice questions. Journal for Nurses in Professional Development, 29(2), 52-57. doi:10.1097/NND.0b013e318286c2f1
Odukoya, J.A., Adekeye, O., Igbinoba, A.O., and Afolabi, A. (2017). Item analysis of university-wide multiple-choice objective examinations: the experience of a Nigerian private university. European Journal of Methodology, 52(3), 983–997. doi: https://doi.org/10.3928/01484834-20170712-08
Oermann, M.H. & Gaberson K.B., (2014). Evaluation and testing in nursing education (4th ed). New York, NY: Springer Publishing Company.
Polit, D. F. & Yang, F. M. (2015). Measurement and the measurement of change. Philadelphia: Wolters Kluwer.
Rehman, A., Aslam, A. & Hassan, S. H. (2018). Item analysis of multiple choice questions. Pakistan Oral and Dental Journal, 38(2), 291-293. Retrieved from https://www.podj.com.pk/index.php/podj/article/view/245
Reichert, T. G. (2011). Assessing the use of high quality multiple-choice exam questions in undergraduate nursing education: Are educators making the grade? Retrieved from Sophia, the St. Catherine University repository. https://sophia.stkate.edu/ma_nursing/15
Riazi, A. M. (2016). The routledge encyclopedia of research methods in applied linguistics: Quantitative, qualitative, and mixed-methods research. London: Routledge Taylor and Francis Group.
Salkind, N. J, (Eds.). (2010). Encyclopedia of research design. Thousand Oaks, California: Sage Publication.
Tracy, D. A., (2012). School improvement: Revitalize your school with strategic planning. USA: Xlibris Corporation.
Understanding item analyses. (2019). Office of Educational Assessment. Retrieved from http://www.washington.edu/assessment/scanning-scoring/scoring/reports/itemanalysis/
Copyright (c) 2019 Abstract Proceedings International Scholars Conference
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Copyright © 2019 ISC Committee.