[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Journal Information::
Articles archive::
For Authors::
Reviewers Section::
Registration::
Contact us::
Site Facilities::
Indexing::
Ethics::
In-press Articles::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
:: Volume 12 - ::
mededj 2023, 12 - : 0-0 Back to browse issues page
Quantitative and Qualitative Evaluation of Four Choice Questions In Shahroud University of Medical Sciences During 2021-2022
Sahar Paryab , Robabeh Zarouj Hosseini , Sepehr Zamani , Masomeh Lashkari , Maryam Yousefi , Kimia Zarouj Hosseini , Omid Garkaz *
Department of Epidemiology, School of Public Health, Shahroud University of Medical Sciences, Shahroud, Iran
Abstract:   (1114 Views)
Background and Objective: Any test as a measurement tool must have sufficient validity and reliability to measure the desired attribute. Multiple-choice tests are the most common types of tests in medical education, which have a high degree of reliability, and this study was conducted with the aim of quantitative and qualitative evaluation of four-choice questions in Shahroud University of Medical Sciences in the academic year of 2021-2022.
Methods: This descriptive study was conducted on all General and specialized courses in the two academic years of 2021-2022 at the Shahroud University of Medical Sciences. For the quantitative evaluation of the questions (difficulty index and discrimination index), the Hamava system was used and for the qualitative evaluation, the 14-item checklist of Millman was used.
Findings: This study showed that out of 5000 examined questions, 2569 questions (51.4%) had a simple difficulty coefficient and 2500 (50%) had a suitable clean coefficient. And the average standard deviation of the degree of difficulty was 0.700±0.314 and the degree of cleanliness was 0.625±0.247, also 330 questions (48.5%) of taxonomy two and 190 questions (27.9%) were without defects. The most types of defects were the absence of most of the information in the question stems of 196 questions, the presence of negative words in the question stems of 136 questions, and the presence of options for all items in 121 questions, none in 45 questions, and combined options in 34 questions. And in the structure of question 398 (58/1%) was the most type of defect.
Conclusion: To improve the quality of tests and use the results of quantitative and qualitative analysis of questions, practical solutions should be thought of, and professors should be trained in designing and using appropriate questions and preparing for-choice question banks.
Keywords: Analysis, Quantitative, Qualitative, and Four-Option Questions
Full-Text [PDF 560 kb]   (685 Downloads)    
Type of Study: Research | Subject: Special
Received: 2023/04/12 | Accepted: 2023/07/11 | Published: 2023/08/14
References
1. Brookhart SM. Educational assessment knowledge and skills for teachers. Educational Measurement: issues and practice. 2011;30(1):3-12.
2. Oguguo BC, Nannim FA, Agah JJ, Ugwuanyi CS, Ene CU, Nzeadibe AC. Effect of learning management system on Student’s performance in educational measurement and evaluation. Education and Information Technologies. 2021;26(2):1471-83.
3. Wakeford R, Denney M, Ludka-Stempien K, Dacre J, McManus I. Cross-comparison of MRCGP & MRCP (UK) in a database linkage study of 2,284 candidates taking both examinations: assessment of validity and differential performance by ethnicity. BMC Medical education. 2015;15(1):1-12.
4. Dehnad A, Nasser H, Hosseini AF. A comparison between three-and four-option multiple choice questions. Procedia-Social and Behavioral Sciences. 2014;98:398-403.
5. Colbert MA. Statistical analysis of multiple choice testing. Air Command And Staff Coll Maxwell Afb Al; 2001.
6. Thompson AR, Husmann PR. Developing multiple-choice questions for anatomy examinations. Teaching Anatomy: Springer; 2020. p. 405-16.
7. Stevens NT, Holmes K, Grainger RJ, Connolly R, Prior A-R, Fitzpatrick F, et al. Can e-learning improve the performance of undergraduate medical students in Clinical Microbiology examinations? BMC medical education. 2019;19(1):1-8.
8. Amin MM, Shayan S, Hashemi H, Poursafa P, Ebrahimi A. Analysis of Multiple Choice Questions Based On Classical Test Theory. Iranian Journal of Medical Education. 2011;10(5).719-725.
9. Bailey PH, Mossey S, Moroso S, Cloutier JD, Love A. Implications of multiple-choice testing in nursing education. Nurse Educ Today. 2012;32(6):e40-e4.
10. Boland RJ, Lester NA, Williams E. Writing multiple-choice questions. Acad Psychiatry. 2010;34(4):310-6.
11. Mardapi D. Educational measurement, assessment, and evaluation. Parama Publishing; 2017.
12. Dent J, Harden RM, Hunt D. A Practical Guide for Medical Teachers, E-Book: Elsevier health sciences; 2021.
13. Meyari A, Beiglarkhani M, Zandi M, Vahedi M, Miresmaeili Af. The Effect Of Education On Improvement Of Multiple Choice Questions'designing In Annual Residency Exams Of Dental School. 2012.9(1).36-45.
14. Ramani S, Ring BN, Lowe R, Hunter D. A pilot study assessing knowledge of clinical signs and physical examination skills in incoming medicine residents. J Grad Med Educ. 2010;2(2):232-5.
15. Walsh J, Harris B, Smith P. Single best answer question-writing tips for clinicians. Postgraduate medical journal. 2017;93(1096):76-81.
16. Miller-Matero LR, Martinez S, MacLean L, Yaremchuk K, Ko AB. Grit: A predictor of medical student performance. Education for Health. 2018;31(2):109-113.
17. Belfi LM, Bartolotta RJ, Giambrone AE, Davi C, Min RJ. “Flipping” the introductory clerkship in radiology: impact on medical student performance and perceptions. Acad Radiol. 2015;22(6).794-801.
18. Landers RN. Developing a theory of gamified learning: Linking serious games and gamification of learning. Simulation & gaming. 2014;45(6):752-68.
19. Baig M, Ali SK, Ali S, Huda N. Evaluation of multiple choice and short essay question items in basic medical sciences. Pakistan journal of medical sciences. 2014;30(1):3-6.
20. Coughlin P, Featherstone C. How to write a high quality multiple choice question (MCQ): A guide for clinicians. Eur J Vasc Endovasc Surg. 2017;54(5):654-8.
21. Yonker JE. The relationship of deep and surface study approaches with factual and applied test‐bank multiple‐choice question performance. Assessment & Evaluation in Higher Education. 2011;36(6):673-86.
22. Secolsky C, Denison DB. Handbook of measurement, assessment, and evaluation in higher education: Routledge New York; 2012.
23. Sanagoo A, Jouybari L, Ghanbari Gorji M. Quantitative and qualitative analysis of academic achievement tests in Golestan University of Medical Sciences. Research in Medical Education. 2010.2(2).24-32.
24. Baghaei R, Shams S, Feizi A. Evaluation of the nursing students final exam multiple-choice questions in uremia university of medical sciences. Nursing And Midwifery Journal. 2016;14(4):291-9.
25. Khajeali N, Aslami M, Araban M. Analysis of medical and dentistry basic sciences examinations: A case study. Payees (Health Monitor). 2020;19(4):383-9.
26. Vafamehr V, Dadgostarnia M. Reviewing the results of qualitative and quantitative analysis of MCQs in Introduction to clinical medicine course. Iranian Journal of Medical Education. 2011;10(5).1146-1152.
27. Almasi K, Bavani SM, Mohammadpour Y. Examining the preferred learning styles (PLSs) of nursing and midwifery students of Urmia University of Medical Sciences. Middle East Journal Of Family Medicine. 2018.7(10).244-253.
28. M Ekrami SR, F Afzali Nasab, F Vijehzadeh, A Farajpour. Analysis of the four-choice end-of-semester gymnastics course 2 exam at Payame Noor University. Journal of Organizational Behavior Management Studies in Sport. 2016.2(6).65-72.
29. Rush BR, Rankin DC, White BJ. The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC medical education. 2016;16(1):1-10.
30. Tarrant M, Knierim A, Hayes SK, Ware J. The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today. 2006;26(8):662-71.
31. Ramraje S, Sable P. Comparison of the effect of post-instruction multiple-choice and short-answer tests on delayed retention learning. The Australasian medical journal. 2011;4(6):332-339.
32. Thompson AR, Husmann PR. Developing multiple-choice questions for anatomy examinations. Teaching Anatomy: A Practical Guide. 2020:405-16.
33. Danh T, Desiderio T, Herrmann V, Lyons HM, Patrick F, Wantuch GA, et al. Evaluating the quality of multiple-choice questions in a NAPLEX preparation book. Currents in Pharmacy Teaching and Learning. 2020;12(10):1188-93.
34. Ghasemi A, Zahediasl S. Normality tests for statistical analysis: a guide for non-statisticians. International journal of endocrinology and metabolism. 2012;10(2):486-489.
35. Surry LT, Torre D, Trowbridge RL, Durning SJ. A mixed-methods exploration of cognitive dispositions to respond and clinical reasoning errors with multiple choice questions. BMC medical education. 2018;18:1-11.
Add your comments about this article
Your username or Email:

CAPTCHA


XML   Persian Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Paryab S, Zarouj Hosseini R, Zamani S, lashkari M, Yousefi M, Zarouj Hosseini K et al . Quantitative and Qualitative Evaluation of Four Choice Questions In Shahroud University of Medical Sciences During 2021-2022. mededj 2023; 12
URL: http://mededj.ir/article-1-455-en.html


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 12 - Back to browse issues page
مجله آموزش پزشکی Medical Education Journal
Persian site map - English site map - Created in 0.08 seconds with 37 queries by YEKTAWEB 4645
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial (CC BY-NC)