Multiple-choice questions (MCQ)
From SCOME Wiki
In its "Toolbox of assessment methods the Accreditation Council for Graduate Medical Education (ACGME) defines Written examination (MCQ) as follows.
A written or computer-based MCQ examination is composed of multiple-choice questions (MCQ) selected to sample medical knowledge and understanding of a defined body of knowledge, not just factual or easily recalled information. Each question or test item contains an introductory statement followed by four or five options in outline format. The examinee selects one of the options as the presumed correct answer by marking the option on a coded answer sheet. Only one option is keyed as the correct response. The introductory statement often presents a patient case, clinical findings, or displays data graphically. A separate booklet can be used to display pictures, and other relevant clinical information. The in-training examinations prepared by specialty societies and boards use MCQ type test items. A typical half-day examination has 175 to 250 test questions. In computer-based examinations the test items are displayed on a computer monitor one at a time with pictures and graphical images also displayed directly on the monitor. In a computer-adaptive test fewer test questions are needed because test items are selected based upon statistical rules programmed into the computer to quickly measure the examinee’s ability.
Medical knowledge and understanding can be measured by MCQ examinations. Comparing the test scores on in-training examinations with national statistics can serve to identify strengths and limitations of individual residents to help them improve. Comparing test results aggregated for residents in each year of a program can be helpful to identify residency training experiences that might be improved.
For test questions to be useful in evaluating a resident’s knowledge each test item and the overall exam should be designed to rigorous psychometric standards. Psychometric qualities must be high for pass/fail decisions, but tests used to help residents identify strengths and weaknesses such as in-training examinations need not comply with the same rigorous standards. A committee of experts designing the test defines the knowledge to be assessed and creates a test blueprint that specifies the number of test questions to be selected for each topic. When test questions are used to make pass/fail decisions the test should be pilot tested and statistically analyzed. A higher reliability/reproducibility can be achieved with more test questions per topic. If pass/fail decisions will be made based on test scores a sufficient number of test questions should be included to obtain a test reliability greater than r = 0.85 (1.00 is perfect reliability). Standards for passing scores should be set by a committee of experts prior to administering the examination (criterion referenced exams). If performance of residents is to be compared from year to year at least 25 to 30 percent of the same test questions should be repeated each year.
A committee of physician specialists develops the examination with assistance from psychometric experts. For in-training examinations each residency program administers an exam purchased from the specialty society or other vendor. Tests are scored by the vendor and scores returned to the residency director for each resident, for each topic, and by year of residency training. Comparable national scores also are provided. All the 24 ABMS Member Boards use MCQ examinations for initial certification.
- Haladyna TM. Developing and validating multiple-choice test items. Hillsdale, New Jersey: L. Erlbaum Associates. 1994.
- Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. Philadelphia, PA: National Board of Medical Examiners, 1996 (www.nbme.org)
- Department for Medical Education, University of Bern (Switzerland). Competent assessment (German and French only) www.iawf.unibe.ch