摘要:Background Single Best Answer (SBA) questions are widely used in undergraduate and postgraduate medical examinations. Selection of the correct answer in SBA questions may be subject to cueing and therefore might not test the student’s knowledge. In contrast to this artificial construct, doctors are ultimately required to perform in a real-life setting that does not offer a list of choices. This professional competence can be tested using Short Answer Questions (SAQs), where the student writes the correct answer without prompting from the question. However, SAQs cannot easily be machine marked and are therefore not feasible as an instrument for testing a representative sample of the curriculum for a large number of candidates. We hypothesised that a novel assessment instrument consisting of very short answer (VSA) questions is a superior test of knowledge than assessment by SBA. Methods We conducted a prospective pilot study on one cohort of 266 medical students sitting a formative examination. All students were assessed by both a novel assessment instrument consisting of VSAs and by SBA questions. Both instruments tested the same knowledge base. Using the filter function of Microsoft Excel, the range of answers provided for each VSA question was reviewed and correct answers accepted in less than two minutes. Examination results were compared between the two methods of assessment. Results Students scored more highly in all fifteen SBA questions than in the VSA question format, despite both examinations requiring the same knowledge base. Conclusions Valid assessment of undergraduate and postgraduate knowledge can be improved by the use of VSA questions. Such an approach will test nascent physician ability rather than ability to pass exams.
关键词:Very short answer ; Single best answer ; Assessment ; Testing ; Validity ; Reliability