首页    期刊浏览 2024年09月18日 星期三
登录注册

文章基本信息

  • 标题:Guidelines for Creating Written Clinical Reasoning Exams: Insight from a Delphi Study
  • 本地全文:下载
  • 作者:Évelyne Cambron-Goulet ; Jean-Pierre Dumas ; Édith Bergeron
  • 期刊名称:Health Professions Education
  • 印刷版ISSN:2452-3011
  • 电子版ISSN:2452-3011
  • 出版年度:2019
  • 卷号:5
  • 期号:3
  • 页码:237-247
  • DOI:10.1016/j.hpe.2018.09.001
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractContextClinical reasoning is an essential skill to be learned by medical students, and thus requires to be assessed. Although written exams are widely used as one of the tools to assess clinical reasoning, there are no specific guidelines to help an exam writer to develop good clinical reasoning assessment questions. Therefore, we conducted a modified Delphi study to identify guidelines for writing questions that assess clinical reasoning.MethodsParticipants were identified from: 1) the literature on clinical reasoning (i.e., people who wrote about clinical reasoning and assessment), 2) the people responsible for assessment in Canadian medical faculties, and 3) a snowball sampling strategy. Thirty-two question-writing guidelines were drawn from the literature and adapted by the team members. Participants were asked to indicate on a ten-point Likert scale their perceived importance of each guideline, and, starting in the second round, the relevance of each guideline in five assessment contexts. A total of three rounds were conducted.ResultsResponse rates were 24%, 57%, and 62% for each round, respectively. Consensus about the importance of the guidelines (interquartile range < 2.5) was reached for all but four guidelines. Four guidelines were identified as important (median ≥ 9 on ten-point scale): the question should be based on a clinical case, the question represents a challenge achievable for the student, the correction scale (i.e., scoring grid) is explicit, and a panel of experts revises the questions.ConclusionA large number of guidelines seem relevant for written-exam clinical reasoning assessment questions. We are considering grouping those guidelines into categories to create a simple tool for use by medical educators in the design of written-exam clinical reasoning assessment questions. The next step will then be to collect evidence of validity about this tool: Does it really help to build questions that assess clinical reasoning?Highlights•Literature about guidelines to elaborate written exam questions that assess clinical reasoning is sparse.•We identified 25 guidelines that seem important to our participants when developing clinical reasoning written exams.•Relevance of some guidelines appears to vary depending on the context of assessment (some guidelines were deemed less important for preclinical training while others were less important for postgraduate training).•All types of written-exam questions (MCQ, EMQ, short open-ended questions, modified-essay questions, script concordance tests and key-feature questions) appear equally relevant in assessing clinical reasoning, although certain formats may be more appropriate in certain contexts.
  • 关键词:Clinical reasoning;Assessment;Written-exam questions;Guidelines;UGME
国家哲学社会科学文献中心版权所有