首页    期刊浏览 2024年11月23日 星期六
登录注册

文章基本信息

  • 标题:Beyond standard checklist assessment: Question sequence may impact student performance
  • 本地全文:下载
  • 作者:Jeff LaRochelle ; Steven J. Durning ; John R. Boulet
  • 期刊名称:Perspectives on Medical Education
  • 印刷版ISSN:2212-2761
  • 电子版ISSN:2212-277X
  • 出版年度:2016
  • 卷号:5
  • 期号:2
  • 页码:95-102
  • DOI:10.1007/s40037-016-0265-5
  • 出版社:Springer Verlag
  • 摘要:Introduction Clinical encounters are often assessed using a checklist. However, without direct faculty observation, the timing and sequence of questions are not captured. We theorized that the sequence of questions can be captured and measured using coherence scores that may distinguish between low and high performing candidates. Methods A logical sequence of key features was determined using the standard case checklist for an observed structured clinical exam (OSCE). An independent clinician educator reviewed each encounter to provide a global rating. Coherence scores were calculated based on question sequence. These scores were compared with global ratings and checklist scores. Results Coherence scores were positively correlated to checklist scores and to global ratings, and these correlations increased as global ratings improved. Coherence scores explained more of the variance in student performance as global ratings improved. Discussion Logically structured question sequences may indicate a higher performing student, and this information is often lost when using only overall checklist scores. Conclusions The sequence test takers ask questions can be accurately recorded, and is correlated to checklist scores and to global ratings. The sequence of questions during a clinical encounter is not captured by traditional checklist scoring, and may represent an important dimension of performance.
国家哲学社会科学文献中心版权所有