首页    期刊浏览 2025年08月03日 星期日
登录注册

文章基本信息

  • 标题:Inter-Rater Agreement, Data Reliability, and The Crisis of Confidence in Psychological Research
  • 其他标题:Inter-Rater Agreement, Data Reliability, and The Crisis of Confidence in Psychological Research
  • 本地全文:下载
  • 作者:Button, Cathryn M. ; Snook, Brent ; Grant, Malcolm J.
  • 期刊名称:Tutorials in Quantitative Methods for Psychology
  • 电子版ISSN:1913-4126
  • 出版年度:2020
  • 卷号:16
  • 期号:5
  • 页码:467-471
  • DOI:10.20982/tqmp.16.5.p467
  • 出版社:Université de Montréal
  • 摘要:In response to the crisis of confidence in psychology, a plethora of solutions have been proposed to improve the way research is conducted (e.g., increasing statistical power, focusing on confidence intervals, enhancing the disclosure of methods). One area that has received little attention is the reliability of data. We note that while it is well understood that reliability of measures is essential to replicability, there is a failure to apply some measure of data reliability consistently, or to correct for chance when assessing agreement. We discuss the problem of relying on Percent Agreement between observers as a measure of reliability and describe a dilemma that researchers encounter when assessing contradictory indicators of reliability. We conclude with some pedagogical strategies that might make the need for reliability measures and chance correction more likely to be understood and implemented. By so doing, researchers can contribute to solving some aspects of the crisis of confidence in psychological research.
  • 关键词:reliability; inter-rater agreement; Kappa; Percent Agreement; research methods; confidence intervals
国家哲学社会科学文献中心版权所有