首页    期刊浏览 2025年02月19日 星期三
登录注册

文章基本信息

  • 标题:Global Times Call for Global Measures: Investigating Automated Essay Scoring in Linguistically-Diverse MOOCs
  • 本地全文:下载
  • 作者:Erin Dawna Reilly ; Kyle M Wiliams ; Rose E Stafford
  • 期刊名称:Online Learning
  • 印刷版ISSN:2472-5749
  • 电子版ISSN:2472-5730
  • 出版年度:2016
  • 卷号:20
  • 期号:2
  • 页码:1-13
  • DOI:10.24059/olj.v20i2.638
  • 出版社:Online Learning Consortium
  • 摘要:This paper utilizes a case-study design to discuss global aspects of massive open online course (MOOC) assessment. Drawing from the literature on open-course models and linguistic gatekeeping in education, we position freeform assessment in MOOCs as both challenging and valuable, with an emphasis on current practices and student resources. We report on the findings from a linguistically-diverse pharmacy MOOC, taught by a native English speaker, which utilized an automated essay scoring (AES) assignment to engage students in the application of course content. Native English speakers performed better on the assignment overall, across both automated- and human-graders. Additionally, our results suggest that the use of an AES system may disadvantage non-native English speakers, with agreement between instructor and AES scoring being significantly lower for non-native English speakers. Survey responses also revealed that students often utilized online translators, though analyses showed that this did not detrimentally affect essay grades. Pedagogical and future assignment suggestions are then outlined, utilizing a multicultural-lens and acknowledging the possibility of certain assessments disadvantaging non-native English speakers within an English-based MOOC system.
  • 关键词:Massive open online courses; assessment; automated essay scoring; technology-enhanced language learning; multicultural assessment
国家哲学社会科学文献中心版权所有