期刊名称:Proceedings of the Canadian Engineering Education Association
出版年度:2017
出版社:The Canadian Engineering Education Association (CEEA)
摘要:Nearly a decade ago a large first year engineering design course moved the collaboratively written design report assignments to an online platform. The switch was made using an existing online word processing tool, Google Drive, that allow for simple sharing and commenting. The students use the online tool to write their assignments, and members of the teaching team use the same tool to coach and grade the assignments. Anecdotally there was initially significant evaluator resistance to the implementation of the online grading platform. This initial resistance has been overcome and the online tool continues to be used today. Anecdotal feedback from the teaching team now praises the online grading platform as increasing quality of feedback, but at the expense of increased marking time. Until recently the exams in the same course are still written and marked on paper in the traditional style. For the first time the teaching team has adopted another online grading platform, Crowdmark. This tool allows for the digitization, online grading, and digital distribution of paper exams. In anticipation of evaluator resistance, this study will explore how use of this system impacts the quality of the grading experience for evaluators, including time on task and satisfaction with the process. This study will use a multiphase mixed methods design with an initial phase of convergent parallel design focusing on quantitative analysis. Time study data measuring time on task for evaluators will be converged with both quantitative and qualitative survey data collected from the evaluators. In the second phase, individual evaluators who struggled with the online grading platform, indicated either by low marking speed or direct feedback, will be interviewed. These interviews will be analysed using a qualitative, thematic analysis to determine the cause and severity of the issues.