出版社:Multimedia Educational Resource for Learning and Online Teaching
摘要:A unique online parsing system that produces partial-credit scoring of students’ constructed responses to mathematical questions is presented. The parser is the core of a free college readiness website in mathematics. The software generates immediate error analysis for each student response. The response is scored on a continuous scale, based on its overall correctness and the fraction of correct elements. The parser scoring was validated against human scoring of 207 real-world student responses (r = 0.91). Moreover, the software generates more consistent scores than teachers in some cases. The parser analysis of students’ errors on 124 additional responses showed that the errors were factored into two groups: structural (possibly conceptual), and computational (could result from typographical errors). The two error groups explained 55% of students’ scores variance (structural errors: 36%; computational errors: 19%). In contrast, these groups explained only 33% of the teacher score variance (structural: 18%; computational: 15%). There was a low agreement among teachers on error classification, and their classification was weakly correlated to the parser’s error groups. Overall, the parser’s total scoring closely matched human scoring, but the machine was found to surpass humans in systematically distinguishing between students’ error patterns.