摘要:Mathematics assessments should be designed for all students, regardless of their background or gender. Rasch analysis, developed based on Item Response Theory (IRT), is one of the primary tools to analyse the inclusiveness of mathematics assessment. However, the mathematics test development has been dominated by Classical Test Theory (CTT). This study is a preliminary study to evaluate the mathematics comprehensive test. This study aims to demonstrate the use of Rasch analysis by assessing the appropriateness of the mathematics comprehensive test to measure students' mathematical understanding. Data were collected from one cycle of mathematics comprehensive test involving 48 undergraduate students of mathematics education department. Rasch analysis was conducted using ACER Conquest 4 software to assess the item difficulty and differential item functioning (DIF). The findings show that the item related to geometry is the easiest question for students, while item concerning calculus as the hardest question. The test is viable to measure students’ mathematical understanding as it shows no evidence of Differential Item Functioning (DIF). Gender has been drawn for each of the test items. The assessment showed that the test was inclusive. More application of Rasch analysis should be conducted to create a thorough and robust mathematics assessment.
其他摘要:Mathematics assessments should be designed for all students, regardless of their background or gender. Rasch analysis, developed based on Item Response Theory (IRT), is one of the primary tools to analyse the inclusiveness of mathematics assessment. However, the mathematics test development has been dominated by Classical Test Theory (CTT). This study is a preliminary study to evaluate the mathematics comprehensive test. This study aims to demonstrate the use of Rasch analysis by assessing the appropriateness of the mathematics comprehensive test to measure students' mathematical understanding. Data were collected from one cycle of mathematics comprehensive test involving 48 undergraduate students of mathematics education department. Rasch analysis was conducted using ACER Conquest 4 software to assess the item difficulty and differential item functioning (DIF). The findings show that the item related to geometry is the easiest question for students, while item concerning calculus as the hardest question. The test is viable to measure students’ mathematical understanding as it shows no evidence of Differential Item Functioning (DIF). Gender has been drawn for each of the test items. The assessment showed that the test was inclusive. More application of Rasch analysis should be conducted to create a thorough and robust mathematics assessment.