标题:Değişen Madde Fonksiyonu Belirlemede MTK-Olabilirlik Oranı, Ordinal Lojistik Regresyon ve Poly-Sıbtest Yöntemlerini/Comparison of Likelihood Ratio Test (LRT), Poly-Sıbtest and Logistic Regression in Differential Item Functioning (DIF) Detection Procedures
期刊名称:e-International Journal of Educational Research
印刷版ISSN:1309-6265
出版年度:2015
卷号:6
期号:1
DOI:10.19160/e-ijer.24504
语种:
出版社:E-International Journal of Educational Research
摘要:Özet Bu çalışmada PISA 2012 öğrenci anketinde yer alan Matematik Çalışma Etiği anketindeki tutum maddelerinin cinsiyete göre değişen madde fonksiyonu gösterip göstermediği incelenmiştir. DMF analizleri Madde Tepki Kuramı-Olabilirlik Oranı (MTK-OO), Ordinal Lojistik regresyon (OLR) ve Poly-SIBTEST yöntemleriyle gerçekleştirilmiştir. Çalışmanın örneklemini, matematik çalışma etiği tutum maddelerine yanıt veren 3217 bireyden SPSS programı aracılığıyla random olarak seçilen 1000 (500 kadın, 500 erkek) birey oluşturmaktadır. Verilerle ilgili madde analizleri MTK ile yapılmıştır. Cinsiyete göre DMF analizlerinde Poly-SIBTEST yöntemine göre 3. Maddenin yüksek düzeyde erkekler lehine, MTK-OO yöntemine göre 3. Maddenin orta düzeyde ve OLR yöntemine göre 3. Maddenin yüksek düzeyde çok biçimli DMF gösterdiği görülmüştür. Anahtar Sözcükler: PISA, Değişen Madde Fonksiyonu, MTK-OO, Poly-SIBTEST, OLR Extended Problem and Purpose: PISA, which are implemented in many countries, identify strengths and weaknesses of education systems of such countries and pave the way for developing future-oriented policies for education systems. For this reason, PISA results need to be able to enlighten differences among individuals and include minimum error. In other terms, reliability and validity evidences regarding measuring results must be presented. One important threat against validity is item and test bias (Clauser and Mazor, 1998). The individuals at the same level of ability are normally expected to get the same score from the test or items in a test. However, probability of correct answers in subgroups of the same ability level might differ due to conditions of the test or some attributes of the item. This is called differential item functioning (DIF) (Zumbo, 2007; Finch and French, 2007). DIF refers to systematic differentiation of individuals in a group with the same test score and ability level in terms of probability of answering a certain test item (Doğan, Guerrero and Tatsuoka, 2005). Regarding items of attitude, Hulin, Drasgow and Parsons (1983) define DIF as differing probability of showing positive attitude towards a certain item by individuals in different subgroups with the same ability level (cited by Johanson and Dodeen, 2003). In attitude scales, existence of DIF affects accuracy of measuring results in a negative way. The study aims at identify whether or not items in PISA 2012 maths work ethics questionnaire differ by gender was examined with poly-SIBTEST, MTK-OO and OLR methods. Also it was investigated whether items showing DIF change depending on methods. Method: The responses given for attitude items in maths work ethics questionnaire of PISA 2012 student questionnaire were downloaded from the OECD web site (http://pisa2012.acer.edu.au/) in January 2014. In Turkey sample of PISA 2012 application, 3217 15-year old students responded to the maths work ethics attitude items. After removing missing data of 3217 participants, 1000 people were selected among 3130 individuals by using SPSS on a random basis (500 females and 500 males). 9 items in the questionnaire were checked to see whether they show DIF by gender. SIBTEST software was used for Poly-SIBTEST method, the macro developed by Zumbo (1999) was used for OLR method, and the IRTLRDIF software was used for MTK-OO. During data analysis, males were used as focus group and females as reference group. Conclusion and Recommendations: According to the DIF analysis results, item 3 shows non-uniform DIF at high level, medium and high level as a result of Poly-SIBTEST, MTK-OO and OLR analysis, respectively. While MTK-OO and poly-SIBTEST methods determine uniform DIF, OLR is able to determine both uniform and non-uniform DIF. Therefore, OLR indicates more details about nature of DIF. Obtained results show that although used methods produce similar results for items with DIF, DIF amounts are found to be different. This difference is thought to be due to the difference of value intervals belonging to classifications regarding DIF levels. Asil (2012), point out that DIF in PISA items often arises from translation and adaptation errors. In relation with item bias, Van de Vijver and Tanzer (2003), and Hambleton, Merenda and Spielberger (2005) argue that DIF arises especially in questionnaires used in international comparisons due to reasons such as insufficient translation, mistyping of items and ambiguous translation of the item.