DIF analysis using IRT three parameter logistic model of the State Level Achievement Survey data on Environment & Science for grade VII students in West Bengal, India
A three parameter Item Response Theory (IRT) model is applied to the data on “Utkarsha Abhiyan” (UA), a State Level Achievement Survey in school education of West Bengal, India. The data set contains scores (correct or incorrect) of about
24,000 grade VII students on 40 multiple choice items in the discipline Environment and Science. There are two sets of items viz. Set A and Set B created by shuffling the sequence of items. The relationship between the ability of student and the chance of giving correct answer to an item is modelled probabilistically through IRT three parameter logistic model. Differential item functioning (DIF) analysis is performed separately in two different sets of items. The focal and reference group for detection of DIF are taken to be the rural and urban learners respectively. A new methodology using Z-scores to measure the extent of DIF in items is described using the data set. The DIF as visualised from the Item Characteristic Curves (ICCs) are corroborated with the DIF statistic values. The effect of item shuffling as observed in the varying nature of DIF is discussed. Probable explanations of presence of DIF in some of the items in both the sets are given in the light of item bias and multidimensionality of ability space and are considered as possible areas of future research. The analysis is done by using ltm package in R version 1.0.136.
 An,X.,Yung,Y.F.(2014).Item response theory: what it is and how you can use the IRT procedure to apply it. SAS Institute Inc.Paper SAS364-2014,1-14. Retrieved from https://support.sas.com/resources/papers/proceedings14/SAS364-2014.pdf
 Camilli, G. (1992). A conceptual analysis of differential item functioning in terms of a multidimensional item response model. Applied Psychological Measurement, 16(2),129-147. Retrieved from http://purl.umn.edu/93227.
 Cappelleri, J.C., Lundy,J.J.,Hays,R.D.(2014). Overview of Classical Test Theory and Item Response Theory for the quantitative assessment of items in developing patient-reported outcomes measures. Clinical Therapeutics,36(5),648-662. DOI: 10.1016/j.clinthera.2014.04.006. Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/24811753.
 DeMars, C.E. (2015). Modeling DIF for simulations: Continuous or categorical sec- ondary trait? Psychological Test and Assessment Modeling, 57(3), 279-300.
 Fan,X.(1998).Item response theory and classical test theory : an empiri- cal comparison of their item/person statistics. Educational and Psychological Measurements,58(3),357-373.
 Kahraman, N. (2014). An explanatory item response theory approach for a computer- based case simulation test. Eurasian Journal of Educational Research, 54, 117-134. Retrieved from ERIC database (EJ1057314).
 Komsta, L., Novomestky, F., 2011. Moments, cumulants, skewness, kurtosis and related tests in R Package Version 0.12. Retrieved from http://www.r-project.org, http://www.komsta.net/.
 Li, Y., Jiao, H., Lissitz, R.W. (2012). Applying multidimensional item response theory models in validating test dimensionality: an example of K-12 large-scale science assessment. Journal of Applied Testing Technology, 13(2). Retrieved from http://www.testpublishers.org/assets/applying%20multidimensional%20item.pdf
 Magis, D., Tuerlinckx, F., Boeck, P. D. (2015). Detection of differential item func- tioning using the Lasso approach. Journal of Educational and Behavioral Statistics, 40(2),111-135. DOI: 10.3102/1076998614559747.
 Meij, A., Kelderman,H. ,Flier,H.V.D.(2008). Fitting a mixture Item Response The- ory model to personality questionnaire data: characterizing Latent Classes and in- vestigating possibilities for improving prediction. Applied Psychological Measurement
,32(8),611-631. DOI: 10.1177/0146621607312613.
 Ozdemir, B. (2015). A coparison of IRT-based methods for examining differential item functioning in TIMSS 2011 mathematics subtest. Procedia - Social and Behavioural Sciences, 174, 2075-2083. DOI: 10.1016/j.sbspro.2015.02.004.
 Pedrajita, J.Q. (2015). Using Contingency Table Approaches in Differential Item Functioning Analysis: A Comparison. Education Journal.4(4),139-148. doi: 10.11648/j.edu.20150404.11
 Reckase. M.D. (2009). Multidimensional item response theory. London New York, Springer Dordrecht Heidelberg. DOI 10.1007/978-0-387-89976-3.
 Rizopoulos, D.(2006). ltm: An R package for latent variable modeling and Item Response Theory analyses. Journal of Statistical Software,17(5),1-25. Retrieved from http://www.jstatsoft.org/.
 Rodriguez, M.C.(2005).Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Prac- tice,Summar 2005, 3-13.
 Svetina, D., Rutkowski, L. (2014). Detecting differential item function- ing using generalized logistic regression in the context of large-scale as- sessments. Large-scale Assessments in Education, 2:4. Retrieved from http://www.largescaleassessmentsineducation.com/content/2/1/4.
 Tay, L., Huang, Q., Vermunt, J.K. (2015).Item response theory with covariates (IRT- C): assessing item recovery and differential item functioning for the three-parameter logistic model. Educational and Psychological Measurement, 76(1), 22-42. Retrieved from http://journals.sagepub.com/doi/abs/10.1177/0013164415579488.
 Violato, C.(1991).Item difficulty and discrimination as a function of stem complete- ness. Psychological Reports, 69(3),739-743. DOI: 10.2466/pr0.1922.214.171.1249.
 Wang,W.C.,Chen,H.F.,Jin,K.Y.(2015). Item Response Theory models for wording effects in mixed-format scales. Educational and Psychological Measurement,75(1),157- 178. DOI: 10.1177/0013164414528209.
 Wiberg, M. (2007) Measuring and detecting differential item functioning in criterion- referenced licensing test-a theoretic comparison of methods.Educational Measure- ment, EM60, ISSN 1103-2685.
 Zanon, C., Hutz, C.S., Yoo, H., Hambleton, R.K. (2016) An application of item response theory to psychological test development. Psicologia: Reflexao e Critica, 29:18, DOI: 10.1186/s41155-016-0040-x.
Submission of an article implies that the work described has not been published previously (except in the form of an abstract or as part of a published lecture or academic thesis), that it is not under consideration for publication elsewhere, that its publication is approved by all authors and tacitly or explicitly by the responsible authorities where the work was carried out, and that, if accepted, will not be published elsewhere in the same form, in English or in any other language, without the written consent of the Publisher. The Editors reserve the right to edit or otherwise alter all contributions, but authors will receive proofs for approval before publication.
Copyrights for articles published in International Journal of Innovative Knowledge Concepts are retained by the authors, with first publication rights granted to the journal. The journal/publisher is not responsible for subsequent uses of the work. It is the author's responsibility to bring an infringement action if so desired by the author.