Analysis of differential item functioning in agricultural science examination across southwestern nigeria's senior schools

Analysis of differential item functioning in agricultural science examination across southwestern nigeria's senior schools

Authors

  • Oluwaseyi Aina Gbolade Opesemowo University of Johannesburg
  • Temitope Babatimihin Obafemi Awolowo University, Ile-Ife
  • Temitope Sarah Ogungbaigbe Obafemi Awolowo University, Ile-Ife

DOI:

https://doi.org/10.21067/jbpd.v8i2.10204

Keywords:

Differential Item Functioning, Public Examination, One-Parameter Logistic Model, Two-Parameter Logistic Model, Three-Parameter Logistic Model

Abstract

Differential Item Functioning poses a threat to test fairness and validity in educational assessments. To this end, this study probes the incidence of DIF in the Senior School Certificate Agricultural Science Examination (SSCASE) across school locations in Southwestern Nigeria. It determined the magnitude of DIF using item parameters in SSCASE. The study applies one, two and three-parameter logistic models to analyze the dichotomous data responses of 818 randomly selected students from urban and rural locations. The ex-post facto design adopted the 2015 National Examination Council SSCE in Agricultural Science as its instrument (α =0.887) in this study. The study population comprised SSCE candidates from Osun, Ondo, and Oyo. The results showed the magnitude of DIF in SSCASE was large and moderate (1PLM=0.52 significant DIF, 2PLM=0.45 moderate DIF, 3PLM=0.47 moderate DIF) across school locations. The study concluded that 1, 2 and 3 PLMs can produce fair items in SSCASE.

Downloads

Download data is not yet available.

References

Adedoyin, O. O. (2010). Investigating the Invariance of Person Parameter Estimates Based on Classical Test and Item Response Theories. International Journal of Educational Sciences, 2(2), 107-113. doi:10.1080/09751122.2010.11889987

Adeyemo, E. O., & Opesemowo, O. (2020). Differential test let functioning (DTLF) in senior school certificate mathematics examination using multilevel measurement modelling. Sumerianz Journal of Education, Linguistics and Literature, 3(11), 249-253.

Akindele, B. (2003). The development of an item bank for selection tests into Nigerian universities: An exploratory study. Unpublished Doctoral Dissertation). Nigeria: University of Ibadan.

Aybek, E. C. (2023). The Relation of Item Difficulty Between Classical Test Theory and Item Response Theory: Computerized Adaptive Test Perspective. Journal of Measurement and Evaluation in Education and Psychology, 14(2), 118-127. doi:10.21031/epod.1209284

Bauer, D. J. (2023). Enhancing measurement validity in diverse populations: Modern approaches to evaluating differential item functioning. British Journal of Mathematical and Statistical Psychology, 76(3), 435-461. doi:10.1111/bmsp.12316

Bundsgaard, J. (2019). DIF as a pedagogical tool: analysis of item characteristics in ICILS to understand what students are struggling with. Large-scale Assessments in Education, 7(1), 9. doi:10.1186/s40536-019-0077-2

Chalmers, R. P., Counsell, A., & Flora, D. B. (2016). It Might Not Make a Big DIF:Improved Differential Test Functioning Statistics That Account for Sampling Variability. Educational and Psychological Measurement, 76(1), 114-140. doi:10.1177/0013164415584576

Chankseliani, M., Gorgodze, S., Janashia, S., & Kurakbayev, K. (2020). Rural disadvantage in the context of centralized university admissions: a multiple case study of Georgia and Kazakhstan. Compare: A Journal of Comparative and International Education, 50(7), 995-1013. doi:10.1080/03057925.2020.1761294

Effiom, A. P. (2021). Test fairness and assessment of differential item functioning of mathematics achievement test for senior secondary students in Cross River state, Nigeria using item response theory. Global Journal of Educational Research, 20(1), 55-62. doi:10.4314/gjedr.v20i1.6

Española, R. P. (2022, 5th-7th March 2022). Examining Gender and Urban/Rural School Differences in Empirically-derived Achievement Profiles. Paper presented at the 17th Education and Development Conference

Faleye, B. A., & Dibu-Ojerinde, O. O. (2006). A Review of the Enrolment and Performance of Male and Female Students in Education / Economics Programme of Obafemi Awolowo University, Ile-Ife, Nigeria. Journal of Social Sciences, 12(2), 143-146. doi:10.1080/09718923.2006.11978383

Faremi, Y. A., & Jimoh, K. (2022). Differential Item Functioning and Implications for Testing in Nigeria Education System. Indonesian Journal of Learning Education and Counseling, 5(1), 1-10. doi:10.31960/ijolec.v5i1.1689

Jang, Y., Pashler, H., & Huber, D. E. (2014). Manipulations of choice familiarity in multiple-choice testing support a retrieval practice account of the testing effect. Journal of Educational Psychology, 106(2), 435-447. doi:10.1037/a0035715

Jerome, A. (2023). Public Examinations in Nigeria: Challenges and Prospects in the 21st Century. Glob Acad J Humanit Soc Sci, 5(1), 10-15. doi:10.36348/gajhss.2023.v05i01.003

Jodoin, M. G., & Gierl, M. J. (2001). Evaluating Type I Error and Power Rates Using an Effect Size Measure With the Logistic Regression Procedure for DIF Detection. Applied Measurement in Education, 14(4), 329-349. doi:10.1207/S15324818AME1404_2

Joo, S., Ali, U., Robin, F., & Shin, H. J. (2022). Impact of differential item functioning on group score reporting in the context of large-scale assessments. Large-scale Assessments in Education, 10(1), 18. doi:10.1186/s40536-022-00135-7

Jumadi, J., Sukarelawan, M. I., & Kuswanto, H. (2023). An investigation of item bias in the four-tier diagnostic test using Rasch model. International Journal of Evaluation and Research in Education (IJERE), 12(2), 622-629. doi:10.11591/ijere.v12i2.22845

Kaya, Y., Leite, W. L., & Miller, M. D. (2016). A comparison of logistic regression models for DIF detection in polytomous items: the effect of small sample sizes and non-normality of ability distributions. International Journal of Assessment Tools in Education, 2(1), 22-39. doi:10.21449/ijate.239563

Khoeruroh, U., & Retnawati, H. (2020). Comparison sensitivity of the differential item function (DIF) detection method. Journal of Physics: Conference Series, 1511(1), 012042. doi:10.1088/1742-6596/1511/1/012042

Kishida, W., Fuchimoto, K., Miyazawa, Y., & Ueno, M. (2023). Item Difficulty Constrained Uniform Adaptive Testing, Cham.

Kristjansson, E., Aylesworth, R., Mcdowell, I., & Zumbo, B. D. (2005). A Comparison of Four Methods for Detecting Differential Item Functioning in Ordered Response Items. Educational and Psychological Measurement, 65(6), 935-953. doi:10.1177/0013164405275668

Kuzu, Y., & Gelbal, S. (2023). Investigation of differential item and step functioning procedures in polytomus items. Journal of Measurement and Evaluation in Education and Psychology, 14(3), 200-221. doi:10.21031/epod.1221823

Lu, L., Phua, Q. S., Bacchi, S., Goh, R., Gupta, A. K., Kovoor, J. G., . . . To, M.-S. (2022). Small Study Effects in Diagnostic Imaging Accuracy: A Meta-Analysis. JAMA Network Open, 5(8), e2228776-e2228776. doi:10.1001/jamanetworkopen.2022.28776

Martin, J. A., Tarantino, D. M., & Levy, K. N. (2023). Investigating gender-based differential item functioning on the McLean Screening Instrument for Borderline Personality Disorder (MSI-BPD): An item response theory analysis. Psychological Assessment, 35(5), 462-468. doi:10.1037/pas0001229

Metsämuuronen, J. (2023). Seeking the real item difficulty: bias-corrected item difficulty and some consequences in Rasch and IRT modeling. Behaviormetrika, 50(1), 121-154. doi:10.1007/s41237-022-00169-9

Mitana, J. M. V., Muwagga, A. M., & Ssempala, C. (2019). The Influence National Examinations on Classroom Practice in Primary Schools in Uganda: Case of Kampala and Kabale Districts. International Journal of Educational Research Review, 4(3), 472-480. doi:10.24331/ijere.573954

Ojerinde, D., & Ajeigbe, T. O. (2021). Post-Test Estimates of Item Parameters of Mathematics Multiple-Choice Test of a Public Examination in Nigeria. Journal of Emerging Trends in Educational Research and Policy Studies, 12(6), 253-259. doi:10.10520/ejc-sl_jeteraps_v12_n6_a5

Ojerinde, D., Popoola, O., Onyeneho, P., & Egberongbe, A. (2016). A comparative analysis of pre-equating and post-equating in a large-scale assessment, high stakes examination. Perspectives in Education, 34(4), 79-98. doi:doi:10.18820/2519593X/pie.v34i4.6

Oluwatimilehin, T., & Opesemowo, O. (2019). Investigating the prevalence of cheating: LAP LAMBERT Academic Publishing.

Omale, O., Dike, G. A., & Chibundum, C. (2023). Assessment of Ability Estimate and Model Fit of Students in 2020 NECO Mathematics in Benue State, Nigeria. Journal of Educational Research and Development, 10, 43-52.

Opesemowo, O. A. G., Ayanwale, M. A., Opesemowo, T. R., & Afolabi, E. R. I. (2023). Differential Bundle Functioning of National Examinations Council Mathematics Test Items: An Exploratory Structural Equation Modelling Approach. Journal of Measurement and Evaluation in Education and Psychology, 14(1), 1-18. doi:10.21031/epod.1142713

Perez, A. L., & Loken, E. (2023). Person Specific Parameter Heterogeneity in the 2PL IRT Model. Multivariate Behavioral Research, 1-7. doi:10.1080/00273171.2023.2224312

Roussos, L. A., & Stout, W. F. (1996). Simulation Studies of the Effects of Small Sample Size and Studied Item Parameters on SIBTEST and Mantel-Haenszel Type I Error Performance. Journal of Educational Measurement, 33(2), 215-230. doi:10.1111/j.1745-3984.1996.tb00490.x

Song, C., Gadermann, A., Zumbo, B., & Richardson, C. (2022). Differential Item Functioning of the Center for Epidemiologic Studies Depression Scale Among Chinese Adolescents. Journal of Immigrant and Minority Health, 24(3), 790-793. doi:10.1007/s10903-021-01275-8

Sweeney, S. M., Sinharay, S., Johnson, M. S., & Steinhauer, E. W. (2022). An Investigation of the Nature and Consequence of the Relationship between IRT Difficulty and Discrimination. Educational Measurement: Issues and Practice, 41(4), 50-67. doi:10.1111/emip.12522

Thissen, D., Steinberg, L., & Wainer, H. (2013). Use of item response theory in the study of group differences in trace lines. In Test validity (pp. 147-169): Routledge.

Wallin, G., Chen, Y., & Moustaki, I. (2024). DIF Analysis with Unknown Groups and Anchor Items. Psychometrika, 89(1), 267-295. doi:10.1007/s11336-024-09948-7

Warne, R. T., Yoon, M., & Price, C. J. (2014). Exploring the various interpretations of “test biasâ€. Cultural Diversity and Ethnic Minority Psychology, 20(4), 570-582. doi:10.1037/a0036503

Wiggins, B. L., Lily, L. S., Busch, C. A., Landys, M. M., Shlichta, J. G., Shi, T., & Ngwenyama, T. R. (2023). Public exams may decrease anxiety and facilitate deeper conceptual thinking. Journal of STEM Education: Innovations and Research, 24(2), 36-48.

Yavuz Temel, G. (2023). A Simulation and Empirical Study of Differential Test Functioning (DTF). Psych, 5(2), 478-496.

Yeng, E., Ali, C. A., & Adzifome, N. S. (2023). Relationship between familiarity and competency in integrating information and communication technology into mathematics instruction. Contemporary Mathematics and Science Education, 4(2), ep23022. doi:10.30935/conmaths/13405

Zhou, S., & Shen, C. (2022). Avoiding Definitive Conclusions in Meta-analysis of Heterogeneous Studies With Small Sample Sizes. JAMA Otolaryngology–Head & Neck Surgery, 148(11), 1003-1004. doi:10.1001/jamaoto.2022.2847

Zhou, Y., & Jia, N. (2023). The Impact of Item Difficulty on Judgment of Confidence—A Cross-Level Moderated Mediation Model. Journal of Intelligence, 11(6), 113. doi.org/10.3390/jintelligence11060113

Downloads

Published

2024-07-30

How to Cite

Opesemowo, O. A. G., Babatimihin, T., & Ogungbaigbe, T. S. (2024). Analysis of differential item functioning in agricultural science examination across southwestern nigeria’s senior schools. Jurnal Bidang Pendidikan Dasar, 8(2), 136–150. https://doi.org/10.21067/jbpd.v8i2.10204

Issue

Section

Articles
Loading...