Analysis of Items Parameters on Work and Energy Subtest Using Item Response Theory

(1) Universitas Pendidikan Indonesia, Indonesia
(2) Universitas Pendidikan Indonesia, Indonesia
(3) Yogyakarta State University,
(4) Yogyakarta State University, Indonesia
(5) SMA Negeri CMBBS, Indonesia

Copyright (c) 2025 Duden Saepuzaman, Haryanto Haryanto, Edi Istiyono, Heri Retnawati, Yustiandi Yustiandi
Article Metrics→ |
Indexing Database→ | ![]() |
![]() |
Abstract
This study aims to describe the Physics test item parameters in Work and Energy and describe students’ abilities using the item response theory approach (IRT) dichotomous scoring. This research is quantitative descriptive. The research subjects were 1175 high school class XI students in West Java and Banten provinces consisting of 450 male students and 725 female students. The instrument used was Physics of Work and Energy as many as 25 items in multiple choices with dichotomous scoring. Student response data with dichotomous scoring were analyzed using the item response theory approach using the BILOG-MG program. The results showed that most of the items fit the 2PL model. Subsequent analysis of the items’ characteristics indicates that all items have different power and a level of difficulty in the good criteria.
Keywords: item parameters, item response theory, physics test.
DOI: https://dx.doi.org/10.23960/jpmipa/v22i1.pp1-9
References
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working inside the black box: Assessment for learning in the classroom. Phi Delta Kappa, 86(1), 8-21. Bonifay, W., & Cai, L. (2017). On the complexity of item response theory models. Multivariate Behavioral Research, 52(4), 465-484.
Cappelleri, J. C., Lundy, J. J., & Hays, R. D. (2014). Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures. Clinical therapeutics, 36(5), 648-662.
Chalmers, R. P., Pek, J., & Liu, Y. (2017). Profile-likelihood confidence intervals in item response theory models. Multivariate Behavioral Research, 52(5), 533-550.
DeMars, C. (2010). Item response theory. Oxford University Press.
Eleje, L., I & Onah, F., E. (2018). Comparative study of classical test theory and item response theory using diagnostic quantitative economics skill test item analysis results. European Journal of Educational & Social Sciences. 3(1), 71-89
Gierl, M. J. (2007). Making diagnostic inferences about cognitive attributes using the rule‐space model and attribute hierarchy method. Journal of educational measurement, 44(4), 325-340.
Güler, N., Kaya Uyanik, G., & Taşdelen Teker, G. (2014). Comparison of classical test theory and item response theory in terms of item parameters. European Journal of Research on Education, 2(1), 1-6.
Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2009). Análise multivariate de dados. Bookman Editora
Hambleton, R. K., & Swaminathan, H. (1985). Item response theory.Boston, MA : Kluwer.Inc
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage
Junker, B. W. (1999). Some statistical models and computational methods that may be useful for cognitively-relevant assessment. Prepared for the National Research Council Committee on the Foundations of Assessment. Retrieved April, 2, 2001.
Mardapi, D. (2008). Teknik penyusunan instrumen tes dan non tes. Yogyakarta: Mitra Cendekia.
Meijer, R. R. (1996). Person-fit research: An introduction. Applied Measurement in Education, 9(1), 3-8.
National Academy of Sciences-National Research Council, Washington, DC., National Research Council (US)., National Research Council Staff, National Research Council, Board on Science Education Staff, Division of Behavioral, ... & Assessment Staff. (1996). National science education standards. Joseph Henry Press.
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. National Academy Press, 2102 Constitutions Avenue, NW, Lockbox 285, Washington, DC 20055.
Putri, N. K. L., Asih, N. M., & Nilakusmawati, D. P. E. (2015). Faktor-faktor yang Menentukan Kepuasan Pelanggan Sepeda Motor Matic Honda di Kota Denpasar. E-Jurnal Matematika, 4(1), 1-7.
Reckase, M. D. (1979). Unifactor latent trait models applied to multifactor tests: Results and implications. Journal of educational statistics, 4(3), 207-230.
Retnawati, H. (2014). Teori respons butir dan penerapannya: Untuk peneliti, praktisi pengukuran dan pengujian, mahasiswa pascasarjana. Yogyakarta: Nuha Medika.
Stone, C. A., & Zhang, B. (2003). Assessing goodness of fit of item response theory models: A comparison of traditional and alternative procedures. Journal of Educational Measurement, 40(4), 331-352.
Wang, J., & Bao, L. (2010). Analyzing force concept inventory with item response theory. American Journal of Physics, 78(10), 1064-1070.
Refbacks
- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
The copyright is reserved to The Jurnal Pendidikan MIPA that is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.