Development of Numbers Material Test Using the Item Response Theory (IRT) Approach for SD Students

Ahmad Rustam(1*), Kasmawati Kasmawati(2),

(1) (SINTA ID : 6713882) Universitas Sulawesi Tenggara
(2) Universitas Sulawesi Tenggara
(*) Corresponding Author

Abstract


The purpose of the research is to produce a product in the form of a valid and reliable measuring instrument for student numeracy that can be used in schools and in the general public. The research stages will be carried out based on the test development design, namely Preparing Test Specifications, Preparing Test Items, Testing Test Items in the Field, Revision of Test Items, and Test Development. The question grid is based on the 2013 curriculum syllabus. The test was conducted on elementary school students. The response of the test results in the form of dichotomous data and analyzed using the item response theory (IRT) model with two logistical parameters (2PL), namely the level of item difficulty and item discriminating power. Estimation of item parameters and capability parameters using the BILOG MG program. Before doing item analysis with IRT. The results of the study contained 18 items that could be used to measure students' numeracy skills. Among these items are numbered questions 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, and 18. meet the criteria of a good item including having a good difficulty level, then the distinguishing power of the item functions well and has good validity and reliability.

Keywords


Test development, Numbers material; IRT; Two logistic parameters

Full Text:

PDF

References


Embretson, S. E., & Reise, S. P. (2013). Item response theory. Psychology Press.

Hambleton, R. K., & Jones, R. W. (1993). Comparison of Classical Test Theory and Item Response Theory and Their Applications to Test Development. Educ. Meas.

Ilahiyah, N., Yandari, I. A. V., & Pamungkas, A. N. (2019). Pengembangan Modul Matematika Berbasis Pakem Pada Materi Bilangan Pecahan Di SD. Jurnal Pendidikan dan Pembelajaran Dasar, 6(1), 49-63.

Irvine, S. H., & Kyllonen, P. C. (2013) Item generation for test development . London: Routledge.

Kean, J., & Reilly, J. (2014). Item response theory, in Handbook for clinical research: Design, statistics and implementation.

Maddalora, A. L. M. (2019). Personalized learning model using item response theory,†Int. J. Recent Technol. Eng, 8(1) Special Issue 4, 811–818.

Muhsetyo, G., dkk. 2007. Pembelajaran Matematika SD. Jakarta: Universitas Terbuka.

Raykov, T., Dimitrov, D. M., Marcoulides, G. A., & M, H. (2019). On true score evaluation using item response theory modeling. Educ. Psychol. Meas, 79(4), 796–807.

Sarea, M. S., & Ruslan, R. (2019). Karakteristik Butir Soal: Classical Test Theory vs Item Response Theory? Didakt. J. Kependidikan, 13(1), 1–16.

Subali, B., Kumaidi, N., A., S., & Sumintono, B. (2019). Student achievement based on the use of scientific method in the natural science subject in elementary school. J. Pendidik. IPA Indones, 8(1), 39–51.

Tjabolo, S. A., & Otaya, L. G. (2019). Quality of school exam tests based on item response theory. Univers. J. Educ. Res., 7 (10), 2156–2164.

Von Davier, M., Yamamoto, K., Shin, H. J., Chen, H., & Khorramdel, L. (2019). Evaluating item response theory linking and model fit for data from PISA 2000–2012. Assess. Educ. Princ. Policy Pract, 26(4), 466–488.




DOI: https://doi.org/10.31327/jme.v6i2.1585

Article Metrics

Abstract view : 533 times
PDF - 308 times

Refbacks

  • There are currently no refbacks.


Copyright (c) 2021 Ahmad Rustam, Kasmawati Kasmawati


INDEXING DATABASE