Distance and displacement inventory: Construction, validation and structural analysis

Authors

  • Akhmad Jufriadi Universitas PGRI Kanjuruhan Malang, Indonesia
  • Sutopo Sutopo Universitas Negeri Malang, Indonesia
  • Sentot Kusairi Universitas Negeri Malang, Indonesia
  • Sunaryono Sunaryono Universitas Negeri Malang, Indonesia
  • Dina Asmaul Chusniyah
  • Hena Dian Ayu Universitas PGRI Kanjuruhan Malang, Indonesia

DOI:

https://doi.org/10.21067/mpej.v8i2.9851

Keywords:

assessment, displacement, distances, kinematics, multi-representations

Abstract

This research aims to develop an instrument for assessing understanding distance and displacement concepts. Distance and Displacement Inventory (DDI) is an instrument constructed in multi-representations, including picture, table, graphic, mathematical, and verbal representations. This instrument is in multiple-choice format with ten questions to assess students' understanding of distance traveled and displacement. DDI development refers to R&D design by Borg&Gall and test development flowchart by Beichner. DDI was applied to 357 students in Indonesia who had taken introductory physics courses. Student answers were analyzed to determine the DDI's psychometric properties and structural analysis. The analysis results show that the DDI is adequate for assessing students' understanding of travel distance and displacement concepts. Meanwhile, the structural analysis of students' knowledge results shows that students have different perceptions of each question item according to their representation. Furthermore, students who understand the concept of distance traveled and displacement in one representation may need help understanding it correctly in another representation. This research implies the necessity of designing learning designs based on multi-representations and teaching students about the changes between these representations.

Downloads

Download data is not yet available.

References

Aiken, L. R. (1985). Three coefficients for analyzing the reliability and validity of ratings, Educational and Psychological Measurument. Journal Articles; Reports - Research; Numerical/Quantitative Data, 45(1), 131–142.

Ainsworth, S. (2006). DeFT: A conceptual framework for considering learning with multiple representations. Learning and Instruction, 16(3), 183–198. https://doi.org/10.1016/j.learninstruc.2006.03.001

Barniol, P., & Zavala, G. (2014). Force, velocity, and work: The effects of different contexts on students’ understanding of vector concepts using isomorphic problems. Physical Review Special Topics - Physics Education Research, 10(2), 020115. https://doi.org/10.1103/PhysRevSTPER.10.020115

Beichner, R. J. (1994). Testing student interpretation of kinematics graphs. American Journal of Physics, 62(8), 750–762. https://doi.org/10.1119/1.17449

Brown, B., & Singh, C. (2021). Development and validation of a conceptual survey instrument to evaluate students’ understanding of thermodynamics. Physical Review Physics Education Research, 17(1), 10104. https://doi.org/10.1103/physrevphyseducres.17.010104

Clement, J. (1982). Students Preconceptions in Introductory Mechanics. American Journal of Physics, 50(1), 66–70.

Ding, L., Chabay, R., Sherwood, B., & Beichner, R. (2006). Evaluating an electricity and magnetism assessment tool : Brief electricity and magnetism assessment. Physical Review Special Topics - Physics Education Research, 2, 1–7. https://doi.org/10.1103/PhysRevSTPER.2.010105

Fuchs, E., & Czarnocha, B. (2016). The Creative Enterprise of Mathematics Teaching Research. In B. Czarnocha, W. Baker, O. Dias, & V. Prabhu (Eds.), The Creative Enterprise of Mathematics Teaching Research (1st ed.). SensePublishers. https://doi.org/10.1007/978-94-6300-549-4

Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational Research: An Introduction (7th ed.). Pearson Education Inc. https://doi.org/10.4324/9781003008064-1

Halloun, I. A., & Hestenes, D. (1985). The initial knowledge state of college physics students. American Journal of Physics, 53(11), 1043–1055. https://doi.org/10.1119/1.14030

He, Z., & Schonlau, M. (2020). Automatic Coding of Text Answers to Open-Ended Questions: Should You Double Code the Training Data? Social Science Computer Review, 38(6), 754–765. https://doi.org/10.1177/0894439319846622

Hestenes, D., & Wells, M. (1992). A mechanics baseline test. The Physics Teacher, 30(3), 159–166. https://doi.org/10.1119/1.2343498

Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141–158. https://doi.org/10.1119/1.2343497

Hughes, D. J. (2018). Psychometric validity: Establishing the accuracy and appropriateness of psychometric measures. In P. Irwing, T. Booth, & D. J. Hughes (Eds.), The Wiley Handbook of Psychometric Testing: A Multidisciplinary Reference on Survey, Scale and Test Development (pp. 751–779). Wiley Blackwell. https://doi.org/10.1002/9781118489772.ch24

Jufriadi, A., & Andinisari, R. (2020). JITT with assessment for learning: Investigation and improvement of students understanding of kinematics concept. Momentum: Physics Education Journal, 4(2), 94–101. https://doi.org/10.21067/mpej.v4i2.4669

Jufriadi, A., Sutopo, S., Kusairi, S., & Sunaryono, S. (2023). Assessment of kinematic concepts comprehension : A systematic review. International Journal of Evaluation and Research in Education (IJERE), 12(3), 1449–1459. https://doi.org/10.11591/ijere.v12i3.24546

Kaltakci-Gurel, D., Eryilmaz, A., & McDermott, L. C. (2015). A Review and Comparison of Diagnostic Instruments to Identify Students’ Misconceptions in Science. EURASIA Journal of Mathematics, Science and Technology Education, 11(5), 989–1008. https://doi.org/10.12973/eurasia.2015.1369a

Kaltakci-Gurel, D., Eryilmaz, A., & McDermott, L. C. (2017). Development and application of a four-tier test to assess pre-service physics teachers’ misconceptions about geometrical optics. Research in Science & Technological Education, 35(2), 238–260. https://doi.org/10.1080/02635143.2017.1310094

Klein, P., Müller, A., & Kuhn, J. (2017). Assessment of representational competence in kinematics. Physical Review Physics Education Research, 13(1), 010132. https://doi.org/10.1103/PhysRevPhysEducRes.13.010132

Lichtenberger, A., Wagner, C., Hofer, S. I., Stern, E., & Vaterlaus, A. (2017). Validation and structural analysis of the kinematics concept test. Physical Review Physics Education Research, 13(1), 1–13. https://doi.org/10.1103/PhysRevPhysEducRes.13.010115

Mainali, B. (2021). Representation in teaching and learning mathematics. International Journal of Education in Mathematics, Science and Technology, 9(1), 1–21. https://doi.org/10.46328/ijemst.1111

Pulgar, J., Spina, A., Ríos, C., & Harlow, D. B. (2020). Contextual details, cognitive demand and kinematic concepts: exploring concepts and characteristics of student-generated problems in a university physics course. Conference: Physics Education Research Conference 2019, January. https://doi.org/10.1119/perc.2019.pr.pulgar

Rimoldini, L. G., & Singh, C. (2005). Student understanding of rotational and rolling motion concepts. Physical Review Special Topics - Physics Education Research, 1(1), 010102. https://doi.org/10.1103/PhysRevSTPER.1.010102

Rosenblatt, R., & Heckler, A. F. (2011). Systematic study of student understanding of the relationships between the directions of force, velocity, and acceleration in one dimension. Physical Review Special Topics - Physics Education Research, 7(2), 020112. https://doi.org/10.1103/PhysRevSTPER.7.020112

Schonlau, M., Gweon, H., & Wenemark, M. (2021). Automatic Classification of Open-Ended Questions: Check-All-That-Apply Questions. Social Science Computer Review, 39(4), 562–572. https://doi.org/10.1177/0894439319869210

Sullivan, G. M. (2011). A Primer on the Validity of Assessment Instruments. Journal of Graduate Medical Education, 3(2), 119–120. https://doi.org/10.4300/jgme-d-11-00075.1

Sumintono, B., & Widhiarso, wahyu. (2013). Aplikasi Model Rasch untuk Penelitian Ilmu-Ilmu Sosial (B. Trim (ed.); Pertama). TrimKom Publishing House.

Tan, S., Clivaz, S., & Sakamoto, M. (2022). Presenting multiple representations at the chalkboard: bansho analysis of a Japanese mathematics classroom. Journal of Education for Teaching, 48(5), 1–18. https://doi.org/10.1080/02607476.2022.2150538

Thornton, R. K., & Sokoloff, D. R. (1998). Assessing student learning of Newton’s laws: The Force and Motion Conceptual Evaluation and the Evaluation of Active Learning Laboratory and Lecture Curricula. American Journal of Physics, 66(4), 338–352. https://doi.org/10.1119/1.18863

Truijens, F. L., Cornelis, S., Desmet, M., De Smet, M. M., & Meganck, R. (2019). Validity beyond measurement: Why psychometric validity is insufficient for valid psychotherapy research. Frontiers in Psychology, 10(MAR), 1–13. https://doi.org/10.3389/fpsyg.2019.00532

Young, J. C., Rose, D. C., Mumby, H. S., Benitez‐Capistros, F., Derrick, C. J., Finch, T., Garcia, C., Home, C., Marwaha, E., Morgans, C., Parkinson, S., Shah, J., Wilson, K. A., & Mukherjee, N. (2018). A methodological guide to using and reporting on interviews in conservation science research. Methods in Ecology and Evolution, 9(1), 10–19. https://doi.org/10.1111/2041-210X.12828

Downloads

Published

2024-05-10

How to Cite

Jufriadi, A., Sutopo, S., Kusairi, S., Sunaryono, S., Chusniyah, D. A., & Ayu, H. D. (2024). Distance and displacement inventory: Construction, validation and structural analysis. Momentum: Physics Education Journal, 8(2), 293–303. https://doi.org/10.21067/mpej.v8i2.9851

Issue

Section

Articles