Completed project phase 2014-2017
National Educational Panel Study (NEPS) 2014-2017
For the domains reading, mathematics, science, and ICT literacy, which are surveyed multiple times in the longitudinal NEPS, changes in the measurement instruments as a result of computerization were psychometrically explored based on combined mode-effect and link studies as well as with the help of experimental mode variation (see e.g. Buerger, Kroehne & Goldhammer, 2016). For this purpose, such procedures of quantifying and correcting mode effects were investigated and applied to enable the introduction of computer-based competency testing in NEPS. Research and development in this project phase focused on the use of properties of technology-based testing for the further development and optimization of NEPS competency tests (e.g., testing multiple highlighting as a response format).
For in-depth research on mode and setting effects, for example, a procedure for log data collection in paper-based testing has been developed at TBA and has been used in selected NEPS studies (see, e.g., Kroehne & Goldhammer, 2018). In this approach, digital ballpoint pens are used to answer the paper-based administered questions in test booklets in which a special dot pattern is is printed (see, among others, Dirk et al, 2017 for a description). While the entries in the test booklet with these digital ballpoint pens are visible to panelists as if they had been made with an ordinary ballpoint pen, the coordinates and timestamps of all the responses are additionally recorded via a Bluetooth-connected computer. This data collection method allows the analysis of answering processes, such as e.g., the comparison of processing times between paper-based and computer-based testing (see, e.g., Kroehne, Hahnel, & Goldhammer,2019).Selected Publications
- Kroehne, U., Gnambs, T., & Goldhammer, F. (2019). Disentangling setting and mode effects for online competence assessment. In H.-P. Blossfeld & H.-G. Roßbach (Hrsg.), Education as a lifelong process (2. Aufl.). Wiesbaden, Germany: Springer VS. doi: 10.1007/978-3-658-23162-0
- Buerger, S., Kroehne, U., Köhler, C. & Goldhammer, F. (2019). What makes the difference? The impact of item properties on mode effects in reading assessments. Studies in Educational Evaluation, 62, 1-9. doi: 10.1016/j.stueduc.2019.04.005
- Kroehne, U., Hahnel, C. & Goldhammer, F. (2019). Invariance of the response processes between gender and modes in an assessment of reading. Frontiers in Applied Mathematics and Statistics, 5:2. doi: 10.3389/fams.2019.00002
- Kroehne, U., Buerger, S., Hahnel, C. & Goldhammer, F. (2019). Construct equivalence of PISA reading comprehension measured with paper‐based and computer‐based assessments. Educational Measurement, 38(3), 97-111. doi: 10.1111/emip.12280
- Dirk, J., Kratzsch, G. K., Prindle, J. P., Kroehne, U., Goldhammer, F., & Schmiedek, F. (2017). Paper-Based Assessment of the Effects of Aging on Response Time: A Diffusion Model Analysis. Journal of Intelligence, 5(2), 12. doi: 10.3390/jintelligence5020012
- Buerger, S., Kroehne, U., & Goldhammer, F. (2016). The Transition to Computer-Based Testing in Large-Scale Assessments: Investigating (Partial) Measurement Invariance between Modes. Psychological Test and Assessment Modeling, 58(4), 597-616.
- Goldhammer, F., & Kroehne, U. (2014). Controlling Individuals’ Time Spent on Task in Speeded Performance Measures: Experimental Time Limits, Posterior Time Limits, and Response Time Modeling. Applied Psychological Measurement, 38(4), 255–267. doi: 10.1177/0146621613517164