Scientific software tool to support the work flow of multidimensional adaptive testing to stimulate and simplify development, application and use of computer based adaptive tests.

One of the main objectives of TBA in computer-based testing is an increase of measurement efficiency compared to paper based tests. In comparison to tests with a given test assembly, e.g. usual linear paper-based tests, computer based adaptive tests provide a significantly higher measurement efficiency. By generalising from one-dimensional to multi-dimensional adaptive tests further increases in measurement efficiency are possible. Therefore, with multi-dimensional adaptive tests, for instance, both mathematical and scientific skills can be tested simultaneously with very high efficiency. With this “Multidimensional Adaptive Testing Environment (MATE)” TBA has developed a software in the context of the DFG priority programme “Competence Models for Assessing Individual Learning Outcomes and Evaluating Educational Processes” that supports the whole workflow of a multi-dimensional adaptive test.

What is MATE about?

MATE aims at both stimulating and simplifying the development and use of computer-based adaptive tests in diagnoses of competencies. MATE is the first open access software for scientific purposes which offers the possibility to create, set up and administrate multi- and one dimensional adaptive tests with the help of an intuitive point-and-click user interface. Additionally, pre-operational simulations can be executed. Therefore, it is possible to predict the future performances of adaptive tests and to determine the optimal specification of the adaptive algorithms.
The usability of MATE is described in detail in the user manual, which is directly integrated in the application.

How does MATE work?

The use of MATE requires a given item pool with automatically scorable items. Different response formats can be used, e.g. multiple choices, semi open texts with clearly specified correct answers as well as complex multiple choice. Templates for the computerisation can be formatted with common programmes like Microsoft PowerPoint or Microsoft Word, and computerized efficiently by MATE.

While formatting the items it is possible to use coloured markers to indicate correct and wrong answers. The computerised items can be used for the administration of tests with MATE, which is illustrated in the following graphic:

 MATEGrafik1

After the item material has been computerized, the planned test compilation can be configured. MATE allows various settings allowing use MATE for specific assessment requirements.

The main scope of MATE is adaptive testing. Concerning this way of testing, the answers of the participants are used to optimize the selection of subsequent items during test administration. Providing individually too easy or too difficult items can thereby be avoided. To make the adaptive item selection possible the items to be used have to be calibrated beforehand with an item response theory model. The item parameters resulting from this calibration (e.g. the item difficulties) can be imported into MATE easily and assigned to the according items.

To ensure the performance of computerized adaptive tests, it is recommended to run pre- operational simulations with different specifications. Hereby it can be checked, for example, which combination of test length and item selection algorithms in a given item pool can be expected to result in the highest measurement accuracy. Accordingly, in MATE tests with simulated responses can be created and evaluated in regard to different criteria as well as represented graphically.

In the following, an example of the standard error of the ability estimate (theta) as determined by means of the a-priori simulation of an adaptive test:

 MATEGrafik2

The graphic shows that the standard error (Y-axis) of the simulated adaptive test is on a comparable level across the whole range of the ability (true theta, X-axis). This simulated adaptive test therefore allows measuring with comparable measurement precision over the examined ability range.

With such simulation results, test an optimally specified for a specific target population can be created. Furthermore, MATE, can be used to administer this test to real test takers. It is furthermore possible to save the results and analyse them with common statistics software like SPSS®, SAS® or R.

Where is MATE used?

The software for administration and simulation of computer based tests was developed by TBA during the priority programme competence models and in the 3rd funding period of the DFG´s “multidimensional adaptive testing: MAT” in Jena and Frankfurt/Main.

MATE is, for example, used in the joint research project MaK-adapt within the initiative  “Technology-based Assessment of Skills and Competencies in VET” (ASCOT) for the measurement of student competencies in reading, mathematics, and science.

How to obtain MATE?

MATE is open access software for scientific purposes. If you are interested in using MATE please outline on a) the involved scholars and, if applicable, the projects involved  b) the objectives of research, c) the aimed sample as well as d) the planned procedure.
Please note that we do not offer any support for the MATE software.

 

Funding: DFG

Cooperation: Prof. Dr. Andreas Frey, Friedrich-Schiller-Universität Jena

Duration: Since 2010

Status: running

Projekt management: Ulf Kröhne (TBA)

Contact: Ulf Kröhne (TBA), Andreas Frey (FSU Jena)

Links: MAT-Projekt FSU; MAT-Project DIPF