How many instructors revise their multiple-choice tests based on their students’ responses? If one wishes to make a better multiple-choice test, a statistical analysis of these responses could be helpful.
We have developed a freely available web-based program to analyze a multiple-choice test, and it requires no programming experience – just your student response data and the answer key in two excel spreadsheets.
The program outputs a pdf file that goes beyond the usual scoring but includes many other metrics, which would help the instructor to revise the test for improving its reliability.
The outputs include difficulty index, discrimination index, Cronbach alpha, item response theory, and distractor analysis. It also outputs if a question should be kept or removed based on four different criteria. The source code is open-access, and we invite others to improve it.
The description is given here at https://www.garrickadenbuie.com/project/mc-test-analysis/
The program can be accessed at https://apps.garrickadenbuie.com/mctestanalysis/
How to make the input excel files is at http://www.eng.usf.edu/~kaw/MCTestAnalysis/MCTestAnalysis_input.pdf
An example of a hypothetical output report is here at https://www.garrickadenbuie.com/project/2017-07-06-mc-test-analysis_files/MCTestAnalysis_Example-Report.pdf
The source code for developers is at https://github.com/gadenbuie/mctestanalysis
You can use our free scanning program https://blog.autarkaw.com/2021/09/29/a-multiple-choice-question-response-reader/ to give multiple-choice tests in the classroom. The scanning program generates the input files needed to analyze its results through the mctestanalysis program.
This post is brought to you by
- Holistic Numerical Methods Open Course Ware:
- the textbooks on
- the Massive Open Online Course (MOOCs) available at