11-16-2023, 02:48 PM
Published in the high-impact Proceedings of the National Academy of Sciences, and edited by giant of cognitive psychology Henry Roediger:
Unproctored online exams provide meaningful assessment of student learning (Jason C. K. Chan and Dahwi Ahn, PNAS, 102(31).) (Open access under CC BY 4.0.)
Quote:The meta-analytic correlation between [unproctored] online and [invigilated] in-person exams [in 18 courses at a U.S. Midwest public university] was strongly positive (r = 0.59, see Fig. 1). Despite substantial heterogeneity in the data, Q = 89.16, I² = 81%, a positive correlation was observed for every course, even those with small enrollments. Moderator analyses showed that the correlation between in-person and online exam scores did not vary significantly by types of questions asked on the exams, the field of study, the course level, exam duration, and enrollment. We also investigated whether score inflation for online exams relative to in-person exams (defined as the difference score between online and in-person exams measured in Hedge’s g), if one existed, reduced the in-person/online exam score correlation in a meta-regression—it did not. All of the moderator results are shown in Table 1. In sum, scores from the unproctored online exams closely resembled those from the invigilated in-person exams, and this correlation was robust against a host of factors relevant to exam design and administration.
Unproctored online exams provide meaningful assessment of student learning (Jason C. K. Chan and Dahwi Ahn, PNAS, 102(31).) (Open access under CC BY 4.0.)