Modelling the pre-assessment learning effects of assessment: Evidence in the validity chain

Francois Cilliers, Lambertus Schuwirth, Cees Van der Vleuten

    Research output: Contribution to journalArticlepeer-review

    11 Citations (Scopus)

    Abstract

    OBJECTIVES We previously developed a model of the pre-assessment learning effects of consequential assessment and started to validate it. The model comprises assessment factors, mechanism factors and learning effects. The purpose of this study was to continue the validation process. For stringency, we focused on a subset of assessment factor-learning effect associations that featured least commonly in a baseline qualitative study. Our aims were to determine whether these uncommon associations were operational in a broader but similar population to that in which the model was initially derived. Methods: A cross-sectional survey of 361 senior medical students at one medical school was undertaken using a purpose-made questionnaire based on a grounded theory and comprising pairs of written situational tests. In each pair, the manifestation of an assessment factor was varied. The frequencies at which learning effects were selected were compared for each item pair, using an adjusted alpha to assign significance. The frequencies at which mechanism factors were selected were calculated. Results: There were significant differences in the learning effect selected between the two scenarios of an item pair for 13 of this subset of 21 uncommon associations, even when a p-value of <0.00625 was considered to indicate significance. Three mechanism factors were operational in most scenarios: agency; response efficacy, and response value. Conclusions: For a subset of uncommon associations in the model, the role of most assessment factor-learning effect associations and the mechanism factors involved were supported in a broader but similar population to that in which the model was derived. Although model validation is an ongoing process, these results move the model one step closer to the stage of usefully informing interventions. Results illustrate how factors not typically included in studies of the learning effects of assessment could confound the results of interventions aimed at using assessment to influence learning.

    Original languageEnglish
    Pages (from-to)1087-1098
    Number of pages12
    JournalMedical Education
    Volume46
    Issue number11
    DOIs
    Publication statusPublished - 1 Nov 2012

    Fingerprint Dive into the research topics of 'Modelling the pre-assessment learning effects of assessment: Evidence in the validity chain'. Together they form a unique fingerprint.

    Cite this