Gathering Validity Evidence on an Internal Medicine Clerkship Multistep Exam to Assess Medical Student Analytic Ability

Dario M. Torre, Paul A. Hemmer, Steven J. Durning, Ting Dong, Kimberly Swygert, Deanna Schreiber-Gregory, William F. Kelly, Louis N. Pangaro

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Construct: The definition of clinical reasoning may vary among health profession educators. However, for the purpose of this paper, clinical reasoning is defined as the cognitive processes that are involved in the steps of information gathering, problem representation, generating a differential diagnosis, providing a diagnostic justification to arrive at a leading diagnosis, and formulating diagnostic and management plans. Background: Expert performance in clinical reasoning is essential for success as a physician, and has been difficult for clerkship directors to observe and quantify in a way that fosters the instruction and assessment of clinical reasoning. The purpose of this study was to gather validity evidence for the Multistep exam (MSX) format used by our medicine clerkship to assess analytical clinical reasoning abilities; we did this by examining the relationship between scores on the MSX and other external measures of clinical reasoning abilities. This analysis used dual process theory as the main theoretical framework of clinical reasoning, as well as aspects of Kane’s validity framework to guide the selection of validity evidence for the investigation. We hypothesized that there would be an association between the MSX (a three-step clinical reasoning tool developed locally), and the USMLE Step 2 CS, as they share similar concepts in assessing the clinical reasoning of students. We examined the relationship between overall scores on the MSX and the Step 2 CS Integrated Clinical Encounter (ICE) score, in which the student articulates their reasoning for simulated patient cases, while controlling for examinee’s internal medicine clerkship performance measures such as the NBME subject exam score and the Medicine clerkship OSCE score. Approach: A total 477 of 487 (97.9%) medical students, representing the graduating classes of 2015, 2016, 2017, who took the MSX at the end of each medicine clerkship (2012–2016), and Step 2 CS (2013–2017) were included in this study. Correlation analysis and multiple linear regression analysis were used to examine the impact of the primary explanatory variables of interest (MSX) onto the outcome variable (ICE score) when controlling for baseline variables (Medicine OSCE and NBME Medicine subject exam). Findings: The overall MSX score had a significant, positive correlation with the Step 2 CS ICE score (r =.26, P <.01). The overall MSX score was a significant predictor of Step 2 CS ICE score (β =.19, P <.001), explaining an additional 4% of the variance of ICE beyond the NBME Medicine subject score and the Medicine OSCE score (Adjusted R2 = 13%). Conclusion: The stepwise format of the MSX provides a tool to observe clinical reasoning performance, which can be used in an assessment system to provide feedback to students on their analytical clinical reasoning. Future studies should focus on gaining additional validity evidence across different learners and multiple medical schools.

Original languageEnglish
Number of pages8
JournalTeaching and Learning in Medicine
Early online date11 Apr 2020
DOIs
Publication statusE-pub ahead of print - 11 Apr 2020
Externally publishedYes

Keywords

  • assessment
  • Clinical reasoning
  • validity

Fingerprint Dive into the research topics of 'Gathering Validity Evidence on an Internal Medicine Clerkship Multistep Exam to Assess Medical Student Analytic Ability'. Together they form a unique fingerprint.

Cite this