Development and evaluation of a paramedic taxonomy and progress test

James Thompson, Don Houston

Research output: Contribution to conferenceAbstractpeer-review

Abstract

How a student performs on the day of a test is influenced by a myriad of variables. Examples of student results being inflated through guessing answers and chance, and conversely being hindered due to poorly constructed questions or other pressures, challenge the reliability of many tests as a tool to judge student true levels of understanding. Further concerns are linked to the capacity of a test to sample adequate breadth and depth of student understanding of a curriculum, as well as their being issues linked to binge learning efforts to optimise scores, which is known to lead to short term rather than sustainable learning.
Responding to local concerns linked to the existing use of high-stakes finals exams, firstly a holistic paramedic knowledge and skills taxonomy was constructed and validated through a collaborative process involving a range of stakeholders, including academics, industry partners, and recent graduates. Items identified on this taxonomy were then re-organised to reflect the integrated relationship of differing concepts, before being developed as multiple-choice question items and reviewed again by the consulting stakeholders.
A major component of the assessment in a final semester undergraduate capstone paramedic subject was redesigned to trial the taxonomy and test instrument. We introduced three connected progress test (PT) events across a single semester. Two spaced 10 weeks apart used the same multiple-choice test. Negative marking was used as a deterrent to guessing behaviours. The third was a final oral viva exam based on individual student results in the second progress test.
Methods. We studied patterns of results of the 103 enrolled students, including correct, incorrect, and don’t know responses between the two MCQ tests, and final oral viva performance results. Additionally, we examined qualitative findings of student experiences and perceived value for the educational approach. Results. Mean total student scores increased by 65% between the 2 MCQ progress tests, with M = 24% increase incorrect responses, M= 9% decline in incorrect responses, and M=15% decline in don’t know responses. The mean class score for the final viva was 76%. Students indicated 89.7% broad agreement for the value of the approach to their learning, with qualitative data demonstrating perceptions this approach to have been the most challenging, yet most beneficial learning of their undergraduate experience.
Conclusion. Favourable results in performance, student learning experience, and improved assessment validity and reliability support the use of progress testing within paramedicine.
Original languageEnglish
Number of pages4
Publication statusPublished - 18 Apr 2021
EventCollege of Paramedics International Education Conference - Online , London, United Kingdom
Duration: 12 Apr 202118 Apr 2021
https://collegeofparamedics.co.uk/InternationalConf

Conference

ConferenceCollege of Paramedics International Education Conference
Country/TerritoryUnited Kingdom
CityLondon
Period12/04/2118/04/21
OtherThis conference was an opportunity to bring together paramedic educators from around the world to share knowledge, enhance learning and promote innovation.
The College of Paramedics, the Australasian College of Paramedicine and partners in Canada worked together to create this virtual, online conference. Each day had a specific, education-based theme and consisted of live online content supplemented by a range of pre-recorded supporting resources. The aim was offer an interactive and engaging forum for attendees to participate in.

Internet address

Keywords

  • programmatic assessment
  • progress test
  • paramedic education

Fingerprint

Dive into the research topics of 'Development and evaluation of a paramedic taxonomy and progress test'. Together they form a unique fingerprint.

Cite this