Article Text

Download PDFPDF

Simulation curriculum evaluation and development in a postgraduate emergency medicine programme: a 2-year logic model follow-up
  1. Jared Baylis1,
  2. Kelsey Innes1,
  3. Justin Roos1,
  4. Chantal McFetridge1,
  5. Connor McSweeney2,
  6. Nicolle Holm1
  1. 1 Department of Emergency Medicine, The University of British Columbia, Vancouver, British Columbia, Canada
  2. 2 School of Medicine, The University of British Columbia Faculty of Medicine, Vancouver, British Columbia, Canada
  1. Correspondence to Dr Jared Baylis, Department of Emergency Medicine, The University of British Columbia, Kelowna, BC V1V 3B8, Canada; jbaylis{at}alumni.ubc.ca

Statistics from Altmetric.com

Introduction

Simulation is an educational tool most valuable when implemented by trained individuals.1 Having regular simulation-based educational (SBE) activities leads to skill acquisition transferable to real-life situations.2 Emergency medicine (EM) residents at the University of British Columbia (UBC) have a variety of SBE opportunities across the four training sites (Vancouver, Fraser Valley, Victoria and Kelowna).

We previously completed step two of Kern’s six-step model for curriculum development; a formal learner-targeted needs assessment.3 The assessment found a desire for increased SBE and concerns around prebrief inconsistency that may have contributed to the 19% rate of reported lack of psychological safety. This project was the second stage in an iterative curricular improvement process using a logic model.4

We chose to use a logic model as it allowed us to analyse our current programme and how it relates to the outcomes we are trying to achieve. In doing so, we were able to construct a theory of change by mapping our logical assumptions about how resource inputs into our programme result in deliverable outputs.5 The main advantage to this approach was to gain a high-level understanding of our simulation programme so that we could target specific inputs as a way to modify outputs. Our inputs into the logic model (figure 1) included simulation facilitators, technologists and labs, a recently developed centralised bank of peer-reviewed cases, and our curriculum based on the Royal College of Physicians and Surgeons of Canada (RCPSC) competencies for EM.6 Our logic model outputs included monthly laboratory-based SBE, a procedural skills day and in situ simulation.

Figure 1

Logic model inputs, outputs, and aims

We hypothesised that the interval curricular development would address some of the initially raised concerns regarding psychological safety.

Methods

Our initial online needs assessment survey was refined by UBC EM simulation educators. The survey had questions regarding the duration and frequency of SBE followed by Likert scale ratings of the perceptions of simulation on various procedural, crisis resource management and clinical topics. Last, we explored time spent on prebrief, scenario and debrief exercises, overall psychological safety and a free-text section.

The survey was available for 2.5 weeks in September 2019 to all UBC RCPSC EM residents from postgraduate year (PGY) 2–5 at all four sites. PGY 1 residents were excluded as they had not yet participated in UBC simulation.

Results

Data were collected from 48 RCPSC-EM UBC residents (65% response rate), with 56% PGY 2–3 and 44% PGY 4–5. The reported number of simulation sessions per year was 14, it increased from 12 previously and ideal frequency was identified as one session every 2 weeks.

The average time of a simulation scenario was 13 min 19 s (14 min 45 s previously). The average amount of time for each prebrief component was:

  • Clarification of objectives 1 min 38 s (1 min 10 s previously) with 9% of responses reporting no clarification (reduced from 23%).

  • Description of scenario timeline 1 min 20 s (44 s previously) with 9% of responses reporting no description (reduced from 15%).

  • Orientation 2 min 23 s (1 min 50 s previously) with one resident reporting no orientation (increased from 0).

  • Establishment of fiction contract 60 s (44 s previously) with 3% reporting no fiction contract (decreased from 18%).

  • Reassurance of confidentiality 1 min 8 s (42 s previously) with 6% reporting no reassurance (same as previous).

  • Discussion of the evaluation process took 51 s (same as previous) with 19% reporting that this was not discussed (decreased from 38%).

The average time spent on each debrief component was:

  • Reaction to scenario 2 min 49 s (2 min 19 s previously).

  • Description of scenario 2 min 5 s (similar to previous).

  • Analysis of scenario 6 min 14 s (4 min 24 s previously).

  • Summary 3 min 2 s (2 min 23 s previously).

Several impediments to learning in a simulation environment were explored and yielded a range of results (figure 2).

Figure 2

Perceived impediments to learning in simulation

Residents were polled in a binary fashion and 26% (up from 19%) perceived a lack of confidentiality and/or psychological safety.

Similar to previously, 97% of respondents reported that their overall simulation programme satisfaction was three or higher on a five-point Likert scale where one was does not meet and five was completely meets.

Discussion

Since the initial needs assessment, residents report improvement in several key simulation curriculum components. The prebrief and debrief frameworks are being used more often and for longer although there was a slight reduction in simulation scenario duration. It is difficult to assess the impact of this as residents continue to report a high level of satisfaction overall.

Creating a psychologically safe environment is an inherent challenge and despite drastic changes to our curriculum, lack of psychological safety remains a key theme. Previously, we postulated that using a standardised framework delivered by simulation trained individuals would enhance psychological safety. However, reported lack of psychologically safety was most often attributed to the debriefer and was actually higher at sites where simulation trained individuals were more likely to lead and use a standardised framework. In particular, learners cited the ‘reactions’ phase as causing feelings of vulnerability. Perhaps explicitly stating the purpose of the reactions phase with regard to gaining insight into the learner’s lens would add helpful perspective. Since our last paper, our residency programme’s curriculum has shifted to competence by design (CBD).6 Several entrustable professional activities (EPAs), and their associated milestones, are able to be assessed using simulation which may have contributed to the simulation environment feeling less psychologically safe. Creating a psychologically safe environment is a key priority of the simulation curriculum at UBC and further exploration into contributory factors will be important to pursue.

Limitations of this project include that it involves aggregate data from residents within a single programme but from four distributed sites with distinct simulation curricula. The data could be skewed as just under 50% of residents belong to a single site. Moreover, our survey population has changed in 2 years which limits direct comparison. Additionally, a resident that has had poor simulation experiences in the past may tend to report the need for simulation more sparingly, despite curricular changes. Finally, the increased perceived ideal frequency of SBE may reflect a resident’s desire to achieve EPAs in the wake of our curricular change.

This project will guide future UBC EM simulation curricular improvement. An iterative process of ongoing review, assessment and correction will assist in establishing a comprehensive and learner-centred simulation curriculum in the evolving era of CBD.

References

Footnotes

  • Twitter @baylis_jared

  • Contributors All authors contributed to the design, implementation, writing and editing of this study and manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.