Article Text

PDF

Interprofessional Simulation Olympics as a platform to assess team work knowledge acquisition
  1. Jill Steiner Sanko1,
  2. Ilya Shekhter2,
  3. Mary Mckay1,
  4. Karina Gattamorta1,
  5. David J Birnbach2
  1. 1University of Miami School of Nursing and Health Studies, Coral Gables, Florida, USA
  2. 2University of Miami, UM-JMH Center for Patient Safety, Miami, Florida, USA
  1. Correspondence to Dr Jill Steiner Sanko, University of Miami School of Nursing and Health Studies, 5030 Brunson Drive, Coral Gables, FL 33124, USA; j.sanko{at}miami.edu

Abstract

Introduction Several years ago an on-stage competition called SimWars was introduced to the simulation community. This concept was adopted into a patient safety course as a way to further engage students and named Sim Olympics. We sought to evaluate it as a platform for assessment of learning in students who participated as audience members.

Methods A non-equivalent groups design was used to assess whether students could be taught to recognise features of effective teamwork, including a pair of expert raters. One-way repeated measures analysis of variance was used to compare students’ attitudes toward interprofessional education (IPE) education, teamwork and simulation, before and after the course.

Results Student scores compared to expert scores showed good agreement. For team 1 there were no statistical differences noted (M=19.58, SD=4.34 given by the students, M=17.50, SD=2.12 given by the experts), t (192)=1.26, p=0.264. There was also no difference for team 2 (M=15.173, SD=5.52 given by the students, M=19.50, SD=3.53 given by the experts), t (173)=0.863, p=0.354. A premeasure and postmeasure of students’ attitudes towards IPE education, teamwork and simulation, also showed significant time effect, p<0.001.

Conclusions Medical and nursing students were able to demonstrate their learning of teamwork dynamics by discerning differences between great teamwork and good teamwork as proficiently as seasoned experts. Findings of this study may further support the use of observation as a method to evaluate learning.

  • Sim Olympics
  • Assessment
  • Simulation
  • Teamwork
  • Interprofessional teams

Statistics from Altmetric.com

Introduction

Interprofessional team training has been shown to promote acquisition of patient safety knowledge and skills, as well as improve teamwork of healthcare students who take part in such training.1 Courses that provide interactive educational experiences, such as simulation encounters, are reported to have the most impact.1

The simulation-based interprofessional patient safety course at the University of Miami consists of 5 days of interactive, experiential learning activities with several lectures throughout the week to anchor concepts. The target learners for this course are second semester accelerated optioni nursing and beginning third year medical students.ii Topics covered in the course address core interprofessional education (IPE) competencies and include: teamwork, communication, roles and responsibilities, values and ethics.2 In addition, leadership skills, delegation, calling for help, development of a shared mental model and visual thinking strategies are also woven throughout the curriculum.

Measuring the application of knowledge and acquisition of skills requires a significant time commitment.3 At present, the most widely used method to evaluate learning outcomes following simulation-based courses is through direct observation of participants’ performance, either individually or as part of a team. Other approaches employ the use of a pre–post survey or questionnaire to examine changes in knowledge, but this method does not adequately measure the ability of students to demonstrate what they have learned. Seeking a way to evaluate a large cohort of students in a short period of time, a plan was devised to gauge feasibility of Sim Olympics as an approach to measure demonstration of knowledge and skill attainment. A pre–post measure of attitudes and perceptions of teamwork, IPE and simulation as a learning modality was used alongside Sim Olympics as a parallel measure of learning achieved during the course.

In 2008, Okuda et al4 introduced Simwars to the healthcare simulation community. It is an onstage competition used to highlight teamwork, communication and patient care skills.4 Several years ago, the concept of Simwars was incorporated into our course. Titled Sim Olympics, this activity serves as a way to further engage students as well as to provide another simulation-based learning opportunity. Rather than volunteering to compete, as is done in Simwars, student teams in Sim Olympics are selected based on their performance in a simulation encounter earlier during the course.

Sim Olympics has been well received as a student engagement tool, but we hypothesised that it could also be used as a way to measure acquisition of learning. We sought to evaluate it as a platform for assessment of the students participating in the activity as audience members. A number of studies have demonstrated that learning as an observer can be almost as effective as actively participating in simulation encounters.1 ,3 ,5 Robertson et al1 successfully used video vignettes to assess acquisition of patient safety competencies such as teamwork, collaboration, leadership and problem solving. This study showed that simulation and a modified TeamSTEPPs curriculum succeeded in improving students’ communication skills, attitudes toward working together, and their ability to identify effective teamwork behaviours.

We examined the use of Sim Olympics as a platform to evaluate mastery of learned concepts by observers (students participating in the activity as audience members) of simulation encounters and as a way to carry out large-scale assessments of learning without resorting to the time-intensive and resource demanding process of observing each student engaging in simulation encounters. To triangulate the findings related to using Sim Olympics as a large scale evaluation, a pre–post measure and participant satisfaction tool were also obtained. Successful implementation of Sim Olympics as an evaluative approach could demonstrate a new application of simulation for both teaching and assessing acquisition of knowledge and skills. Further, the act of providing a peer evaluation (the audience members would be evaluating teams of their peers who were selected to compete in Sim Olympics), carried out as part of this activity, may offer additional learning opportunities for the providers and recipients of the feedback.

Methods

The study was carried out following institutional review board approval. Two hundred and twenty students from the schools of medicine and nursing participated in a week-long interprofessional simulation-based patient safety course. To facilitate learning teamwork, a major objective of the course, the 220 students were divided into 36 teams made up of four to five medical students and two nursing students. With the exception of lectures and one standardised patient encounter, all course activities were team-based. A non-equivalent groups design was used to assess whether students could be taught to recognise features of effective teamwork and to assign objective scores to teams of participants encountering a simulated medical case as competently as a pair of expert raters. One-way repeated measures analysis of variance was used to compare students’ attitudes toward IPE education, teamwork and simulation, before and after the course. Finally, descriptive statistics were used to examine course satisfaction and perceptions of the course as a whole.

ATTITUDES questionnaire

Before and after the course, all students were asked to complete a modified version of the KidSIM Attitude Towards Teamwork in Training Undergoing Designed Educational Simulation (ATTITUDES) questionnaire6 via an electronic link. The Kid SIM ATTITUDES questionnaire is a five factor tool developed to measure attitudes and perceptions of teamwork, IPE and simulation as a learning modality.6 Psychometric evaluation of the tool completed by Sigalet et al demonstrated excellent internal reliability, α=0.95. The 30 item measure taps into five domains (relevance of IPE, relevance of simulation, communication, situational awareness, and roles and responsibilities). Each item is answered using a five-point Likert scale from ‘strongly agree’ to ‘strongly disagree’. The modified version of the tool had a total of 28 items; two items were dropped because they were not adequately covered in our course, but otherwise the tool was identical to the original version.

Performance measurement mid-week

As part of the course, to identify the top performing teams that would compete on stage during the Sim Olympics activity, a mid-week performance measure of all teams was undertaken. All teams were scored by a trained faculty evaluator using a crisis resource management (CRM) tool originally developed by Kim et al.7 The tool was modified by our team of patient safety faculty members and used to generate objective performance scores for each team (UM-CRM tool) during the mid-week simulation encounter. The UM-CRM tool is a 13-item, single factor measure that evaluates team performance in the areas of problem solving, situational awareness and resource utilisation at the team level. Team leadership is also evaluated in the areas of leader delegation, decisiveness, maintenance of a calm demeanour, and the ability to keep the big picture view of the situation (figure 1). Based on each team's performance scores as measured using the UM-CRM tool, two of the 36 teams were selected to compete in the final activity of the course, Sim Olympics.

Sim Olympics

Prior to scoring the participating teams, audience members were instructed to bring a computer or tablet and were given a brief orientation on the use of the UM-CRM tool to score the teams. Specific exemplars of desired team behaviours were not included as part of the brief orientation; however, during the course of the week, exemplars were reinforced through activities, lectures and debriefings. Qualtrics,iii a web-based survey system, was used to produce an electronic version of the UM-CRM tool. The electronic UM-CRM tool was accessible through a link in the course's Blackboardiv site and facilitated real-time evaluation of the teams. In addition to scoring teams using the tool provided, the audience was also asked to vote for the winning team, based on either the score they assigned or based on their overall impression of which team handled the case better.

Each of the selected teams was given an identical case, which presented a simulated patient introduced to them earlier in the course during other simulation encounters. To avoid giving the second team an advantage, its members were sent out of the room and beyond earshot during the first team's scenario. Concepts taught throughout the week, as well as new distracters and patient complications, were incorporated into the Sim Olympics scenario, to challenge the competing teams and present new learning opportunities. Each scenario lasted 8 min, and the simulations took place on the stage of a large auditorium. Immediately following each team's performance, the audience members were asked to score each team using the electronic UM-CRM tool.

Expert rating procedure

Each Sim Olympics case was video recorded. These recordings were later viewed by two blinded expert raters: one nurse and one physician. Both were familiar with the course and the measurement tool, but were not present on the day of the event; and neither knew which of the two teams had won or how they had performed. Each expert rater was provided with the recordings to view, as well as access to the electronic tool for scoring. Scores provided by the audience were compared to those assigned by the expert raters. SPSS V.19 (IBM Corp, Armonk, New York, USA) was used to perform the data analyses.8

Post-course evaluation

The post-course evaluation tool was administered to all students enrolled in the course. The post-course evaluation is a 21 item, comprehensive appraisal of the students’ satisfaction with, and perceptions of, each of the activities encountered during the course, and rates how well the course as a whole met expectations. A four-point Likert scale was used to establish agreement with each of the items, except for the last three items, which were open-ended questions asking the students what they learned from, about, and with each other, during the week-long course.

Results

ATTITUDES Questionnaire

Of the 186 students who responded to the 28-question survey, only 161 completed the survey at both assessment points and consented to having their data included. Excellent reliability was found with a Chronbach α=0.98 and internal consistency was good with item total correlations all at or above 0.57 (table 1). A significant time effect was found; on average pre to post differences showed a significant difference f (1, 157)=40.95, p<0.001, ŋ2=0.21. The average pretest score was 4.4, whereas the average post-test score was 4.8. There were no significant differences noted by gender f (1, 157)=0.07, p=0.79 or intended profession f (1, 157)=0.65, p=0.421, except for questions in the situational awareness category, where nursing students scored higher than medical students on the pretest (p<0.05), but this difference was not observed on the post-test.

Table 1

Item-total statistics for ATTITUDES Survey

Sim Olympics

A total of 190 students scored team 1's performance and 170 students scored team 2's performance (outliers more than 2 SDs from the mean were excluded from analysis, n=4). Prior to hypothesis testing, psychometric evaluation of the UM-CRM tool was undertaken to test for internal validity. Good reliability was found (α=0.88). Additionally, good internal consistency was demonstrated with all total item correlations at or above 0.40 (table 2).

Table 2

Expert-student comparisons and UM-CRM tool total-item correlations

Student scores compared to expert scores showed good agreement. A student's t test was used to examine the differences in total scores awarded by the experts and students. For team 1, there was no statistical difference noted (M=19.58, SD=4.34 given by the students, M=17.50, SD=2.12 given by the experts), t (192)=1.26, p=0.264. There was similarly no statistical difference noted for team 2 (M=15.173, SD=5.52 given by the students, M=19.50, SD=3.53 given by the experts), t (173)=0.863, p=0.354. Individual items were also calculated; the descriptive statistics are reported in table 2.

Post-course evaluation

From the perspective of the students, both the course as a whole and the Sim Olympics activity were well received. Students were asked about their opinions of the course, as well as about the Sim Olympics experience itself. The overwhelming majority (96%) of students agreed (57%) or strongly agreed (39%) that the course provided opportunities to gain knowledge and skills. 98% of the students agreed (28%) or strongly agreed (70%) that the activities in which they participated as part of the course were valuable for their professional development. Nearly all (99%) of the students agreed (13%) or strongly agreed (86%) that the simulation activities delivered during the course provided valuable learning opportunities on patient safety concepts. Finally, specific to Sim Olympics, 83% of the students found the activity to be a useful way to learn (strongly agree (40%), agree (43%)).

Discussion

The findings of this study further support the use of observation of simulation encounters as a feasible way to assess the ability of individual learners to recognise good teamwork. Not only were the students able to recognise teamwork behaviours and characteristics as adeptly as the experts, but they also reported the experience to be worthwhile. Moreover, the ability to assess students’ capability to evaluate teamwork, communication and interprofessional collaboration using Sim Olympics as a platform alongside a measure that demonstrated a pre–post shift in attitudes and perceptions of teamwork, communication and IPE, allowed for parallel assessment of learning utilising two measures. Finally, the ability to triangulate the positive findings using Sim Olympics as an evaluative method with statistically significant improvements in perceptions of, and attitudes toward, IPE education, teamwork and simulation as a teaching modality as well as with positive perceptions of the activity itself and the course as a whole. Thus, favourable findings demonstrated across multiple measures support our hypothesis.

Despite these successes, however, there were some limitations to this study. First, our findings may not be generalisable to all other academic settings. In fact, observing an activity may not always be a good substitute for active participation, especially when it comes to experiential learning. In other words, recognising teamwork behaviours in others is not equivalent to being able to execute those behaviours. Second, not all observed behaviours may lend themselves to such an assessment format. Teamwork behaviours are well defined, and the UM-CRM tool directed the students towards an explicit list of teamwork characteristics that they had to assess. Without a specific list of desired behaviours to look for, students may have had difficulty assessing the teams. Third, we only utilised two expert raters to review and score the recorded cases to be used as a reference score. Therefore, it is possible that findings would have been different if more expert raters had been used. Finally, the expert raters assessed the teams’ performance post hoc using video recordings, which could have limited the accuracy of scoring. Indeed, recordings may not fully capture all the actions that were observable during the live encounters. Ideally, having the experts scoring the live action alongside the audience would have equalised the scoring condition, thus allowing for a more balanced comparison between the students’ and experts’ scores.

Conclusion

Medical and nursing students were able to exhibit acquisition of patient safety knowledge by demonstrating the ability, as a group, to discern differences between great teamwork and good teamwork, as proficiently as seasoned experts. Findings of this study may also support the use of observation of simulation scenarios as a method to evaluate learning. Our findings are consistent with the conclusions of Robertson et al1 and other studies,3 ,5 and provide additional support for using observation, such as watching and scoring teams engaging in simulation, to assess attainment of knowledge and the ability to recognise desired patient safety behaviours. These findings provide further evidence that simulation encounters may also be used for teaching and possibly assessment even when learners participate only as observers, however, a more rigorous controlled study would be needed to solidly demonstrate these conclusions. Lastly, this study presented a novel use of the concept of Simwars in an interprofessional educational format.

Mastering the skill of reviewing and scoring of peer teams in the clinical setting could play an important role in promoting patient safety. Acquiring such skills during the educational process could create the foundation for establishing the habit of paying attention to team dynamics and engaging in purposeful debriefing following patient encounters. Furthermore, the practice of observing peer teams followed by providing peer generated evaluations in clinical settings could profoundly improve recognition and correction of team related errors. Teaching future healthcare providers to recognise, think about and discuss effective teamwork, will improve their ability to be more effective and collaborative team members, consequently enabling them to provide higher quality and safer patient care. Looking to the future, a plan to study the impact of self-debriefing is under way. The combination of these novel teaching approaches could have wide reaching impacts in both the educating of healthcare students as well as their patient safety practices as they enter the work force.

References

View Abstract

Footnotes

  • Twitter Follow Jill Sanko at @jillsanko

  • Competing interests None declared.

  • Ethics approval University of Miami IRB approval.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • i Accelerated option nursing is comprised of students who have completed a bachelor degree in another field and are now completing the course work to become Bachelor Degree prepared Registered Nurses. This program of study is generally a total of three semesters.

  • ii The medical students participating in this study were in a four year program at a US-based medical school. In the third year, these medical students begin seeing patients on the wards and <while> rotating through their core clinical rotations.

  • iii 1. Qualtrics is a cloud based platform used for survey distribution and research.

  • iv Blackboard is a web-based educational platform used to develop and to house course content.

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.