- Jessica Hernandez1,
- Alise Frallicciardi2,
- Nur-Ain Nadir3,
- M David Gothard4,
- http://orcid.org/0000-0001-8518-6083Rami A Ahmed5
- 1 Emergency Medicine, UT Southwestern Medical, Dallas, Texas, USA
- 2 Emergency Medicine, University of Connecticut School of Medicine, Farmington, Connecticut, USA
- 3 Emergency Medicine, Kaiser Permanente Central Valley, Modesto, California, USA
- 4 Biostats, Inc, East Canton, Ohio, USA
- 5 Emergency Medicine, Indiana University School of Medicine, Indianapolis, Indiana, USA
- Correspondence to Dr Rami A Ahmed, Emergency Medicine, Indiana University School of Medicine, Indianapolis, IN 46202-5114, USA;
Introduction One critical aspect of successful simulation facilitation is development of written scenarios. However, there are no validated assessment tools dedicated to the evaluation of written simulation scenarios available. Our aim was to develop a tool to evaluate the quality of written simulation demonstrating content validity.
Methods A comprehensive literature search did not yield a validated assessment tool dedicated for the evaluation of written simulation scenarios. A subsequent search yielded six templates published for written simulation scenario design. From these templates, critical scenario elements were identified to create an evaluation instrument with six components of scenario quality with corresponding anchors and rating scale. Subsequently, a national group of simulation experts were engaged via survey methodology to rate the content of the proposed instrument. Ultimately, a modified two-round Delphi approach was implemented to demonstrate consensus of the final assessment tool.
Results 38 responses were obtained in round 1, while 22 complete responses were obtained in round 2. Round 1 kappa values ranged from 0.44 to 1.0, indicating moderate to almost perfect rater agreement for inclusion of the six proposed components. Kappa values specifically regarding scale and anchors ranged from 0 to 0.49. After revisions, there was a significant level of agreement (p<0.05) of all items of the proposed assessment tool in the second-round survey except for item 10. Of note, all initial respondents indicated that they had never evaluated written scenarios with an assessment tool.
Conclusions The Simulation Scenario Evaluation Tool, developed using a national consensus of content experts, is an instrument demonstrating content validity that assesses the quality of written simulation scenarios. This tool provides a basis to guide structured feedback regarding the quality of written simulation scenarios.
- simulation evaluation
- simulation scenario
- Delphi method
- simulation fellow
Statistics from Altmetric.com
Contributors JH provided substantial contributions to concept and design, acquisition of data, analysis and interpretation of data, drafting of the manuscript. AF provided substantial contributions to concept and design, drafting of the manuscript and critical manuscript revision. N-AN provided substantial contributions to concept and design, drafting of the manuscript and critical manuscript revision. MDG provided substantial contributions to concept and design, acquisition of data, analysis and interpretation of data, drafting of the manuscript and statistical expertise. RAA provided substantial contributions to concept and design, analysis and interpretation of data and critical revision of the manuscript for important intellectual content. JH agrees to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. This worked served as JH’s thesis at Johns Hopkins University.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None declared.
Ethics approval The institutional review board at the University of Texas Southwestern Medical Center and Johns Hopkins University reviewed this study protocol, and it was categorised as ‘not human subject research’ and was deemed exempt from formal ethics review.
Provenance and peer review Not commissioned; externally peer reviewed.
Data availability statement All data relevant to the study are included in the article or uploaded as supplementary information.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.