- Nick Sevdalis
- Correspondence to Dr Nick Sevdalis, Department of Surgery and Cancer, Imperial College London, St Mary's Hospital, Wright Fleming Building, 5th Floor (Room 507), Norfolk Place, London W2 1PG, UK;
Statistics from Altmetric.com
In the past two decades, the field of simulation in healthcare has experienced a dramatic expansion. Numerous simulators, simulation-based training curricula and programmes now exist, across a range of healthcare specialties. Many healthcare and education practitioners have developed their careers in the field of simulation-based education and training. There are basic and advanced courses and qualifications that can be obtained to demonstrate knowledge and mastery of simulation as an approach to education and training in a number of countries. The overall investment in simulation from the perspective of human and organisational resources, as well as financial commitment is significant. A wide range of peer-reviewed outlets now exist, publishing scientific investigations of simulation-based applications. Multiple conferences across the globe led by learned societies specialising in simulation take place every year, where the latest studies and nascent developments are shared and discussed. To add to those, simulation has now permeated mainstream general and specialty-specific clinical meetings (eg, in surgery, nursing, anaesthesia, or obstetrics to name but a few), such that simulation studies are also regularly presented to clinical audiences at large.
This expansion has been driven by a range of inter-related factors. Ever increasing pressures to shorten clinical trainees’ learning curves; the quest to ensure the highest levels of safety at the point of care; technological developments (eg, virtual reality), including industry initiatives; as well as cross-industry influences from sectors where simulation-based training and development have long been part of operational training or personnel selection and appraisal are only some of these drivers.1 More recently, there has also been a parallel expansion in the range of modern training technologies available to clinicians and educators, including virtual reality and web-based learning2 ,3 alongside the more traditional apprenticeship-styled learning that has taken place in healthcare and the caring professions for centuries.
BMJ Simulation & Technology Enhanced Learning (BMJ STEL) is undoubtedly a reflection of this wider expansive environment. BMJ STEL focuses on simulation and related modern learning technologies. The Oxford English Dictionary definition of simulation as “the technique of imitating the behaviour of some situation or process (whether economic, military, mechanical, etc.) by means of a suitably analogous situation or apparatus, especially for the purpose of study or personnel training” fully applies to the way simulation is conceptualised within the journal.4 Clarifying this definition from the start is important—as it implies that the focus of BMJ STEL is not on ‘kit’ but on the wider practice of simulation, which certainly involves ‘kit’ (however simple/inexpensive, or complex/costly) but importantly it revolves around a human and an organisational element. We are keen to understand when and how simulation-based and other modern training modalities within health and social care work—and perhaps even more importantly why they do not always work. Entirely basing studies on simulators and their properties may be necessary when these simulators need to be developed and evaluated. However, we take this level as rather basic. It is not uncommon to find early simulation studies presenting self-report data from small numbers of attendees of a single hospital showing great satisfaction having taken part in a simulation-based training session, which they would certainly recommend to their colleagues; or studies showing that first year surgical trainees took significantly longer and made more errors in carrying out a laparoscopic suturing task compared to minimally invasive surgeons who have been at Consultant/Attending level for 5–10 years. In the first example, it is more interesting and consequential for training and also practice to examine how exactly attendees of a training course have improved—perhaps in their knowledge, skills, or attitudes; whether the improvement is lasting (ie, when does it starts to decay?); whether it is transferable to clinical practice; and whether it is transferable to other tasks or procedures. Likewise in the second example, it is more useful to be able to benchmark trainee and Consultant/Attending performance on a suturing task—so that we can compare where they are in relation to their peers, construct performance norms and then identify reasons for variable performance, or detect performance detriments early.
These examples bring us to the concept of validity—a much debated concept in education and psychology, with a theoretical basis of its own.5 Although validation studies have in many ways been the ‘bread and butter’ of the simulation field as a whole, the theory underpinning validity is yet to be fully appreciated within the wider simulation community. Numerous studies are published reporting ‘validation’ of a simulator, of a simulation-based curriculum, or of assessment instruments and metrics of skill and performance. There are equally numerous references in the literature to ‘validated’ simulators, curricula, or assessments. Such references are, unfortunately, inaccurate and far too simplistic. Our scientific understanding of validity, just like our understanding of, say, laparoscopy, is not static; it evolves and changes with changes to the evidence base. The current state of the art in validity theory describes validation as an ongoing process, which depends heavily on what we want to do with the simulator, curriculum or assessment instrument in question. BMJ STEL aspires to help elucidate the concept of validity in the simulation and education evidence base—Korndorffer et al6 offer a great brief introduction to the concept and further reading.
BMJ STEL intends to be an outlet for well-designed, impactful studies within the broad field of simulation and technology enhanced learning. Alongside other specialist and clinical publications hosting simulation-related work, we hope to publish high-quality primary research in the field. We welcome studies across all research traditions—including quantitative investigations, qualitative explorations and mixed methods approaches. The key criterion for publication in BMJ STEL is quality and appropriateness of design and metrics. Alongside primary research studies we welcome evidence syntheses carried out qualitatively or quantitatively (eg, meta-analyses)—we are seeking well-conceptualised reviews of relevance to educational and clinical policy-making. We are also open to well-articulated, thorough policy analysis articles—to tackle issues including implementation of simulation and technology enhanced training into curricula or hospitals/healthcare systems, curricular integration, for instance, between medical and nursing schools, and educational policy analyses. The focus of the research, synthesis or analysis can be the individual learner/health or social care provider; health and social care teams; or entire organisations. There is a wealth of simulation research activity across all of these fields—with organisational-level learning and improvement increasingly becoming a key priority topic for the modern hospital or social care institution7 alongside the more traditional simulation studies on learning curves, validation, or team skills training.1
As a new venture, we anticipate BMJ STEL to climb a steep early learning curve—without the benefit of simulation-based rehearsal beforehand! We invite you all as readers and contributors to join us in this new and exciting journey. The journal will work only if it addresses your needs and provides materials that are scientifically solid and clinically and educationally useful. We welcome contributions from all medical, surgical, nursing and social care specialties as well as veterinary medicine—and also from psychology, education, sociology and learning technology. Simulation and learning are interdisciplinary fields—this is what we aspire to reflect in our publications, contributors and readership.
- Stefanidis D,
- Sevdalis N,
- Paige J, et al
- Karacapilidis N
- Michael M,
- Abboudi H,
- Ker J, et al
- ↵Oxford English Dictionary. 2014. Accessed electronically at http://www.oed.com.
- ↵Joint Committee on Standards for Educational and Psychological Testing APA, American Educational Research Association and National Council on Measurement in Education. Standards for educational and psychological testing. Washington, DC: AERA Publications, 2014.
Funding NS is affiliated with the Imperial Patient Safety Translational Research Centre (http://www.cpssq.org), which is funded by the UK's National Institute for Health Research.
Competing interests NS delivers regular teamwork and safety training on a consultancy basis in the UK and internationally.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.