Article Text

Download PDFPDF

‘It is a different type of learning’. A survey-based study on how simulation educators see and construct observer roles
Free
  1. Stephanie O’Regan1,2,
  2. Elizabeth Molloy3,
  3. Leonie Watterson1,
  4. Debra Nestel2
  1. 1 Sydney Clinical Skills and Simulation Centre, Northern Sydney Local Health District, St. Leonards, NSW, Australia
  2. 2 Monash Institute for Health and Clinical Education, Faculty of Medicine Nursing and Health Sciences, Monash University, Clayton, Victoria, Australia
  3. 3 Department of Medical Education, Faculty of Medicine Dentistry and Health Sciences, The University of Melbourne, Melbourne, Victoria, Australia
  1. Correspondence to Stephanie O’Regan, Sydney Clinical Skills and Simulation Centre, Level 6 Kolling Building, Royal North Shore Hospital, Northern Sydney Local Health District, Reserve Road, St. Leonards, NSW 2065, Australia; Stephanie.oregan{at}health.nsw.gov.au

Abstract

Background Simulation is reported as an appropriate replacement for a significant number of clinical hours in pregraduate programmes. To increase access for learners, educators have looked to understanding and improving learning in observer roles. Studies report equivalent learning outcomes and less stress in observer roles. However, reports on the prevalence, use and perceived value of observer roles from the educator’s perspective are lacking.

Methods An exploratory survey for Australian and New Zealand (ANZ) simulation educators based on literature findings was developed and piloted with a small sample (n=10) of like subjects for language, clarity, skip logic and completion time. The final survey comprised 36 questions. Quantitative data were analysed using Pearson’s chi-squared test, Welch’s ANOVA and exploratory factor analysis. Select qualitative data were analysed using content analysis and summarised with frequency counts and categorisation.

Results Two hundred and sixty-seven surveys were completed, with 221 meeting criteria for analysis. The observer role is widely used in ANZ and most learners experience both hands-on and observer roles. The location of observers is dependent upon several factors including facility design, learner immersion, scenario design and observer involvement. Verbal briefings and/or other guides are provided to 89% of observers to direct their focus and 98% participate in the debrief. Educators value observer roles but tend to believe the best learning is hands-on.

Conclusions The learning in observer roles is less valued by educators than hands-on roles. Focused observation provides opportunities for noticing and attributing meaning, an essential skill for clinical practice. Learning spaces require consideration of scenario design and learning objectives. Scenario design should include objectives for observer roles and incorporate the observer into all phases of simulation. Attention to these areas will help promote the value of the different type of learning available in observer roles.

  • Simulation based education
  • Simulation based learning
  • Simulation faculty
  • Simulation in healthcare

Statistics from Altmetric.com

INTRODUCTION

In 2004, a pioneer of contemporary simulation David Gaba wrote in his paper about his future vision “not all learning requires direct participation…some learning can take place merely by viewing a simulation involving others, as one can readily imagine being in the shoes of the participants. A further step is to involve the remote viewers either in the simulation itself or in debriefings about what transpired”.1 Fifteen years later that future is now, with simulation reported as an appropriate replacement for a significant number of clinical hours in nursing and physiotherapy undergraduate programmes.2 3 In an effort to increase access for learners, educators have looked to understanding and augmenting learning in observer roles.4–9 Some,5–7 10–12 but not all,13 14 studies report equivalent learning outcomes and/or reduced stress in observer roles. However, reports on the prevalence, use and perceived value of observer roles from the educator’s perspective are lacking, a gap this paper aims to address.

Understanding and effectively using observer roles in simulated activities are important for several reasons. First, simulation is resource intensive, so optimising learning for all participants, hands-on or observers, is economically sound. Second, as simulation is increasingly used to augment or replace clinical placements,2 3 15 it is important to ensure that all learners benefit. Third, learning how to observe or notice effectively is a skill essential to clinical practice.16 Finally, by understanding the experience and benefits of observer roles, educators can critically appraise their simulation sessions and incorporate strategies to ensure that learning is optimised across all roles.

In a systematic review of manikin and simulated participant (SP; patient) scenario-based simulations, the authors defined observer roles in two broad contexts.17 First, an observer can watch from an external or within room non-participatory position. Second, an observer can be active within the scenario in a role which is not congruent with their professional role; for example, playing a relative to the patient, or a doctor playing a nurse, or a senior practitioner playing a student. This second context was referred to as an ‘in-scenario’ observer role. Other terms include “student confederate10 and “response-based role”.18 The addition of observer tools, checklists or briefings further defined these observers as ‘directed’ observers.17 This directed observation activates the learner, provides focus on learning objectives and increases observer satisfaction.17 19 20 The location of the observer may affect engagement.21

Aim and research question

This study aims to provide a deeper understanding of observer roles through the experience of simulation educators in Australia and New Zealand (ANZ). The questions guiding this research are:

  1. What are the types and prevalence of observer roles?

  2. What value do educators place on observer roles in simulation?

  3. Do educator factors exhibit differences in the value placed on observer roles in simulation?

The findings contribute to a series of recommendations for maximising learning in observer roles in simulation.

METHODS

An exploratory survey was developed, based on literature findings17 and reviewed by subject matter experts as identified by the research team. This was piloted with a small sample (n=10) of like subjects for language, clarity, understanding, ease of online completion, skip logic and length of time for completion.22 Following refinements to wording and response choices, the final survey comprised 36 questions.

Recruitment during April to June 2016 was through a targeted snowball email campaign with an online link to the survey. Online recruitment and survey completion reduces costs, reduces data transcription and affords the respondent with the flexibility and time to complete the survey at their convenience.23 Using a cluster sampling approach,24 known simulation facilities, professional associations of simulation educators, a national simulation educator/technician programme (NHET-Sim) in Australia and simulation interest groups in ANZ were requested to forward the survey link to their members and survey respondents onto their colleagues. This secondary snowball recruitment process was designed to capture respondents not associated with any of the professional organisations targeted and broaden the sample size. No incentives were offered.

The population of healthcare simulation educators in ANZ is self-reported and not clearly defined through registration, certification or other means. Eligible respondents were self-identified simulation educators, working with manikins or SPs and residing in ANZ. Respondents provided consent before progressing to the survey and anonymity was protected by Survey Monkey software during the data collection process. Two hundred and sixty-seven responses were collected. Following the elimination of those not meeting ANZ location or work criteria (n=21) and incomplete responses (n=25), 221 surveys were included for analysis.

The survey items elicit educators’ current utilisation of observer roles including types of observers, location while observing and their view of learning within these roles. Respondents were asked to provide demographic data, respond to closed and open-ended questions with categorical, Likert anchored and free text to explain specific answers. Free comments data are not reported.

This mixed methods study provided quantitative and textual data. Quantitative data were analysed using SPSS (v24.0 IBM, USA). Statistical tests used were Pearson’s chi-squared text, exploratory factor analysis and Welch’s analysis of variance (ANOVA) with significance set at p<0.05. The explanatory text response data were analysed using content analysis, then summarised with frequency counts and categorisation into positive, negative or neutral responses.25 26 This data was used to provide colour and substance to the quantitive data. Data analysis was through a social learning theory lens and a contextual background of simulation education providers and researchers.

The professions with very small numbers; perfusion technicians, anaesthetic technician and dentists, were combined with the allied health group to support further analysis. Respondents without clinical backgrounds were excluded when data were sub-grouped by clinical background for comparative analysis as numbers were too few. Registered nurses and midwives were combined into the one nursing group.

Ethics

The Monash University Human Research Ethics committee approved the research study (CF16/414-2016000193). This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

RESULTS

Demographics

The respondents were nurses and midwives (56%), medical doctors (27%), allied health across six professions (9%), paramedicine (6%) and other educators (2%). This is a similar proportional distribution as reported for more than 5000 Australian educators enrolled in the National Health Education and Training in Simulation (NHET-Sim) programme.27 The primary locations were higher education (34%), hospital (32%), specialised simulation centre (6%), in situ or workplace simulation (27%) and other space (1.5%) (table 1).

Table 1

Demographic data in study population

Learners were primarily practicing clinicians (51%), higher and vocational education pre-graduate students (33%) and a mixture of both (16%). The primary learner professions identified were 38% nursing and midwifery, 26% medicine, 16% multi-professions, 8.5% interprofessional, 5% allied health, 4.5% ambulance and 1% other. Based on position description most respondents identified as educators (52%), followed by clinicians (38.5%), simulation coordinators (3%), facility managers (3%), academics (1.5%), researchers (1%) and one person as a SP (0.5%).

Thirty-nine per cent of respondents had a higher education qualification in education (27%) or simulation (12%). Most had completed a short simulation training course (79%) and ‘on the job training’ (84%). Thirteen people (6%) were Certified Healthcare Simulation EducatorsI (CHSE). Nurses/midwives were more likely to have a postgraduate certification in education (p=0.013) or simulation (p=0.006) (table 2).

Table 2

Further education or training in health or simulation-based training by clinical background

Teaching experience using simulation ranged from 1 to 30 years (M=8.12, SD=6.23). Data from the other (n=4) and paramedicine (n=13) groups were removed due to extreme variance and small numbers. There was a significant difference in the years of experience between groups (p=0.001), with 12% of the variation in experience explained by clinical background. Doctors had more years of experience in simulation education than nursing and allied health, with nursing also significantly more experienced than allied health with moderate effect sizes for all (table 3).

Table 3

Experience in simulation education measured in years by clinical backgrounds

Prevalence of observer roles

The observer role is widely used in ANZ with 67% of educators reporting usually or always having observers in their simulation practice. Only 6% reported rarely or never having learners in observer roles (table 4). There were no differences in the use of observer roles related to educators’ job description (p=0.54), clinical background (p=0.28), primary simulation location (p=0.20) or type of learners (p=0.15). Educators who never have observers (n=4), reported small group sessions of only active learners, or one-on-one sessions for assessment purposes.

Table 4

Allocation and application of observer roles

Of the group who uses observers, 66% of observers always or usually have a hands-on or active role in another scenario. Twenty-two per cent sometimes have an active role. Only 12% never or rarely have an active role, that is are only ever observing the simulation. Once again there was no difference in observers also being hands-on related to educators’ job description (p=0.82), clinical background (p=0.20), primary simulation location (p=0.47) or primary learner groups (p=0.49).

Fifty per cent of respondents allocated learners to in-scenario observer roles, that is, active roles within the scenario that are not congruent with their normal professional role. The use of the in-scenario role was unrelated to educator job description (p=0.71), clinical background (p=0.08), primary location (p=0.12), job description (p=0.72) or learner groups (p=0.63).

Location of observers

Most observers watched from within the simulation room (38%), with 30% observing from a separate area. The remaining 31% were flexible about the location. Nurses were significantly more likely to have observers located in the simulation room. Respondents were also asked to provide commentary on the location choice. Factors influencing the choice of location for observers were facility design (46%), the type of scenario (21%), hands-on immersion (18%), observer involvement (10%) and resources (5%). Facility design and observer involvement were significant reasons cited for observers located in the same room and hands-on immersion for location of observers in a separate viewing space. Flexible location of observers was due to scenario design and resources (table 5).

Table 5

Location of observers during scenario

Table 6

Summary of exploratory factor analysis of observer role vale (n=216)

Reasons for choice of location were categorised as positive (22%), negative (43%) or neutral (35%). Negative responses connected observers with impacting hands-on learning and lack of observer location choice. Positive comments targeted how the learning for observers might be improved through location and scenario design. Negative responses included observers being “removed from the clinical area so they cannot influence the scenario or interfere with immersion”, “sim needs to be as realistic as possible, having a group of peers observing is a major blocker to this”, “learner can feel intimidated if observers are in the room”, and “best viewing without interrupting scenario”. Location choice comments included “it’s the only option”, “limited by facilities’” “insitu forces in room location’”and “forced due to facility available”. Conversely, positive observer role responses included, “we felt that the learning would be enhanced and more focused by being in the room rather than observing from a different room”, “they (observers) experience some of the emotional response to patients situation, can see everything that is occurring and witness other student responses, patient response” and it is easy for them to observe when they are in the room”.

Observation guides and tools

Most educators provide observers with some form of instruction (89%). These instructions include verbal briefings (76%), written learning objectives (24%), an observation guide (22%) or an observation checklist (19%). While there was no difference in verbal briefings between groups, doctors were less likely to provide checklists, written learning objectives or observation guides. Some educators use more than one type of observer instruction process or tool.

Ninety per cent of observers are involved in providing feedback to the active participants. This feedback may be written (5%), verbal feedback directly to the scenario participants (66%), facilitator collated and mediated checklists (6%) or within the debrief (52%). Thirty-four per cent of observers received feedback from an educator on their observations, with a small percentage (2%) having their ratings or checklists calibrated with other observers. Even if not providing or receiving feedback, 98% of observers participate in the debrief. One educator indicated that observers are debriefed separately, and three educators did not have observers involved in a debrief. There were no differences between the professional groups.

Eighty per cent of in-scenario observers received a verbal role briefing, 31% a written role brief including parameters of the role, and 45% a role script including specific answers or cues to be provided to other participants during the scenario. Nurse and allied health educators were more likely to provide a script for in-scenario observer roles. Similarly, 98% of the in-scenario observers participated in the debrief.

The value of the observer role

Twenty-five Likert anchored, subjective value statements about observer roles were organised into four categories: learning during observation, learner preferences in simulation, observer tools and observers in the debrief. An exploratory principle factor analysis was conducted on 25 items using the process described in Field.28 The initial correlation matrix showed five items (2, 14, 15, 21 and 24 R) with no correlations above 0.30 so these were removed from analysis. The sample adequacy was verified by Kaiser–Meyer–Olkin measure, KMO =0.732, with the lowest individual KMO 0.64 (range 0.64–0.84). Five factors had eigenvalues over 1, explaining 47.67% of variance. One item (7) loaded across 2 factors and was retained in both. Two items (4 R and 13) did not reach the loading criteria of 0.40 on any factors and were removed. Cronbach’s test of reliability ranged from 0.74 to 0.78 (table 6).

The five factors identified across 18 items were:

  1. Observers contributing to the debrief

  2. Focused observation

  3. Value of observer roles

  4. Learning in observer roles

  5. Observing first

Welch’s one-way repeated measures ANOVA compared 216 participants’ ratings of the five factors identified. Responses were significantly skewed in factor three and kurtotic in factors three and five, however, studies have shown Welch’s ANOVA to be robust for highly skewed data.29

Pairwise comparisons showed significant mean differences across factors with observer perspective in the debrief (M=4.21, 95% CI=4.14 to 4.27) rated highest followed by value of observer roles (M=4.16, 95% CI=4.09 to 4.24), observing first (M=4.03, 95% CI=3.95 to 4.11), focus in observer roles (M=3.88, 95% CI=3.80 to 3.95) and learning in observer roles was rated significantly lower than the other four factors (M=2.81, 95% CI=2.73 to 2.89). Value of observer roles was not significantly different from observer perspective in the debrief or observing first (table 7).

Table 7

Welch’s one-way repeated measures analysis of variance for ratings of the five factors of observer role value

The influence of various factors on how observer roles were valued by educators was examined. Nurses scored learning in observer roles significantly higher than doctors (p<0.001). Completion of a short course in simulation, gender and educator as a primary job description also resulted in a significantly higher mean score for learning. The most experienced educators in years scored lower in focus and learning in observer roles. The location of the observer influenced scores in focus and being an observer first, while the use of observer tools influenced the focus, learning and values scores (table 8).

Table 8

Influence of educator factors on observer role value ratings (p values)

DISCUSSION

This study reports that observer roles are common assignments in simulation-based education in ANZ; the overwhelming majority of educators report assigning these roles, sometimes more commonly than hands-on roles. Other studies demonstrate that learners have also reported finding value in these roles, particularly for noticing the big picture, noticing detail and reducing stress.6 10

Our study suggests that educators design these observer roles, not just as a way to keep people busy while those in the ‘hands-on’ role gained vital experience. The observer role is seen to offer a different type of learning in that educators reported it gave opportunities for learners to notice and make judgements about the quality of work and interactions they were witnessing. Noticing is vital to clinical practice, and simulation is an excellent way to provide well-structured, planned opportunities to notice.16 Translation of skills and lessons learnt to clinical practice also includes the skill of noticing. By providing feedback about and discussions of observations made, educators are supporting and refining this skill.

Some study participants reported using observation guides and checklists and written learning objectives, with most providing verbal instructions to focus observation. In our study, formalisation of observer role activities and learning objectives into the scenario design and pre-briefings, specific to observers and hands-on learners, was deemed to be important to increase the learning in and value of observer roles. The evaluative judgement necessary for effective observation requires an understanding of what is being evaluated, what good looks like, and how to share that with peers.30 ‘What good looks like includes standards of practice, and expectation of actions and interventions to be undertaken by their peers during the scenario. This information can be provided to observers as part of their briefing. In many ways, the observer requires similar information to the debriefer in order to watch a scenario for meaning.8

The allocation of in-scenario observer roles was also common (50%). These are roles not congruent with the learners’ professional role; for example, another profession, patient or patient relative. In the literature, these roles have also been referred to as ‘student confederate’10 and ‘response-based roles’.18 Learners have reported value in these roles for experiencing the perspective of others,10 31 but also causing distraction and cognitive overload.10 14 Explicit screening and instruction are required for role clarity and psychological safety.17 32 In our study, 80% of survey participants using in-scenario observers provided verbal briefings and 45% role scripts. Roles allocated to learners require specific learning objectives for that role and attention provided to this in the debrief. For example, the role of the relative could have a communication learning objective attached. This would possibly exclude some roles currently allocated as a way of providing more hands-on roles. Embedded participant roles designed to aid control or provide information during a scenario may be better suited to trained faculty, or where available, SPs. Poor or off-script acting has been found to derail a scenario.33

The finding that facility design influences the location of observers is an important one. Purpose-built facilities with external viewing space were seen as both an opportunity to provide an immersive learning experience for hands-on learners and observers to watch remotely. Language like “interfere”, “intimidated’”and “interrupting” provided in some responses indicated a higher value being placed on those learners in the scenario as compared with those observing. External location was also important for some educators who welcomed the opportunity to encourage conversation and engagement with observers during the scenario.

Educators expressed frustration when facility design offered no choice in location of observers. This compromise was reflected in language like “forced”, “limited” and “it’s the only option”. The language choice may indicate educators felt that the simulation was of a lower quality than could be achieved with an external viewing space. With 27% of educators indicating they conduct simulations primarily in situ in the clinical workplace, observer location has a potential negative impact on observer role value in this context.

In contrast, having observers in the room was seen as an advantage for observer learning. Being close to the action was seen to enable an emotional connection, and therefore more potent learning. Being close is supported by Nyström et al 21 who found observers distant in a classroom typesetting were less engaged than those observing from the control room exposed to the behind scene theatre and interaction with educators.

Overall it was reported that a separate location for observers was significantly better for hands-on learner immersion and having observers in the same room as the simulation was significantly better for observer involvement or engagement. In this study, comments favoured hands-on immersion over observer involvement as a reason for observer location in a 2:1 ratio. This result may reflect role values or may just reflect a predominance of immersive-style scenario design. Scenarios which incorporate pauses for reflection or a tag-team design purposely involve observers.34 Flexible learning spaces provide the opportunity to do both immersive and paused scenarios. Other spaces may require consideration of the scenario objectives and most appropriate design for learners.

The mean value of learning in observer roles in this survey was 2.69 (maximum=5, 95% CI=2.6 to 2.78); significantly less than the other four factors identified. So, while educators stated they highly valued observer roles (mean 4.16), they did not believe that learning as an observer was as valuable. Observers, however, by the nature of the assignment have a very different perspective and are placed to see what participants immersed within a situation cannot. Rooney et al’s35 observational study reports observers commenting on physical positioning, missed cues and missteps in assessment not noticed by the immersed learners. Noticing the choreography of teamwork requires a mental bandwidth not always available in the heat of the moment. This way of “noticing and attributing meaning”, producing “agile learners” is important for clinical practice.35 Harnessing the potential of observer roles to produce agile learners who can observe purposefully in clinical practice is worthwhile in its own right, making observer roles a significant and not merely secondary assignment.

The value of observer perspectives in the debrief was recognised as important and had the highest mean score (4.20, CI 4.14 to 4.26). Sustainable feedback is where learners understand what good looks like and actively work to achieve it and observers watch actively to notice it.30 Our survey respondents indicated that learners are positioned as both seekers and users of feedback, with peer observers able to provide a perspective not available to learners in the moment and vice versa. Other studies have shown that observers are able to relate to the actions of their peers and evaluate relevance to their own practice.19 This process has been reported to be enhanced with the presence of an educator to guide and mentor the learning as occurs in a facilitated debrief.36

If observation is important in and of itself, why do educators remain unsure if the learning in this role is as valuable as in the experiential role? The answer may be found in traditional teachings around simulation. In 2004, in his vision paper, Gaba wrote that simulation is “an educational technique that replaces or amplifies real experiences with guided experiences that evoke or replicate substantial aspects of the real world in a fully interactive manner”.1 There is no mention of a hands-on requirement. This definition has been subsequently adopted by the Society for Simulation in healthcare in their simulation dictionary.37 Yet in 2013, Pasquale writes in a healthcare simulation textbook, “Simulation is a hands-on educational modality acknowledged by adult learning theories to be more effective than learning that is not experiential in nature. Simulation offers the learner opportunities to become engaged in experiential learning”.38 Further in the same year, Chiniara et al wrote in a theory-based paper that observation is “irrelevant to simulation, since trainees have no hands-on experience and do not interact with the situation”. 39 This more recent focus on a hands-on experience may unconsciously influence the value that educators place on learning in observer roles. The results from our study suggest that there is a synergy that can be achieved through cultivating a hands-on experience combined with ‘noticing’ through active and directed observation.

Limitations

As simulation practitioners and work-based educators/researchers with a strong interest in optimising learning for observers, we interpret these results with this inherent bias. This bias extends to the phrasing and actual questions asked, as well as the design of the survey tool. We view this data through the lens of social learning theory and our conclusions reflect this stance.

Estimating the sample size for the survey was challenging as the population is not easily defined. There are no required qualifications for simulation education in ANZ, rather simulation educators are self-identified. A number of postgraduate qualifications feature simulation education as either a discrete qualification or as a major in a broader qualification. Short courses vary in content and quality. CHSE certification is not widely sought in ANZ. Hours spent in simulation-based education is varied. In a recent survey of self-identified simulation professionals in NSW, two-thirds of respondents spent less than 20% of their average working week engaged in simulation activities, with only 5% spending 100% of their time in healthcare simulation.40

Established simulation educator professional societies and networks were used to target those who identified as such, biasing results to members and potentially excluding a larger unknown cohort. Web-based distribution and distribution of links through third parties prevent us from reporting a response rate. Electronically distributed surveys often have lower response rates than mailed surveys; however, specifically targeted distribution may make the responses received more representative of the desired population.22 Self-reported data means we cannot be certain that respondent reported practices are their actual or usual practices.

CONCLUSIONS

Observer roles are widely used in ANZ and most learners experience both hands-on and observer roles. While educators value the role, they tend to believe the best learning is achieved through hands-on experiences. Strategies aimed to increase the engagement of observers by incorporating them in professionally incongruent hands-on roles are likely to tax mental bandwidth, whereas focused observation provides opportunities for noticing and attributing meaning not available in the heat of the moment. Peer feedback by observers during reflective phases provides a perspective not available to those immersed in the case and can encourage the observers to engage more deeply in understanding standards of practice or what good work looks like. Lack of flexible learning spaces in in-situ or ad hoc locations may require reconsideration of scenario design and learning objectives. Scenario design should include objectives for observer roles and incorporate the observer into the various phases of simulation. Attention to these areas will help to embed and promote the value of the different type of learning available in the role of observer.

What this paper adds

What is already known on this subject

  • Several studies report equivalent learning outcomes and/or reduced stress in observer roles.

  • Providing guidance or observer tools activates and focuses observers to the objectives of the scenario and increases learner satisfaction.

  • The location of the observer may affect learner engagement.

What this study adds

  • Observer roles, both external and in-scenario roles that are not congruent with learners’ professional designation, are very common role assignments, often more common than hands-on roles, across all professional groups in Australia and New Zealand.

  • This study suggests that the learning in observer roles is less valued by educators than the learning in hands-on roles, even though educators reported highly valuing observer roles as a whole and appreciated the perspective that observer roles provide.

  • Educators report that a separate location for observers was significantly better for hands-on learner immersion and having observers in the same room as the simulation was better for observer involvement or engagement; however, facility design had a significant impact on the choice of observer location.

REFERENCES

Footnotes

  • Twitter Debra Nestel @DebraNestel.

  • Contributors SO devised the study, drafted the survey, conducted analysis and wrote the manuscript (80%). EM and LW advised on and reviewed the study design, reviewed the survey, advised on and reviewed the analysis and contributed to the writing of manuscript (both 5%). DN supervised the study design, reviewed the survey, advised on and reviewed the analysis and contributed to the writing of manuscript (10%).

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement Data are available upon reasonable request.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.