Table 2

Comparison of expert debriefing evaluators and debriefers’ perceptions of debriefing quality

Perceptions differ remarkably
Expert debriefing evaluators rating ≥−2 points from debriefers rating
Perceptions differ slightly
Expert debriefing evaluators rating −1 point from debriefers rating
Matching perceptions
No difference between expert debriefing evaluators and debriefers rating
Perceptions differ slightly
Expert debriefing evaluators rating +1 point from debriefers rating
Perceptions differ remarkably
Expert debriefing evaluators rating ≥+2 points from debriefers rating
OSAD elementFrequency count (%)Frequency count (%)Frequency count (%)Frequency count (%)Frequency count (%)
Approach (n=41)0 (0)10 (24.4)20 (48.8)9 (22.0)2 (4.9)
Learning environment (n=41)31 (75.6)9 (22.0)1 (2.4)0 (0)0 (0)
Engagement of learners (n=40)2 (5.0)5 (12.5)22 (55.0)9 (22.5)2 (5.0)
Reaction (n=41)24 (58.5)10 (24.4)5 (12.2)2 (4.9)0 (0)
Reflection (n=41)0 (0)7 (17.1)23 (56.1)8 (19.5)3 (7.3)
Analysis (n=41)1 (2.4)16 (39.0)12 (29.3)11 (26.8)1 (2.4)
Diagnosis (n=41)0 (0)12 (29.3)20 (48.8)8 (19.5)1 (2.4)
Application (n=41)0 (0)7 (17.1)25 (61.0)9 (22.0)0 (0)
Total 32758 (17.74)76 (23.24)128 (39.14)56 (17.13)9 (2.75)
Debriefers perceive the quality of debriefing more favourably that expert debriefing evaluators
134 (40.98)
Debriefers and expert debriefing evaluators perceptions of the quality of debriefing match
128 (39.14)
Debriefers perceive the quality of debriefing less favourably than expert debriefing evaluators
65 (19.88)
  • OSAD, Objective Structured Assessment of Debriefing.