Background The General Medical Council (UK) has called for more patient and public involvement in assessment of medical students.1 In the US, simulated patients (SP) perform summative assessments of communication skills as part of medical licensing examinations.2 The use of SP’s and RP’s has been shown to demonstrate adequate reliability for OSCE’s and structured long cases.3,4 This project aimed to evaluate the reliability of simulated (SP) and real (RP) patient ratings in the Integrated Structured Clinical Examination (ISCE) of the Peninsula College of Medicine and Dentistry (PCMD).5 This undergraduate summative assessment requires students to perform a complex combination of skills, simulating real patient encounters, through the course of six stations. Each station represents a different patient presentation.
Methodology This is a quantitative evaluation of SP’s, RP’s and Physicians ratings of student’s communication skills and overall performance based on six point rating scales using assessment data collected through 3 years iteration of the ISCE at PCMD. The students were from either their second or fourth year of a five year programme. Data from >300 student assessments has been used. Inter-rater reliability of patient-physician ratings was measured using the Intra-class correlation coefficient (ICC).
Results Preliminary analyses demonstrate good inter-rater reliability in most stations, particularly those focusing on communication, including ‘communication in difficult circumstances’ (ICC = 0.653, p < 0.001). Some stations showed poor inter-rater reliability, such as ‘Endocrine’ (ICC = 0.152, p = 0.148)
Conclusion Reliability of SP assessments during undergraduate ISCE’s is context dependent. SP ratings of communication skills during ISCE are reliable measures at undergraduate level and could replace physician ratings in communication based topics.
Impact This study has important implications for the ISCE and other clinical skills examinations. The results have financial and strategic dimensions which could reduce the burden to the Institution of these elements without reducing the reliability of assessment of communication skills.
GMC, Patient and public involvement in undergraduate medical education, Advice supplementary to Tomorrow’s Doctors (2009), May 2011
www.usmle.org/step-2-cs/ (accessed 6.6.14)
Wass V, Jones R, van der Vleuten C. Standardised or real patients to test clinical competence? The long case revisited, Medical Education 2001;35:321–325
Wallace J, Rao R, Haslam R. Simulated patients and objective structured clinical examination: Review of their use in medical education, Advances in Psychiatric Treatment 2002;8:342–350
Mattick K, Dennis I, Bradley P, Bligh J. Content Specificity: is it the full story? Statistical modelling of a clinical skills examination. Medical Education 2008;42: 589–599
- Category: Course or curriculum evaluation/innovation/integration
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.