Article Text

Download PDFPDF
Gasping for air: measuring patient education and activation skillsets in two clinical assessment contexts
  1. Jeffrey A Wilhite1,
  2. Harriet Fisher1,
  3. Lisa Altshuler1,
  4. Elisabeth Cannell2,
  5. Khemraj Hardowar1,
  6. Kathleen Hanley1,
  7. Colleen Gillespie1,
  8. Sondra Zabar1
  1. 1Department of Medicine, New York University School of Medicine, New York, New York, USA
  2. 2Department of English, Bowdoin College, Brunswick, Maine, USA
  1. Correspondence to Jeffrey A Wilhite, Medicine, New York University School of Medicine, New York, NY 10016, USA; jeffrey.wilhite{at}nyulangone.org

Abstract

Objective structured clinical examinations (OSCEs) provide a controlled, simulated setting for competency assessments, while unannounced simulated patients (USPs) measure competency in situ or real-world settings. This exploratory study describes differences in primary care residents’ skills when caring for the same simulated patient case in OSCEs versus in a USP encounter. Data reported describe a group of residents (n=20) who were assessed following interaction with the same simulated patient case in two distinct settings: an OSCE and a USP visit at our safety-net clinic from 2009 to 2010. In both scenarios, the simulated patient presented as an asthmatic woman with limited understanding of illness management. Residents were rated through a behaviourally anchored checklist on visit completion. Summary scores (mean % well done) were calculated by domain and compared using paired sample t-tests. Residents performed significantly better with USPs on 7 of 10 items and in two of three aggregate assessment domains (p<0.05). OSCE structure may impede assessment of activation and treatment planning skills, which are better assessed in real-world settings. This exploration of outcomes from our two assessments using the same clinical case lays a foundation for future research on variation in situated performance. Using both assessments during residency will provide a more thorough understanding of learner competency.

  • clinical skills practice
  • graduate medical education
  • education
  • assessment
  • contextual
  • simulation-based medical education

Statistics from Altmetric.com

Footnotes

  • Contributors JAW served as primary author of the manuscript. HF, EC and KhH contributed extensively to data preparation and written revision. LA, KaH, CG and SZ contributed to study planning, writing and revision throughout. All authors have contributed to and have approved the final manuscript.

  • Funding This work was supported by the Agency for Healthcare Research and Quality (AHRQ 5 R18 HS 0 21 176–02) and the Health Resources & Services Administration (HRSA 15-A0-00-004497).

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval Individual resident’s consent to allow this assessment data to be used for research was requested during orientation as part of a New York University Institutional Review Board (IRB) approved resident research registry (#i06-683).

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement Data for this project were made available through an IRB approved medical education research registry. Only those who consented to participate are included. No data are available, as data presented were de-identified prior to analyses.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.