CDEM CORD 2026 Special Education Issue

Page 1


Penn State Health Emergency Medicine

About Us: Penn State Health is a multi-hospital health system serving patients and communities across central Pennsylvania. We are the only medical facility in Pennsylvania to be accredited as a Level I pediatric trauma center and Level I adult trauma center. The system includes Penn State Health Milton S. Hershey Medical Center, Penn State Health Children’s Hospital and Penn State Cancer Institute based in Hershey, Pa.; Penn State Health Hampden Medical Center in Enola, Pa.; Penn State Health Holy Spirit Medical Center in Camp Hill, Pa.; Penn State Health Lancaster Medical Center in Lancaster, Pa.; Penn State Health St. Joseph Medical Center in Reading, Pa.; Pennsylvania Psychiatric Institute, a specialty provider of inpatient and outpatient behavioral health services, in Harrisburg, Pa.; and 2,450+ physicians and direct care providers at 225 outpatient practices. Additionally, the system jointly operates various healthcare providers, including Penn State Health Rehabilitation Hospital, Hershey Outpatient Surgery Center and Hershey Endoscopy Center.

We foster a collaborative environment rich with diversity, share a passion for patient care, and have a space for those who share our spark of innovative research interests. Our health system is expanding and we have opportunities in both academic hospital as well community hospital settings.

Benefit highlights include:

• Competitive salary with sign-on bonus

• Comprehensive benefits and retirement package

• Relocation assistance & CME allowance

• Attractive neighborhoods in scenic central Pennsylvania

Orlando, FL | March 26 - 29 | Pre-Day: March 25

Western Journal of Emergency Medicine: CDEM/CORD 2026 Special Education Issue

A Note from the Editors:

We are excited to publish the 11th issue of the Western Journal of Emergency Medicine (WestJEM) Special Issue in Educational Research & Practice (Special Issue). Over a decade ago a unique relationship was formed between WestJEM, the Council of Residency Director for Emergency Medicine and the Clerkship Directors of Emergency Medicine to develop a publication that disseminates educational scholarship which impacts our communities while promoting the growth, as authors, of our junior faculty. The structure of the Special Issue provides a diversity of submission categories such as original research, educational advances, best practices, reviews and scholarly perspectives. This selection provides an opportunity for all scholars and scholarly approaches to have a voice. A successful Special Issue requires the courage of the authors to submit their work for peer review. In turn, we do our best to provide detailed feedback regardless of the final decision. Publication of the issue requires the commitment and hard work of the publication staff, leadership of the organizations, editors, and peer reviewers. We want to thank them all for their efforts and professionalism. The topics of this year’s education issue reflect many of the current issues in medical education today. We have begun receiving and reviewing submissions for next year’s Special Issue. The editorial staff review every submission on a rolling basis. Once accepted, the articles are available on PubMed in an expedited process. There are also no processing fees when accepted to the Special Issue. This is a great opportunity to submit your educational scholarship, thereby enhancing your professional development while disseminating your work to others. We are delighted that this initiative has flourished and look forward to seeing your work on display in this, our 11th issue.

Jeffrey Love, MD

Georgetown University School of Medicine

Co-Editor of Annual Special Issue on Education Research and Practice

Douglas Ander, MD

Emory University

Co-Editor of Annual Special Issue on Education Research and Practice

The Western Journal of Emergency Medicine: Integrating Emergency Care with Population Health would like to thank The Clerkship Directors in Emergency Medicine (CDEM) and the Council of Residency Directors in Emergency Medicine (CORD) for helping to make this collaborative special issue possible.

Western Journal of Emergency Medicine:

Integrating Emergency Care with Population Health

Indexed in MEDLINE, PubMed, and Clarivate Web of Science, Science Citation Index Expanded

JOURNAL FOCUS

Emergency medicine is a specialty which closely reflects societal challenges and consequences of public policy decisions. The emergency department specifically deals with social injustice, health and economic disparities, violence, substance abuse, and disaster preparedness and response. This journal focuses on how emergency care affects the health of the community and population, and conversely, how these societal challenges affect the composition of the patient population who seek care in the emergency department. The development of better systems to provide emergency care, including technology solutions, is critical to enhancing population health.

Table of Contents

Best Practices

1 Resident-as-Teacher Curriculum: An Evidence-based Guide to Best Practices from the Council of Residency Directors in Emergency Medicine

J Jordan, M Gottlieb, M Estes, ME Parsons, K Goldflam, A Grock, BJ Long, S Natesan

Original Research

10 A Qualitative Study of Senior Residents’ Strategies to Prepare for Unsupervised Practice

M Griffith, A Garrett, BK Watsjold, J Jauregui, M Davis, JS Ilgen

19 Characteristics and Educational Support Resources Available to Emergency Medicine Core Faculty: A National Survey

J Jordan, LR Hopson, F Gallahue, JA Cranford, JC Burkhardt, KE Kocher, DL Robinett, M Weizberg, T Murano

27 Substantial Variation Exists in Clinical Exposure to Chief Complaints Among Residents Within an Emergency Medicine Training Program

CM.Jewell, AT Hummel, DJ Hekman, BH Schnapp

33 Harder, Better, Faster, Stronger? Residents Seeing More Patients Per Hour See Lower Complexity

CM Jewell, G (Anthony) Bai, DJ Hekman, AM Nicholson, MR Lasarev, R Alexandridis, BH Schnapp

40 The Effect of Hospital Boarding on Emergency Medicine Residency Productivity

P Moffett, A Best, N Lewis, S Miller, G Hickam, H Kissel-Smith, L Barrera, S Huang, J Moll

49 Leadership Perceptions, Educational Struggles and Barriers, and Effective Modalities for Teaching Vertigo and the HINTS Exam: A National Survey of Emergency Medicine Residency Program Directors

M McLean, J Stowens, R Barnicle, N Mafi, K Shah

57 Push and Pull: What Factors Attracted Applicants to Emergency Medicine and What Factors Pushed Them Away Following the 2023 Match

M Kiemeney, J Morris, L Lamparter, M Weizberg, A Little, B Milman

67 Emergency Medicine Residency Website Wellness Pages: A Content Analysis

A Sappington, B Milman

74 Inequities in the National Clinical Assessment Tool for Medical Students in the Emergency Department

BZ Amin, CJ Dine, ER Tabakin, M Trotter, JK Heath

Policies for peer review, author instructions, conflicts of interest and human and animal subjects protections can be found online at www.westjem.com.

Western Journal of Emergency Medicine:

Integrating Emergency Care with Population Health

Indexed in MEDLINE, PubMed, and Clarivate Web of Science, Science Citation Index Expanded

Brief Research Reports

84

Table of Contents continued

Program Director Perspectives on the Impact of the Proposed 48-Month Emergency Medicine Residency Requirement: A National Survey R Austin, C Patel, K Delfino, S Kim

90 Virtual Interviews Correlate with Home and In-State Match Rates at One Emergency Medicine Program C Motzkus, C Frey, A Humbert

Edcuational Advances

95 Development of a Reliable, Valid Procedural Checklist for Assessment of Emergency Medicine Resident Performance of Emergency Cricothyrotomy DE Loke, AM Rogers, ML McCarthy, MK Leibowitz, ET Stulpin, DH Salzman

Brief Educational Advances

101 A Taste of Our Own Medicine: Fostering Empathy in Medical Learners Through Patient Simulation RP Peña, W Weber

105 Effectiveness of a Collaborative, Virtual Outreach Curriculum for 4th-Year EM-bound Students at a Medical School Affiliated with a Historically Black College and University C Brown, R Carter, N Hartman, A Hammond, E MacNeill, L Holden, A Pierce, L Campbell, M Norman

Editorial

111 A 30-year History of the Emergency Medicine Standardized Letter of Evaluation JS Hegarty, CB Hegarty, JN Love, A Pelletier-Bui, S Bord, MC Bond, SM Keim, K Hamilton, EF Shappell

Policies for peer review, author instructions, conflicts of interest and human and animal subjects protections can be found online at www.westjem.com.

2025 Gold Standard Reviewers

The WestJEM Special Issue in Educational Research & Practice couldn’t exist without our many reviewers. To all, we wish to express our sincerest appreciation for their contributions to this year’s success. Each year a number of reviewers stand out for their (1) detailed reviews, (2) grasp of the tenets of education scholarship and (3)efforts to provide feedback that mentors authors on how to improve. This year’s “Gold Standard” includes:

• Max Berger (UCLA Medical Center)

• April Choi (Rutgers New Jersey Medical School)

• Anna Darby, Jeff Riddell (Keck School of Medicine-USC)*

• Zoe Fisher, Jenna Paul Schultz, Linda Regan (Johns Hopkins)*

• Max Griffith (University of Washington)

• Kirlos Haroun, Katie Lorenz, Kathryn Ritter, Linda Regan (Johns Hopkins)*

• Arman Hussain, Claudia Ranniger (George Washington University)*

• Carlos Jaquez, Daniela Ortiz (Baylor College of Medicine)*

• Corlin Jewell (University of Wisconsin)

• Chelsea Johnson, Anne Messman (Wayne State University)*

• Justine McGiboney (University of AlabamaBirmingham)

• Vivek Medepalli, Avirale Sharma, Larissa Valez (UT Southwestern)*

• Joe-Ann Moser (University of Wisconsin)

• Collyn Murray (University of North Carolina)

• Elspeth Pearce (University of Kansas Medical Center)

• Jessica Pellitier (University of Missouri)

• Monica Shah, Patrick Felton, Bryanne McDonald, Lucienne Lutfy-Clayton (University of Massachusetts)*

• Emily Straley, Vicki Zhou, Richard Bounds (University of Vermont)*

• NeelouTabatabai, Samual O Clarke (UC Davis)*

• Thadeus Schmitt (Medical College of Wisconsin)

• Juhi Varshney, Michael Zdradzinski (Emory University)*

• Kalen Wright, Eric Shappell (HarvardMassachusetts General Hospital)*

• Chris Yang, Tim Koboldt, Chelsea Broomhead, Margaret Goodrich (Missouri Health)*

• Ivan Zvonar, Jon Ilgen (University of Washington)*

• Ivan Zvonar (University of Washington)

*Mentored Peer Reviews from Emergency Medicine Education Fellowship Programs

CDEM/CORD Guest Consulting Editors

We would also like to recognize our guest consulting editors who assisted with pre-screening submissions during our initial peer-review stages.

Thank you for all of your efforts and contributions.

CDEM

• Christine Stehman

• Eric Shappell

• Sharon Bord

• Andrew Golden

CORD

• Jenna Fredette

• Danielle Hart

• William Soares III

• Jamie Jordan

• Anne Messman

• Logan Weygandt

Consulting Statistician/ Psychometrician

• David Way

Western Journal of Emergency Medicine:

Integrating Emergency Care with Population Health

Indexed in MEDLINE, PubMed, and Clarivate Web of Science, Science Citation Index Expanded

Jeffrey N. Love, MD, Guest Editor Georgetown School of Medicine- Washington, District of Columbia

Danielle Hart, MD, MACM, Associate Guest Editor Hennepin Healthcare-Minneapolis, Minnesota

Benjamin Schnapp, MD, MEd, Associate Guest Editor University of Wisconsin-Madison, Wisconsin

Wendy Macias-Konstantopoulos, MD, MPH, Associate Editor Massachusetts General Hospital- Boston, Massachusetts

Danya Khoujah, MBBS, Associate Editor University of Maryland School of Medicine- Baltimore, Maryland

Patrick Joseph Maher, MD, MS, Associate Editor Ichan School of Medicine at Mount Sinai- New York, New York

Yanina Purim-Shem-Tov, MD, MS, Associate Editor Rush University Medical Center-Chicago, Illinois

Gayle Galletta, MD, Associate Editor University of Massachusetts Medical SchoolWorcester, Massachusetts

Section Editors

Behavioral Emergencies

Bradford Brobin, MD, MBA Chicago Medical School

Marc L. Martel, MD Hennepin County Medical Center

Ryan Ley, MD Hennepin County Medical Center

Cardiac Care

Sam S. Torbati, MD Cedars-Sinai Medical Center

Rohit Menon, MD University of Maryland

Elif Yucebay, MD Rush University Medical Center

Mary McLean, MD AdventHealth

Climate Change

Gary Gaddis, MBBS University of Maryland

Clinical Practice

Casey Clements, MD, PhD Mayo Clinic

Murat Cetin, MD

Behçet Uz Child Disease and Pediatric Surgery Training and Research Hospital

Patrick Meloy, MD Emory University

Carmine Nasta, MD Università degli Studi della Campania “Luigi Vanvitelli”

David Thompson, MD University of California, San Francisco

Tom Benzoni, DO Des Moines University of Medicine and Health Sciences

Critical Care

Christopher “Kit” Tainter, MD University of California, San Diego

Joseph Shiber, MD University of Florida-College of Medicine

David Page, MD University of Alabama

Quincy Tran, MD, PhD University of Maryland

Dan Mayer, MD, Associate Editor Retired from Albany Medical College- Niskayuna, New York

Julianna Jung, MD, Associate Guest Editor Johns Hopkins Hospital, Baltimore, Maryland

Douglas Franzen, MD, Associate Guest Editor Harborview Medical Center, Seattle, Washington

Gentry Wilkerson, MD, Associate Editor University of Maryland

Michael Gottlieb, MD, Associate Editor Rush Medical Center-Chicago, Illinois

Sara Krzyzaniak, MD Associate Guest Editor Stanford Universtiy-Palo Alto, California

Susan R. Wilcox, MD, Associate Editor Massachusetts General Hospital- Boston, Massachusetts

Donna Mendez, MD, EdD, Associate Editor University of Texas-Houston/McGovern Medical School- Houston, Texas

Taku Taira, MD, EDD, Associate Guest Editor LAC + USC Medical Center-Los Angeles, California

Antonio Esquinas, MD, PhD, FCCP, FNIV Hospital Morales Meseguer

Dell Simmons, MD Geisinger Health

Disaster Medicine

Andrew Milsten, MD, MS UMass Chan Medical Center

John Broach, MD, MPH, MBA, FACEP University of Massachusetts Medical School UMass Memorial Medical Center

Christopher Kang, MD Madigan Army Medical Center

Scott Goldstein, MD Temple Health

Education

Danya Khoujah, MBBS University of Maryland School of Medicine

Jeffrey Druck, MD University of Colorado

Asit Misra, MD University of Miami

Cameron Hanson, MD The University of Kansas Medical Center

ED Administration, Quality, Safety

Gary Johnson, MD Upstate Medical University

Brian J. Yun, MD, MBA, MPH Harvard Medical School

Laura Walker, MD Mayo Clinic

León D. Sánchez, MD, MPH Beth Israel Deaconess Medical Center

Robert Derlet, MD

Founding Editor, California Journal of Emergency Medicine University of California, Davis

Tehreem Rehman, MD, MPH, MBA Beth Israel Deaconess Medical Center

Anthony Rosania, MD, MHA, MSHI Rutgers University

Neil Dasgupta, MD, FACEP, FAAEM Nassau University Medical Center

Emergency Medical Services

Daniel Joseph, MD Yale University

Douglas S. Ander, MD, Guest Editor Emory University School of Medicine-Atlanta, Georgia

Edward Ullman, MD, Associate Guest Editor Harvard University-Cambridge, Massachusetts

Abra Fant MD, MS, Associate Guest Editor

Northwestern Medicine-Chicago, Illinois

Kendra Parekh, MD, MS, Associate Guest Editor Vanderbilt University-Nashville, Tennessee

Matthew Tews, DO, MS, Associate Guest Editor Indiana University School of Medicine, Augusta, Georgia

Rick A. McPheeters, DO, Associate Editor Kern Medical- Bakersfield, California

Niels K. Rathlev MD, MS, Associate Editor Tufts University School of Medicine-Boston, Massachusetts

Shahram Lotfipour, MD, MPH, Managing Associate Editor University of California, Irvine School of Medicine- Irvine, California

Mark I. Langdorf, MD, MHPE, Editor-in-Chief University of California, Irvine School of Medicine- Irvine, California

Joshua B. Gaither, MD University of Arizona, Tuscon

Julian Mapp University of Texas, San Antonio

Shira A. Schlesinger, MD, MPH Harbor-UCLA Medical Center

Tiffany Abramson, MD University of Southern California

Jason Pickett, MD University of Utah Health

Geriatrics

Stephen Meldon, MD Cleveland Clinic

Luna Ragsdale, MD, MPH Duke University

Health Equity

Cortlyn W. Brown, MD Carolinas Medical Center

Faith Quenzer

Temecula Valley Hospital San Ysidro Health Center

Victor Cisneros, MD MPH Eisenhower Health

Sara Heinert, PhD, MPH Rutgers University

Naomi George, MD MPH University of Mexico

Sarah Aly, DO Yale School of Medicine

Lauren Walter, MD University of Alabama

Infectious Disease

Elissa Schechter-Perkins, MD, MPH Boston University School of Medicine

Ioannis Koutroulis, MD, MBA, PhD

George Washington University School of Medicine and Health Sciences

Stephen Liang, MD, MPHS Washington University School of Medicine

Injury Prevention

Mark Faul, PhD, MA Centers for Disease Control and Prevention

Wirachin Hoonpongsimanont, MD, MSBATS Eisenhower Medical Center

International Medicine

Heather A.. Brown, MD, MPH Prisma Health Richland

Taylor Burkholder, MD, MPH Keck School of Medicine of USC

Christopher Greene, MD, MPH University of Alabama

Chris Mills, MD, MPH Santa Clara Valley Medical Center

Shada Rouhani, MD

Brigham and Women’s Hospital

Legal Medicine

Melanie S. Heniff, MD, JD Indiana University School of Medicine

Statistics and Methodology

Shu B. Chan MD, MS Resurrection Medical Center

Soheil Saadat, MD, MPH, PhD University of California, Irvine

James A. Meltzer, MD, MS Albert Einstein College of Medicine

Monica Gaddis, PhD University of Missouri, Kansas City School of Medicine

Emad Awad, PhD University of Utah Health

Musculoskeletal

Juan F. Acosta DO, MS Pacific Northwest University

Neurosciences

Rick Lucarelli, MD Medical City Dallas Hospital

William D. Whetstone, MD University of California, San Francisco

Antonio Siniscalchi, MD Annunziata Hospital, Cosenza, Italy

Pediatric Emergency Medicine

Muhammad Waseem, MD Lincoln Medical & Mental Health Center

Cristina M. Zeretzke-Bien, MD University of Florida

Jabeen Fayyaz, MD The Hospital for Sick Children

Available in MEDLINE, PubMed, PubMed Central, CINAHL, SCOPUS, Google Scholar, eScholarship, Melvyl, DOAJ, EBSCO, EMBASE, Medscape, HINARI, and MDLinx Emergency Med. Members of OASPA. Editorial and Publishing Office: WestJEM/Depatment of Emergency Medicine, UC Irvine Health, 3800 W Chapman Ave Ste 3200, Orange, CA 92868, USA. Office: 1-714-456-6389; Email: Editor@westjem.org.

Volume 27, No. 1.1: January 2026

Western Journal of Emergency Medicine:

Integrating Emergency Care with Population Health

Indexed in MEDLINE, PubMed, and Clarivate Web of Science, Science Citation Index Expanded

This open access publication would not be possible without the generous and continual financial support of our society sponsors, department and chapter subscribers.

Professional Society Sponsors

American College of Osteopathic Emergency Physicians California ACEP

Academic Department of Emergency Medicine Subscriber

Alameda Health System-Highland Hospital Oakland, CA

Ascension Resurrection Chicago, IL

Arnot Ogden Medical Center Elmira, NY

Atrium Health Wake Forest Baptist Winston-Salem, NC

Baylor College of Medicine Houston, TX

Baystate Medical Center Springfield, MA

Beth Israel Deaconess Medical Center Boston, MA

Brigham and Women’s Hospital Boston, MA

Brown University-Rhode Island Hospital Providence, RI

Carolinas Medical Center Charlotte, NC

Cedars-Sinai Medical Center Los Angeles, CA

Cleveland Clinic Cleveland, OH

Desert Regional Medical Center Palm Springs, CA

Eisenhower Health Rancho Mirage, CA

State Chapter Subscriber

Arizona Chapter Division of the American Academy of Emergency Medicine

California Chapter Division of the American Academy of Emergency Medicine

Florida Chapter Division of the American Academy of Emergency Medicine

International Society Partners

Emergency Medicine Association of Turkey Lebanese Academy of Emergency Medicine

Emory University Atlanta, GA

Franciscan Health Carmel, IN

Geisinger Medical Center Danville, PA

Healthpartners Institute/ Regions Hospital Minneapolis, MN

Hennepin Healthcare Minneapolis, MN

Henry Ford Hospital Detroit, MI

Henry Ford Wyandotte Hospital Wyandotte, MI

Howard County Department of Fire and Rescue Marriotsville, MD

Icahn School of Medicine at Mt Sinai New York, NY

Indiana University School of Medicine Indianapolis, IN

INTEGRIS Health Oklahoma City, OK

Kaweah Delta Health Care District Visalia, CA

Kent Hospital Warwick, RI

Kern Medical Bakersfield, CA

Mediterranean Academy of Emergency Medicine

Loma Linda University Medical Center Loma Linda, CA

California Chapter Division of American Academy of Emergency Medicine

Louisiana State University Shreveport Shereveport, LA

Massachusetts General Hospital/ Brigham and Women’s Hospital/ Harvard Medical Boston, MA

Mayo Clinic in Florida Jacksonville, FL

Mayo Clinic College of Medicine in Rochester Rochester, MN

Mayo Clinic in Arizona Phoeniz, AZ

Medical College of Wisconsin Affiliated Hospital Milwaukee, WI

Mount Sinai Medical Center Miami Beach Miami Beach, FL

Mount Sinai Morningside New York, NY

New York University Langone Health New York, NY

North Shore University Hospital Manhasset, NY

NYC Health and Hospitals/ Jacobi New York, NY

Ochsner Medical Center New Orleans, LA

Great Lakes Chapter Division of the American Academy of Emergency Medicine

Tennessee Chapter Division of the

Norwegian Society for Emergency Medicine Sociedad Argentina de Emergencias

Ohio State University Wexner Medical Center Columbus, OH

Oregon Health and Science University Portland, OR

Penn State Milton S. Hershey Medical Center Hershey, PA

Poliklinika Drinkovic Zagreb, Croatia

Prisma Health/ University of South Carolina SOM Greenville Greenville, SC

Rush University Medical Center Chicago, IL

Rutgers Robert Wood Johnson Medical School New Brunswick, NJ

St. Luke’s University Health Network Bethlehem, PA

Southern Illinois University School of Medicine Springfield, IL

Stony Brook University Hospital Stony Brook, NY

SUNY Upstate Medical University Syracuse, NY

Temple University Philadelphia, PA

Texas Tech University Health Sciences Center El Paso, TX

American Academy of Emergency Medicine Uniformed Services Chapter Division of the American Academy of Emergency Medicine Virginia Chapter Division of the American Academy of Emergency Medicine

Sociedad Chileno Medicina Urgencia Thai Association for Emergency Medicine

To become a WestJEM departmental sponsor, waive article processing fee, receive print and copies for all faculty and electronic for faculty/residents, and free CME and faculty/fellow position advertisement space, please go to http://westjem.com/subscribe or contact:

Stephanie Burmeister

WestJEM Staff Liaison

Phone: 1-800-884-2236

Email: sales@westjem.org

Western Journal of Emergency Medicine:

Integrating Emergency Care with Population Health

Indexed in MEDLINE, PubMed, and Clarivate Web of Science, Science Citation Index Expanded

This open access publication would not be possible without the generous and continual financial support of our society sponsors, department and chapter subscribers.

Professional Society Sponsors

American College of Osteopathic Emergency Physicians

California ACEP

Academic Department of Emergency Medicine Subscriber

The University of Texas Medical Branch Galveston, TX

UT Health Houston McGovern Medical School Houston, TX

Touro University College of Osteopathic Medicin Vallejo, CA

Trinity Health Muskegon Hospital Muskegon, MI

UMass Memorial Health Worcester, MA

University at Buffalo Program Buffalo, NY

University of Alabama, Birmingham Birmingham, AL

University of Arizona College of Medicine-Tucson Little Rock, AR

University of Arkansas for Medical Sciences Galveston, TX

University of California, Davis Medical Center Sacramento, CA

University of California San Francisco General Hospital San Francisco, CA

University of California San Fracnsico Fresno Fresno, CA

University of Chicago Chicago, IL

University of Cincinnati Medical Center/ College of Medicine Cincinnati, OH

University of Colorado Denver Denver, CO

University of Florida, Jacksonville Jacksonville, FL

University of Illinois at Chicago Chicago, IL

University of Iowa Hospitals and Clinics Iowa City, IA

University of Kansas Health System Kansas City, IA

University of Louisville Louisville, KY

University of Maryland School of Medicine Baltimore, MD

University of Miami Jackson Health System Miami, FL

University of Michigan Ann Arbor, MI

University of North Dakota School of Medicine and Health Sciences Grand Forks, ND

University of Southern Alabama Mobile, AL

State Chapter Subscriber

Arizona Chapter Division of the American Academy of Emergency Medicine

California Chapter Division of the American Academy of Emergency Medicine

Florida Chapter Division of the American Academy of Emergency Medicine

International Society Partners

Emergency Medicine Association of Turkey Lebanese Academy of Emergency Medicine Mediterranean Academy of Emergency Medicine

California Chapter Division of American Academy of Emergency Medicine

University of Southern California Los Angeles, CA

University of Vermont Medical Cneter Burlington, VA

University of Virginia Health Charlottesville, VA

University of Washington - Harborview Medical Center Seattle, WA

University of Wisconsin Hospitals and Clinics Madison, WI

UT Southwestern Medical Center Dallas, TX

Franciscan Health Olympia Fields Phoenix, AZ

WellSpan York Hospital York, PA

West Virginia University Morgantown, WV

Wright State University Boonshoft School of Medicine Fairborn, OH

Yale School of Medicine New Haven, CT

Great Lakes Chapter Division of the American Academy of Emergency Medicine

Tennessee Chapter Division of the

Norwegian Society for Emergency Medicine Sociedad Argentina de Emergencias

American Academy of Emergency Medicine Uniformed Services Chapter Division of the American Academy of Emergency Medicine

Virginia Chapter Division of the American Academy of Emergency Medicine

Sociedad Chileno Medicina Urgencia Thai Association for Emergency Medicine

To become a WestJEM departmental sponsor, waive article processing fee, receive print and copies for all faculty and electronic for faculty/residents, and free CME and faculty/fellow position advertisement space, please go to http://westjem.com/subscribe or contact:

Stephanie Burmeister

WestJEM Staff Liaison

Phone: 1-800-884-2236

Email: sales@westjem.org

Resident-as-Teacher Curriculum: An Evidence-based Guide to Best Practices from the Council of Residency Directors in Emergency Medicine

Jaime Jordan, MD, MAEd*†

Michael Gottlieb, MD‡

Molly Estes, MD§

Melissa E. Parsons, MD||

Katja Goldflam, MD#

Andrew Grock, MD*

Brit J. Long, MD¶

Sree Natesan, MD**

David Geffen School of Medicine at University of California Los Angeles, Department of Emergency Medicine, Los Angeles, California

Oregon Health & Science University, Department of Emergency Medicine, Portland, Oregon

Rush University Medical Center, Department of Emergency Medicine, Chicago, Illinois

Northwestern University, Department of Emergency Medicine, Chicago, Illinois

University of Florida College of Medicine, Department of Emergency Medicine, Jacksonville, Florida

Yale School of Medicine, Department of Emergency Medicine, New Haven, Connecticut

Brooke Army Medical Center, Department of Emergency Medicine, San Antonio, Texas

Duke University, Division of Emergency Medicine, Durham, North Carolina

Section Editor: Kendra Parekh, MD, MHPE

Submission history: Submitted December 14, 2024; Revision received May 17, 2025; Accepted May 17, 2025

Electronically published September 24, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem

DOI: 10.5811/westjem.41493

Improving resident teaching skills is an expectation of training. Despite the recognized importance of resident-as-teacher (RaT) curricula, variability indicates the need for evidence-based guidelines to inform best practices. This paper outlines expert guidelines for the development, implementation, and evaluation of RaT curricula from the members of the Council of Residency Directors in Emergency Medicine Best Practices Subcommittee, based on a critical review of the literature. It is important to perform a needs assessment prior to creating and implementing a RaT curriculum. The RaT curricula should include instruction on adult learning theory, feedback, and classroom and bedside teaching techniques. Outcomes of RaT curricula should be assessed using multiple sources including direct observation and incorporate both knowledge and skill retention, as well as acquisition. [West J Emerg Med. 2025;26(5)1135–1143.]

BACKGROUND

Training future physicians to be teachers is an important curricular component of residency programs and supported by the Accreditation Council for Graduate Medical Education (ACGME), which states that residents are expected to participate in the education of patients, families, students, residents, and other health professionals and should be encouraged to teach using a scholarly approach.1 Residentas-teacher (RaT) curricula hold the potential to provide numerous benefits to residents, medical students, and patients by enhancing teaching skills that allow for transfer of knowledge.2-17 Benefits of RaT programs across medical specialties include improved teaching skills, self-reflection,

self-efficacy in teaching, and improved educational outcomes for both residents and their learners, as well as better outcomes for patient care.2-17

Despite recommendations to provide this training in residency and a substantial body of literature on the topic, there is no standard approach to RaT curricula.1 This deficit can lead to variability in education skill development for resident trainees. It also leaves education leaders uncertain about how to best provide this important training in their programs. While a few prior reviews have sought to address this topic, they include only a small number of papers, are narrow in scope (focusing on the benefits and effectiveness of RaT curricula rather than how to best deliver this type of

instruction), and may be outdated and not reflect the current literature available.6,9,13,17 Therefore, a critical need exists to develop best practices and evidence-based guidelines to optimize RaT curricular content, implementation, and evaluation in graduate medical education training programs.

Based on the best available evidence through a critical review of the literature, we offer expert guidelines on RaT curricular content, implementation, and evaluation from members of the Council of Residency Directors in Emergency Medicine (CORD) Best Practices Subcommittee. This paper provides readers with recommendations on the content, educational strategies, curricular implementation, and program evaluation for RaT curricula.

CRITICAL APPRAISAL

This is the 11th paper in a series of evidence-based best practice reviews from the CORD Best Practices Subcommittee.18-27 The author group consists of expert emergency medicine (EM) educators and education researchers with experience in residency program education and leadership. We conducted a literature search in conjunction with a medical librarian using MEDLINE with a combination of Medical Subject Heading terms and keywords focused on RaT curricula searching for papers published from inception to December 31, 2023 (Supplemental Appendix 1). We also reviewed the bibliographies of all included papers. Two authors (JJ and SN) independently screened and included papers that addressed RaT curricula development, implementation and evaluation. We excluded papers that were not related to RaT curricula development, implementation, or evaluation. We also excluded papers that were not in English, were abstracts only, or did not have full text available. Papers were included based on agreement of the two screeners. The two screeners resolved discrepancies through in-depth discussion and negotiated consensus.

The search yielded 1,486 papers, of which 89 were deemed to be directly relevant to this review (Supplemental Appendix 2). The author group derived their best practice recommendations based on the literature review and discussion among the expert author group. The level and grade of evidence were provided for each best practice statement implementing the Oxford Center for EvidenceBased Medicine criteria (Tables 1 and 2).28 When supporting data were not available, recommendations were made based upon the authors’ combined experience and consensus opinion. Prior to submission, the manuscript was reviewed by the CORD Best Practices Subcommittee and posted to the CORD website for two weeks for peer review by the entire CORD medical education community. Upon completion of the review period, there was general agreement, and no substantial changes to the guideline were recommended.

Population Health Research Capsule

What do we already know about this issue? Resident-as-teacher (RaT) curricula are an important part of residency training and have many potential benefits.

What was the research question? What are best practices for RaT curricular content, implementation, and evaluation in graduate medical education training programs?

What was the major finding of the study? This paper offers expert recommendations for best practices on RaT curricular content, implementation, and evaluation.

How does this improve population health? Improving teaching skills ultimately leads to better education outcomes for residents and better care of their patients.

1a

1b

2a

2b

3a

3b

4

5

Systematic review of homogenous RCTs

Individual RCT

Systematic review of homogenous cohort studies

Individual cohort study or a low-quality RCT*

Systematic review of homogenous casecontrol studies

Individual case-control study**

Case series/Qualitative studies or lowquality cohort or case-control study***

Expert/consensus opinion

*Defined as <80% follow up; **includes survey studies and crosssectional studies; ***defined as studies without clearly defined study groups.

RCT, randomized controlled trial.

RESIDENT-AS-TEACHER CURRICULAR CONTENT AND EDUCATIONAL STRATEGIES

Of the reviewed papers, few included a formal needs assessment beyond a review of the literature. Residents’ responsibility to teach students, other residents, and other

Table 1. Oxford Centre for Evidence-Based Medicine levels of evidence.28

Table 2. Oxford Centre for Evidence-Based Medicine grades of recommendation.28

Grade of Evidence Definition

A Consistent Level 1 studies

B Consistent Level 2 or 3 studies or extrapolations* from Level 1 studies

C Level 4 studies or extrapolations* from Level 2 or 3 studies

D Level 5 evidence, or troublingly inconsistent or inconclusive studies of any level

*“Extrapolations” refers to data being used in a situation that has potentially clinically important differences than the original study situation.

staff is well recognized, as is the need to provide training to prepare residents for their roles as teachers.29 Reasons for implementing RaT curricula include the following: to teach a skill important to the resident role; meet residents’ desire for formal training in education; address regulatory requirements; and prepare trainees for future career roles.1,30 General curricular goals included improving resident formal and informal teaching skills in both classroom and clinical settings and increasing resident confidence in teaching skills.29,31 The RaT curricula reviewed contain diverse components. The topics most consistently included in RaT curricula were adult learning theory, creating a positive learning environment and setting objectives, clinical or bedside teaching techniques, classroom teaching techniques, and how to give feedback.10-12,14,15,29-60

Adult learning theory—which describes how adults learn best when material is problem-centered, relevant to their work, and when they are involved in the planning and evaluation of their instruction—was a major component of RaT curricula, both as a framework for the curricular development and a topic of instruction for learners.10,29,31-42 Adult-learning theory was often considered in how RaT curricula was applied.34,35,61 For example, RaT leaders factored this in for determining the length, frequency, and formatting of these educational sessions within the curricula. 34,35,61

Many curricula also include adult learning principles as part of their educational content.11,29, 33,35-42,49,56 Berger et al provided a primer for anesthesiology residents about adult learning principles by having the learners discuss effective and ineffective teaching moments that they remembered in their education.11 They also had learners review literature on adult learning principles and watch a video demonstration.11 Similarly, Chee et al had residents identify effective and ineffective teaching strategies observed in video clips to better understand adult learning theory.35 Choski et al had learners review two papers on adult learning theory to better understand adult education principles.36 Another group used formal lectures on adult learning theory followed

by debriefing.29 Tang Girdwood et al revised a previous curriculum by removing the PowerPoint lecture on adult learning theory and instead having residents teach the principles of adult learning theory to one another with a faculty facilitator present.42

Many RaT curricula sought to teach residents how to set the stage for learning.11,12,15, 31,32,34-36,38,43-49 Curricular content included how to create a positive learning environment and recognize behaviors that can lead to an environment of harassment or learner mistreatment.12,31,35,43,44 Understanding how to set goals and expectations with learners to facilitate knowledge and skill acquisition was also an important topic included in RaT curricula.11,15,31,32,34,36,38,44-49

Clinical or bedside teaching techniques and tools was another commonly included topic in RaT curricula.10,14,15,29-31,33,37-42,44,46-48,50-55 One survey study in EM found that 84% of programs reported bedside teaching to be a major focus of their educational curriculum.32 One of the most frequently included teaching tools was the One-Minute Preceptor.31,32,37,44,47,51,52.54,57,62 Ahn et al found that 45% of RaT programs in a single specialty incorporated training on the One-Minute Preceptor.32 In another example, curricula learners were asked to describe the elements of this model, apply the model to a simulated learner’s patient presentation, and use the model to assess the learner’s knowledge level and identify educational points.31 Content specific to procedural teaching was included in many curricula. 5,10,11,15, 29,32,33,35-37,41,46,53,55,57,59

In addition to the clinical setting, many RaT curricula also seek to prepare residents for teaching in the classroom by including content on didactic, small group, and case-based instruction.11,15,31,32,38,41,42, 45,46,48,50,53,54,56-58 While these content areas were often listed as topics or titles of educational sessions included in curricula, there was little additional description in the included studies as to what these content areas were comprised of. Many curricula also included content on the use of simulation in education.14,32,42,53,55,57,63

Feedback was also consistently included in RaT curricula.8,10-12,14,15,29,31-38,40-44,46-48,50,52,53,55,59,60 One study found that 96% of EM residency programs that had RaT curricula included feedback as a major focus.32 Specific content areas related to feedback included techniques and components of effective feedback, optimizing the environment for feedback, and how to receive feedback.33 Curricula often included interactive activities, during which the learners could practice feedback interactions via role-play and debrief with the other learners.30,33 Other RaT curricular content included education to augment teaching such as communication skills, professionalism, and how to deal with difficult learning situations.32,46,50,57,58 Some curricula also included content that could help prepare residents as education professionals such as mentorship and role modeling, curricular design, time management, and learner assessment.15,32,40,46,57,60,64 We provide a summary of RaT curricular content and educational strategies in Tables 3 and 4.

Resident-as-Teacher Curriculum: Evidence-based Guide to Best Practices

Table 3. Summary of content in resident-as-teacher curricula.

Curricular Content

Adult learning theory

Number of Papers

14 8, 10, 29, 31, 33-42

Assessment of learners 1 60

Case-based instruction

Clinical/bedside instruction

7 8, 11, 42, 46, 54, 57, 58

References

23 10, 14, 15, 29, 30, 31, 33, 37-42, 44, 46-48, 50-55

Communication skills 4 8, 46, 57, 58

Creating a positive learning environment 5 12, 31, 35, 43, 44

Curriculum design 2 8, 46

Didactic instruction 10 11, 15, 31, 38, 42, 45, 48, 50, 53, 54

Difficult learning situations

2 8, 50

Feedback 28 8, 10-12, 14, 15, 29, 31, 33-38, 40-44, 46-48, 50, 52, 53, 55, 59, 60

Mentorship 2 8, 40

Procedural instruction

16 5, 8, 10, 11, 15, 29, 33, 35-37, 41, 46, 53, 55, 57, 59

Professionalism and role modeling 5 8, 15, 46, 57, 64

Setting goals and expectations

13 8, 11, 15, 31, 34, 36, 38, 44-49

Simulation instruction 7 8, 14, 42, 53, 55, 57, 63

Small group instruction 7 8, 41, 46, 50, 53, 56, 57

Time management 2 8, 46

Table 4. Summary of educational strategies in resident-as-teacher curricula.

Educational Strategy Number of Papers

Didactic lectures

Direct observation and feedback

Iterative reminders / staged repetition

Simulation/role playing

Small groups

Virtual sessions/electronic handouts

Workshops

References

22 5, 7, 8, 12, 17, 29, 33, 37, 38, 42, 47, 57, 58, 64, 71, 75, 86, 87, 93, 95, 103, 104

13 29, 31, 49, 56, 57, 59, 66, 69, 71, 87, 88, 94, 104

4 29, 31, 61, 71

12 12, 14, 31, 37, 57, 64, 69, 70, 75, 87, 88, 91

6 12, 36, 37, 56, 69, 93

7 5, 41, 43, 61, 71, 86, 97

21 4, 15, 17, 31, 33, 37, 38, 44, 48, 52, 54, 57, 59, 62, 64, 65, 66, 83, 87, 98, 104

Best Practices Recommendations

Resident-as-teacher curricula should include the following:

1. Teaching techniques applicable to both classroom and bedside settings (Level 1, Grade A).

2. Effective feedback techniques that educators can use to provide feedback to learners (Level 1, Grade B).

3. Adult learning theory as part of the framework of the curriculum and its delivery, as well as an educational component of the curriculum. (Level 2, Grade B).

RESIDENT-AS-TEACHER CURRICULAR LOGISTICS AND IMPLEMENTATION

Timing, duration, and frequency of interventions varied greatly among studies and specialties, with no overarching consensus on ideal approaches. The most common

approach included single interventions, usually early in intern year or during residency orientation, with most one-day curricula ranging from 4-8 hours.44, 47, 59, 60, 62, 65-67

According to a landmark paper published by Morrison et al in 2004, the average total time for a RaT curriculum was 11 hours, with their institutional published curriculum lasting for 13 total cumulative hours of longitudinal instruction.68, 69 Some longitudinal curricula had longer durations including those that spanned the entire length of resident training.12, 29, 31, 38, 42, 49, 54, 55

Staffing of the educational sessions was largely by general residency faculty who participated in didactics, mentorship, or evaluations of resident teaching.56, 58, 59, 62, 69, 70 Sometimes faculty with additional training or specialization in education led or designed the curricula, which included

“educational experts,” designated education faculty, and education fellows.12, 15, 70 Additionally, residents themselves often contributed, including chief residents and teach-theteacher models.47, 58

Several barriers were identified in the implementation of RaT curricula, with the most frequently mentioned being the balance of workload on faculty and residents.71, 48, 58 Both the total time required for participation and instruction as well as real-time balancing responsibilities of patient care with teaching while working clinically were noted. 37, 57, 72, 73 Additionally, many residents felt it was challenging to teach topics that they themselves still did not feel quite familiar with, even for the sake of experiential learning. 37, 43, 57 Lastly, despite ACGME supportive program requirements, some program directors felt that RaT curricula were not a priority among other competing educational demands.1, 58,74

A needs assessment before creating and implementing a RaT curriculum can help confirm interest, elucidate clear, specific program goals for participants, and secure buy-in from faculty and leadership.37, 56, 58, 75 Buy-in from residents was less challenging, with many residents confirming that they lacked self-confidence in their own teaching abilities, wanted mentorship in this area, and were willing to spend time to gain this experience.5, 59, 76 Medical students, who along with junior residents, were frequently the recipients of the outcomes of RaT, identified residents as more approachable than faculty and appreciated near-peer teaching.73, 77-79

Administration of RaT curricula may be challenging due to the resources required for successful implementation. This includes the number of faculty needed and time for residents to participate in curricular sessions, as well as time to learn and practice these skills while working clinically.58,70 Through an online survey of 47 residency programs and iterative expert consensus building, McKeon et al proposed the following key components to a successful RaT curriculum: required trainee participation; evaluations and feedback of resident teaching; recognition of excellence through teaching awards; and faculty teaching evaluations

Best Practices Recommendations:

1. General residency faculty can teach, provide mentorship, and evaluate participants in RaT curricula (Level 2a, Grade B).

2. Perform a needs assessment prior to implementing a RaT curriculum (Level 3a, Grade B).

3. Identify and address barriers such as time limitations for residents and faculty when implementing a RaT curriculum (Level 4, Grade C).

RESIDENT-AS-TEACHER CURRICULAR OUTCOMES/EVALUATION

When evaluating a RaT program, it is critical to use a robust model, accounting for various inputs, outputs, and outcomes. Examples of relevant program evaluation frameworks include the Kirkpatrick framework, the Logic Model, and CIPP (Context, Input, Process, Products).81,82 Despite this, most studies did not explicitly state the program evaluation framework they used.

While many studies included only a single or limited number of outcome measures individually, when assessed as a whole, there were a wide range of potential outcomes assessed (Table 5). The most common form of learner assessment was self-surveys of perceived effectiveness after a RaT program.4-7, 9-12, 14, 31, 33, 35, 39, 42, 44, 48, 49, 51, 54-56, 58, 61, 62, 65, 66, 71, 74, 83-97 A few studies also conducted delayed self-assessments at 3-12 months following RaT course completion.11,44, 51 One study assessed differences in attitude toward teaching after the course, while others performed knowledge assessment tests.36, 43, 44 Another study assessed actual use of the skills in subsequent teaching.52

Skill assessments were performed using either direct observation or structured assessments in a simulated environment. Several studies directly observed resident teaching, while others video-recorded resident teaching for delayed assessments.6, 49, 71, 72, 74, 87 Other measures included end-of-shift teaching evaluations completed by faculty.56, 58, 74 The most common assessment, using simulation, was the Objective Structured Teaching Exercise (OSTE).6, 9, 36, 38, 41, 46, 49, 54, 63, 71, 74, 84, 87, 92, 96, 98 The OSTEs were incompletely reported; they often ranged from 6-8 stations and were 2-4 hours in length. One study used the Debriefing Assessment for Simulation in Healthcare (DASH) instrument instead of the OSTE.14 Another assessed both initial and delayed OSTE as part of a randomized trial.75

Additional measures were obtained via learners (eg, students, junior residents). Learner assessments used a variety of measures of teaching effectiveness, although most had limited validity evidence.4, 6, 9, 10, 14, 39, 47, 48, 62, 66, 71, 74, 84-89, 97, 99, 100

One study used the Stanford Faculty Development Program—a 25-item tool assessing learning climate, control of teaching sessions, communicating goals, promoting understanding and retention, evaluation, feedback, and promoting self-directed learning.47 Another study evaluated the effect of the intervention by comparing course/rotation evaluations from students.48

linked to annual faculty review but not to salary or promotion.80 Finally, a RaT curriculum should be iteratively refined to ensure optimization of its content.42

One study focused on the feasibility to inform broader implementation.38 A few other select studies assessed organizational changes and broader outcomes. Two studies found that the RaT program led to substantive changes, which resulted in residency programs converting to this model going forward.60,101 Others assessed downstream effects on student learning by comparing student Objective Structured Clinical Examinations (OSCE) or Objective Structured Assessments of Technical Skills (OSATS) between those taught by residents completing the RaT program vs those who did not.63, 102

RaT, resident as teacher.

Resident-as-Teacher Curriculum: Evidence-based Guide to Best Practices

Table 5. Summary of methods of outcome assessments in resident-as-teacher curricula.

Educational Strategy

Observed Structured Teaching Evaluation

Survey of faculty

Survey of learners

Semi-structured interview

Best Practices Recommendations:

Number of Papers

References

12 14, 15, 17, 37, 41, 49, 54, 69, 70, 75, 87, 98

4 7, 17, 54, 58

36 4, 7, 8, 12, 14, 17, 29, 31, 33, 37, 38, 42, 44, 47, 48, 51, 54, 56, 57, 58, 61, 62, 65, 66, 71, 73, 83, 86, 91, 94, 96, 97, 100, 103-105

1 59

1. RaT outcomes should be assessed using multiple sources of data (Level 1b, Grade B).

2. Use OSTE or direct observation to directly assess RaT outcomes (Level 1b, Grade B).

3. Incorporate delayed assessment for skill retention (Level 1b, Grade B).

4. Use higher level outcome assessments, such as learner evaluations or assessments (Level 3b, Grade B).

RaT, resident as teacher; OSTE, Observed Structured Teaching Evaluation.

LIMITATIONS

Although we performed a comprehensive search guided by a medical librarian in conjunction with a bibliographic review and expert consultation to augment content when needed, we used a single search engine, and it is possible that we may have missed some pertinent papers. In instances where evidence in the form of high-quality data was limited or lacking, we relied upon expert opinion and group consensus for the best practice recommendations. Finally, in areas where evidence was not available, we used the consensus from the expertise of our authorship group. While our author group possesses experience in research and scholarship in both RaT curricula and medical education, there was a potential for bias to have been introduced during this process. Therefore, we also sought peer review from the CORD Best Practices Subcommittee and posted it online for open review feedback by the CORD community.

CONCLUSION

Resident-as-teacher curricula are a vital component of graduate medical education training programs. This paper provides guidance on best practices for developing, implementing, and evaluating RaT curricula.

ACKNOWLEDGEMENTS

The authors would like to thank the members of the Council of Residency Directors in Emergency Medicine (CORD) and the members of the CORD Best Practice Committee for their review and feedback of this manuscript. The authors would also like to acknowledge Samantha

Kaplan, PhD, Medical Librarian, Duke University, Durham, NC, for her contributions.

Address for Correspondence: Jaime Jordan, MD, MAEd, Oregon Health & Science University, Department of Emergency Medicine, 3181 SW Sam Jackson Park Road, Portland, OR 97239. Email: jaimejordanmd@gmail.com.

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Jordan et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Accreditation Council for Graduate Medical Education. ACGME Common Program Requirements (Residency). 2022. Accessed December 5, 2024. Available at chrome-extension:// bdfcnmeidppjeaggnmidamkiddifkdib/viewer.html?file=https:// www.acgme.org/globalassets/pfassets/programrequirements/ cprresidency_2023.pdf

2. Bordley DR, Litzelman DK. Preparing residents to become more effective teachers: a priority for internal medicine. Am J Med 2000;109(8):693‐696.

3. Julian KA, O’Sullivan PS, Vener MH, Wamsley MA. Teaching residents to teach: the impact of a multi‐disciplinary longitudinal curriculum to improve teaching skills. Med Educ Online 2007;12(1):4467.

4. Nejad H, Bagherabadi M, Sistani A, Dargahi H. Effectiveness of resident as teacher curriculum in preparing emergency medicine residents for their teaching role. J Adv Med Educ Prof 2017;5(1):21‐25.

5. Kobritz M, Demyan L, Hoffman H, Bolognese A, Kalyon B, Patel V. “Residents as teachers” workshops designed by surgery residents for surgery residents. J Surg Res. 2022;270:187-194.

6. Wamsley MA, Julian KA, Wipf JE. A literature review of “resident‐as‐

Jordan et al.

Resident-as-Teacher Curriculum: Evidence-based Guide to Best Practices

teacher” curricula: Do teaching courses make a difference? J Gen Intern Med. 2004;19(5 Pt 2):574‐581.

7. Ratan BM, Johnson GJ, Williams AC, Greely JT, Kilpatrick CC. Enhancing the teaching environment: 3‐year follow‐up of a resident‐led residents‐as‐teachers program. J Grad Med Educ 2021;13(4):569‐575.

8. Ahn J, Golden A, Bryant A, Babcock C. Impact of a dedicated emergency medicine teaching resident rotation at a large urban academic center. West J Emerg Med. 2016;17(2):143‐148.

9. Post RE, Quattlebaum RG, Benich JJ 3rd. Residents‐as‐teacher curricula: a critical review. Acad Med. 2009;84(3):374‐380.

10. Geary AD, Hess DT, Pernar LI. Efficacy of a resident-as-teacher program (RATP) for general surgery residents: an evaluation of 3 years of implementation. Am J Surg. 2021;222(6):1093-1098.

11. Berger JS, Daneshpayeh N, Sherman M, et al. Anesthesiology residents-as-teachers program: a pilot study. J Grad Med Educ 2012;4(4):525-528.

12. Santini VE, Wu CK, Hohler AD. Neurology Residents as Comprehensive Educators (Neuro RACE). Neurologist 2018;23(5):149-151.

13. Busari JO, Scherpbier AJ. Why residents should teach: a literature review. J Postgrad Med. 2004;50(3):205‐210.

14. Miloslavsky EM, Sargsyan Z, Heath JK, et al. A simulation-based resident-as-teacher program: the impact on teachers and learners. J Hosp Med. 2015;10(12):767-772.

15. Morrison EH, Rucker L, Boker JR, et al. A pilot randomized, controlled trial of a longitudinal residents-as-teachers curriculum. Acad Med. 2003;78(7):722-729.

16. Snell L. The resident‐as‐teacher: It’s more than just about student learning. J Grad Med Educ. 2011;3(3):440‐441.

17. Hill AG, Yu T, Barrow M, Hattie J. A systematic review of resident‐as‐teacher programmes. Med Educ. 2009;43(12):1129‐1140.

18. Chathampally Y, Cooper B, Wood DB, et al. Evolving from morbidity and mortality to a case-based error reduction conference: Evidencebased Best Practices from the Council of Emergency Medicine Residency Directors. West J Emerg Med. 2020;21(6):231–41.

19. Wood DB, Jordan J, Cooney R, et al. Conference didactic planning and structure: an Evidence-based Guide to Best Practices from the Council of Emergency Medicine Residency Directors. West J Emerg Med. 2020;21(4):999–1007.

20. Parsons M, Bailitz J, Chung AS, et al. Evidence-based interventions that promote resident wellness from the Council of Emergency Residency Directors. West J Emerg Med. 2020;21(2):412–22.

21. Parsons M, Caldwell M, Alvarez A, et al. Physician pipeline and pathway programs: an evidence-based guide to best practices for diversity, equity, and inclusion from the Council of Residency Directors in Emergency Medicine. West J Emerg Med 2022;23(4):514–24.

22. Davenport D, Alvarez A, Natesan S, et al. Faculty recruitment, retention, and representation in leadership: an Evidence-Based Guide to Best Practices for Diversity, Equity, and Inclusion from the Council of Residency Directors in Emergency Medicine. West J

Emerg Med. 2022;23(1):62–71.

23. Gallegos M, Landry A, Alvarez A, et al. Holistic review, mitigating bias, and other strategies in residency recruitment for diversity, equity, and inclusion: an Evidence-based Guide to Best Practices from the Council of Residency Directors in Emergency Medicine. West J Emerg Med. 2022;23(3):345–52.

24. Natesan S, Bailitz J, King A, et al. Clinical teaching: An Evidence-based Guide to Best Practices from the Council of Emergency Medicine Residency Directors. West J Emerg Med. 2020;21(4):985–98.

25. Estes M, Gopal P, Siegelman JN, et al. Individualized Interactive Instruction: a Guide to Best Practices from the Council of Emergency Medicine Residency Directors. West J Emerg Med. 2019;20(2):363–8.

26. Gottlieb M, King A, Byyny R, et al. Journal club in residency education: an Evidence-based Guide to Best Practices from the Council of Emergency Medicine Residency Directors. West J Emerg Med. 2018;19(4):746–55.

27. Natesan S, Jordan J, Sheng A, et al. Feedback in medical education: An Evidence-based Guide to Best Practices from the Council of Residency Directors in Emergency Medicine. West J Emerg Med 2023;24(3):479-494.

28. Phillips R, Ball C, Sackett D. Oxford Centre for Evidence-Based Medicine: Levels of Evidence. CEBM: Centre for Evidence-Based Medicine. 2021. Accessed December 5, 2024. Available at: https:// www.cebm.ox.ac.uk/resources/levels-of-evidence/ocebm-levels-ofevidence

29. Nguyen S, Cole KL, Timme KH, Jensen RL. Development of a residents-as-teachers curriculum for neurosurgical training programs. Neurosurgical focus. 2022;53(2):E6.

30. Al Achkar M, Hanauer M, Morrison EH, Davies MK, Oh RC. Adv Med Educ Pract. 2017;8:299-306.

31. Rowat J, Johnson K, Antes L, White K, Rosenbaum M, Suneja M. Successful implementation of a longitudinal skill-based teaching curriculum for residents. BMC Med Educ. 2021;21(1):346

32. Ahn J, Jones D, Yarris L, Fromme H, Yarris LM, Fromme HB. A national needs assessment of emergency medicine resident-asteacher curricula. Intern Emerg Med. 2017;12(1):75-80.

33. Anderson MJ, Ofshteyn A, Miller M, Ammori J, Steinhagen E. “Residents as teachers” workshop improves knowledge, confidence, and feedback skills for general surgery residents. J Surg Educ 2020;77(4):757-764.

34. Bensinger LD, Meah YS, Smith LG. Resident as teacher: the Mount Sinai experience and a review of the literature. Mt. Sinai J Med 2005;72(5):307-311.

35. Chee YE, Newman LR, Loewenstein JI, Kloek CE. Improving the teaching skills of residents in a surgical training program: results of the pilot year of a curricular initiative in an ophthalmology residency program. J Surg Educ. 2015;72(5):890-897.

36. Chokshi BD, Schumacher HK, Reese K, et al. A “Resident-asteacher” curriculum using a flipped classroom approach: Can a model designed for efficiency also be effective? Acad Med. 2017;92(4):511514.

Resident-as-Teacher Curriculum: Evidence-based Guide to Best Practices

37. Cullimore AJ, Dalrymple JL, Dugoff L, et al. The obstetrics and gynaecology resident as teacher. J Obstet Gynaecol Can 2010;32(12):1176-1185.

38. Friedman S, Moerdler S, Malbari A, Laitman B, Gibbs K. The Pediatric Resident Teaching Group: the development and evaluation of a longitudinal resident as teacher program. Med Sci Educ 2018;28(4):619-624.

39. Langer AL, Bernard S, Block BL. Two-week resident-as-teacher program may improve peer feedback and online evaluation completion. Med Sci Educ. 2018;28(4):633-637.

40. Mendoza D, Peterson R, Ho C, Harri P, Baumgarten D, Mullins ME. Cultivating future radiology educators: development and implementation of a clinician-educator track for residents. Acad Radiol. 2018;25(9):1227-1231.

41. Ricciotti HA, Freret TS, Aluko A, McKeon BA, Haviland MJ, Newman LR. Effects of a short video-based resident-as-teacher training toolkit on resident teaching. Obstet Gynecol. 2017;130:36S-41S.

42. Tang Girdwood S, Treasure J, Zackoff M, Klein M. Implementation, evaluation, and improvement of pediatrics residents-as-teachers elective through iterative feedback. Med Sci Educ. 2019;29(2):375-378.

43. Bettendorf B, Quinn-Leering K, Toth H, Tews M. Teaching when Time Is Limited: a Resident and Fellow as Educator Video Module. Med Sci Educ. 2019;29(3):631-635.

44. Tipton AE, Ofshteyn A, Anderson MJ, et al. The impact of a “residents as teachers” workshop at one year follow-up. Am J Surg. 2022;224(1 Pt B):375-378.

45. Gaba ND, Blatt B, Macri CJ, Greenberg L. Improving teaching skills in Obstet Gynecol residents: evaluation of a residents-as-teachers program. Am J Obstet Gynecol. 2007;196(1):87.e1-7.

46. Messman A, Kryzaniak SM, Alden S, Pasirstein MJ, Chan TM. Recommendations for the development and implementation of a residents as teachers curriculum. Cureus. 2018;10(7):e3053.

47. Moser EM, Kothari N, Stagnaro-Green A. Chief residents as educators: an effective method of resident development. Teach Learn Med. 2008;20(4):323-328.

48. Ostapchuk M, Patel PD, Hughes Miller K, Ziegler CH, Greenberg RB, Haynes G. Improving residents’ teaching skills: a program evaluation of residents as teachers course. Med Teach. 2010;32(2):e49-e56.

49. Zackoff M, Jerardi K, Unaka N, Sucharew H, Klein M. An Observed Structured Teaching Evaluation demonstrates the impact of a resident-as-teacher curriculum on teaching competency. Hosp Pediatr. 2015;5(6):342-347.

50. Achkar MA, Davies MK, Busha ME, Oh RC. Resident-as-teacher in family medicine: a CERA survey. Fam Med. 2015;47(6):452-458.

51. Burgin S, Zhong CS, Rana J. A resident-as-teacher program increases dermatology residents’ knowledge and confidence in teaching techniques: A pilot study. J Am Acad Dermatol 2020;83(2):651-653.

52. Burke S, Schmitt T, Jewell C, Schnapp B. A novel virtual emergency medicine residents-as-teachers (RAT) curriculum. J Educ Teach Emerg Med. 2021;6(3).

53. Farrell SE, Pacella C, Egan D, et al. Resident-as-teacher: a

suggested curriculum for emergency medicine. Acad Emerg Med 2006;13(6):677-679.

54. Liang JF, Cheng HM, Huang CC, Yang YY, Chen CH. Lessons learned from a novel 3-year longitudinal stepwise “residents-asteachers” program. J Chin Med Assoc. 2023;86(6):577-583.

55. Seelig S, Bright E, Bod J, et al. Educating future educators-resident distinction in education: a longitudinal curriculum for physician educators. West J Emerg Med. 2021;23(1):100-102.

56. Frey-Vogel A. A resident-as-teacher curriculum for senior residents leading morning report: a learner-centered approach through targeted faculty mentoring. MedEdPORTAL. 2020;16:10954.

57. Fromme HB, Whicker SA, Paik S, et al. Pediatric resident-as-teacher curricula: a national survey of existing programs and future needs. J Grad Med Educ. 2011;3(2):168-175.

58. Pien LC, Taylor CA, Traboulsi E, Nielsen CA. A pilot study of a “resident educator and life-long learner” program: using a faculty train-the-trainer program. J Grad Med Educ. 2011;3(3):332-336.

59. McKinley SK, Cassidy DJ, Sell NM, et al. A qualitative study of the perceived value of participation in a new department of surgery research residents-as-teachers program. Am J Surg 2020;220(5):1194-1200.

60. Roberts KB, DeWitt TG, Goldberg RL, Scheiner AP. A program to develop residents as teachers. Arch Pediatr Adolesc Med 1994;148(4):405-410.

61. Watkins AA, Gondek SP, Lagisetty KH, et al. Weekly e-mailed teaching tips and reading material influence teaching among general surgery residents. Am J Surg. 2017;213(1):195-201.e3.

62. Ofshteyn A, Bingmer K, Tseng E, et al. Effect of “residents as teachers” workshop on learner perception of trainee teaching skill. J Surg Res. 2021;264:418-424.

63. York-Best C, Bengtson J, Stagg A. A Simulation-Based Resident as Surgical Teacher (RAST) program. J Grad Med Educ. 2017;9(3):382384.

64. Patocka C, Meyers C, Delaney JS. Residents-as-teachers: a survey of Canadian emergency medicine specialty programs. CJEM 2010;12(3):249.

65. Aiyer M, Woods G, Lombard G, Meyer L, Vanka A. Change in residents’ perceptions of teaching: following a one day “residents as teachers” (RasT) workshop. South Med J. 2008;101(5):495-502.

66. Ryg PA, Hafler JP, Forster SH. The efficacy of residents as teachers in an ophthalmology module. J Surg Educ. 2016;73(2):323-328.

67. Wipf JE, Pinsky LE, Burke W. Turning interns into senior residents: preparing residents for their teaching and leadership roles. Acad Med. 1995;70(7):591-596.

68. Morrison EH, Friedland JA, Boker J, Rucker L, Hollingshead J, Murata P. Residents-as-teachers training in U.S. residency programs and offices of graduate medical education. Acad Med. 2001;76(10 Suppl):S1-4.

69. Morrison EH, Rucker L, Boker JR, et al. The effect of a 13-hour curriculum to improve residents’ teaching skills: a randomized trial. Ann Intern Med. 2004;141(4):257-263.

70. Ricciotti HA, Dodge LE, Head J, Atkins KM, Hacker MR. A novel

Jordan et al. Resident-as-Teacher Curriculum: Evidence-based Guide to Best Practices

resident-as-teacher training program to improve and evaluate Obstet Gynecol resident teaching skills. Med Teach. 2012;34(1):e52-7.

71. Geary A, Hess DT, Pernar LIM. Resident-as-teacher programs in general surgery residency - a review of published curricula. Am J Surg. 2019;217(2):209-213.

72. Ilgen JS, Takayesu JK, Bhatia K, et al. Back to the bedside: the 8-year evolution of a resident-as-teacher rotation. J Emerg Med 2011;41(2):190-195.

73. Kaji A, Moorehead JC. Residents as teachers in the emergency department. Ann Emerg Med. 2002;39(3):316-318.

74. Bree KK, Whicker SA, Fromme HB, Paik S, Greenberg L. Residentsas-teachers publications: What can programs learn from the literature when starting a new or refining an established curriculum? J Grad Med Educ. 2014;6(2):237-248.

75. Dunnington GL, DaRosa D. A prospective randomized trial of a residentsas-teachers training program. Acad Med. 1998;73(6):696-700.

76. Benè KL, Bergus G. When learners become teachers: a review of peer teaching in medical student education. Fam Med. 2014;46(10):783-7.

77. Minor S, Poenaru D. The in-house education of clinical clerks in surgery and the role of housestaff. Am J Surg. 2002;184(5):471-5.

78. Weisgerber M, Flores G, Pomeranz A, Greenbaum L, Hurlbut P, Bragg D. Student competence in fluid and electrolyte management: the impact of various teaching methods. Ambul Pediatr. 2007;7(3):220–225.

79. Moore J, Parsons C, Lomas S. A resident preceptor model improves the clerkship experience on general surgery. J Surg Educ 2014;71(6):e16-8.

80. McKeon BA, Ricciotti HA, Sandora TJ, et al. A consensus guideline to support resident-as-teacher programs and enhance the culture of teaching and learning. J Grad Med Educ. 2019;11(3):313-318.

81. Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE guide no. 67. Med Teach. 2012;34(5):e288-e299.

82. Hosseini S, Yilmaz Y, Shah K, et al. Program evaluation: an educator’s portal into academic scholarship. AEM Educ Train 2022;6(Suppl 1):S43-S51.

83. Donovan A. Radiology resident teaching skills improvement: impact of a resident teacher training program. Acad Radiol. 2011;18(4):518-524.

84. Gill DJ, Frank SA. The neurology resident as teacher: evaluating and improving our role. Neurology. 2004;63(7): 1334-1338.

85. Johnson KM, Rowat J, Suneja M. A 3-year rolling teaching skills curriculum for all residents in the ambulatory block. J Gen Intern Med. 2018;33(2):675.

86. Tischendorf JS, MacDonald M, Harer MW, Pittner-Smith CA, Zelenski AB, Johnson SK. Bridging undergraduate and graduate medical education: a resident-as-educator curriculum embedded in an internship preparation course. Wis Med J. 2020;119(4):278-281.

87. Dewey CM, Coverdale JH, Ismail NJ, et al. Residents-as-teachers programs in psychiatry: a systematic review. Can J Psychiarty. 2008;53(2):77-84.

88. Chochol MD, Gentry M, Hilty DM, McKean AJ. Psychiatry Residents

as Medical Student Educators: a Review of the Literature. Acad Psychiatry. 2022;46(4):475-485.

89. James MT, Mintz MJ, McLaughlin K. Evaluation of a multifaceted “resident-as-teacher” educational intervention to improve morning report. BMC Med Educ. 2006;6:20.

90. Haghani F, Eghbali B, Memarzadeh M. Effects of “teaching method workshop” on general surgery residents’ teaching skills. J Educ Health Promot. 2012;1:38.

91. Humbert AJ, Pettit KE, Turner JS, Mugele J, Rodgers K. Preparing emergency medicine residents as teachers: clinical teaching scenarios. MedEdPORTAL. 2018;14:10717.

92. York-Best C, Bengtson J, Stagg A. Laparoscopic salpingectomy: a simulation-based resident as surgical teacher (RAST) program. Obstet Gynecol. 2016;128:63S.

93. Katzelnick DJ, Gonzales JJ, Conley MC, Shuster JL, Borus JF. Teaching psychiatric residents to teach. Acad Psychiatry 1991;15(3):153-159.

94. Marcus CH, Newman LR, Winn AS, et al. TEACH and repeat: Deliberate practice for teaching. Clin Teach. 2020;17(6):688-694.

95. Grady-Weliky TA, Chaudron LH, Digiovanni SK. Psychiatric residents’ self-assessment of teaching knowledge and skills following a brief “psychiatric residents-as-teachers” course: a pilot study. Acad Psychiatry. 2010;34(6):442-444.

96. Dannaway J, Ng H, Schoo A. Literature review of teaching skills programs for junior medical officers. Int J Med Educ. 2016;7:25-31.

97. Geary AD, Hess DT, Pernar LIM. Resident-as-teacher programs in general surgery residency: context and characterization. J Surg Educ. 2019;76(5):1205-1210.

98. Zackoff MW, Real FJ, DeBlasio D, et al. Objective assessment of resident teaching competency through a longitudinal, clinically integrated, resident-as-teacher curriculum. Acad Pediatr 2019;19(6):698-702.

99. Hill AG, Srinivasa S, Hawken SJ, et al. Impact of a Resident-asteacher workshop on teaching behavior of interns and learning outcomes of medical students. J Grad Med Educ. 2012;4(1):34-41.

100. Loo BKG, Thoon KC, Tan JHY, Nadua KD, Chow CCT. Supporting paediatric residents as teaching advocates: changing students’ perceptions. Asia Pacific Scholar. 2020;5(3):62-70.

101. Litzelman DK, Stratos GA, Skeff KM. The effect of a clinical teaching retreat on residents’ teaching skills. Acad Med. 1994;69:433–4.

102. Thomas PS, Harris P, Rendina N, Keogh G. Residents as teachers: outcomes of a brief training programme. Educ Health. 2002;15:71–8.

103. Hoffman LA, Furman DT Jr, Waterson Z, Henriksen B. A novel resident-as-teacher curriculum to improve residents’ integration into the clinic. PRiMER. 2019;3:9.

104. Mann KV, Sutton E, Frank B. Twelve tips for preparing residents as teachers. Med Teach. 2007;29(4):301-306.

105. Fakhouri Filho SA, Feijo LP, Augusto KL, Nunes M do PT. Teaching skills for medical residents: Are these important? A narrative review of the literature. Sao Paulo Med J. 2018;136(6):571-578.

A Qualitative Study of Senior Residents’ Strategies to Prepare for Unsupervised Practice

Max Griffith, MD*

Alexander Garrett, MD*

Bjorn K. Watsjold, MD, MPH*

Joshua Jauregui, MD, MHPE*

Mallory Davis, MD, MPH†

Jonathan S. Ilgen MD, PhD*

Section Editor: Abra Fant, MD

University of Washington, Department of Emergency Medicine, Seattle, Washington University of Michigan, Ann Arbor, Department of Michigan, Ann Arbor, Michigan

Submission history: Submitted July 11, 2025; Revision received October 10, 2025; Accepted October 13, 2025

Electronically published November 26, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI 10.5811/westjem.48914

Introduction: As emergency medicine (EM) residents prepare for the transition into unsupervised practice, their focus shifts from demonstrating competencies within familiar training environments to anticipating their new roles and responsibilities as attending physicians, often in unfamiliar settings. Using the self-regulated learning framework, we explored how senior EM residents proactively identify goals and enact learning strategies leading up to the transition from residency into unsupervised practice.

Methods: In this study we used a constructivist grounded theory approach, interviewing EM residents in their final year of training at two residency programs. Using the self-regulated learning framework as a sensitizing concept for analysis, we conducted inductive, line-by-line coding of interview transcripts and grouped codes into categories. Theoretical sufficiency was reached after 12 interviews, with four subsequent interviews producing no divergent or disconfirming examples.

Results: We interviewed16 senior residents about their self-regulated learning approaches to preparing for unsupervised practice. Participants identified two types of gaps that they sought to address prior to entering practice: knowledge/skill gaps, and autonomy gaps. We employed specific workplace learning strategies to address each type of gap, which we have termed cherry-picking, case-based hypotheticals, parachuting, and making the call, and reflection on both internal and external sources of feedback to assess the effectiveness of these learning strategies. This study presents participants’ identification of gaps in their residency training, their learning strategies, and reflections as cyclical processes of self-regulated learning.

Conclusion: In their final months of training EM residents strategically leverage learning strategies to bridge gaps between their self-assessed capabilities and those they anticipate needing to succeed in unsupervised practice. These findings show that trainees have agency in how they use goal setting, strategic actions, and ongoing reflection to prepare themselves for unsupervised practice. Our findings also suggest tailored approaches whereby programs can support learning experiences that foster senior residents’ agency when preparing for the challenges of future practice. [West J Emerg Med. 2025;26(6)1510–1518.]

INTRODUCTION

Competency-based medical education frameworks provide scaffolding and accountability to ensure that emergency medicine (EM) trainees develop the necessary

knowledge and skills for unsupervised practice.1,2 While competency-based medical education frameworks provide a roadmap for residents to deliberately practice the core elements of EM, graduates of EM training programs often

lament the inevitability of encountering new challenges when entering practice.3,4 This suggests that the training experiences that advance residents’ competencies (what a resident does to demonstrate their abilities) must be done in conjunction with efforts to advance residents’ capabilities (the things they can think or do in future practice.)5 While competencies are often embedded in the tools that training programs use to assess residents,1,6,7 capability development requires trainees to engage in dynamic self-assessment8 to consider what they can work on now to prepare themselves for future transitions. A capability approach looks beyond training residents who are simply competent, aiming instead to develop trainees who can self-diagnose their future learning needs and enact learning strategies to achieve their goals.9–11

Self-regulated learning (SRL) provides a framework to study how senior residents approach workplace learning to prepare for their transitions into unsupervised practice.12 The SRL theory proposes that individuals are “metacognitively, motivationally, and behaviorally active participants in their own learning process.”13 This provides a structure to consider how residents might assess their abilities and modulate their activities during training.14 These SRL behaviors are often depicted as a cycle whereby individuals set goals, employ learning strategies to attain these goals, and reflect on their progress.14 This cycle is context-dependent, shaped by learner characteristics (eg, knowledge, prior experiences, emotions, and confidence) as well as by the learning environment (structure, supports, and cultural expectations).15 Learnerrelated factors such as autonomy, efficacy, and accumulated experience have been shown to support engagement with SRL,16 suggesting that residents in the final months of training have nuanced and mature self-regulated learning habits.

The end of residency training is a compelling period to examine SRL as it relates to capability development. As residents prepare for the transition into unsupervised practice, their focus shifts from demonstrating advanced competencies within familiar training environments7 to anticipating their performance with new roles and responsibilities as attending physicians, often in unfamiliar practice environments.17–21 In recent work exploring how senior EM residents conceptualized their preparedness for unsupervised practice, we found that trainees were cognizant of the inevitable mismatch between what they learned in training and what they would be expected to do in unsupervised practice.3

We were struck by trainees’ sense of agency in their reflections,22 particularly by how they set goals and leveraged their learning environments to create learning strategies that addressed their anticipated future practice needs. Recognizing that these findings have not been described previously in the literature, we returned to our data using the lens of SRL to explore how senior residents proactively identified goals and enacted learning strategies specific to their transitions from residency into unsupervised practice. By elaborating these strategies, we hope to provide insights that educators

Population Health Research Capsule

What do we already know about this issue? Emergency medicine (EM) residency graduates are often anxious about the unfamiliar clinical problems that they will encounter in unsupervised practice.

What was the research question? What workplace learning strategies do EM senior residents use to prepare themselves for unsupervised practice?

What was the major finding of the study? We describe self-regulated learning strategies: cherry picking, case-based hypotheticals, parachuting, and making the call.

How does this improve population health? These learning strategies can improve new physicians’ preparedness to treat patients without supervision in a variety of clinical settings.

can use to tailor their support for senior trainees during these important transition periods.

METHODS

We chose a constructivist grounded theory approach for this qualitative study, a methodology appropriate to study a complex social process about which relatively little is already known.23 We assembled a research team with a range of expertise and experiences, recognizing the importance of subjectivity in our processes of building theory through analyses of our participants’ narratives.24 The author group consisted of emergency clinician educators from both participating institutions, all of whom regularly supervise senior residents. We approached this study with an understanding of the challenges and affordances of the emergency department (ED) learning environment. At the time of data collection, three of the authors (MG, AG, MD) were each one year removed from residency, which allowed them to reflect on their recent training experiences as well as the challenges of working as new attending physicians.

Conceptual framework

This study is part of a larger program of research about how senior residents prepare themselves for unsupervised practice. In prior work,3 we described how senior residents adopted a future-oriented, capability approach to workplace

learning,5 using past training experiences as starting points to engage with unforeseen problems in practice. Our participants recognized that uncertainty and unfamiliar problems were inevitabilities of future practice, and defined preparedness in terms of the skills and approaches that would enable them to capably adapt to unforeseen challenges. This understanding led us to consider how senior residents might proactively use their final months of training to further these goals of adaptability25,26 and capability development.5 For this study, we used the SRL framework as a sensitizing concept for additional analysis, focusing on how participants described their dynamic processes of setting goals, strategizing for workplace learning, and monitoring their progress.13,14,27,28

Setting, Population and Sampling Strategy

We recruited EM residents in their final year of training at two four-year residency programs, each housed within large, academic healthcare centers with Level I trauma designation and rotations between multiple clinical sites. Fourth-year trainees in each program work a combination of “pre-attending” shifts, in which they supervise junior trainees with an attending physician also present, and primary shifts, during which they manage patients directly with attending supervision. We chose this cohort because of their proximity to their transition into unsupervised practice as well as their familiarity with their residency learning environments accumulated over three prior years of training. We sampled from geographically distinct areas (Midwest and Western United States) to account for local practice patterns and workplace cultures.

Interviews occurred between April–June 2023, when participants were still immersed in training but had solidified their immediate post-residency career plans. Email invitations were sent to all 28 eligible residents, with assurances that data would be deidentified before analysis and that participation would have no bearing on their standing within their programs. Interviews were scheduled in the order that residents responded. Participants were reimbursed with a $100 gift card. This study was reviewed and deemed exempt by institutional review boards at both sites.

Procedures

The principal investigator (MG) used videoconferencing software (Zoom Video Communications, Inc., San Jose, CA) to conduct individual virtual interviews. We piloted the interview guide (Supplemental material) with two senior trainees not participating in this study, and modified questions for clarity. We then conducted semi-structured interviews, asking participants about their career plans after residency, what it means to be “prepared” for unsupervised practice, and what challenges they anticipated as they entered work in new contexts. We probed about how residents developed learning goals and enacted specific workplace learning strategies to prepare themselves for the experiences they anticipated in unsupervised practice. We used a professional transcription

service (Rev.com. Inc., Austin, TX) to transcribe recordings, which MG then deidentified and checked for accuracy prior to analysis by the group.

Analysis

The entire research team coded four initial interview transcripts line-by-line to inductively develop a preliminary codebook, after which we agreed on a focused set of codes for the remaining data. Two investigators (MG and AG) then coded all transcripts with Dedoose (Social Cultural Research Consultants, LLC, Los Angeles, CA), using memos to keep track of conflicting examples or ideas requiring more exploration, and meeting frequently to discuss coding discrepancies. The entire group met periodically to resolve differing interpretations of the data, discuss relationships between codes, and group codes into categories. Drawing from these categories, we constructed theory as defined by Charmaz: to “present arguments about the world and the relationships within it.” 28,29(p128) Our coding framework sufficiently captured our construct of interest after 12 interviews. Finding no divergent or disconfirming examples in four subsequent interviews, we deemed our dataset sufficient for the study’s aims.30

RESULTS

We interviewed 16 EM senior residents (ten and six from each respective residency program; nine women and seven men). These residents had accepted positions to work clinically at a variety of community, academic, county, and community-academic hybrid sites, often splitting time between multiple clinical practice settings. Three participants were entering EM fellowships but with clinical roles as attending physicians. One participant was slated to start work as a critical care fellow, albeit with opportunities to work unsupervised shifts in the hospital’s ED. Across these interviews, participants shared a view that the resources, practice patterns, and pathologies characteristic of their training sites did not reflect clinical practice in most other settings. This perception shaped their learning goals for the final months of training, motivating them to develop learning strategies that bridged gaps between the capabilities they had developed in their existing training contexts and the skills they anticipated needing in unsupervised practice.

We identified learning strategies in our initial coding of participants’ stories. We then applied SRL as a theoretical lens, which allowed us to arrange those strategies into cycles involving an initial planning phase, in which participants selfidentified gaps, an action phase where they deployed learning strategies to address these gaps, and a phase of reflection on their progress. Finally, we divided these cycles as we recognized that they addressed two types of gaps— knowledge/skill gaps and autonomy gaps—to represent how our participants engaged in SRL cycles as a response to their impending clinical transitions.

Gaps in Participants’ Knowledge and Skills

In reflecting on their readiness to enter unsupervised practice, participants identified crucial gaps in their abilities to understand and manage unfamiliar clinical problems. These perceived gaps stemmed from limitations to the pathologies and patient complaints that they were exposed to during training, due to the time-limited nature of training as well as the affordances and limitations of working at an academic healthcare center. Many participants anticipated managing conditions with less input from specialists than they did at their current academic healthcare centers and worried they might be insufficiently prepared to manage these problems on their own. As Participant 11 reflected, “we just have so many consultants …we take a backseat on a lot of things.” Participants also worried that structures within their training environments—such as the tendency for nurse practitioners or physician assistants to care for patients with low-acuity complaints— buffered them from the realities of community practice:

I feel like we often are protected from the urgent care complaints… just because we have great mid-levels, and also it just doesn’t feel like that type of community medicine comes in that much. (Participant 6)

Learning Strategies: Cherry-picking and Case-based Hypotheticals

Participants adopted two strategies to address perceived knowledge and skill gaps in their training. For each, they strategically leveraged the resources of their training environments to build confidence that they could handle the anticipated challenges of their future practice. First, participants described acts of cherry- picking, selectively engaging with clinical tasks that addressed gaps in their knowledge and skills. They seemed to view their last months of training as an opportunity to seek out pathologies and procedures in areas where they felt inadequately prepared, at times prioritizing these over tasks that they viewed as less educationally enriching. For example, after self-identifying a need for more experience with orthopedic injuries, Participant 15 described an instance in which they intentionally sought out orthopedic experiences on shift.

“I knew there was a bunch of ortho stuff going into the department, and so I just said to [my attending] … ‘you won’t see me for a while. I’m going to spend the next few hours doing ortho stuff.’ And, so, I just went along and properly learned some better techniques.”

Yet residents often discussed the difficult balance between cherry-picking and expectations of patient throughput.

Several participants noted that they had not felt empowered to focus on gaps in their learning until the final years of their training. They attributed this greater sense of agency to their familiarity with the clinical workflow, their comfort with their clinical supervisors, and the sense of urgency imparted by the upcoming transition to unsupervised practice. As Participant 6 explained:

I was able to recognize that, after I don’t know how many years … if the patient needs to be seen because they’re super sick, then happily I’ll see them and I’ll see them fast. But you know, the eighth abdominal pain can sit for 20 minutes while I focus on the critical care or the pathology that I don’t really understand or recognize yet, because that’s more important.

A second strategy that participants used to address their gaps in knowledge and skills was case-based hypotheticals, moments when they deliberately slowed down and stretched a clinical experience to consider alternative approaches or dimensions they might face. Participant 7 used the expression “mental war-gaming” to describe their process of thinking through a range of case-specific “if this happens, what would I do?” hypotheticals with the help of their supervisor. Another participant elaborated on how considering hypotheticals with trusted supervisors helped them to feel more confident tackling novel problems in unsupervised practice:

Just trying to go through every line of how this [case] could turn out, so that when it does turn out that way [next year], I have a good frame of reference of what the attending would do … I think that just doubles your number. You’re essentially creating a new patient in your mind, right? … It’s not the unknown anymore.” (Participant 6)

Reflection: Programmatic Feedback

Participants often struggled with the poor alignment of external formative feedback sources—such as procedure logs, competency metrics, standardized tests, and evaluations from attendings—with their self-assessed knowledge and skills gaps. Despite this, they relied on programmatic assessment for feedback on these gaps and used it to calibrate their selfassessments. Participant 8 described the trust they placed in their program’s assessment processes:

I think you just have to really rely on the system that’s in place, and you have to trust the program leadership and the attendings to call you out when they think you are not ready in something.

Although these were recognized to be imperfect sources of feedback, many felt that nationally recognized milestone assessments and standardized tests were the best available substrate to reflect on their abilities to apply knowledge and skills in unsupervised practice.

I think to be a practicing emergency physician...you have to be able to pass the boards… Do you have enough information in your head that you’re not going to miss something glaringly obvious because you don’t know it? (Participant 14)

Gaps in Participants’ Autonomy

As part of their transition to the attending physician role, participants anticipated a major leap in autonomy, with expectations of practicing independently and bearing the ultimate responsibility for decisions. Working without supervisory guidance was an anticipated source of stress and anxiety, as Participant 13 reflected:

I am going to be the adult in the room making all of these decisions. I don’t necessarily have the attending to say, ‘I’m not sure, let me go ask them.’ …Every call ... I’ll be the final one making it.

Confidence was frequently mentioned as an attribute needed for unsupervised practice, on par with any knowledge or skill set. Thus, participants strategized ways that they could use their workplace learning to build confidence in the decisionmaking they would need when working without supervision.

Learning Strategies: Parachuting and Making the Call

Participants adopted two learning strategies to engender confidence that they could engage in new tasks without the input of supervisors. They pushed themselves to expand their comfort zones in two ways: by trying new management approaches with supervisory support; or by deliberately seeking experiences where supervisory support felt absent. First, participants described acts of parachuting, deliberately seeking to safely try new things while still having the backstop provided by their clinical supervisors. They sought opportunities to trial unfamiliar approaches to patient care, for example treating with a medication they had little experience with or attempting a procedural method that they had not tried before, viewing these instances as moments when they could “widen [their] experience before getting too set in one way” (Participant 7). Participants were able to test their limits or attempt new things with reassurances that supervisors were available to help. Participant 5 described this support structure in the following way:

[T]his parachute that you know is there…no matter how much flexibility and how much

autonomy our attendings give us, it’s very clear that there’s somebody to catch you if you fall.

Second, participants described deliberate efforts to adopt a mindset of working without supervisory support, pushing themselves to engage with high-stakes decisions before their supervisors provided input. Participant 13 described stroke evaluations as moments when they found opportunities to “make the call” with no supervisory guidance:

I try to jump on [stroke evaluations], whether or not the attending is there yet, and kind of make a call before that support comes in.

Making the call in this manner fostered self-reliance, and participants expected that experiences like this would lessen the stress of similar decisions in future unsupervised practice:

Whenever I come up with a patient and I’m like, ‘Oh, I need to ask about what to do.’

Then I just pause. I’m like, ‘No, I’m not going to ask. I’m going to figure out what I’m going to do and then present it that way.’

(Participant 2)

In addition to finding opportunities to practice their independent thinking during supervised shifts, several participants sought out authentic experiences of autonomy through moonlighting (working physician shifts for pay outside of their regular training, often unsupervised) to build confidence at the end of training.

Reflection: Internal Emotions and External Standards

Gauging whether they were ready for increased autonomy was difficult for participants, and they struggled to link their readiness to existing performance metrics within their residency training structures. Participants instead reflected on their emotional reactions in the workplace, as well as implicit feedback from their clinical supervisors as more holistic measures of whether they could be confidently autonomous. During high-stakes scenarios, participants took stock of their own internal emotional states, using these reactions as a measuring stick for whether they could handle the pressures of unsupervised practice. Participant 3 reflected on their comfort level while leading a pediatric code as evidence that they were ready to handle the increased autonomy:

Yeah, you know I felt uncomfortable in regard to it being a 3-year-old and it being stressful, but I didn’t feel out of my depth by any means. I felt like if that showed up at [a community hospital], even without a ton of surgeons, I felt like I would have been able to handle it.

Residents also cited their supervisors as external reference points that helped them reflect on their abilities to work autonomously. They described a practice of comparing their management plans to those of their supervisors, using instances of alignment or misalignment as ongoing sources of feedback. Working alongside attending physicians provided opportunities to identify a range of acceptable practice and evaluate decisionmaking against a trusted standard, either reaffirming or calling into question their sense of practice readiness.

“I am just quizzing myself against the attendings… you’re ready for more independent practice when all of your care decisions seem to fall within the range of people that you work with.” (Participant 9)

DISCUSSION

Senior residents in this study described a range of strategies that they used to prepare themselves for unsupervised practice. Using SRL as a framework to interrogate these strategies, we showed how residents identified specific knowledge and skill gaps and the need to build confidence in their ability to work autonomously. They then strategically leveraged their familiarity with their training environments toward. tailored approaches that addressed these self-identified areas of development. To reflect on whether their learning strategies were effective, they looked to programmatic feedback, their own emotions, or their performance relative to their supervisors, using these sources of feedback to prepare themselves for new cycles of learning. Taken together, these acts of gap identification, strategic action, and reflection provided unique cycles of SRL specific to their upcoming transitions into practice (Figure).

This study adds to previous research on transitions into practice, which has historically focused on perspectives of attending physicians who have already entered their new professional roles.18,31-35 Teunissen and Westerman have argued that “a transition is not a moment, but rather a dynamic process,”36(p45) encompassing the periods both leading up to and succeeding an advance in training. Other authors have questioned the very notion of “preparedness” for medical trainees, for whom performance depends heavily on the shifting contexts of their work environment.37 Our study provides a different perspective, namely that trainees can exert agency in how they use goal setting, strategic actions, and ongoing reflection to prepare themselves for the needs they anticipate in unsupervised practice, even if the specifics of their future practice remain unpredictable.

Our participants’ learning strategies align with recent work that has described SRL as context dependent.12,15 Senior trainees are more likely to employ nuanced learning strategies because they have developed competence with routine aspects of care over time within the contexts of their training environments.5 Furthermore, because they were familiar with

Gap identification, learning strategies, and means of reflection described by participants in a study of how senior residents in emergency medicine programs prepare for independent practice.

their learning environments and supervisors, and perhaps because they felt a sense of urgency from the upcoming transition to practice, our participants seemed empowered to prioritize learning over service to the department. While many participants did reference a tension between “moving the meat” and taking time to grapple with new learning,38 senior residents in this study seemed more comfortable deferring non-emergent patient care to focus on high-yield learning opportunities. Our results also resonate with other models of self-regulated learning that have been studied in medical training, such as the master adaptive learner framework.39,40 Regan et al noted that master adaptive learners identify knowledge and skill gaps based on a combination of performance- and community-related cues; they triage learning opportunities based on complex considerations of needs, desires, and obligations; and they self-assess the effectiveness of their learning efforts.41 These authors also note the influence of context on SRL, with transitions in training

Figure.

prompting trainees to re-evaluate and adapt their learning strategies, and the specifics of each learning environment helping to shape the goals that trainees set for themselves.42 Our study’s participants showed similar processes of gap identification, learning, and self-assessment, all heavily influenced by the context of preparing for unsupervised practice during the final months of residency training. It would be informative to study how trainees develop specific learning goals regarding other significant transitions or milestones in training.

While these senior residents’ learning strategies were geared toward proactively seeking educational opportunities and fostering autonomy, supervisors clearly played a fundamental role in these experiences. Participants drew from supervisors’ support in both explicit and tacit ways—borrowing from supervisors’ experience to stretch their learning through hypotheticals that they might see in practice or engaging in new tasks equipped with their metaphorical parachutes. This framing expands the traditional framing of learner-centric SRL cycles toward paradigms such as “co-regulated learning” that emphasize the critical aspects of supervisory support at each step.43 Supervising physicians can guide senior residents’ goals for their workplace learning by highlighting differences between their training environments and their future practice settings, or spotlighting the skills that will maximize their confidence and future success. Supervisors can also guide senior residents to authentic experiences of autonomy and productive struggle,44 allowing them to grapple with clinical uncertainty while still being available for support.45

These findings present several important considerations for residency training programs. First, programs can more meaningfully consider residents’ individualized post-training needs, probing for perceived gaps and letting residents select (or even design) experiences that are likely to set them up for success in their unique practice contexts. Second, they can provide level-appropriate supervisory opportunities for senior residents while still in training, whether junior-attending shifts in the ED46 or moonlighting opportunities that allow them to assume the duties of an attending physician.47 Experiences when trainees are pushed to “make the call” clearly build confidence for future autonomy.

Finally, while residents in this study identified individualized learning goals, their means for reflection often involved less specific tools such as exam scores and procedure logs. This suggests opportunities for programs to better support each resident’s self-regulated learning efforts by helping them identify sources of feedback that meaningfully address their unique and contextualized learning goals. Our results suggest that residents gauge their own performance through multiple sources of feedback that are both explicit (eg, post-shift discussions with attendings, workplace-based assessments, and semi-annual reviews), and implicit (eg, social cues generated from their interactions with attendings, staff, and patients).48

LIMITATIONS

Our results and analyses reflect several methodological decisions. We interviewed residents from two residency programs to allow for more diverse perspectives; however, both programs featured large Level I trauma centers and academic hospitals, and most of these residents were preparing for transitions to community practice. Thus, these participants’ actual and perceived knowledge gaps and learning goals may not reflect those of trainees from other programs. We focused only on residents’ strategies in anticipation of unsupervised practice; thus, our study was not designed to follow-up with participants after graduation to see whether these strategies were actually helpful in fostering preparedness.

It is important to note that this was a preplanned return to a dataset that was initially collected as part of a broader study about residents’ preparedness for practice.3 Returning to these data with a SRL lens enabled us to focus on specific dimensions of participants’ stories that were germane to cycles of learning, although this choice may have necessarily excluded other important aspects of their experiences.49 Finally, MG had professional relationships with the participants that he interviewed, either as a supervisor or former co-resident, and this may have shaped their responses as well as his interpretation of the data.

CONCLUSION

Emergency medicine residents strategically leverage learning strategies in their final months of training to bridge perceived gaps between their self-assessed capabilities and those they anticipate needing to succeed in unsupervised practice. We present these strategies—cherry-picking, casebased hypotheticals, parachuting, and making the call—within cyclical processes of self-regulated learning, although they are notably codependent on supervisory support. These findings suggest tailored approaches whereby programs can support learning experiences that foster senior residents’ agency when preparing for the challenges of future practice.

Address for Correspondence: Max Griffith, MD, University of Washington, Department of Emergency Medicine, Harborview Medical Center, Box 359702, 325 Ninth Avenue, Seattle, WA 98104-2499. Email: maxgrif@uw.edu.

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Griffith et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Hamstra SJ, Yamazaki K. A validity framework for effective analysis and interpretation of milestones data. J Grad Med Educ. 2021;13(2s):75-80.

2. Frank JR, Snell L, Sherbino J. CanMEDS 2015 Physician Competency Framework. 2015. Available at: https://www. royalcollege.ca/content/dam/document/standards-andaccreditation/2015-canmeds-framework-reduced-e.pdf. Accessed July 11, 2025.

3. Griffith M, Garrett A, Watsjold BK, et al. Ready, or not? A qualitative study of emergency medicine senior residents’ perspectives on preparing for practice. AEM Educ Train. 2025;9(1):e70005.

4. Gamborg ML, Mylopoulos M, Mehlsen M, et al. Exploring adaptive expertise in residency: the (missed) opportunity of uncertainty. Adv Health Sci Educ Theory Pract. 2024;29(2):389-424.

5. Teunissen PW, Watling CJ, Schrewe B, et al. Contextual competence: how residents develop competent performance in new settings. Med Educ. 2021;55(9):1100-9.

6. Cooney RR, Murano T, Ring H, et al. The Emergency Medicine Milestones 2.0: setting the stage for 2025 and beyond. AEM Educ Train. 2021;5(3).

7. Holmboe ES, Yamazaki K, Nasca TJ, et al. Using longitudinal milestones data and learning analytics to facilitate the professional development of residents: early lessons from three specialties. Acad Med. 2020;95(1):97-103.

8. Eva KW, Regehr G. “I’ll never play professional football” and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28(1):14-9.

9. Jain V, Oweis E, Woods CJ. Mapping the distance: from competence to capability. ATS Sch. 2023;4(4):400-4.

10. Stephenson J, Weil SW. Quality in Learning: A Capability Approach in Higher Education. London, UK: Kogan Page Ltd; 1992.

11. Mylopoulos M, Brydges R, Woods NN, et al. Preparation for future learning: a missing competency in health professions education? Med Educ. 2016;50(1):115-23.

12. van Houten-Schat MA, Berkhout JJ, van Dijk N, et al. Self-regulated learning in the clinical context: a systematic review. Med Educ 2018;52(10):1008-15.

13. Zimmerman BJ. Becoming a self-regulated learner: Which are the key subprocesses? Contemp Educ Psychol. 1986;11(4):307-13.

14. White CB, Gruppen LD, Fantone JC. Self‐regulated learning in medical education. In: Swanwick T (Eds.), Understanding Medical Education (201-211). Hoboken, NJ: John Wiley & Sons, 2013.

15. Brydges R, Butler D. A reflective analysis of medical education research on self-regulation in learning and practice. Med Educ. 2012;46(1):71-9.

16. Berkhout JJ, Helmich E, Teunissen PW, et al. Exploring the factors influencing clinical students’ self-regulated learning. Med Educ 2015;49(6):589-600.

17. Roten C, Baumgartner C, Mosimann S, et al. Challenges in the transition from resident to attending physician in general internal medicine: a multicenter qualitative study. BMC Med Educ.

2022;22(1):336.

18. Collini A, Alstead E, Knight A, et al. “You may think that the consultants are great, and they know everything, but they don’t”: exploring how new emergency medicine consultants experience uncertainty. Emerg Med J. 2023;40(9):624-9.

19. Watsjold BK, Griffith M, Ilgen JS. Stuck in the middle: the liminal experiences of entering practice. Emerg Med J. 2023;40(9):622-3.

20. Parikh AB. On the Transition to attendinghood. J Cancer Educ 2021;36(1):207-9.

21. Schrewe B. Thrown into the world of independent practice: from unexpected uncertainty to new identities. Adv Health Sci Educ Theory Pract. 2018;23(5):1051-64

22. Varpio L, Aschenbrener C, Bates J. Tackling wicked problems: how theories of agency can provide new insights. Med Educ 2017;51(4):353-65.

23. Charmaz K. An invitation to Grounded Theory. In: Charmaz, K. Constructing Grounded Theory, 2nd Ed. 1-21. Thousand. Oaks, CA: Sage. Publications, 2014.

24. Albert M, Mylopoulos M, Laberge S. Examining grounded theory through the lens of rationalist epistemology. Adv Health Sci Educ Theory Pract. 2019;24(4):827-37.

25. Mylopoulos M, Regehr G, Ginsburg S. Exploring residentsʼ perceptions of expertise and expert development. Acad Med 2011;86:S46-9.

26. Lajoie SP, Gube M. Adaptive expertise in medical education: accelerating learning trajectories by fostering self-regulated learning. Med Teach. 2018;40(8):809-12.

27. Panadero E. A review of self-regulated learning: six models and four directions for research. Front Psychol Frontiers Media S.A. 2017;8(APR).

28. Apramian T, Cristancho S, Watling C, et al. (Re)grounding grounded theory: a close reading of theory in four schools. Qualitative Research. 2017;17(4):359-76.

29. Reconstructing theory in Grounded Theory studies. In: Charmaz K, Constructing Grounded Theory, A Practical Guide Through Qualitative Analysis. Thousand Oaks, CA: Sage Publications; 2006:158.

30. Malterud K, Siersma VD, Guassora AD. Sample size in qualitative interview studies. Qual Health Res. 2016;26(13):1753-60.

31. Westerman M, Teunissen PW, van der Vleuten CPM, et al. Understanding the transition from resident to attending physician: a transdisciplinary, qualitative study. Acad Med. 2010;85(12):1914-9.

32. Wiebe N, Hunt A, Taylor T. “Everything new is happening all at once”: a qualitative study of early career obstetrician and gynaecologists’ preparedness for independent practice. Can Med Educ J. 2024;15(3):6-17.

33. Cogbill TH, Shapiro SB. Transition from training to surgical practice. Surg Clin North Am. 2016;96(1):25-33.

34. De Leo AN, Drescher N, Bates JE, et al. Challenges in the transition to independent radiation oncology practice and targeted interventions for improvement. Tech Innov Patient Support Radiat Oncol.

Residents’ Strategies to Prepare for Unsupervised Practice

2022;24:113-7.

35. de Montbrun S, Patel P, Mobilio MH, et al. Am I cut out for this? Transitioning from surgical trainee to attending. J Surg Educ. 2018;75(3):606-12.

36. Teunissen PW, Westerman M. Opportunity or threat: the ambiguity of the consequences of transitions in medical education. Med Educ. 2011;45(1):51-9.

37. Kilminster S, Zukas M, Quinton N, et al. Preparedness is not enough: Understanding transitions as critically intensive learning periods. Med Educ. 2011;45(10):1006-15.

38. Veysman BD. Butchers move the meat; doctors care for patients. Ann Emerg Med. 2010;56(5):578-9.

39. Auerbach L, Santen SA, Cutrer WB, et al. The educators’ experience: learning environments that support the master adaptive learner. Med Teach. 2020;42(11):1270-4.

40. Cutrer WB, Miller B, Pusic MV, et al. Fostering the development of master adaptive learners: a conceptual model to guide skill acquisition in medical education. Acad Med. 2017;92(1):70-5.

41. Regan L, Hopson LR, Gisondi MA, et al. Learning to learn: a qualitative study to uncover strategies used by master adaptive learners in the planning of learning. Med Teach. 2019;41(11):1252-62.

42. Regan L, Hopson LR, Gisondi MA, et al. Creating a better learning

environment: a qualitative study uncovering the experiences of master adaptive learners in residency. BMC Med Educ 2022;22(1):141.

43. Rich JV. Proposing a Model of co-regulated learning for graduate medical education. Acad Med. 2017;92(8):1100-4.

44. Mylopoulos M, Steenhof N, Kaushal A, et al. Twelve tips for designing curricula that support the development of adaptive expertise. Med Teach. 2018;40(8):850-4.

45. Ilgen JS, de Bruin ABH, Teunissen PW, et al. Supported Independence: the role of supervision to help trainees manage uncertainty. Acad Med. 2021;96(11S):S81-6.

46. Dunbar-Yaffe R, Wu PE, Kay T, et al. Understanding the influence of the junior attending role on transition to practice: a qualitative study J Grad Med Educ. 2022;14(1):89-98.

47. Kaji A, Stevens C. Moonlighting and the emergency medicine resident. Ann Emerg Med. 2002;40(1):63-6.

48. Yama BA, Hodgins M, Boydell K, et al. A qualitative exploration: questioning multisource feedback in residency education. BMC Med Educ. 2018;18(1).

49. Bolander Laksov K, Dornan T, Teunissen PW. Making theory explicit - An analysis of how medical education research(ers) describe how they connect to theory. BMC Med Educ. 2017;17(1).

Characteristics and Educational Support Resources Available to Emergency Medicine Core Faculty: A National Survey

Jaime Jordan, MD, MAEd*†

Laura R. Hopson, MD, Med‡

Fiona Gallahue, MD§

James A. Cranford, PhD‡

John C. Burkhardt, MD, PhD‡||

Keith E. Kocher, MD, MPH‡||

Drew L. Robinett, MD†

Moshe Weizberg, MD#

Tiffany Murano, MD¶

Oregon Health & Science University, Department of Emergency Medicine, Portland, Oregon

David Geffen School of Medicine at University of California Los Angeles, Department of Emergency Medicine, Los Angeles, California

University of Michigan Medical School, Department of Emergency Medicine, Ann Arbor, Michigan

University of Washington, Department of Emergency Medicine, Seattle, Washington University of Michigan Medical School, Department of Learning Health Sciences, Ann Arbor, Michigan

Maimonides Midwood Community Hospital, Department of Emergency Medicine, Brooklyn, New York

Columbia University, Department of Emergency Medicine, New York, New York

Section Editor: Benjamin Holden Schnapp, MD, MEd

Submission history: Submitted February 8, 2025; Revision received May 17, 2025; Accepted May 21, 2025

Electronically published September 2, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI 10.5811/westjem.42503

Introduction: Core faculty are key to supporting the educational mission in emergency medicine (EM). Changes in the Accreditation Council for Graduate Medical Education (ACGME) requirements for minimum protected time for core faculty may no longer guarantee adequate support. We sought to assess EM core faculty characteristics, support, and the impact of the 2019 revisions to ACGME regulations. We explored the influence of individual and institutional characteristics on support and the impact of the regulatory changes.

Methods: This was a cross-sectional survey study of a convenience sample of EM core faculty. Participants completed an online survey of multiple-choice and completion items between April–June 2022. We calculated descriptive and comparative statistics to assess associations between individual (e.g., sociodemographics, rank) and institutional (e.g., region, program type) factors on resources and impact of ACGME revisions.

Results: A total of 596 individuals (57% male) from 116 residency programs participated, including 15 (3%) instructors/lecturers, 280 (47%) assistant professors, 182 (31%) associate professors, and 80 (13%) professors. Most (64%) were 36-50 years of age; 246 (41%) had completed a fellowship. Despite the change to the ACGME requirements in 2019, 417 (70%) reported no modification to their clinical work hours, and 420 (71%) reported no modification to their non-clinical responsibilities. There were statistically significant associations between number of residents per class (P < .001), duration of training program (P < .001), and type of institution (P < .001) with the number of administrative personnel. We also observed statistically significant associations between academic rank (P = .02), region (P =.01), number of residents per class (P = 0.02), and type of site (P = .01) with change to clinical work hours after changes to ACGME requirements.

Conclusion: A minority of participants reported a change to their clinical and non-clinical expectations after revisions to the ACGME regulations with disproportionate impact across faculty and program type. [West J Emerg Med. 2025;26(5)1162–1169.]

INTRODUCTION

Academic emergency physicians play a unique and valuable role in the US healthcare system. Although academic emergency departments (ED) make up ~2% of all US EDs, these centers provide care for 5-12% of all acute care patients (> 10 million annually), staffing ~20% of all trauma centers and ~25% of transplant centers.1,2 However, in addition to their complex patient care responsibilities, the core faculty of these academic centers are charged with multiple extra-clinical responsibilities: training residents and medical students; publishing scholarly work; and filling administrative and quality improvement positions both within and outside the hospital.3

Success in these multifaceted roles requires substantial investment in personnel, funds, education, and opportunity.4 But, as of 2019, such support may not be guaranteed. The Accreditation Council for Graduate Medical Education (ACGME) changed prior regulations on protected time for CF from a limit on clinical hours to a minimum percentage of support, potentially reducing the administrative and financial support they receive for extra clinical responsibilities of their job.5,6 This recent change has renewed a century-old discussion on the intrinsic value of academic faculty and how best to support and compensate their work.7,8 Researchers have investigated the characteristics of this complex issue related to academic faculty roles and support, but often from a top-down perspective in which they summatively assess departments through the responses of program directors or department chairs.1,9,10

To more deeply understand the core-faculty workforce and the resources they are provided to accomplish their critical responsibilities, the field would benefit from data reported directly by the core faculty themselves. In this study, we aimed to characterize this workforce including sociodemographics, roles, responsibilities, administrative support, protected time, and impact of ACGME regulations. We also sought to test the association of these sociodemographic and institutional characteristics on administrative time and funding resources. Understanding these relationships is crucial to informing regulatory bodies and institutional leadership to provide necessary resources and staffing systems that allow faculty to meet the demands of their job tasks and thrive in their uniquely multidimensional roles.

METHODS

Design, Setting and Participants

This is a cross-sectional electronic survey study of a convenience sample of core faculty in emergency medicine (EM). We included individuals who were reported as core faculty to the ACGME. We announced the study and directly recruited participants at the Council of Residency Directors in Emergency Medicine (CORD) 2021 Academic Assembly and through emails on the organizational listserv. We also directly reached out to programs to seek diverse representation with

Population Health Research Capsule

What do we already know about this issue? Core faculty are essential to the educational mission in EM but may not get adequate support to carry out their tasks.

What was the research question?

What support do core faculty receive, and how have they been impacted by changes in regulatory requirements re protected time?

What was the major finding of the study? Approximately 70% of participants reported no change to their clinical work hours or non-clinical responsibilities after regulatory revisions.

How does this improve population health?

Insights from core faculty themselves on the impact of fewer protected hours illuminate potential downstream impact on teaching, publishing, and fulfilling administrative duties.

regard to region, duration of training, and institution type. We collected data between April–June 2022.

Study Protocol

We emailed participants a link to an online survey. Informed consent was implied by those who clicked the survey link. We sent up to three reminders to non-responders at regular intervals. We provided participants with a $10 gift card for survey completion. To maximize response rates and minimize guessing, we did not require participants to answer all items on the survey.

Instrument Development

Our study team of expert educators and education researchers developed the survey after literature review to optimize content validity. We developed the surveys according to best practices in survey design.11 The survey consisted of multiple-choice and completion items. We read all items aloud among the author group and piloted the survey with a small group of EM faculty to ensure response process validity. We made revisions for clarity and readability based on feedback. The final survey is available in Appendix A.

Data Analysis

As this was an exploratory study, we did not conduct

statistical power analyses or sample size estimates. We calculated descriptive statistics including percentages and measures of central tendency to detail respondent demographics and responses to survey items with discrete answer choices. We used chi-squared tests, independentgroups t-tests, and correlational analyses to examine associations between individual and institutional characteristics with outcome variables of number of administrative personnel, job responsibilities, clinical work hours, and non-clinical expectations. An alpha level of .05 was used for all analyses, and all statistical significance tests were two-tailed. We conducted all analyses with the SPSS software package v29.0 (IBM Corp, Armonk NY).

Institutional Review Board Statement

This study was reviewed by the Institutional Review Board of the University of Michigan and determined to be “exempt” based on federal exemption category 3(i)(B) at 45 CFR 46.104(d).

RESULTS

A total of 596 core faculty from 116 EM residency programs participated in this study. We report the characteristics of participants, programs, and institutions in Table 1. Participants were most motivated to be core faculty by the additional opportunities to mentor and teach trainees, to participate in the educational program, and obtain recognition of their educational work with 475 (80.0%), 429 (72.0%), and 261(44.0%) identifying these as one of their top three most important motivators, respectively. While participants received multiple benefits from being core faculty, they had additional responsibilities (Table 2). They found scholarship requirements, completion of assessments, and involvement in the didactic curriculum to be their most challenging responsibilities, with 336 (68.7%), 298 (60.9%), and 238 (48.7%) ranking these as their top three most difficult responsibilities, respectively.After the change to the ACGME requirements in 2020, 417 core faculty (70%) reported no change to their clinical work hours and 420 (70.5%) reported no change to their non-clinical responsibilities (Table 2). Of the 52 participants (11.1%) who reported that the change in ACGME requirements affected their clinical work hours, a greater percentage of assistant (11.3%) and associate professors (11.5%) were affected compared to professors (4.0 %) and instructors/lecturers (0%), P = .01. The average number of residents per class was statistically significantly lower among those who indicated that the change in ACGME requirements of July 2020 affected their clinical work hours (mean 11.1 ± 3.3) vs those who indicated that it did not (mean 12.1 ± 3.5), P = .02. Type of site was statistically significantly associated with change to clinical work hours after changes to ACGME requirements (P = .01) with 66.7% of military/ Veterans Administration (VA) sites, 14.8% of community sites, 9.2% of county/public sites, 8.8% of university sites, and

Table 1. Participant, program, and institution characteristics in a survey of emergency medicine core faculty.

Sex

in educational role

medicine/Dive medicine

MD, Doctor of Medicine; DO, Doctor of Osteopathic Medicine; MA, Master of Arts

Table 1. Continued

Total N = 596 n (%) MHPE

14 (2)

(2) 121 (21)

(0)

(0)

(0)

(0)

(2)

Table 1. Continued

Academic rank Instructor/Lecturer

Administrative roles**

Program Director

Assistant/Associate Program Director

Clerkship Director

Assistant or Associate Clerkship Director

Fellowship Director

Medical Director or Assistant/Associate

Medical Director

EMS Director or Assistant/Associate EMS Director

Ultrasound Director or Assistant/Associate

Ultrasound Director

Research Director or Assistant/Associate

Research Director

Vice Chair Chair

Designated Institutional Official Assistant/Associate Dean

Institution has specific faculty promotion tracks

Specific faculty promotion track Clinical Administrator Clinical

Promotion track with tenure

(3) 280 (47)

(31) 80 (13) 26 (4)

(2)

(13)

(25)

(10)

(4)

(12)

(13)

(8)

(10)

(5)

(10)

(4)

(1)

(3)

(25)

(13)

(34)

(65)

(2)

(4)

(46)

(1)

(5)

(8)

(35)

(72)

(14)

(15)

(22)

(24)

MHPE, Master of Health Professions Education; PhD, Doctor of Philosophy; MBA, Master of Business Administration; EdD, Doctor of Education; JD, Juris Doctor; PharmD, Doctor of Pharmacy; EMS, emergency medical services.

Faculty employment model of primary training site

School of Medicine Employee

Direct Hospital Employee

Large Contract group (Covers > 10 EDs)

Small Contract group (Covers ≤ 10 EDs)

group

Contractor

of personnel in program administration (mean ± standard deviation)

*Based on n = 246 who responded “Yes” to “Have you completed a fellowship training program?”

**Participants could select more than one role.

PGY, postgraduate year; VA, Veterans Administration; ED, emergency department.

7.1% of other sites reporting a change to clinical hours. Region was also statistically significantly associated with change to clinical work hours after changes to ACGME requirements (P = .09) with 18.0% of programs in the South, 11.7% of programs in the Midwest, 7.5% of programs in the Northeast, and 5.8% of programs in the West reporting a change to clinical hours.Of the 596 study participants, 400 (71.8%) reported that the previous ACGME requirements accurately reflected their commitments and responsibilities. Academic rank was statistically significantly associated with accurate reflection of responsibilities in previous ACGME requirements (P = .18) with 84.4% of professors, 85.4% of associate professors, 76.3% of assistant professors, and 62.5% of instructors/lecturers reporting that that the prior ACGME requirements accurately reflected their commitments and responsibilities. The average number of residents per class was statistically significantly higher among those who indicated that the previous ACGME requirements accurately reflected their commitments and responsibilities (mean 12.1 ± 3.4) vs those who indicated that the previous ACGME

Table 2. Reported benefits and responsibilities of being core faculty in emergency medicine,

Responsibilities and benefits received as core faculty*

Additional clinical time with trainees

Mean percentage of FTE reduction for being core faculty**

Mean additional CME funds (in dollars per year) for being core faculty***

Did the previous ACGME requirements accurately reflect your commitments and responsibilities?

Did the change to the ACGME requirements in July 2019 affect your clinical work hours?

Did the change to the ACGME requirements in July 2019 affect your non-clinical expectations?

If your group decreases their current level of support for core faculty in terms of shift numbers or non-clinical expectations, how would it change your willingness to serve as core faculty?

In the past two years, which of the following scholarship requirements for core faculty status have you met?*

Peer-reviewed publications

Presentations at Local/ Regional/National organizations

*Participants could select more than one response.

*Based on responses from n = 53 participants.

**Based on responses from n = 232 participants. FTE, full-time equivalent; CME, continuing medical education; ACGME, Accreditation Council for Graduate Medical Education.

requirements did not accurately reflect their commitments and responsibilities (mean 11.2 ± 3.5), P = .02. There were no statistically significant differences between sex, race, academic rank, type of institution, region, residency duration, or number of residents per class on changes to non-clinical expectations after revisions to the ACGME requirements. There were statistically significant differences by program duration (three vs four years) and number of residents and number of personnel in program administration. The average number of personnel working in program administration was higher among participants from four-year programs (mean 5.0 [SD = 6.2]) compared to participants from three-year programs (3.1 [SD = 2.6]), P < .001. Programs with more residents also had more personnel in program administration (r(575) =.18, P < .001). The mean number of administrative personnel was also higher in county/public (4.5 [SD = 5.7]) and university (4.4 [SD = 4.3]) than community (2.4 SD = 2.0), military/VA (2.0 [SD = 0.0]) and other (2.8 [SD = 1.8]) training sites (P < .001) and higher in the West (4.6 [SD = 5.8]) and Midwest (4.0 [SD = 4.5]) than the South (3.3 [SD = 2.1]) and Northeast (2.7 [SD = 2.3]) regions (P < .001).

DISCUSSION

The previous EM program requirements had a 28 hours/ week ceiling on the amount of clinical time that core faculty were permitted to work.5 When considering a 40-hour work week, this allotted core faculty 12 hours per week for administration and educational activities. The 2023 requirements establish a floor of 0.1 full-time equivalent (FTE) of protected time for core faculty, or approximately four hours per week in the 40-hour work week model.15 Understanding the workforce composition, its responsibilities, and impact of the ACGME changes is critical to determining whether this model of support is adequate. Drawing from a broad cross-section of EM core faculty across geographic regions, program types, and training sites, we are able to describe the core faculty workforce. In comparison to other recent studies of EM, residency core faculty have similar sex distributions to large studies of national specialty organizations.16

Our study noted significant associations between academic rank and faculty responsibilities as well as clinical work hours. Most faculty indicated that the prior ACGME requirements accurately reflected their educational commitments, particularly those at the rank of professor and associate professor. These findings may reflect the solidification of responsibilities and alignment with regulatory requirements as faculty progress in their careers. Although at the time of data collection, only a small subset of participants (11%) had been impacted by higher clinical work hours, we found that faculty at the assistant or associate professor rank may be disproportionately affected. These mid-career faculty may have been at the sweet spot to squeeze. They have advanced beyond the very early career stage and may have some administrative time to lose in favor of

clinical work compared to clinical instructors who may have already been working substantial clinical time that could not be significantly increased. Yet they are not as advanced in their careers as professors who may have more secure means of protected time such as grant funding or advanced leadership positions.

It is not surprising that programs with larger numbers of residents had more program administrative personnel, highlighting the scaled requirement for resources to the size of the programs.3,15 This is evident in the ACGME requirements regarding the minimum number of program coordinators, which are scaled to increase with the increased size of a program.3,15 The higher numbers of additional personnel in program administration among four-year programs is likely a reflection of the relative sizes of four-year programs being overall larger.17 Similar associations between size and duration of program have been seen with other outcomes.18

Interestingly, although there was no change in the clinical expectations for most participants, there were changes in clinical hours associated with faculty from programs with fewer residents with the new program requirements. This may be due to a perception that smaller programs require less time to administer. While this may be true, there is still a significant amount of time required for engaging in other programmatic and education-related activities that take place regardless of the number of residents in a program (eg, attendance at weekly conference, preparation and delivery of didactic sessions, interview/recruitment efforts, medical student mentoring, scholarship efforts). The correlation between programs with fewer residents and faculty who experienced changes in their clinical work hours as well as their commitments and responsibilities suggests that the smaller programs may have less flexibility in redistributing the clinical and administrative workloads when the ACGME requirements were modified. This potentially places a higher burden on these faculty, expecting them to perform more administrative duties with less time to do so. We also detected associations between type of site and region on changes to clinical hours. This may reflect variations in employment models, funding streams, and institutional priorities.19

One of the problems with establishing the floor on protected time, rather than capping the clinical time, is that there is wide variability among institutions (and EDs) as to what is considered 1.0 FTE. Although hour ranges are not explicitly detailed in the literature, institutional definitions of an FTE have been noted to vary from 40 to ≈ 60 hours/week based on individual operational needs and expectations. Emergency departments also vary in what is considered a clinical FTE, 32 vs 36 hours/week.22-24 With this lack of standardization, the change in the protected time requirement left room for interpretation by organizational, institutional, and departmental leadership to mean that the minimum requirement is the only amount of time necessary for core faculty activities.

LIMITATIONS

This survey-based study was subject to sampling and response bias with those most engaged in educational programming or most impacted by the ACGME changes potentially being more likely to respond. Future surveys of EM core faculty could be strengthened by systematic assessment of potential non-response bias. While our participants only represent a fraction of the total number of core faculty in EM, they do appear to parallel specialty educator demographics.25-28 Our data cover a broad crosssection of program characteristics; however, the sample may not be completely representative of the whole.

CONCLUSION

This study highlights potential concerns about the impact of the changed ACGME requirements for core faculty support on the educational environment for EM residency training. Additional work will be needed to track temporal trends, the potential for disproportionate impact among faculty members and programs, the effect on the learning environment, and the quality of residency training.

Address for Correspondence: Jaime Jordan, MD, MAEd, David Geffen School of Medicine at UCLA, Department of Emergency Medicine, 1100 Glendon Avenue, Suite 1200, Los Angeles, CA 90024. Email: jaimejordanmd@gmail.com

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Jordan et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Reznek MA, Scheulen JJ, Harbertson CA, et al. Contributions of academic emergency medicine programs to U.S. health care: summary of the AAAEM-AACEM benchmarking data. Acad Emerg Med. 2018;25(4):444-52.

2. Reznek MA, Michael SS, Harbertson CA, et al. Clinical operations of academic versus non-academic emergency departments: a descriptive comparison of two large emergency department operations surveys. BMC Emerg Med. 2019;19(1):72.

3. Accreditation Council for Graduate Medical Education. Common Program Requirements (Residency). 2023. Available at: https://www. acgme.org/globalassets/pfassets/programrequirements/ cprresidency_2023.pdf. Accessed February 7, 2025.

4. Yarris LM, Juve AM, Artino AR Jr, et al. Expertise, time, money, mentoring, and reward: systemic barriers that limit education

researcher productivity-proceedings from the AAMC GEA Workshop. J Grad Med Educ. 2014;6(3):430-6.

5. Greenberger SM, Finnell JT 2nd, Chang BP, et al. Changes to the ACGME Common Program Requirements and their potential impact on emergency medicine core faculty protected time. AEM Educ Train 2020;4(3):244-53.

6. Yuan CM, Young BY, Watson MA, et al. Programmed to fail: the decline of protected time for training program administration. J Grad Med Educ. 2023;15(5):532-5.

7. Duffy TP. The Flexner Report--100 years later. Yale J Biol Med 2011;84(3):269-76.

8. Gunderman RB. The perils of paying academic physicians according to the clinical revenue they generate. Med Sci Monit 2004;10(2):RA15-20.

9. Jarrett JB, Griesbach S, Theobald M, et al. Nonclinical time for family medicine residency faculty: national survey results. PRiMER 2021;5:45.

10. Accreditation Council for Graduate Medical Education. Data Resource Book. 2022-2023 Available at: https://www.acgme.org/ about/publications-and-resources/graduate-medical-education-dataresource-book/. Accessed February 7, 2025.

11. Rickards G, Magee C, Artino AR. You can’t fix by analysis what you’ve spoiled by design: developing survey instruments and collecting validity evidence. J Grad Med Educ. 2012;4(4):407–10.

12. Lincoln YS, Lynham SA, Guba EG. Paradigmatic controversies, contradictions, and emerging confluences, revisited. Sage Handbook Qualitat Res. 2011;4:97‐128.

13. Terry G, Hayfield N, Clarke V, et al. Thematic analysis. In: The SAGE Handbook of Qualitative Research in Psychology. Thousand Oaks, CA: SAGE Publications Ltd.; 2017:17‐37.

14. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42:1758‐1772.

15. Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Emergency Medicine. 2023. Available at: https://www.acgme.org/ globalassets/pfassets/programrequirements/110_ emergencymedicine_2023.pdf. Accessed February 7, 2025.

16. Bennett CL, Ling AY, Agrawal P, et al. How we compare: Society for Academic Emergency Medicine faculty membership demographics. AEM Educ Train. 2022;6(Suppl 1):S93-6.

17. Gaeta TJ, Ankel FK, Calderon Y, et al. American Board of Emergency Medicine Report on Residency and Fellowship Training Information (2023-2024). Ann Emerg Med. 2024;84(1):65-81.

18. Jordan J, Hwang M, Kaji AH, et al. Scholarly Tracks in emergency medicine residency programs are associated with increased choice of academic career. West J Emerg Med. 2018;19(3):593-9.

19. Adelman L. 2023 State of the emergency medicine employer market. 2023. Available at: https://assets.ivyclinicians.io/content/2023%20 State%20of%20the%20EM%20Employer%20Market_Ivy%20 Clinicians.pdf. Accessed February 7, 2025.

20. Li K, Al-Amin M, Rosko MD. Early financial impact of the COVID-19

Jordan et al.

Support Resources Available to

Faculty pandemic on U.S. hospitals. J Healthc Manag. 2023;68(4):268-283.

21. Gottlieb M, Sebok-Syer SS, Bawden A, et al “Faces on a screen”: a qualitative study of the virtual and in-person conference experience. AEM Educ Train. 2022;6(6):e10827.

22. Moorhead JC, Gallery ME, Hirshkorn C, et al. A study of the workforce in emergency medicine: 1999. Ann Emerg Med. 2002;40(1):3-15.

23. Nurok M, Flynn BC, Pineton de Chambrun M, et al. A review and discussion of full-time equivalency and appropriate compensation models for an adult intensivist in the United States across various base specialties. Crit Care Explor. 2024;6(4):e1064.

24. Medscape. Your income vs your peers’: medscape emergency medicine physician compensation report 2023. Available at: https:// www.medscape.com/slideshow/2023-compensation-emergencymedicine-6016356?icd=login_success_gg_match_norm. Accessed

February7, 2025.

25. Jordan J, Coates WC, Clarke S, et al. Exploring scholarship and the emergency medicine educator: a workforce study. West J Emerg Med. 2017; 18(1):63-8.

26. Golden A, Diller D, Riddell J, et al. A workforce study of emergency medicine medical education fellowship directors: describing roles, responsibilities, support, and priorities. AEM Educ Train. 2022; 6(5):e10799.

27. Coates WC, Gill AM, Jordan R. Emergency medicine clerkship directors: defining the characteristics of the workforce. Ann Emerg Med. 2005; 45(3):262-8.

28. Beeson MS, Gerson LW, Weigand JV, et al. Characteristics of emergency medicine program directors. Acad Emerg Med. 2006; 13(2):166-72.

EDUCATION SPECIAL ISSUE -ORIGINAL RESEARCH

SubstantialVariationExistsinClinicalExposuretoChief ComplaintsAmongResidentsWithinanEmergency MedicineTrainingProgram

CorlinM.Jewell,MD*

AmyT.Hummel,MD*†

DannJ.Hekman,MS*

BenjaminH.Schnapp,MD,MEd*

*UniversityofWisconsinSchoolofMedicineandPublicHealth,BerbeeWalsh DepartmentofEmergencyMedicine,Madison,Wisconsin † EmergencyMedicineSpecialistsSC,Wauwatosa,Wisconsin

SectionEditors:DougFranzen,MDandAndrewKetterer,MD

Submissionhistory:SubmittedFebruary25,2024;RevisionreceivedSeptember27,2024;AcceptedOctober11,2024

ElectronicallypublishedNovember19,2024

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.20281

Introduction: Whilemanyaspectsofemergencymedicine(EM)residencytrainingarestandardized amongresidentswithinasingleresidencyprogram,thereisnostandardforthedistributionofchief complaints(CC)thatresidentsshouldseeoverthecourseofresidency.Thiscouldresultinsubstantial variabilityineachresident’sclinicalexposure.OurobjectiveinthisstudywastoexploreEMresidents’ clinicalexposuretoCCstodeterminewhethersubstantialvariationexists.Ifsuchvariationexists,this couldsuggesttheneedforcurricularreformtoaddressgapsinresidentclinicalexposureduringtraining.

Methods: ThiswasaretrospectiveobservationalstudyofEMresidentswhograduatedintheyears 2016–2021atasingle,university-affiliatedemergencydepartment(ED)inthemidwesternUnitedStates. AllpatientencounterswhereaCCwasloggedwereincludedandcategorizedinto1of20clinical domainsbasedonthe2016AmericanBoardofEmergencyMedicineModelofClinicalPractice. Wecalculateddescriptivestatisticsforthetop10mostencountereddomainsforcomparison amongresidents.

Results: Weincludedatotalof228,916patientencountersfrom69residentsintheanalysis.Residents wereinvolvedinanaverageof3,323distinctpatientencountersduringthestudyperiod.Theoverall interquartilerangeforpatientencounterswas523.ThethreeCCdomainswiththebroadestinterquartile variationwereabdominalandgastrointestinaldisorders(116),musculoskeletaldisorders(nontraumatic) (93),andtraumaticdisorders(86).

Conclusion: Withinasingle,three-yearacademicEMprogram,substantialvariationexistedamong residentswithregardtothevarietyofpatientCCsseenduringtheirresidencytraining.[WestJEmerg Med.2025;26(1)47–52.]

INTRODUCTION

Medicalresidencytrainingallowsphysicianstogainthe cognitiveandproceduralskillsnecessarytopractice independently.Basedonexperientiallearningtheory,patient encountersformthefoundationuponwhichphysiciansin trainingbegintomasterthepracticeofmedicine.1 Additionally,thedevelopmentof “illnessscripts,” ormental modelsfortheclassificationofpatientpresentations,is

crucialtothedevelopmentofclinicalskillsandreasoning duringresidencytraining.2 Thesemodelsaredevelopedover timebymultipleexposurestopresentationsofsimilar diseasestates.3,4 Emergencymedicine(EM)trainees mustbeexposedtoavarietyofpatientchiefcomplaints(CC) throughoutthecourseofresidencytodevelop thesescriptsandbecomereadytobegin independentpractice.

EducatorswithinEMhaveworkedtodefinemanyaspects ofEMresidencytraining,includingoptimumnumberof shifts,on-shifteducationalgoals/practices,anddidactic content.5 Despitethis,theclinicalexperienceofanindividual residentmaybehighlyvariableandmaybepartiallydriven byself-selectionofpatientsbytheresident.Studiesin pediatricEMsuggestthatthereissignificantvariationinthe overallnumberofpatientsandrangeinacuityamong individualresidents.6,7 However,thereislittleadultEM literaturethatexploresthevariationinclinicalexperience seenbyresidentswithinamodernEMprogram.The literaturethatdoesexistinadultEMsuggeststhereis substantialvariationinclinicalexposuresamongresidents.8 Astudyfrom2006foundthatthenumberofcasesseen overallcorrelatedwithimprovedperformanceona standardizedtestdesignedtoassessclinicalcompetence. However,theeffectplateauedataround200cases.9 Prior workbyourgrouphasshownthatcasevolumeinan individualdomaindidnotcorrespondtoperformancewithin thatdomainoncorrespondingquestionsonthe in-trainingexam.10

Thesestudiessuggestthatindividualswithinasingle trainingprogrammaybegainingvariableexperiencewith certaintypesofpatientpresentationsandlackingexposure (andthereforeopportunitiestodevelopmastery)toother complaintsandpathology.However,thisvariabilityin clinicalexposureduringtraininghasnotbeenshowninadult EMforoverthreedecades.8 Sincethen,thenumberofannual visitstotheEDaswellasthecomplexityofmedicalcare providedhavesubstantiallyincreased.11,12 We,therefore, hypothesizedthatsubstantialdifferencesinclinicalexposure stillexistamongresidentsatthetimeofgraduation. Understandingthesedifferencesisofcriticalimportancefor residencyprogramsasconsiderablevariationcouldpush someresidentsbelowathresholdtodeveloprobustillness scriptssuitableforindependentpractice.

METHODS

StudyDesignandSetting

Weconductedthisretrospective,observationalstudyata three-yearEMresidencyprogramsituatedwithinanurban, academicemergencydepartment(ED)intheMidwest.The EDfortheprimaryclinicalsitehasatotalof54bedsandsees anannualvolumeofapproximately60,000patientvisits. Duringthestudyperiod,theresidencyhad12 first postgraduateyearone(PGY-1)positionsavailableeach year.ThestudyEDdividesitsbedsintotwoadultclinical areasandapediatricclinicalarea.Allthreeareasare physicallyconnectedonasingle floorofthehospital. Residentsfromallthreeyearsareassignedtonine-hourshifts ineachclinicalarea.Eachshiftincludes1–2junior(PGY-1) residents,1–2senior(PGY-2orPGY-3)residents,andone attendingphysician.Anyresidentcanassignthemselvesto patientsofanyseverityregardlessofseniority.InFall2020,

PopulationHealthResearchCapsule

Whatdowealreadyknowaboutthisissue?

Studiesfrom30yearsagoreportedvariation inthedistributionofchiefcomplaintsseenby emergencymedicineresidentsduringtraining.

Whatwastheresearchquestion?

Wehypothesizedthatsubstantialdifferences inclinicalexposurestillexistamongresidents atthetimeofgraduation.

Whatwasthemajor findingofthestudy?

Thethreechiefcomplaintdomainswiththe mostvariabilitybetweenindividualresident experience,asmeasuredbythegreatest25 – 75 interquartilerangeswereabdominaland gastrointestinaldisorders(median594 patientsperresident,IQR116), nontraumaticmusculoskeletaldisorders (median314,IQR92),andtraumatic disorders(median525,IQR86).

Howdoesthisimprovepopulationhealth?

Understandingthesedifferencesisimportant, assubstantialvariationcouldmeanthatsome residentsdonotdeveloprobustillnessscripts suitableforindependentpractice.

thestudyEDshiftedfroma “pod” modelinwhichthetwo adultclinicalareaswouldassignthemselvespredominately topatientsintheirclinicalareatoa “free-for-all” modelin whicheitheradultteamcouldassignthemselvestoanyadult patientregardlessoftheclinicalareatheywereroomedin. Duringthestudy,physicianassistantswereemployedinthe EDandwouldoccasionallytaketheplaceofaresidenton shift(particularlyduringweeklyresidentdidactics).

DataAcquisition

Residentswereeligibleforinclusioniftheyhadcompleted residencywithinthreeconsecutiveyearsandgraduatedinthe years2016–2021(therefore,thestudyperiodwasfromJune 2013–June2021).Theelectronichealthrecord(EHR)was usedtocreateadatabaseofpatientencounters;all encounterswhereeligibleEMresidentswerethe firstresident assignedtothepatientwereanalyzed.Weuseddeidentified patientencounterdata,listedby firstCC.TheCCwasusedto identifythenatureofthepatientencounterasthisdatawas availableatthetimeofpatientpresentation,oftendictates thepatient’sEDworkup,andwouldnothavebeenaffected byinformationdiscoveredduringthelaterstagesofa

patient’shospitalcourse.Thisapproachisconsistentwith priorliterature.9,13 Tomaintainanonymity,onlythesenior author,amemberoftheresidencyleadershipteam,had accesstoeachresident’sindividualizedstudy identificationnumber.

WeexcludedfromanalysisencounterswherenoCCwas listedornoresidentwasassigned.Incaseswheremultiple residentswereassignedtoasingleencounter(e.g.,apatient hadbeensignedouttoadifferentresident),weanalyzedthis encounteronlyfortheinitialresidentassigned.Thiswas doneastheyaretypicallythemostinvolvedinthecognitive workloadofdeterminingthepatient’sinitialdiagnosticand treatmentplan.TheCCforeachencounterwasselectedand enteredintotheEHRbytheprimarynursewhocaredforthe patientintheEDinitially.Atourinstitution,thisisnearly alwaysselectedfromalistofcommonCCs,althoughitcan beenteredasfreetext.EncountersinwhichmultipleCCs werelistedwereonlycodedintoasingledomainbasedonthe firstlistedCC.

DataAnalysis

AlistofcommonCCsinEMhasbeencategorizedintoa setof20contentdomainsviaaconsensusprocessbytwoEM attendingsusingthe2016AmericanBoardofEmergency Medicine(ABEM)ModelofClinicalPracticeasa framework.14 ForCCsidentifiedinourdatathatwerenot alreadycategorizedbyapreviouslydescribedmethod,13 we repeatedthesamecategorizationprocessinwhicheachCC wasassignedtoasingledomainbytwoboard-certifiedEM attendingphysiciansatourinstitution.Disagreements betweenthetworeviewerswereadjudicatedbyathirdboardcertifiedemergencyphysician.Ifasymptomwasenteredas theCC,suchas “fever” (whichcouldcorrespondtooneof multipledomains),itwaspreferentiallycategorizedintoa domainbasedonwhatthecodingphysiciansfeltwasthe

mostlikelytodictatetheEDworkup,ratherthanthe “signs, symptoms,andpresentations” domain.WeusedExcel (MicrosoftCorp,Redmond,WA)tocalculatedescriptive statisticsandcreateplotsandtables.Thetop10most encountereddomainsoverallwereanalyzed.Weexcluded lesscommondomainsgiventhelownumberoftotal encountersineacharea,whichwouldhavebeenmore vulnerabletorandom fluctuationsinwhenthesepatients presenttotheED.

Thisprojectwasdeemedexemptqualityimprovementby theUniversityofWisconsinHealthSciencesInstitutional ReviewBoard.

RESULTS

Atotalof315,614encounterswereinitiallyidentifiedfrom theEHR.Oftheseencounters198wereexcludedasnoCC waslisted.Afterexcludingresidentswhoseclinical experiencewasoutsidethestudyperiodandthosewhohad leftthetrainingprogrampriortograduationorhada prolongedleaveofabsence,atotalof228,916patient encountersfrom69residentswereincludedintheanalysis. Eachresidentwasassignedtoanaverageof3,323distinct patientencountersAssessmentofthetop10mostcommon clinicalexposuredomainsshowedwiderangesinthecase numbersofindividualresidents.The Table liststhemean, minimum,maximum,interquartilerange(IQR) and25thand75thpercentileforthe10mostcommon contentdomains.The Figure showstherangeof exposuretothe10mostcommondomainsin box-and-whiskerformat.

DISCUSSION

Ourdatasuggeststhatresidentswithinasingletraining programhavesubstantialvariationintheirclinical experiencesasmeasuredbythevariationinABEMcontent

Table. Mean,25th–75thpercentileranges,interquartilerange,andminimum/maximumencountersforthe10mostencountereddomains perresident.

, interquartilerange.

Figure. Top10mostcommonclinicalexposuredomainsseenbygraduationperresident.Boxesillustratethe25th–75thpercentileofnumber ofclinicalexposuresbyresidentsineachdomain,withwhiskersrepresentingtheminima,maxima,andoutliers.

domainsseenbyindividualresidents.Thisissimilartowhat wasdescribedbyLangdorfetal.in1990,despitetheprevious studybeingperformedoverthreedecadesagoandthe substantialsubsequentdifferencesintheutilizationofthe ED.8 Wefoundwideinterquartilerangesbetweenthe maximumandminimumnumberofencountersamong residents,suggestingthatsomeresidentssawsubstantially morepatientswithinparticulardomainsthanothers.

Themagnitudeoftheeducationalsignificanceofthe exposurevariabilityofresidentsisunclear.Itispossiblethat aresidentwhoseestwiceasmanymusculoskeletalchief complaintsasanotherresidentbygraduationissignificantly morecompetentinthatdomain.Alternatively,itisalso possiblethattheyhavebothattainedtheminimallevelof exposuretocompetentlymanagemusculoskeletal complaintsindependently.Theeffectsofclinicalexposureon clinicalcompetence,includingtheminimalnumberof encountersrequiredtodemonstratecompetencyina particulardomain,isanopenquestionandanavenuefor furtherresearch.However,theformationofillnessscriptsis continuallymodifiedbysubsequentpatientencounters.3,4 Therefore,theidentificationofhighdegreesofvariation amongresidentsmaypromptprogramleadershiptoinstitute changesinthecurriculumorsupplementclinicalexposure withindividualizedlearningplans.Thisislikelymore importantfordomainsthatareencounteredlessfrequently

overall,suchaspsycho-behavioraldisorders,wherelarger relativedifferencesinexposurecouldresultingreaterdeficits inillnessscriptformation.

Inadditiontopromptingchangesmadebytheprogram, identificationofhighvariabilityinclinicalexposuremay enhanceresidentself-assessment.Asdemonstrated previously,self-assessmentwhendoneinisolation,isan imperfectmeansofdrivingimprovementbutcanbe enhancedgreatlywheninformedbyadditionalinformation fromavarietyofsources.15 Understandingthedistribution ofthepatientencountersresidentshaveduringtraining,and thepotentialgapsintheirclinicalexposure,couldbea potentialmeansofallowingforinformedself-assessmentfor aresident’sclinicalskills.Thiscouldbepotentiallyfurther enhancediffacilitatedunderthesupervisionoffaculty coacheswithintheprogram,amethodthathasbecome increasinglypopularinmedicaleducation.16,17 Futurework couldfollowacohortofresidentswhoareabletotracktheir ownpatientvolumesmoreregularlythanwaspossibleinthe currentstudyandcomparethemselvestotheirpeers throughouttrainingandevaluatewhetheranydifferencesin clinicalcompetenceareidentified.Thiscouldalsoallow programstodeterminetheperceivedvalueofthis informationtoresidents.Finally,residentscould usethisdatatodrivetheirpatientselectionwhile workingintheED.

Beyondthepotentialforshapingresidentselfassessments,clinicalexposuredatamayhaveimportant implicationsforresidencyprogramleadershipaswemove towardaneraofcompetency-basedmedicaleducation (CBME).TwoofthepillarsofCBME, “teachingtailoredto competencies” and “effectiveprogrammaticassessment,”18 lendthemselveswelltotheidentificationofprogramclinical weaknessesaswellastothecreationofnewcurricular experiencesdesignedtoaddressareasoflimitedclinical exposureidentifiedbyresidentCCdata.Theseexperiences couldpotentiallytaketheformoftargetedreadingsor simulationsessionsdesignedtosupplementlowerfrequency clinicalencounters.

LIMITATIONS

Thiswasasingle-centerstudyinanurban,academicED, and findingsmaynotbegeneralizabletotrainingprograms indifferentenvironments.Additionally,thedatawas retrospective,makingtheeducationalutilityofthis informationoranypotentialcausesofvariationdifficult todetermine.

UseofaCCtocategorizeeachpatientencounterintoa clinicaldomainhasanelementofsubjectivityandmayhave ledtosomeencountersbeingmiscategorizedwithrespectto theworkupdoneor finaldiagnosis.Someadditional subjectivitymayhavebeenintroducedbyhowweclassified CCsthatcouldpotentiallyhavebeencategorizedinto multipledifferentdomains(suchas “fever” or “ingestion”). Thiswasdonebasedonwhatwasdeterminedtobemost likelytodrivetheinitialworkupinthedepartment.For example,althoughaCCof “chestpain” couldrepresenta cardiacorpulmonaryetiology,inalmostallcases,acardiac etiologymustbeexcluded.Therefore,itwasfeltthatthis wouldinfluencetheformationandmodificationofthe resident’sillnessscriptmostheavily.Itisalsopossiblethat encountersweremischaracterizedduetoonlyusingthe first CClistedandnotconsideringtheothersifmultipleCCswere listed.Likethepriorlimitation,itwasfeltthatthe firstCC wasmostlikelytodictatetheinitialEDworkup.Using dischargeor finaldiagnosesinsteadwasconsideredforthis study,butitwasfeltthattheCCismorelikelytodrivethe initialdifferentialanddiagnosticworkupforthepatient.

Additionally,ABEMdomainsmaybetoobroadto captureimportantdifferencesinexposure(e.g.,tworesidents withthesameexposureto “respiratorydisorders” couldhave seenlargenumbersofpneumoniapatientsor,alternatively, manypatientswithasthma).Trainingisinherentlyvariable astheEMenvironmentdiffersbyclinicalsite,day,shift,or evenseason.Therefore,theremayhavebeenslight differencesinwhenindividualresidentswereintheED clinicallyorthenumber/typeofoverallEDshiftsworked.It isimportanttonotethatsomeoftheincludedresidents’ trainingoccurredduringtheCOVID-19pandemic,which mayhavehadaneffectonboththevarietyandnumberof

clinicalexposuresseenbytheseresidents.Futureworkcould alsoexploreexposurebasedonsub-domainsfromthe ABEMmodeltogetamoregranularlookatindividual residentclinicalexperiencesratherthanrelyingonthe relativelybroaddomains.

Otherclinicalvariablesmayalsohaveaneffectona resident’sclinicalexposure,includingthetimingofmonths rotatingintheED.However,theEDdidnotundergomajor changesinthestaffingmodelofphysicians(including residents)duringthisperiod.Also,whileitislikelythatmore seniorresidentsassignthemselvestocriticallyillpatients,this wasfelttobeunlikelytomeaningfullyimpactourresults giventhatdata was obtainedatthetimeofgraduation. Therefore,eachresidentwouldhaveactedinaseniorrolefor thesameamountoftime.Finally,ouruseoftheEHRatthe mainclinicaltrainingsiteoftheresidencytogeneratethe datadidnotcapturetheclinicalexperienceattwoother trainingsitesfortheresidencythatuseadifferentEHR.This mayhaveservedtomoderateorexacerbatethedifferences seenamongresidents.However,clinicalexperiencesatthese othersitescomprisedatotalofonlyfourmonthsofthe36monthcurriculum,andsoitislikelythatouroverall findings wouldnothavebeensubstantiallyaffected.

CONCLUSION

Withinasingle,three-yearacademicemergencymedicine program,therewassubstantialvariationamongresidents regardingthevarietyofpatientchiefcomplaintsseen throughoutresidencywhenmappedtoABEM’sModelof ClinicalPractice.

AddressforCorrespondence:CorlinM.Jewell,MD,Universityof WisconsinSchoolofMedicineandPublicHealth,BerbeeWalsh DepartmentofEmergencyMedicine,800UniversityBayDr., Madison,WI53705.Email: cmjewell@medicine.wisc.edu

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Jewelletal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.TeunissenPW,ScheeleF,ScherpbierAJ,etal.Howresidentslearn: qualitativeevidenceforthepivotalroleofclinicalactivities. MedEduc. 2007;41(8):763–70.

2.BowenJL.Educationalstrategiestopromoteclinicaldiagnostic reasoning. NEnglJMed. 2006;355(21):2217–25.

3.CustersEJ,RegehrG,NormanGR.Mentalrepresentationsofmedical diagnosticknowledge:areview. AcadMed. 1996;71(10Suppl):S55–61.

4.HatalaR,NormalGR,BrooksLR.Influenceofasingleexampleon subsequentelectrocardiominterpretation. TeachingandLearningin Medicine. 1999;11(2):110–7.

5.StahmerSandKuhnG.Optimizingresidenttraining:resultsand recommendationsofthe2009CouncilofResidencyDirectors ConsensusConference. AcadEmergMed. 2010;17Suppl2:S78–86.

6.LiJ,RooseveltG,McCabeK,etal.Pediatriccaseexposureduring emergencymedicineresidency. AEMEducTrain. 2018;2(4):317–27.

7.ChenEH,ChoCS,ShoferFS,etal.Residentexposuretocritical patientsinapediatricemergencydepartment. PediatrEmergCare. 2007;23(11):774–8.

8.LangdorfMI,StrangeG,MacneilP.Computerizedtrackingof emergencymedicineresidentclinicalexperience. AnnEmergMed. 1990;19(7):764–73.

9.KernMW,JewellCM,HekmanDJ,etal.Numberofpatientencountersin emergencymedicineresidencydoesnotcorrelatewithin-trainingexam domainscores. WestJEmergMed. 2022;24(1):114–8.

10.HayashinoY,FukuharaS,MatsuiK,etal.Qualityofcareassociated withnumberofcasesseenandself-reportsofclinicalcompetencefor Japanesephysicians-in-trainingininternalmedicine. BMCMedEduc. 2006;6:33.

11.StrangeGRandChenEH.Useofemergencydepartmentsby elderpatients:a five-yearfollow-upstudy. AcadEmergMed. 1998;5(12):1157–62.

12.CairnsCandKangK.NationalHospitalAmbulatoryMedicalCare Survey:2021emergencydepartmentsummarytables.2023. Availableat: https://ftp.cdc.gov/pub/Health_Statistics/NCHS/ Dataset_Documentation/NHAMCS/doc21-ed-508.pdf AccessedSeptember21,2024.

13.BischofJJ,EmersonG,MitzmanJ,etal.Doestheemergencymedicine in-trainingexaminationaccuratelyreflectresidents’ clinical experiences? AEMEducTrain. 2019;3(4):317–22.

14.CounselmanFL,BabuK,EdensMA,etal.The2016modeloftheclinical practiceofemergencymedicine. JEmergMed. 2017;52(6):846–9.

15.WolffM,SantenSA,HopsonLR,etal.What’stheevidence:selfassessmentimplicationsforlife-longlearninginemergencymedicine. JEmergMed. 2017;53(1):116–20.

16.DeiorioNM,MooreM,SantenSA,etal.Coachingmodels,theories,and structures:anoverviewforteachingfacultyintheemergency departmentandeducatorsintheoffices. AEMEducTrain. 2022;6(5):e10801.

17.SargeantJ,LockyerJ,MannK,etal.Facilitatedreflective performancefeedback:developinganevidence-andtheory-based modelthatbuildsrelationship,exploresreactionsandcontent, andcoachesforperformancechange(R2C2). Acad Med. 2015;90(12):1698–706.

18.Van Melle E,FrankJR,HolmboeES,etal.Acorecomponents frameworkforevaluatingimplementationofcompetency-basedmedical educationprograms. AcadMed. 2019;94(7):1002–9.

EDUCATION SPECIAL ISSUE -ORIGINAL RESEARCH

Harder,Better,Faster,Stronger?ResidentsSeeingMorePatients PerHourSeeLowerComplexity

CorlinM.Jewell,MD* Guangyu(Anthony)Bai,MD†

DannJ.Hekman,MS*

AdamM.Nicholson,MD*

MichaelR.Lasarev,MS‡ RoxanaAlexandridis,PhD‡ BenjaminH.Schnapp,MD,MEd*

*UniversityofWisconsinSchoolofMedicineandPublicHealth,BerbeeWalsh DepartmentofEmergencyMedicine,Madison,Wisconsin † IndianaUniversitySchoolofMedicine-Northwest,Gary,Indiana ‡ UniversityofWisconsinSchoolofMedicineandPublicHealth,Departmentof BiostatisticsandMedicalInformatics,Madison,Wisconsin

SectionEditors:EdUllman,MD,ChristineStehman,MD,andDougFranzen,MD

Submissionhistory:SubmittedFebruary25,2024;RevisionreceivedNovember20,2024;AcceptedNovember22,2024

ElectronicallypublishedJanuary31,2025

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.20282

Introduction: Patientsseenperhour(PPH)isapopularmetricforemergencymedicine(EM)resident efficiency,althoughitislikelyinsufficientforencapsulatingoverallefficiency.Inthisstudyweexplored therelationshipbetweenhigherpatientcomplexity,acuityonshift,andmarkersofclinicalef ficiency.

Methods: Weperformedaretrospectiveanalysisusingelectronichealthrecorddataofthepatientsseen byEMresidentsduringtheir finalyearoftrainingwhograduatedbetween2017–2020atasingle,urban, academichospital.WecomparedthenumberofPPHseenduringthethird(final)yeartopatientacuity (EmergencySeverityIndex),complexity(CurrentProceduralTerminologycodes[CPT]),propensityfor admissions,andgeneratedrelativevalueunits(RVU).

Results: Atotalof46residentswereincludedintheanalysis,representing178,037totalcases.The numberofPPHincreasedfrom firsttosecondyearofresidencyandfellslightlyduringthethirdyearof residency.Overall,foreach50%increaseintheoddsoftreatingapatientrequiringhigh-levelevaluation andmanagement(CPTcode99215),therewasa7.4%decreaseinmeanPPH.Each50%increasein oddsoftreatingacaserequiringhospitaladmissionwasassociatedwitha6.7%reduction(95% confidenceinterval[CI]0.73–12%;P = 0.03)inmeanPPH.Each0.1-pointincreaseinPPHwas associatedwitha262(95%CI157–367;P < 0.001)unitincreaseinaverageRVUsgenerated.

Conclusion: Seeingagreaternumberofpatientsperhourwasassociatedwithalower volumeofcomplexpatientsandpatientsrequiringadmissionamongEMresidents.

[WestJEmergMed.2025;26(2)254–260.]

INTRODUCTION

The2019AmericanBoardofEmergencyMedicineModel ofClinicalPracticerecognizestask-switchingandmultiple patientcareascorephysiciantasks,1 andtheAccreditation CouncilforGraduateMedicalEducation(ACGME)lists multitaskingasEmergencyMedicinePatientCareMilestone 7.2 Emergencyphysicians(EP)mustefficientlyevaluateand treatahighvolumeofpatientstoeffectivelymanagecarein theemergencydepartment(ED).Variousmetricshavebeen

usedtoevaluateefficiencyandqualityofcareprovidedinthe EDbytheEDstaffaswellasindividualEPs(patientlength ofstay,EDadmissionrate,etc).3,4 Ametriccommonlyused byprogramstomeasureefficiencyinresidentsisthenumber ofpatientsseenperhour(PPH).Thismetricisenticing becauseitisbasedondatathatiseasilyretrievableand widelyapplicableacrossclinicalsites.5,6 However,itis currentlyunclearwhetherthenumberofPPHcanadequately encapsulateefficiencyinphysiciantrainees.Itisalso

uncertainhowresidencyprogramsshouldconsiderthis metricwhenassessingtheirtrainees,especiallyifnot consideredalongsideothermetrics.

Aphysician-in-trainingwhoseesmorePPHcould potentiallybeseenasmorecapableofindependently managingthehighernumberofpatientsrequiredfor independentpractice.Thismeasurementisalready commonlyusedwhenevaluatingEMresidentsandisalso frequentlyusedtoevaluateattendingEPs.5,6 However,itis unclearwhethertherearetradeoffsforresidentsthatcome withseeingahigherpatientvolume.Itislikelythatmedical traineesareonlyabletohandlea finitenumberofcognitive tasksbeforetheirperformanceisimpairedandtheyare unabletotakeonadditionaltasks.

Onemethodtoconceptualizetherelationshipbetween howpatientcomplexityandacuityimpactsotheraspectsof patientcareisthroughcognitiveloadtheory.7 Ingeneral, whencognitiveloadistoohigh,suchasincreasedextraneous loadfrommanagingmultiplepatientsorincreasedintrinsic loadfrommanagingverycomplexpatients,overallcognitive performancemaybeimpaired.Thiscoulddecreasecognitive bandwidthfornewpatient-caretasksaswellaslimitgermane loadtoallowforlearningandillness-schemecreation.7 Conversely,simple,straightforwardpatientpresentations maynotimposesuchasignificantcognitiveload,allowing cognitiveresourcestobedeployedtoseeahighervolumeof patients.8,9 Priorstudieshaveassessedresidentefficiencyin theEDintermsnumberofPPHastrainingprogresses.10 Thesestudieshavedemonstratedthatseniorresidentscansee highernumbersofpatientsperhourcomparedto postgraduateyear(PGY)-1residents,whichplateausinthe finalyearoftraining.11

Comparedtoadvancedpracticepractitioners(APP) (physicianassistants[PA]ornursepractitioners),residents seefewerPPHbutgenerateahigheramountofrelativevalue units(RVU).Thissuggestsresidentsmayseehigheracuity patientsordocumentmorethoroughly.10 TheRVUsarean objectivemeansofmeasuringtheresourcesneededto providemedicalcareasasinglemetric.12 Anothermeansof estimatingtheresourcesneededtoprovidecareareED evaluationandmanagement(E/M)CurrentProcedural Terminology(CPT)codes.Theseallowcoderstouse complexityindocumentationasasurrogatemarkerof complexityofcareprovided.WhileRVUsandCPTcodes aremeasuresassignedfollowingapatient’sED encounter,theEmergencySeverityIndex(ESI)isameansof estimatingtheacuityofthepatientintermsof priorityandresourcesallocationbasedontheir initialpresentation.

Itiscurrentlyunknownhowpatientcomplexityandacuity mayimpactmarkersofclinicalefficiencyforEDresidents. Ouraiminthisstudywastobetterevaluatethisrelationship usingmultiplemetricstoallowresidencyleaderstobetter contextualizegreaterresidentefficiencyintheED.

PopulationHealthResearchCapsule

Whatdowealreadyknowaboutthisissue?

Patientsseenperhour(PPH)iscommonly usedbyprogramstomeasureef fi ciencyin residents.Itisunclearwhetherthisadequately encapsulatesef fi ciency.

Whatwastheresearchquestion?

Cantheuseofmultipleclinicalmetricsallow programstobettercontextualizethemeaning ofresidentef fi ciencyintheED?

Whatwasthemajor findingofthestudy?

Foreach50%increaseintheoddsoftreatinga high-complexcase,therewasa7.4% (0.79 – 13.6%;P = 0.03)decreasein meanPPH.

Howdoesthisimprovepopulationhealth?

ResidentswhoseemorePPHmaynottreatas manycomplexpatients,whichcouldhave implicationsfortheirreadinessfor independentpractice.

METHODS StudySetting

Thestudywasconductedatasinglethree-yearEM residencyprogramassociatedwithanurban,academicED locatedintheMidwesternUS.ThehospitalinwhichtheED issituatedisaLevelIadultandpediatrictrauma,burn, stroke,andSTEMIcenter.TheEDhas43adultbedsand seesapproximately60,000patientvisitsperyear.Duringthe studyperiod,theresidencyhad12PGY-1positions eachyear.

TheadultEDisdividedintothreeseparatetreatment areaswithtwoprimarytreatmentteams.Eachtreatment teamconsistsofasingleattendingphysiciansaswellas2–3 PAsorresidentphysicians.Shiftsareninehoursinduration. Throughoutmostofthestudyperiod,patientsweretreated bytheteamofphysiciansdesignatedtothattreatmentarea. In2019,theEDshiftedtoamodelinwhicheithertreatment teamcouldcareforanypatientineithertreatmentarea.Each treatmentteamisstaffedbyresidentsofanyPGYlevelwith atleastoneseniorresident(PGY-2orPGY-3).Allresidents wereencouragedtoassignthemselvestopatientsofany acuitylevel.Duringthestudy,PAswereemployedintheED andcouldtaketheplaceofaresidentonshift(especially duringweeklyresidentdidactics).TheAPPshadno additionalrestrictionsorprivilegescomparedtoresidentsin assigningthemselvestopatients.

Asstaffingisvariable,therearenospecificnumberof patientsthateachresidentisrequiredtoseepershift.All residentsstaffdirectlywiththeattending;noresidents superviseotherresidents.Duringexpectedpeaktimes(of patientarrival),atriageteamconsistingofasingleattending physicianandaPAisalsopresentandgenerallyseesthe lowestacuitypatients;allresidentsareassigned approximatelythesamenumberofshiftsbutmayfreely tradeshiftsamongthemselves.Whileattendingphysicians canassignthemselvestopatientsprimarily(ie,noresidentor APPassigned),thisisarareoccurrenceandtypicallyoccurs onlyduringtimesofexcessivepatientvolumeoracuity.

StudyDesignandPopulation

Wedesignedthisstudyasaretrospectiveobservational studyusingaggregated,residentcasedataextractedfromthe electronichealthrecord(EHR)(EpicSystems,Verona,WI). DataforPGY1–3residentswereextractedforfour consecutiveclassesofresidentswhograduatedbetween 2017–2020.Toremovesignificantoutliersweexcluded residentsiftheydidnotgraduatefromtheprogramwithin threeconsecutiveyears.Wecollecteddataonthe characteristicsofthepatientsseenaswellasmarkersof residencyefficiencyforallavailablepatientencounters duringthestudyperiod(Table1).Multiplemetricswereused toprovideamoreaccuratemeasureofpatientcomplexity ratherthanasinglemetricinisolation.Theresearchteam wascomposedofaseniorresident(TB)andadepartmental dataanalyst(DH),aswellasfacultyeducators(CJ,AN,BS). Wechosetheselectedmarkersastheyhavebeenusedas markersofresidentclinicalefficiencyinotherstudies.6,10

Patientcarewasattributedtothe firstassignedresident,as thisresidentistypicallythemostcognitivelyandpractically involvedinthepatient’scare.Patientswhoaresignedoutto anoncomingEDteamaresharedequallyamongall oncomingresidents.Weexcludedpediatricpatient

encounters(ie,patients <18yearsofage)aspediatriccases havesubstantialdifferencesintermsoftheresourcesand cognitiveloadrequiredtoprovideadequatecare.Therefore, itwasdeterminedthatthechosenefficiencymetricscouldnot bemeaningfullycomparedtoadultpatientencounters.13 For example,theaveragelengthofstaybetweenpediatricand adultencountersduringthestudyperiodwas219vs362 minutes.Overthecourseoftheirtraining,residentscomplete adedicatedblockofpediatricEDshiftsduringtheir firstand secondyearsandcompleteanadditional1–3pediatricED shiftsduringeachadultEDrotation.Wecalculatedthe percentageofpatientencounterscomparedtooverall patientencounters.

Giventheaggregatednatureofthedatathatdidnot containanypatientprotectedhealthinformationor identifyingresidentdata,noinformedconsentwascollected. ThedatawasextractedfromtheEHRbythedepartmental dataanalystandwasstoredonapassword-protected departmentalserveravailableonlytomembersofthestudy team.Noadditionalchartreviewwasconductedonthe includedencounters.Thisstudywasdeterminedtobequality improvementandexemptfromformalreviewbyour institutionalreviewboard.

StatisticalAnalysis

WecalculatedthePPHforeachPGY-3residentbyusing thetotalnumberofadultpatientencountersforwhichthey werethe firstresidentassigned,dividedbythetotalnumber ofhoursworkedintheadultsectionoftheED.Residents weregroupedbasedontheyearofgraduation.Atwo-sided significancelevelof P < 0.05wasusedforallstatisticaltests. WeperformedallstatisticalanalysesandgraphicsusingR version4.1.1(RCoreTeam,RFoundationforStatistical Computing,Vienna,Austria).Weusednegativebinomial regressiontoassesstherelationshipbetweenPPHandthe oddsoftreatingapatientwhorequiredadmission,adjusted

Table1. Emergencymedicineresidentmetricsofefficiencyandthecharacteristicsofpatientsseen.

MetricDescription

Patientcharacteristics

Emergency SeverityIndex(ESI)

Evaluationandmanagement(E/M)Current ProceduralTerminology(CPT)codes

Hospitaladmission

Efficiencymetrics

Relativevalueunits(RVU)

Patientsseenperhour

ED, emergencydepartment; PGY,postgraduateyear.

Frequency of patientencountersmatchingeachESIscore(1–5).Thisisameans ofestimatingtimeandresourceallocationforapatientbasedontheir initialpresentation.

FrequencyofpatientencountersreceivingeachE/MCPTcode(99281–99285). Theserepresentameansofdeterminingpatientcomplexitybasedonmeeting certaindocumentationcriteria.

Numberofpatientencountersinwhichaninpatientadmissionoccurred

TotalnumberofworkRVUsgenerated

TotalnumberofpatientsseendividedbytotalnumberofhoursworkedintheED duringPGY-3

forhoursworkedandpatientcomplexity.Allanalyseswere performedattheresidentlevel.

TodeterminetherelationshipbetweenESIandPPH,we firstdichotomizedESIintohighandlowseverity.High severityincludedencountersfromthethirdyearofresidency thatwerelabeledESI1and2andlowseverityincluded encountersthatwerelabeledESI3,4,and5.TheESI1 encounterswerenotseparatelyanalyzedastheseare relativelyrarecomparedtotheoverallnumberofpatient encounters.Wethencalculatedtheoddsoftreatingapatient withahigh-severityESI.TherelationshipbetweenCPT codesandPPHwassimilarlycalculatedbydichotomizing CPTintomoreandlesscomplex.Morecomplexincludedthe highestcomplexityCPTcode(99285),andlesscomplex includedtheremainingfourcodes(99281–99284).Wedid notconsiderCPTcode99291asonlyattendingscanbillfor criticalcare,andthereissignificantvariationwithinour attendinggroupintheuseofcriticalcarebilling.Therefore, webelievedthatthiswaslesslikelytobearesident-sensitive metric.Wesimilarlycalculatedtheoddsoftreatingapatient withamorecomplexCPT.Toassesssignificantdifferences amongPGYthatcouldintroducebias,weusedthe Kruskal-WallistestandtheNemenyiprocedurefor post-hoccomparisons.14

WeusedRVUsasaproxyforshiftcomplexityand regressedthatastheresponseinamultivariableregression modelusingPPH,PGY,andtheinteractionbetweenPPH andPGYasexplanatoryvariables.

RESULTS

Atotalof46residentsmetinclusioncriteria.Oneresident wasexcludedwhohadanon-consecutivetrainingperiod, andanotherresidentlefttheprogrampriortograduationat theendoftheirPGY-1year.Overall,1.6%ofthetotalpatient encounterswereassigned99291/99292CPTcodesandwere excludedfromthatanalysis.Anadditional17.6%oftotal patientencounters,consistingofpediatriccases,werealso excluded,leavingatotalof178,037patientencounters. AveragePPHdataforthefourincludedPGYscanbeseenin Table2.TheaverageESIduringthestudyperiodwas2.8.

CurrentProceduralTerminology

Adjustedforclassyear,a50%increaseintheoddsof treatingacomplexcasewasassociatedwiththemeanPPH decreasing7.42%(95%confidenceinterval[CI]0.79–13.6% reductioninmeanPPH; P = 0.03).Therelationshipbetween PPHandoddsoftreatingahigh-complexitycasecanbeseen in Figure1

HospitalAdmission

Each50%increaseinoddsoftreatingacaserequiring hospitalorintensivecareunit[ICU]/intermediatecareunit admissionwasassociatedwitha6.7%(95%CI0.73–12%; P = 0.03)reductioninmeanPPH.Therelationshipbetween

Table2. Patientsseenperhourdataforclassyears2017–2020. ClassyearAcademicyearMeanPPH(95%CI)

2017 2014–2015PGY-11.20(1.13–1.28)

2015–2016 PGY-21.51(1.42–1.61)

2016–2017PGY-31.52(1.43–1.62) 2018 2015–2016PGY-11.11(1.05–1.16)

2016–2017PGY-21.50(1.43–1.58)

–2018PGY-31.45(1.39–1.52)

–2017PGY-11.08(1.03–1.13)

–2018PGY-21.37(1.31–1.44) 2018–2019PGY-31.26(1.21–1.32)

–2018PGY-11.01(0.96–1.05) 2018–2019PGY-21.33(1.28–1.39) 2019–2020*PGY-31.09(1.04–1.14)

*MayhavebeenimpactedbytheCOVID-19pandemic. CI, confidence interval; PPH,patientsseen per hour; PGY, postgraduateyear.

Figure1. Relationshipbetweenoddsoftreatingahigh-complex caseandmeanpatientsseenperhourduringpostgraduateyear-3, groupedbygraduationyear.Shadedregionsrepresent95% confidenceintervals. CPT,CurrentProceduralTerminology.

PPHandoddsoftreatingacaserequiringadmissioncanbe seenin Figure2.

EmergencySeverityIndex

AftercontrollingforPGY,therewasnosignificant relationshipobservedbetweenPPHandtheoddsoftreating ahighacuitycase(P = 0.30).

RelativeValueUnits

Themodelssuggestedthateach0.1pointincreaseinPPH isassociatedwitha262unitincrease(95%CI157–367; P < 0.001)inaverageworkRVUsgenerated,withtheassociation betweenaveragetotalRVUandPPHstableacrossthefour years.See Figure3 fortherelationshipbetweenRVUs generatedandPPH.

Figure2. Relationshipbetweenoddsofacaseresultingin admissionandmeanpatientsseenperhourduringpostgraduate year3,groupedbygraduationyear.Shadedregionsrepresent95% confidenceintervals.

IMC,intermediatecareunit; ICU,intensivecareunit.

Figure3. Therelationshipbetweenrelativevalueunitsgenerated andpatientsseenperhourduringpostgraduateyear3, groupedbygraduationyear.Shadedregionsrepresent95% confidenceintervals.

RVU,relativevalueunits.

DISCUSSION

Residentsseeinghighernumbersofpatientssawfewer complexpatientsandfewerpatientsrequiringaninpatient admission.Webelievethisstudyisthe firsttoexaminethe associationofpatientcomplexityandacuityontheclinical efficiencywithwhichEMresidentsoperate.Assuggestedby cognitiveloadtheory,wefoundthatresidents’ capacityto pickupcomplexpatientsinthisstudywas finite.More complexpatientsandpatientsrequiringadmissionmay imposemoreofataskload(eg,phonecallstoconsultantsor admittingphysicians,reviewofrecords,orlongerhistorytaking)thanpatientswithloweracuity.Thisgreater cognitiveloadcouldresultinadecreaseinPPHas complexitygoesup.Thiseffectmaybemitigatedsomewhat byavarietyofeffectiveclinicalpractices,suchaspartnering withnursesorassistancefromtheirsupervisingattending. However,moreresearchisneededtodeterminewhether otherfactors,suchastheincorporationofevidence-based efficiencypracticesoraddingscribesfordocumentation,may affectresidentefficiency.

OurdatashowsthatPPHrisessharplybetweenPGY-1 and-2yearsandthenplateausbetweenthePGY-2and3years.This findingisinlinewithpreviousliterature.11 Whiletheunderlyingcauseofthis findingisultimately unknown,itmaybesecondarytochangesinfocusthatoccur betweenthelatteryearsintraining.Forexample,anyfurther increasesintheabilityofPGY-3residentstoseeadditional patientsoveraPGY-2residentmaybeoffsetbyafocuson departmental flow,instructionofjuniorlearners,orsimply succumbingto “senioritis.” Itisalsopossiblethatthemost seniorresidentspreferentiallyselectedthemostcriticallyill patientsintheEDandtheincreasedcomplexityofthese patientswerethereasonfortheplateau.

WefoundnosignificantrelationshipbetweenPPHand ESI.However,therewasanegativerelationshipwhen evaluatingPPHandCPTcodesaswellasthelikelihoodof caringforapatientwhowouldneedtobeadmitted.Thismay bebecauseESIisassignedatthebeginningofthepatient’ s treatmentcourse,whereasCPTdesignationandadmission decisionsaremadelaterinthepatient’scourse(orafterthe conclusionoftheencounterinthecaseofCPT).TheESIwas alsotreatedasabinaryvariableforanalysis,withESI3 treatedasalow-acuitypatient.However,manyofthese patientsmayhaveahigheracuityillness;itispossiblethat thisdichotomizationeliminatedatrueeffectthatwould otherwisehavebeenseen.Therefore,itcouldreflectthatESI couldnotbeusedtoaccuratelyestimatetheamountof resourcesandcognitiveeffortrequiredtocarefor thesepatients.15

Whilewedidnotanalyzetherelationshipbetweenpatient complexityandoverallgenerationofRVUs,itremainsan interestingavenueforfutureresearch.Whileitmakes intuitivesensethatthecareofasingle,morecomplexpatient wouldgeneratemoreRVUsthanasingle,lesscomplex patient,itisunknownwhetherRVUgenerationisbalanced bytheincreasedamountoftimeandcognitiveloadthese patientsoftenrequire.Thiswasnotdoneinthecurrentstudy asthiswouldalsohavedependedonhospitalcrowding, whichisaconfoundingvariablewechosenottoinclude.

Overall,ourresultssuggestthattheuseofPPHasa surrogatemeasureofpatientefficiencymaypaintan incompletepictureofresidentperformance.Whilethe currentstudydiddemonstrateastatisticallysignificant relationshipbetweenpatientcomplexityandPPH,the clinicalsignificanceisunclear.Therequirednumberof patientsseenduringtrainingrepresentsacritically unexploredareaofresidencytraining.Experientiallearning theorywouldsuggestthatseeingagreaternumberofpatients wouldresultinahigherlevelofcompetence,butthismaybe mediatedbycomplexityorotherfactors.Residency leadershipteamswhoplantoevaluatetheirresidentsontheir abilitytotaskswitchbetweenmultiplepatients(ACGME MilestonePC7)maywishtoexploretheuseofothermarkers thatmaycorrelatewithPPH.Thesemaybettercapturethe

complexityofthecareprovided,althoughfurtherstudyis requiredbeforethiscanbeconsideredbestpractice.

LIMITATIONS

Animportantlimitationofthisstudyisitssingle-center design.Theresultsseenmaybeduetouniquefactorsofthe studysiteand,therefore,maynotbegeneralizabletoother sites.Forexample,thestudysitechangedfromapod-based modelin2019,whichmayhaverestrictedtheefficiencyof someresidents,toa “free-for-all” modelwhereresidents couldassignthemselvestonewpatientsassoonastheywere ready.Additionally,theremayhavebeensubtlechangesto thepatientpopulationseenbytheresidentsovertheyears,or changestotheresidency,thatwerenotassessedinthecurrent study.Forexample,the finalyearofthestudydataincludeda fewmonthsthatwereaffectedbytheCOVID-19pandemic. Thiswouldonlyhaveimpactedasmallportionofthe final yearoftrainingfortheClassof2020.However,itmayhave ledtothediscrepancyseeninPPHbetweentheClassof2020 andtheotherincludedclassesasseenin Figure3.Itis interestingthatthisdidnotresultinasubstantialchangein RVUsgenerated.Nospecificdocumentationinterventions wereimplementedduringthistimeandmaysimplyrepresent generalchangesindocumentationpractices.

Wedidnotfactorinhowpatientswhoweretakeninsignoutwouldaffecttheutilizedmetrics.Itislikelythatresidents whoweresignedoutpatientsrequiringmultipleadditional actions(suchasconsultationcalls,procedures,etc)would negativelyimpacttheirabilitytotakeonnewpatients.These caseswereexcludedbecauseitwouldhavebeenunfeasibleto accountforhowmuchadditionalworkwasrequiredfor thesepatients.Forexample,somepatients,eventhosewho werecriticallyill,maybesignedoutwhenallmajor diagnosticandtherapeuticinterventionshavealreadybeen completed,andthepatientissimplyawaitingtransfertothe hospital floor.

Wedidnotconsiderpatientswhowerespecifically admittedtoourstep-downICUunits,orthosewhowent directlytotheoperatingroom.Whiletherateofadmissionto theselocationscouldcertainlyimplyalevelofcomplexity, thewaythisisdeterminedvariesgreatlybetweeninstitutions andwouldhaveaddedasignificantlayerofcomplexitytothe currentstudy.Atourinstitution,wehavetwoaffiliated hospitalsthatwecanadmitpatientsto,eachwithdifferent levelsofcapabilitiesanddifferentcriteriaforICU/stepdown unitstatus.Thisrepresentsaninterestingavenueof futurestudy.

WealsoexcludedpatientsassignedCPTcodes99291and 99292(whichdenotecriticalcare)fromouranalysisofthe relationshipbetweenPPHandCPTcodes.Thiswasdoneas criticalcarebillingcanonlybedonebytheattending physician,anddocumentationpracticesforthisarevariable withinourattendinggroup.Theoverallpercentageof patientswhoreceived99291or99292CPTcodeswasonly

6%.However,thesepatientswerenotexcludedentirelyand wouldhavebeenincludedintheanalysisofothermetrics apartfromCPT.Asstatedearlier,theuseofmultiplemetrics inthisstudywasdesignedtoovercomelimitationsin individualmetricsalone.

Itispossiblethatthepresenceoftriagephysicianduring peakhoursofpatientarrivalmayhaveimpactedthemetrics usedinthisstudy.Whilethiswasnotspecificallycontrolled for,thetriagephysicianteamprimarilyseesonlythelowest acuitypatients(eg,simplelacerationrepairs,anklesprains, needlestickinjuries)andwasfelttonothaveabigimpacton ourchosenmetrics.Wedidnotwishtoexcludeshiftsin whichthetriagephysicianwaspresentasthistimeframe representsthehighestpatientcensusinourED.Ifanimpact occurred,thiswouldbeexpectedtodecreasethemagnitude oftherelationshipbetweenPPHandthechosen variables.Despitethis,asignificanteffectwas stilldemonstrated.

Finally,thisnumericaldatadoesnotcompletely encapsulateotherfactorsthatwouldinfluencearesident’ s overallefficiency. Thesefactorscouldincludetheirclinical abilitiesandmedical knowledge. Becauseofthis,wecaution residencyprogramsfromlookingatthevariables investigatedinthisstudyinisolationwhenassessingtheir owntrainees.

CONCLUSION

Residentscaringforhighernumbersofpatientsperhour wereassociatedwithfewercomplexpatientsandpatients whorequiredinpatientadmission.

AddressforCorrespondence:CorlinM.Jewell,MD,Universityof WisconsinSchoolofMedicineandPublicHealth,BerbeeWalsh DepartmentofEmergencyMedicine,800UniversityBayDr., Madison,WI53705.Email: cmjewell@medicine.wisc.edu

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Jewelletal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.BeesonMS,AnkelF,BhatR,etal.The2019Modelofthe ClinicalPracticeofEmergencyMedicine. JEmergMed. 2020;59(1):96–120.

2.AccreditationCouncilforGraduateMedicalEducation.ACGME CommonProgramRequirements(residency).2022.Availableat:

https://www.acgme.org/globalassets/pfassets/programrequirements/ cprresidency_2023.pdf.AccessedFebruary25,2024.

3.SørupCM,JacobsenP,ForbergJL.Evaluationofemergency departmentperformance – asystematicreviewonrecommended performanceandquality-in-caremeasures. ScandJTraumaResusc EmergMed. 2013;21:62.

4.WilerJL,WelchS,PinesJ,etal.Emergencydepartmentperformance measuresupdates:Proceedingsofthe2014EmergencyDepartment BenchmarkingAllianceConsensusSummit. AcadEmergMed. 2015;22(5):542–53.

5.JosephJW,DavisS,WilkerEH,etal.Modellingattendingphysician productivityintheemergencydepartment:amulticentrestudy. Emerg MedJ. 2018;35(5):317–22.

6.KirbyR,RobinsonRD,DibS,etal.Emergencymedicineresident efficiencyandemergencydepartmentcrowding. AEMEducTrain. 2019;3(3):209–17.

7.YoungJQ,VanMerrienboerJ,DurningS,etal.Cognitiveloadtheory: implicationsformedicaleducation:AMEEGuideNo.86. MedTeach. 2014;36(5):371–84.

8.GraberML,KissamS,PayneVL,etal.Cognitiveinterventionsto reducediagnosticerror:anarrativereview. BMJQualSaf. 2012;21(7):535–57.

9.RothschildJ,LandriganC,CroninJW,etal.Thecriticalcare safetystudy:theincidenceandnatureofadverseevents andseriousmedicalerrorsinintensivecare. CritCareMed. 2005;33(8):1694–700.

10.HamdenK,JeanmonodD,GualtieriD,etal.Comparison ofresidentandmid-levelproviderproductivityinahigh-acuity emergencydepartmentsetting. EmergMedJ. 2014;31(3):216–9.

11.DouglassA,YipK,LumanauwD,etal.Residentclinicalexperiencein theemergencydepartment:patientencountersbypostgraduateyear. AEMEducandTrain. 2019;3(3):243–50.

12.AMACPTInternational.RelativeValueUnits.Availableat: https://cpt-international.ama-assn.org/relative-value-units AccessedOctober25,2024.

13.HeatonHA,NestlerDM,JonesDD,etal.Impactofscribesonpatient throughputinadultandpediatricacademicEDs. AmJEmergMed. 2016;34(10):1982–5.

14.NemenyiP. Distribution-freeMultipleComparisons.Princeton,NJ: PrincetonUniversity,1963.

15.SaxDR,WartonEM,MarkDG,etal.Evaluationofversion4ofthe EmergencySeverityIndexinUSemergencydepartmentsfortherateof mistriage. JAMANetwOpen. 2023;6(3):e233404.

EDUCATION SPECIAL ISSUE

TheEffectofHospitalBoardingonEmergencyMedicine ResidencyProductivity

PeterMoffett,MD

AlBest,PhD

NathanLewis,MD

StephenMiller,DO

GraceHickam,MD

HannahKissel-Smith,MD

LauraBarrera,MD

ScottHuang,MD

JoelMoll,MD

Section Editors: Abra Fant, MD

DepartmentofEmergencyMedicine,VirginiaCommonwealthUniversitySchoolof Medicine,Richmond,Virginia

Submission history: Submitted July 21, 2024; Revision received October 7, 2024; Accepted October 11, 2024

Electronically published November 27, 2024

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.31064

Introduction: Emergencydepartmentboardinghasescalatedtoacrisis,impactingpatientcare, hospital finances,andphysicianburnout,andcontributingtoerror.Nopriorstudieshaveexaminedthe effectsofboardinghoursonresidentproductivity.Ifboardingreducesproductivity,itmayhavenegative educationalimpacts.Weinvestigatedtheeffectofboardingonresidentproductivityasmeasuredby patientsperhourandhypothesizedthatincreasedboardingleadstodecreasedproductivity.

Methods: Thiswasaretrospectivestudyataquaternary,urban,academicLevelItraumacenterfrom 2017–2021withathree-yearemergencymedicineresidencyof10–12residentsperyearandannual volumesof80,000–101,000.Boardingwasdefinedasthetimebetweenanadmissionorderandthe patientleavingtheED.Wecreatedamultivariablemixedmodelwith fixedcovariatesforyear,month,day ofweek,residentexperience,shiftduration,totaldailyEDpatients,andwithresidentsasrepeated measures.Theeffectofboardingwasestimatedaftercovaryingoutallotherfactors.

Results: Allvariablesincludedinthemodelweresignificantlyassociatedwithchangesinproductivity. Residentexperiencehasthelargesteffectsuchthatforeachmonthofresidencyexperience,aresident adds0.012patientsperhour(95%confidenceinterval[CI]0.010–0.014).Isolatingtheeffectofboarding demonstratedthatforeveryadditional100hoursofboarding,aresident’sproductivitydecreasedby 0.022patientsperhour(95%CI0.016–0.028).Inthestudy,themediandailyboardingwas261hours;if thiswereeliminated(assumingaresidentcompletes10010-hourshiftsannually),aresidentcouldbe expectedtosee56.9morepatientsperyear(95%CI40.7–73.1).

Conclusion: Hospitalboardingsignificantlyreducesresidentproductivityasmeasuredbypatients perhour.Furtherstudiesarewarrantedtodeterminetheeducationalimpact.[WestJEmergMed. 2025;26(1)53–61.]

INTRODUCTION

Emergencydepartment(ED)boarding(definedas patientsadmittedtothehospitalbutremainingintheED) hasreachedcriticallevelsandhasbeendeclaredacrisisby theAmericanCollegeofEmergencyPhysicians.1 Thescope ofthecrisisisdauntingwitheffectsonpatientcare,errors, physicianburnout,hospitaleconomicstress,andambulance diversion.2 IncreasedEDboardingalsoleadstoincreasesin

medicationerrors,timetoantibiotics,timetopercutaneous coronaryinterventionforpatientswithmyocardial infarction,timetocareforpatientswithacutestroke,patient mortality,andrisk-adjustedhospitalspending,andhas effectsonalllevelsofacuity.3–10

Withinthecontextofboarding,EDsmustalsoprovide soundeducationaltraininginvolvingbothqualityand quantityofpatientexperiences.Residencyprogramsseekto improveefficiencyandproductivityintheirresidents throughouttheirtraining.Manyvariableshavebeen associatedwithresidentproductivityincludingtimeofshift, shiftlength,andresidentexperience.11–13 Thereare, however,fewstudiesthatevaluatetheeffectofEDcrowding andboardingtimeontheeffectofemergencymedicine(EM) residentproductivity.14 Ifboardingdecreasesthenumberof patientsseenduringaresidency,theremaybeanimpacton residenteducation.

Inthisstudyweaimedtoinvestigatetheeffectofboarding onEMresidentproductivityasmeasuredbypatientsper hour.Wehypothesizedthatincreasedhospitalboarding wouldresultindecreasedresidentproductivity.

METHODS

StudyDesignandSetting

ThiswasaretrospectivestudyconductedattheVirginia CommonwealthUniversityHealthSystem,theonly comprehensiveLevelItraumacenterinRichmond,VA. DuringthestudyperiodfromJanuary2017–June2021,the totalpatientvolumesrangedfrom80,000–101,000peryear. Onaverage,30%ofpatientswereadmittedtothehospital,of whom5%wenttotheintensivecareunit.Patients <18years ofageconstituted22%ofthetotalvolume.Thedepartmentis staffedwithboard-certifiedemergencyphysicians,and duringthestudyperiod81%ofpatientswereseenbya resident.Theremainingnon-residentcaseswereseenby advancedpracticepractitioners(APP)inalow-acuityareaof theEDorbyattendingphysiciansandwerenotincludedin thestudy.Throughoutthestudytherewasnochangeinthis staffingmodelsuchthatAPPswerenevercompetingforthe samepatientsasresidents.Thedepartmenthas76bedswith 35inanacutearea,10intrauma/resuscitation,10inamidtrackarea,16inapediatricdepartment,and fiveina fast-trackzone.

Ourresidencyprogramisthreeyearsinlength,andclass sizesrangedfrom10residentsin2017to12residentsin2021. Duringpostgraduateyears(PGY)-1,2,and3,residentswork intheEDfor26weeks,29weeks,and35weeks,respectively. Residentshiftlengthsvariedfrom9–12hourswiththemost typicalshiftbeing10hours.Onaverage,each24-hourperiod hadatotalof137hoursofresidentcoverageinoverlapping shifts.TheEMresidentssawpatientsinallEmergency SeverityIndex(ESI)categoriesandweretheprimary physiciansforallemergentpatients(ESI1and2).Residents caredforpatientsinallareasoftheEDotherthanthe

PopulationHealthResearchCapsule

Whatdowealreadyknowaboutthisissue?

Emergencydepartmentboardingnegatively impactspatientcare,hospitalef fi ciency,and physicianwell-being.

Whatwastheresearchquestion?

DoesincreasedEDboardingreduce emergencymedicineresidentproductivity, asmeasuredbypatientsperhour?

Whatwasthemajor findingofthestudy?

Foreveryadditional100hoursofED boarding,aresident ’ sproductivitydecreased by0.022patientsperhour(95%CI 0.016 – 0.028);aresidentsees57fewer patientsperyearduetoboarding.

Howdoesthisimprovepopulationhealth?

Understandingthenegativeeffectsof boardingonproductivitymayhelppolicy makers fi ndsolutionstoimprovepatient fl ow, patientcare,residenteducation,andoverall healthoutcomes.

low-acuityarea.Allresidentsstaffpatientsdirectlywithan attendingphysicianwithoutoversightbyamoresenior resident;therefore,theproductivitynumbers forresidentsinallthreeyearsoftrainingareindependent. ThestudywasgrantedexemptstatusbytheVirginia CommonwealthUniversityInstitutionalReview Board(HM20024717).

SelectionofParticipants

DatafromallpatientsevaluatedbyanEMresidentwas capturedinadatabase,andinconjunctionwithscheduling dataitwasusedtodeterminetheaveragenumberofpatients perhour.OnlyEMresidentswereincluded.Thestudyperiod wasselectedasthiswasthemaximumamountoftimefor whichdatawasavailablepriortothehospitalswitchingtoa newelectronichealthrecord.Asthedatabasewasinitially createdtoprovidefeedbacktoresidents,certaindatawas removedandnotavailabletousforanalysis.Information fromthe firstmonthofEMforeachresidentwasnot provided,andduetoinitialeffectsfromtheCOVID-19 pandemic,datafromApril–July2020wasnotincluded.

Measurements

Wecombinedthreedatabasesforanalysis:thepatient databaseofallEDencounters;theresidentscheduling database;andthehospitalboardingdatabase.

Duringthestudyperiod,theEMresidencyprogram receivedmonthly,system-generatedreportslistingthe uniquepatientidentifier,nameoftheresidentassignedto careforthepatient,theESIacuitylevel,thedate/timeof first contactandcheckout,andthedisposition.Theresident assignmentwasderivedfromtrackingboarddata,andin scenarioswheremultipleresidentswereassignedtoapatient encounter,onlythe firstresidentassignedwascreditedfor eachuniquepatientencounter.TheEMresidentswere scheduledfor9-hour,10-hour,or12-hourshiftsduringthe studyperiod.Allnon-EMresidentsandstaffwereexcluded fromthepatientdatabase.

Boardingdatawasreporteddailyfromhospitalanalytics. Thenumberofhoursofboardingwasdefinedasthetime betweenanadmissionorderandwhenthepatientlefttheED. Boardinghourswasselectedasthiswasthevariableavailable tousfromthehospitalanalyticsdatabase.

Outcomes

WedesignedamodeltoisolatetheeffectsofEDboarding onresidentproductivityasmeasuredbypatientsperhour. Patientsperhourwasdefinedasthetotalnumberofnew patientsseenduringtheshiftdividedbythedurationofthe shiftinhours.Thecovariateswerechosenfromthosefound inpreviousstudiestoberelatedtoresident productivity.11,13,15,16 Theseincludedyear,month,dayofthe week,cumulativeresidencymonthsintraining,shift duration,totalpatientsperday,andboarding.Monthsin trainingwaschosenasacontinuouscovariatetodelineate residentexperienceratherthantheroughclassificationof PGY-1,-2,or-3basedontheobservationthatresident productivitybeginslowinthePGY-1year,increasesinthe PGY-2year,andthenplateaus.Thismonthlyexperience variablewasmodeledusingcubicregression.

Analysis

Wedescribedthedatausingcountsandpercentages. Patientsperhourwasmodeledusingamultivariablemixed model,withcovariatesdefinedas fixedeffectsandresidents asrepeatedmeasures.Weusedanautoregressive(AR1) covariancestructuretoaccountforthedependencebetween repeatedmeasures.The fixedeffectswereyear(reference = 2019),month(reference = 12),dayoftheweek(reference = Thursday),residentmonthintraining(centeredon18),total patientsperday/100,shiftduration,anddailyboarding hours/100.Wechosetheyear2019asareferenceasitwasthe lastfullyearofdatapriortothestartoftheCOVID-19 pandemic.Decemberwaschosenasitalignswiththe18th monthofresidency,whichiswhenproductivityplateauedin ourmodel.Thursdaywasselectedasitisthoughttorepresent thedaywiththemostideal flowsinceitavoidsweekends, Monday,andFridaypatientsurges,aswellasWednesday morningdidacticswhenEMresidentsarenotworking clinically.Thetotalpatientsperday,shiftduration,and

boardinghourswerereferencedatthemedianvaluesin ourdataset.

Weestimatedtheeffectofboardingfromthemarginal regressionmodelaftercovaryingoutallotherfactors. Estimatesaredescribedusing95%confidenceintervals.All datamanagementandanalysiswereperformedusingSAS software(version9.4andJMPProversion17.2(SAS InstituteInc,Cary,NC).

RESULTS

CharacteristicsofStudySubjects

Duringthestudyperiod,263,058patientswereseeninthe EDby601cliniciansincludingthe80EMresidentsstudied. Duringthe49monthsstudiedbetween2017–2021,EM residentswerescheduledto16,949shiftsandwereassigned 188,685patients(Table1).Totaldailypatientvolumevaried considerablyduringthistime(mean177,SD26,range

Table1. Characteristicsoftheemergencydepartmentresidents’ shiftsandpatientsevaluated(January2017–June2021).

CharacteristicShiftsNPatientsN(%) Total 16,949188,685 Year 2017 3,49644,119 (23)

3,955 47,569(25) 2019(11months)*4,05347,035(25) 2020(8months)† 3,10129,191(15) 2021(6months)2,34420,771(11) Month

1-January1,90921,052(11) 2-February1,57618,004(10) 3-March1,68018,901(10)

4†-April 1,30215,229(8)

5†-May 1,37116,385(9) 6†-June 1,33715,191(8)

7†-July 820 10,129(5) 8-August 1,56015,543(8) 9-September 1,37614,741(8) 10-October 1,43115,299(8) 11*-November 1,06211,639(6) 12-December 1,52516,572(9) Dayofweek Sunday 2,24925,887(14) Monday 2,67929,099(15) Tuesday 2,75629,504(16) Wednesday‡ 1,98921,970(12) Thursday2,60127,874(15) (Continuedonnextpage)

Table1. Continued.

CharacteristicShiftsNPatientsN(%)

Friday 2,52528,785(15)

Saturday 2,15025,566 (14)

Shift

7 AM to5 PM 1,68816,332(9)

7 AM to7 PM 180 2,512(1)

9 AM to7 PM 2,54628,306(15)

12 PM to10 PM 3,38638,586(20)

2 PM to12 AM 2,47028,631(15)

3 PM to12 AM 3,55341,138(22)

9 PM to7 AM 3,12633,180(18)

PGY

PGY-1§ 5,16244,817(24)

PGY-24,75657,447(30)

PGY-3 7,03186,421(46)

Disposition

Admitted 74,663(40)

Discharged 114,022(60)

*November2019wasexcludedasthehospitalinformation management systemwas down.

†April2020throughJuly2020wasexcludedduetoCOVID-19and hospitalchanges.

‡Wednesdaysmorningsareresidentdidactics.

§The firstmonthofaresidencywasexcluded(orientationmonth). ESI,EmergencySeverityIndex; PGY,postgraduateyear.

88–263).Asindicatedinthetable,theEDexperienceda patientcountvariabilitythatchangedacrossyears,months, daysoftheweek,shifts,andPGYlevel.Ofall188,167 patientsseenbyEMresidents,40%wereadmitted.

Boardinghoursperdayvariedconsiderably(mean281, SD127,range50.8–914.4; Figure1).Thehospital informationsystemcalculatedboardinghoursdaily; however,acrossthe1,490daysstudied,thereweresix impossible(negative)valuesandnineverylowvalues.Low valueswereidentifiedbylargeresidualsinthemultiple regressionmodel.Ratherthantreatingtheseasmissing values,weusedamultipleregressionmodeltoimputethe 15valuesinquestion.

MainResults

Allthefactorsintherepeated-measuresmixed-model weresignificant(P < 0.001). Table2 showstheestimated effectofeachterminthemodel.Thejointeffectofallthe factorsonresidentproductivityisshownin Figure2.These profileplotsshowthemarginalmodelpredictedvalueof residentproductivityontheverticalaxisacrossallthe covariatesontheseparatehorizontalaxes.Theimportance

Figure1. Boardingacrossstudyyears.

Linesetatmedianboardinghoursacrosstheentirestudyperiod (261hours/day).

Eachboxplotrepresentsamonth(line = median,box = 25th to75th quartile,whiskers = typicalextremes,circles = outliers).

Note:April2020–July2020hoursarenotavailableasthey correspondtothebeginningoftheCOVID-19pandemic.

ofafactorisvisualizedbythesteepnessofthe predictiontrace.

Isolatingtheeffectofboardingdemonstratedthatfor everyadditional100hoursofdailydepartmentalboarding, individualresidentproductivitydecreasedby0.022patients perhour(95%confidenceinterval[CI]0.016–0.028, Table2). Inthereferencestandardscenario,aresidentcouldbe expectedtosee1.10patientsperhourwithboardingatthe dailymedian(261hours)butcouldsee1.15patientsperhour ifboardingwereeliminated(Figure2,PanelC). Table3 showshowresidentproductivitywasdegradedbyboarding acrosstherangeofvaluesseenatourinstitution.Aresident wouldsee1.14patientsperhourwhenboardingwasatthe lowestinthestudycomparedto0.95patientsperhouratthe maximumlevelofboardingseeninthestudy,whichisa differenceof0.19patientsperhour(95%CI0.15–0.22). Assumingaresidentcompletesapproximately100shiftsa yearthatareof10hoursdurationandboardingwas eliminated,thenaresidentcouldbeexpectedtosee56.9more patientsperyear(95%CI40.7–73.1).Thiswouldrepresenta 5%increaseinpatientvolumeperresidentannually. Residentexperiencehasthelargesteffectonresident productivity.Residentproductivitywaslowinitiallyat0.5 patientsperhour(95%CI0.46–0.54)bythesecondmonthof training(Figure2).Improvementwasinitiallyrapidto0.75 patientsperhouratsevenmonths,thenplateauednearthe 18-monthpoint(1.10patientsperhour)to finallyreach1.12 patientsperhourattheendofthe36months(95%CI 1.08–1.17).WhenevaluatingourdatabyPGYlevel,our

Table2. Multipleregressionresultspredictingnewpatientsperhourperresidentforeachvariable.

EffectEstimatednewpatientsperhourStandarderror95%CI

2019 [reference]

Month

1-January 0.0635 0.01720.0298to0.0972

2-February 0.0776 0.01820.0420to0.1133

3-March 0.0498 0.01810.0144to0.0852

4-April

0.01970.0453to0.1227

5-May 0.0750 0.01960.0366to0.1133

6-June 0.0585 0.02010.0191to0.0979

7-July

8-August

9-September

10-October

12-December [reference] Dayoftheweek

Themixed-modelalsoincludedresidentasarepeated-effectwithanAR(1)covariancestructure. *Continuous covariateswere referenced tothemedianvalue.Medianresidentmonth = 18,totalpatientsperday/100 = 1.77, shiftduration = 10hours,boardedhours/100 = 2.61. CI,confidenceinterval.

PGY-1residentssaw0.75perhour,PGY-2residentssaw 1.10patientsperhour,andPGY-3residentssaw1.12 patientsperhour.

TotalpatientsperdaypresentingtotheEDwasthenext mostimportantfactorinresidentproductivity.Forevery100 newpatientspresentingtotheED,anindividualresident

wouldbeexpectedtoadd0.40patientsperhour(95%CI 0.37–0.43).Themedianvaluefordailytotalpatientvolume was177patientsperday,butalow-volumedayatthe10th percentile(143totalpatients)resultedinacorresponding decreaseinresidentproductivityto0.96patientsperhour (95%CI0.92–1.00).Forahigh-volumedayatthe90th

Figure2. Multipleregressionresultspredictingnewpatientsperhourperresidentforeachvariable. Allvalues(year,month,dayofweek,EMresidentmonths,totalpatients,shiftduration)inmodelheldatreferencestandardswithadjustments toboarding(lastpanelofeachgraph).ExpectedpatientsperhourineachscenarioisindicatedbytherednumberintheYaxiswith95% confidenceintervalsinblue.Asboardedhourschange(lastpanelofeachgraph)sodopatientsperhour(rednumbertoleftofeachgraph)in eachofthethreescenarios(A:Medianboardingof261hours.B:Reducingboardingby100hours.C:Eliminatingboardinghours.)

percentile(210patients),residentproductivityincreasedto 1.23patientsperhour(95%CI1.19–1.26).

Residentproductivityalsochangedbasedonthe year,shiftduration,anddayoftheweek.Resident

productivitywashighestin2017at1.25patientsperhour (95%CI1.21–1.28)andsteadilydecreasedtothe0.93 patientsperhourseenin2021.Residentproductivityfora nine-hourshiftwaspredictedtobe1.21patientsperhour

Table3. Estimatedresidentproductivitybyboardinghours. CutoffBoarded(hours)EstimatedpatientsperhourStandarderror95%CI

Marginalestimatesfromthemixedmodelwiththefollowingfactorsheldconstant:year = 2019, month = 12,day of theweek = 5(Thursday), residentmonthintraining = 18,totalpatientsperday/100 = 1.77,shiftduration = 10hours. CI,confidenceinterval.

(95%CI1.19–1.26),whereasfora12-hourshiftitwas predictedtobe0.84patientsperhour(95%CI0.80–0.89). SaturdaysandWednesdaysaveragedapproximately1.21 patientsperhour,Sundays,andFridaysapproximately1.15 perhour,andMondays,Tuesdays,andThursdays1.10 patientsperhour.

Month-to-monthvariabilityhadthesmallesteffecton residentproductivity.Comparedwiththeothermonths,July andDecemberhadlowerresidentproductivity(1.09vs1.16 patientsperhour).

DISCUSSION

Toourknowledge,thisisthe firststudytodemonstrate thatthereisasignificantreductioninresidentproductivity (measuredaspatientsperhour)duetohospitalboardingin theED.Inourmodel,thisresultedinadecreaseof0.022 patientsperhour(95%CI0.016–0.028)forevery100hours ofdailyboarding.Whileperformedatasingleinstitution, ourdatasetbroadlyalignswithmultiplestudiespreviously completedregardingresidentproductivity.Inourstudy,we analyzedresidentexperienceasthenumberofmonthsin trainingratherthandividedintoPGYlevel.Thiswasbased onourobservationthatproductivityrapidlyincreased duringthePGY-1yearandthenplateauedinthemiddleof thePGY-2year.

WhenevaluatingourdatabyPGYlevel,ourPGY-1 residentssaw0.75patientsperhour,PGY-2residentssaw 1.10patientsperhour,andPGY-3residentssaw1.12 patientsperhour.Priorstudieshavedemonstratedsimilar patternswithPGY-1to-3residentsseeingbetween0.79–0.81 patientsperhour,1.05–1.2patientsperhour,and1.22–1.27 patientsperhour,respectively.17–19 AstudybyHenningetal showedrapidprogressionfromPGY-1toPGY-2yearand thengradualprogressioninPGY-3yearbutwasbasedon patientsperday.20 Similarly,astudybyTurner-Lawrence andToddsawincreasingproductivityfrom1.2patientsper hourto1.5patientsperhourto1.6patientsperhourby PGY-1to-3residents,respectively.13 Whilethese productivitynumbersarehigherthanthoseinourstudy,the authorsdidnotadjustforadditionalvariables.

Inamorecomparablestudy,Kirbyetalreportedthe efficiencyofEMresidentsduringEDcrowding.14 The authorsusedtheNationalEmergencyDepartment OvercrowdingStudy(NEDOCS)scoringsystemto categorizestatesintheEDasnotcrowded,crowded,and overcrowded.Theyfoundthatresidentproductivity measuredasnewpatientsperhourincreasedinitiallyinall yeargroupsastheEDtransitionedfromnotcrowdedto crowded,butthenremainedstablewhentransitioningfrom crowdedtoovercrowded.WhiletheNEDOCSscoreusesa measureofEDboarding(thewaitingtimeofthelongest admittedpatient),itdoesnotincludetotalpatientboarding hoursasinourstudy.Ourstudymoredirectlyexaminesthe effectofboarding(oneelementofcrowding)onresident

productivity.Theparadoxicalincreaseinresident productivityintheKirbystudymayhavebeenduetoan increasednumberofpatientspresentingtotheED,which couldhaveincreasedtheNEDOCSscore.Ourstudy demonstratedthatresidentproductivityincreasedwith higherpatientvolumes,andincludingthisinourmodel allowedustobetterisolatetheeffectofboarding.

AccordingtoastudybytheAcademyofAdministrators inAcademicEmergencyMedicineandtheAssociationof AcademicChairsofEmergencyannualbenchmarksurvey, boardingtimeshavedramaticallyincreasedsincethe COVID-19pandemic.21 Bytheendoftheirstudyperiod,the mediannumberofboardinghourspermonthwas11,480, whichapproximatesto382hoursofdailyboarding.Inour study,whichincludesapre-pandemicperiod,themedian dailyboardingwas261hours,suggestingthatboardingis likelyworseningovertimeandisaproblematmany academicmedicalcenters.

Theeducationalimpactofdecreasedpatientvolumes causedbyboardingisuncertain.Itisreasonabletoexpect thatresidentsseeingfewercasesmaylosevaluablelearning opportunities,butthishasnotbeenwellstudiedandno firm numbersexisttosuggestathresholdatwhicheducation suffers.Priorauthorshavesurveyedresidentsregardinga perceiveddecreaseineducationduringcrowding.22,23 These studiesconcludedthatresidentsdidnotperceiveadifference ineducationduringthesetimes,buttheyuseddiffering measuresofcrowding,weresurvey-based,and underpowered.Educatorsmayswitchtodifferentmodelsof teachingduringperiodsofhighboarding,leadingto residentsperceivingalessdeleteriouseffect.24

OthershavepostulatedaneducationalStarlingeffect wherebysomeboardingallowssupervisingphysiciansmore timetoteach,butatsomepointtherearediminishedreturns asfewernewpatientsbecomeavailabletodiscuss.25 Amore recentstudywasconductedduringthecurrentboarding epidemic;theauthorssurveyedEMprogramdirectors regardingtheirperceptionsoftheimpactofboardingon residenttraining.26 Inthisstudy,80%oftherespondentsfelt thatboardingnegativelyaffectedresidenteducation, especiallyinthedomainsofmanagingdepartment throughputandmanaginghighvolumesofpatientsper resident.Whilesurvey-basedinnature,thestudyresults broadlyalignswiththepriorstudiesinthisarea.

Theoretically,residentswhoseefewercasesmaylose valuablelearningopportunities.Whilethecomponentsof Bloom’sdomainsofeducationalactivitiescanbelearnedvia differentmodalitiesofinstructionaltechniques,clinical experienceallowsforthelinkingofknowledgetoskillsand thentoattitudes/emotions.27 Bydecreasingalearner’ s exposuretopatients,onecouldarguethatresidentsmaylose valuableexperientiallearningopportunities.Whilesomeof thesecanbereplicatedinsimulationorcase-based discussion,otherskillscannotandarebestlearnedviahands-

on,experientiallearningencounters.Experientiallearning theory,asdescribedbyKolb,highlightstheimportanceof real-lifeexperienceandtheinfluencethishasonlearning.28 Unliketraditionallearningandinstructionalmethodology thatfocusesonrotememorization,experientiallearningisan activeprocesswhereresidentsareengagedinconcept transformationthroughactionaswellasreflectionontheir experiencesandpatientencounters.

Thislearningtheoryalsoemphasizesprinciplesofadult educationinwhichpriorlearningexperiencescanbe leveragedtocreatemoremeaningfulandrelevant educationalexperiences.29 Additionally,decreasingpatient interactionmayalsoaffectresidents’ applicationand translationofknowledgeintopractice.Behaviorallearning theoryemphasizeslearningthroughinteractionswiththe environmentwherereinforcementandfeedbackcan encouragemodificationofbehaviors.Byincorporating behaviorallearningstrategies,medicaleducationcanfoster notonlytechnicalcompetenciesbutalsothedevelopmentof professionalhabitssuchaseffectivecommunicationbetween teammembersandpatients.30

LIMITATIONS

Thisstudyhasseverallimitations.Thiswasasingle-center studythattookplaceinahighacuity,quaternary-carecenter thatalsoexperienceshighlevelsofboarding,whichmaylimit generalizabilitytoothercenters.Thedatabasethatcaptured theresidentpatientassignmentwasbasedontrackingboard dataandmayhaveoccasionallymiscreditedaresidentwitha patientencounter;however,asthedatasetwaslargeand involvedmultipleyearswithcompletedatasetsforthreefull classesofresidentsthisisunlikelytohavegreatlyinfluenced thedata.Ourresidentclasssizedidincreaseduringthe 2021yearandthuscouldtheoreticallyhavedecreasedthe numberofpatientsavailableperresident.Whilewedidnot studythatdirectly,itisunlikelytohaveimpactedthedata greatlyastheadditionalresidentsallowedforthecreationof anoutsiderotationatafree-standingemergencycenterand, therefore,residentstaffinghoursstayedgenerallyconsistent atthestudysite.

Ourmodeldidnotincludeameasureofpatientacuityasa covariate.WhiletheESIcategoryanddispositionwere recordedforeachpatient,wedidnotfeeltherewasareliable waytoconvertthisdataintoameaningfulmeasureofhourly acuitythatinfluencedtheamountoftimearesidentmight dedicatetowardpatientcare.Forexample,anESI-1patient whoisadmittedforanST-segmentelevationmyocardial infarctionmaystayinthedepartmentfor15minutesleaving thebedopenforanewpatient,whileanESI-3patient requiringaworkupforabdominalpainincludingimaging whoisdischargedmayoccupyaroomandaresidentfor multiplehours.Sinceourdatasetwaslarge,itwasassumed thatallresidentswouldbeexposedequallytothesamemixof acuitiesonindividualshifts,bytheendoftheirresidency

andthuslimittheeffectonthedata.Additionally,recent studieshavecalledintoquestiontheaccuracyofthe ESI.26,27 Apriorstudyonresidentproductivitydidnot showacorrelationbetweenESIandclinician dispositiontimes.14

OurstudyalsoincludeddatafromtheCOVID-19 pandemic,whichaffectedpatientvolumesandEDboarding. Thedatasetweusedwasinitiallymeantforreporting individualresidents’ productivitymeasures,sodatafromthe firstfewmonthsofthepandemicwasnotavailableforour currentstudy.Thislikelyservedtodecreasetheeffectofthe initialpandemicresponseonourdata.Justpriortothe pandemicourEDhadseenagrowthinpatientvolumesfrom 87,000patientsperyeartoapeakof101,000patientsper year,whichwasfollowedbyarapiddeclineto83,000ayear inthe2021–2022year.Thevolumesdidslowlyriseafterthe studyperiod.Thismayhaveinfluencedsomeofthedata fromourlaterresident-yeargroupsandservedto decreaseproductivity.

Ourmeasureofboardingmayalsohavelimitations.Total boardinghoursperdaywasthevariableavailablefromour hospitalanalyticsdepartment.Thenumberofboarded patientsperdaymayhaveprovideddifferentdata.For example,inourmodelasinglebehavioralhealthpatient boardingfor20hoursfromonedaywouldbe indistinguishablefrom20patientsboardingin20 individualroomsforasinglehoureach.Asthedatasetis large,andallresidentswereexposedtothesameconditions throughouttheirtime,itisunlikelyanyoneresident’sdata (orthetrend)wouldbeaffectedbasedonthesetypes ofoutliers.

CONCLUSION

Wefoundasignificantreductioninresidentproductivity asmeasuredbypatientsperhourduringperiodsofincreased boarding.Furtherstudiesarewarrantedtodeterminethe educationalimpactofthese findings.

AddressforCorrespondence:PeterMoffett,MD,Virginia CommonwealthUniversity,DepartmentofEmergencyMedicine, 1250EMarshallStreet,Suite600,RichmondVA23298.Email: peter.moffett@vcuhealth.org

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Moffettetal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.AmericanCollegeofEmergencyPhysicians.Emergencydepartment boardingandcrowding.Availableat: https://www.acep.org/ administration/crowding–boarding/.AccessedDecember29,2022.

2.KelenGD,WolfeR,D’OnofrioG,etal.Emergencydepartment crowding:thecanaryinthehealthcaresystem. NEJMCatalyst InnovationsinCareDelivery. 2021.Availableat: https://catalyst.nejm. org/doi/full/10.1056/CAT.21.0217.AccessedApril4,2022.

3.KulstadEB,SikkaR,SweisRT,etal.EDovercrowdingisassociated withanincreasedfrequencyofmedicationerrors. AmJEmergMed. 2010;28(3):304–9.

4.ChatterjeeP,CucchiaraBL,LazarciucN,etal.Emergencydepartment crowdingandtimetocareinpatientswithacutestroke. Stroke. 2011;42(4):1074–80.

5.KulstadEBandKelleyKM.Overcrowdingisassociatedwithdelaysin percutaneouscoronaryinterventionforacutemyocardialinfarction. IntJ EmergMed. 2009;2(3):149–54.

6.FeeC,WeberEJ,MaakCA,etal.Effectofemergencydepartment crowdingontimetoantibioticsinpatientsadmittedwithcommunityacquiredpneumonia. AnnEmergMed. 2007;50(5):501–9,509.e1.

7.BaloescuC,KinsmanJ,RaviS,etal.Thecostofwaiting:associationof EDboardingwithhospitalizationcosts. AmJEmergMed. 2021;40:169–72.

8.RichardsonDB.Increaseinpatientmortalityat10daysassociated withemergencydepartmentovercrowding. MedJAustral. 2006;184(5):213–6.

9.McCarthyML,ZegerSL,DingR,etal.Crowdingdelaystreatmentand lengthensemergencydepartmentlengthofstay,evenamong high-acuitypatients. AnnEmergMed. 2009;54(4):492–503.e4.

10.WhiteBA,BiddingerPD,ChangY,etal.Boardinginpatientsinthe emergencydepartmentincreasesdischargedpatientlengthofstay. JEmergMed. 2013;44(1):230–5.

11.JosephJW,HenningDJ,StrouseCS,etal.Modelinghourlyresident productivityintheemergencydepartment. AnnEmergMed. 2017;70(2):185–90.e6.

12.JeanmonodR,JeanmonodD,NgiamR.Residentproductivity:does shiftlengthmatter? AmJEmergMed. 2008;26(7):789–91.

13.Turner-LawrenceDandToddBR.Monthlyprogressionofemergency medicineresidentefficiency:whatcanweexpectofourresidents throughouttraining? JEmergMed. 2019;57(1):77–81.

14.KirbyR,RobinsonRD,DibS,etal.Emergencymedicineresident efficiencyandemergencydepartmentcrowding. AEMEducTrain. 2019;3(3):209–17.

15.JeanmonodR,BrookC,WintherM,etal.Residentproductivityasa functionofemergencydepartmentvolume,shifttimeofday,and cumulativetimeintheemergencydepartment. AmJEmergMed. 2009;27(3):313–9.

16. JosephJW, Davis S,WilkerEH,etal.Modellingattendingphysician productivityintheemergencydepartment:amulticentrestudy. Emerg MedJ. 2018;35(5):317–22.

17.DeBehnkeD,O’BrienS,LeschkeR.Emergencymedicineresidentwork productivityinanacademicemergencydepartment. AcadEmergMed. 2000;7(1):90–2.

18.FredetteJ,KimT,McHughD,etal.Adescriptiveanalysisofemergency medicineresidentproductivityoverthecourseoftraining. AEM EducationandTraining. 2021;5(S1):S44–8.

19.JeanmonodR,DamewoodS,BrookC.Residentproductivity:trends overconsecutiveshifts. IntJEmergMed. 2009;2(2):107–10.

20.HenningDJ,McGillicuddyDC,SanchezLD.Evaluatingtheeffectof emergencyresidencytrainingonproductivityintheemergency department. JEmergMed. 2013;45(3):414–8.

21.KilaruAS,ScheulenJJ,HarbertsonCA,etal.BoardinginUSacademic emergencydepartmentsduringtheCOVID-19pandemic. AnnEmerg Med. 2023;82(3):247–54.

22.MahlerSA,McCartneyJR,SwobodaTK,etal.Theimpactofemergency departmentovercrowdingonresidenteducation. JEmergMed. 2012;42(1):69–73.

23.PinesJM,PrabhuA,McCuskerCM,etal.TheeffectofEDcrowdingon education. AmJEmergMed. 2010;28(2):217–20.

24.AtzemaC,BandieraG,SchullMJ,etal.Emergencydepartment crowding:theeffectonresidenteducation. AnnEmergMed. 2005;45(3):276–81.

25.ShayneP,LinM,UfbergJW,etal.Theeffectofemergencydepartment crowdingoneducation:blessingorcurse? AcadEmergMed. 2009;16(1):76–82.

26.GoldflamK,BradbyC,CoughlinRF,etal.Isboardingcompromisingour residents’ education?Anationalsurveyofemergencymedicine programdirectors. AEMEducTrain. 2024;8(2):e10973.

27.BloomBS,EngelhartMD,FurstEJ,etal. TaxonomyofEducational Objectives:CognitiveandAffectiveDomains.NewYork,NewYork: DavidMcKayCompany,Inc,1956.

28.KolbDA. ExperientialLearning:ExperienceastheSourceofLearning andDevelopment.EnglewoodCliffs,NJ:Prentice-Hall;1984.

29.SchultzK,McEwenL,GriffithsJ.ApplyingKolb’slearningcycleto competency-basedresidencyeducation. AcadMed. 2014;91(2):284.

30.MerriamS. AdultLearning:LinkingTheoryandPractice.SanFrancisco, CA:Jossey-Bass;2013.

31.SaxDR,WartonEM,MarkDG,etal.Evaluationoftheemergency severityindexinUSemergencydepartmentsfortherateofmistriage. JAMANetwOpen. 2023;6(3):e233404.

32.MistryB,StewartDeRamirezS,KelenG,etal.Accuracyandreliability ofemergencydepartmenttriageusingtheemergencyseverity index: aninternational multicenterassessment. AnnEmergMed. 2018;71(5):581–7.e3.

EDUCATION SPECIAL ISSUE -ORIGINAL RESEARCH

ANationalSurveyofEmergencyMedicineResidency ProgramDirectors

MaryMcLean,MD*

JustinStowens,MD†

RyanBarnicle,MD,MSEd‡

NegarMafi,MD§

KaushalShah,MD∥

*AdventHealthEastOrlando,DepartmentofEmergencyMedicine,Orlando,Florida

† ChristianaCareHealthSystem,DepartmentofEmergencyMedicine, Newark,Delaware

‡ TheWarrenAlpertMedicalSchoolofBrownUniversity,DepartmentofEmergency Medicine,Providence,RhodeIsland

§ SanJoaquinGeneralHospital,DepartmentofEmergencyMedicine, FrenchCamp,California

∥ WeillCornellMedicine,DepartmentofEmergencyMedicine,NewYork,NewYork

SectionEditors: JulesJung,MDandSharonBord,MD

Submissionhistory:SubmittedApril11,2024;RevisionreceivedNovember18,2024;AcceptedNovember19,2024

ElectronicallypublishedDecember31,2024

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.20787

Introduction: Theutilityofthethree-partbedsideoculomotorexamHINTS(headimpulsetest, nystagmus,testofskew)inthehandsofemergencyphysiciansremainsunderdebatedespitebeing supportedbythemostrecentliterature.Educatorshistoricallylackconsensusonhowspecificallyto teachthisskilltoemergencymedicine(EM)residents,anditisunknownwhetherandhowEMresidency programshavebeguntoimplementHINTStrainingintotheircurricula.Weaimedtocharacterizethe stateofHINTSeducationinEMresidencyanddevelopaneedsassessment.

Methods: Inthiscross-sectionalstudy,weadministeredasurveytoEMresidencydirectors,thethemes ofwhichcenteredaroundHINTSeducationperceptions,practices,resources,andneeds.Weanalyzed Likertscaleswithmeansand95%confidenceintervalsfornormallydistributeddata,andwithmedians andinterquartilerangesfornon-normallydistributeddata.Frequencydistributions,means,andstandard deviationswereusedinallotheranalyses.

Results: Of250eligibleparticipants,201(80.4%)respondedandconsented.Ofthe192respondents providingusabledata,149/191(78.0%)believedtheHINTSexamisvaluabletoteach;124/192(64.6%) reportedHINTSeducationalofferingsinconference;and148/192(77.1%)reportedclinicalbedside teachingbyfaculty.Themost-effectiveeducationalmodalitieswereclinicalbedsideteaching,online videos,andsimulation.SubtopicteachingstruggleswithregardtoHINTSwereheadimpulsetestand test-of-skewconductionandinterpretation,selectionofthecorrectpatients,andoverallHINTS interpretation.Teachingbarrierscenteredaroundlackoffacultyexpertise,concernforpoorHINTS reproducibility,andlackofresources.Leadershipwoulddedicateameanof2.0hours/year(SD1.3 hours/year)toimplementingaformal,standardizedHINTScurriculum.

Conclusion: Despitecontroversysurroundingtheutilityof theHINTSexaminEM,mostresidencydirectors believeitisimportanttoteach.Thisneedsassessmentcanguidedevelopmentofformaleducationaland simulationcurriculafocusingonresidencydirectors’ citedHINTSexameducationalstruggles,barriers,and reportedmost-effectiveteachingmodalities.[WestJEmergMed.2025;26(1)70–77.]

INTRODUCTION

Background

Posteriorstrokepresentingwithdizzinessismisdiagnosed byemergencyphysicians(EP)in35%ofcases,1 whichcan leadtoseveredebilitationandsometimesdeath.2 Paradoxically,ofpatientsdischargedfromtheemergency department(ED)withadiagnosisofdizzinessorvertigo, only1in500isdiagnosedwithastrokewithinthe first month.3 Withadvancesinstroketreatmentmodalities,it makessensethatthereisheightenedemphasisondetection. In2013,theannualcostofimagingforacutedizzinessin UnitedStatesEDswasnearly$4billion.4 Muchofthiscostis duetoutilizationofnon-contrastcomputedtomography (CT)ofthehead5 despiteitslowsensitivityfordetecting posteriorfossastroke(mean41.8%,95%confidenceinterval 30.1–54.4%)6 andthelowlifetimecost-effectiveness comparedtomagneticresonanceimaging(MRI).7 Specifically,itisestimatedthatover$1billionperyearis wastedoninappropriateCTimagingforpatientswith dizziness/vertigo.8

AworldwidesurveyofEPspublishedin2008foundthat thedevelopmentofabetterclinicaldecisionrulefor identificationofcentralvertigowasthesecondhighest clinicalpriorityforparticipants.9 Managementofdizziness andvertigoisincludedintheJointTaskForceEmergency MedicineModelofClinicalPractice,10 andEPsareexpected todiagnoseandmanagepatientswiththesechiefcomplaints. Itis,therefore,incumbentuponEMresidencyprogramsto provideadequateeducationandtrainingondizzinessand vertigo.However,a2005studyfoundthatonly35%ofEM residencyprogramsrequiredaclinicalneurologyor neurosurgeryrotation,andanannualmeanof12hours(of 280totaldidacticeducationhours)wasdedicatedto neurologicemergencies.11 Itisunknownhowmuchofthis timeisdevotedspecificallytodizzinessandvertigo,or exactlywhatisbeingtaughtregardingappropriatehistory, physical,anddiagnosticworkuprecommendations.

TheclinicalHINTSexam(headimpulsetest,nystagmus, testofskew)12 isathree-partbedsideoculomotorexamwith diagnosticaccuracyforcentralvertigosimilartothatofMRI. A2023CochraneReviewof12studiesand1,890participants foundtheclinicalHINTSexamtobe94%sensitiveand87% specific. 13 ThisexammaybeappealingtotheEPbecauseitis purportedtobearapidandlow-costbedsideevaluation. However,literaturesuggeststhatitsdiagnosticaccuracyhas fallenshortforEPsusingtheHINTSexaminclinicalpractice, with findingssuggestingthatthereasonsareapplicationto inappropriatepatients(eg,thosewithoutacutevestibular syndromeandnystagmus)anddifficultyininterpretinghead impulsetest(HIT)results.14,15 Inaddition,theliteraturehas shownpoorinter-raterreliabilityamongEPsusingthe HINTSexam.16 Withtheseconcernsinmind,twoother clinicaldecisiontoolshavesincebuiltonHINTSprinciples. The firstistheHINTS “plus” tool,whichaddsahearingtest

PopulationHealthResearchCapsule

Whatdowealreadyknowaboutthisissue?

Whenproperlyused,theHINTSexamhashigh diagnosticaccuracyforcentralcausesindizzy patients,butthestateofHINTSeducationin (EM)isinadequatelycharacterized.

Whatwastheresearchquestion?

Whatareprogramleadershipperceptions, educationalpractices,andbarriersto teachingHINTSinEMresidencies?

Whatwasthemajor findingofthestudy?

78.0%ofprogramleadersbelievethe3-part oculomotorexamisvaluabletoteach, and64.6%offerformalHINTS educationsessions.

Howdoesthisimprovepopulationhealth?

TeachingHINTStoEMresidentsrequires improvedcurricula,resources,andfaculty expertise.Bettereducationmayhelptranslate promisingHINTSliteratureinto clinicalpractice.

(95.3%sensitiveand72.9%specific).13 Thesecondisthe STANDING(spontaneousnystagmus,direction,head impulsetest)algorithm,whichusestwopartsoftheHINTS examandadditionalphysicalexammaneuvers(93–100% sensitiveand72–94%specific).17–19

The2023AmericanCollegeofEmergencyPhysicians (ACEP)ClinicalPracticeGuidelineoffersspecificHINTS examrecommendationsandcautions: “Beforeemployinga maneuversuchasHINTS,physiciansshouldhavesufficient educationtoperformthetechnique;notusingtoolssuchas HINTSmayleadtoexcessivetestingandadmission;and incorrectimplementationmayleadtoanincreasedriskof misdiagnosis.”20 InadditiontoACEP’srecommendations, in2023,theSocietyforAcademicEmergencyMedicine releasedGuidelinesforReasonableandAppropriateCarein theEmergencyDepartment(GRACE-3):AcuteDizziness andVertigointheEmergencyDepartment.Theyhadsimilar recommendationsthatEPeducationshouldinvolvethe following: “receivetrainingintheHINTSexam;usethe HINTSexam(onceproperlytrained)inpatientswith nystagmus;andconsidertheHINTSexamasthe first-line testoverMRI(ifaHINTS-trainedclinicianis available).”21,22 TheauthorsofGRACE-3also acknowledgedadiscordanceinthatmostEPshavenot receivedspecialtrainingintheuseoftheHINTSexam.This

lackofspecialtrainingmayhaveledtotheHINTStesting inaccuraciesreportedintherecentliterature.14,15 Thisbegs thequestionofwhich,ifany,educationaltacticshave beeneffective.

FromtherecentGRACE-3guidelines21 andreleasesby EMsocieties,10,20 thereisaclearcallforEMHINTS educationandHINTSexamintegrationintotheEMclinical arena.However,thecurrentstateofHINTSexam acceptance,education,andtrainingisunclear.IfHINTS curricularimplementationhasoccurred,informationabout theneeds,barriers,teachingstruggles,andeducator perspectivesmayaddfurtherweighttotheargumentforour specialty’soverallacceptanceoftheHINTSexam.

Importance

ThestandardofcarefortheEDevaluationofdizzy patientsmaybeevolvingtoembracetheHINTSexam,but translationoftheliteraturetoclinicalpracticeremains unclear.ItisalsounclearwhatproportionofEPshavebeen adequatelytrainedintheuseoftheHINTSexam. Furthermore,residencyprogramsmaylackthefaculty expertise,timeandfundingtoaddnewitemssuchasHINTS educationtotheircurricula.Programsthathaveadoptedthe societalguidelinesaddressingtheHINTSexammayhave alreadyadjustedtheirdidacticandsimulationcontent. SupportersoftheHINTSexamwillrecognizethe importanceofaneedsassessmentwithregardtoresidency effortsandperceivedchallengesandbarrierstodizziness evaluationandHINTSeducation.Skepticswill findthe knowledgeofcurrentHINTSteachingparadigmsusefulto determinetheirownpracticeandthepotentiallyevolving standardofcare.

GoalsofThisInvestigation

Whilerecentresearchsupportsaneedforchangeinour EDclinicalpractice,ithasyettobeassessedwhetherthese ideasarecurrentlybeingtaughtwithinEMresidency programs,andifso,howtheyarebeingtaught.Ourgoalin thisinvestigationwastoassessthecurrentUnitedStatesEM residencyprogramleadershipperspectives,teaching paradigms,teachingbarriers,andfutureneedsfor implementingeducationalcurriculaonassessmentofthe dizzypatient,withaparticularfocusontheHINTSexam. Theresultsofthiseducationalneedsassessmentcanserveto guideandrefinetheconstructionofeducationalresources includingdidacticandsimulationmodalities.

METHODS

StudyDesignandSetting

Thiswasacross-sectionalobservationalstudyinavirtual setting.Participantswereofferednoincentives,therewasno funding,andthestudywasinstitutionalreviewboardapprovedasexempt.Anelectronicsurveywasadministered toEMresidencydirectorsbetweenApril6–July13,2023.

ThestudywasconductedincompliancewithSTROBE (StrengtheningtheReportingofObservationalstudiesin Epidemiology)cross-sectionalreportingguidelines.23

SelectionofParticipants

Includedwerecurrentprogramdirectorsforcategorical EMresidencyprogramsintheUS.Excludedwereprogram directorsfromresidencyprogramsthatreceivedinitial accreditationfromtheAccreditationCouncilforGraduate MedicalEducationonorafterJanuary1,2020.Therationale forthisexclusionwasthatnewprogramswerelesslikelyto haveadministeredanentireeducationalcurriculumcycle. Thetargetpopulationincluded250programleaders(one fromeacheligibleprogram).Programdirectorcontact informationwasobtainedfrommedicalsocietydatabases andresidencyprogramwebsites.Whilebothworkand personalemailswereoftenpubliclyavailable,weprioritized makingcontactviaworkemails.SeeAppendixAforthe participantrecruitmentmessage.

SurveyDevelopment

Thesurveyinstrumentwasdeveloped,tested,and validatedusingarigorousprocesswithcloseguidanceand leadershipfromseasonednationalmedicaleducationexperts viaaformalMedicalEducationResearchCertification programthroughtheCouncilofResidencyDirectorsin EmergencyMedicine.Wefollowedthesystematic,sevenstepprotocolfordevelopingmedicaleducationresearch questionnairesdescribedbyArtinoetal.24 Formalfocus groupswereusedtopropose,discuss,andreworksurvey itemsusinganiterativeprocessuntilconsensuswasreached regardingfacevalidityandinternalconsistency.Thesurvey waspilotedbyagroupof20membersofthenonprofit medicaleducationallianceALLNYCEM(consistingofEP medicaleducators,residencyleadershipmembers,and residenteducationfellows)forfeedbackonclarityand usability.Thesoleconsensusrecommendationwasto shortenthesurvey,whichwasdonepriortonational distribution.Finalsurveyitemsincludedprogram/institution demographicsandquestionsaboutperceptionsandpractices regardingdizziness,vertigo,andHINTSexameducation withineachresidencyprogram.SeeAppendixBforacopyof thecompletesurveytool.

StudyProtocol

WeusedtheelectronicplatformSurveyMonkey (SurveyMonkeyEnterprise,SanMateo,CA)todistribute thesurveyandcollectdata.The250programdirectorswere initiallycontactedindividuallyviaemailwiththerecruitment messageandtheirpersonalizedsurveylink.Subsequent contactattempts(requiredfor235programdirectors)were madefornon-responsesorincompletesurveys.Attheendof thedatacollectionperiod,allcompleteandpartialsurveys wereincludedinanalysisiftheparticipantprovideddata

beyondtheinformedconsentquestion.Exceptforthe informedconsentquestion,nosurveyquestionwasrequired. Thisallowedparticipantstooptoutofansweringspecific questionsiftheywishedwhilestillenablingthemto participate.Missingdatafromparticipantswhooptedoutof aquestionwasnotincludedinthecalculationsfor subsequentstatisticalanalysisforthatitem.

Outcomes

Intendedoutcomescenteredaroundresidencydirectors’ HINTSexamperceptionsaswellascurrentHINTS educationalpracticeswithinresidencyprograms,resources available,andcurricularneeds.Thepurposeofgathering informationontheseoutcomeswastogenerateaneeds assessmentfordizzinessandHINTSexamcurriculain EMresidencies.

Analysis

WeanalyzeddatausingRversion4.3.2forMacOS(R FoundationforStatisticalComputing,Vienna,Austria). Likert-scaledatawasanalyzedusingmediansand interquartilerangesfornon-normaldatadistributionsor usingmeansand95%confidenceintervals(CI)fornormal datadistributions.Wetestednormalityofdatadistributions byexaminingestimatesofskewnessandkurtosisforeach scale,aswellasbyplottinghistogramsandcomparing distributionstothenormalcurve.Normalitywasconcluded onlyifallestimatesofskewnessandkurtosisfellbelowthe thresholdsof2and7,respectively,andallhistogramsaligned closelywiththenormalcurve.25 WeusedtheWilsonscore statisticforcalculationof95%CIsforbinomialproportion items(yes/noitemswithananswerof “ yes ” definedasa positiveresult).26,27 Frequencydistributionswereusedto analyzequestionsaboutstrugglesandbarrierstoteaching theHINTSexam.Weuseddescriptivestatistics(meansand standarddeviationsforallotherquantitativedata.As participantswerepermittedtoskipanyquestion,missing datawasomittedfromitem-levelanalyses.SeeAppendixC fordetailsonmissingdataanditem-levelresponserates.

RESULTS

CharacteristicsofStudySubjects

Of250eligibleprograms,leadershipfrom204openedthe surveyand201providedinformedconsentforanoverall surveyresponserateof80.4%.Amongconsenting respondents,192programsprovidedusefuldatabeyondthe initialinformedconsentquestion.SeeAppendixDforthe enrollment flowsheet.Participatingprogramdemographic characteristicswerewellrepresentativeofthepopulationof alleligibleprograms(seeAppendixE).

MainResults

Programleadershipreportedperceptionsthattheir residencygraduateswere,onaverage,moreconfidentand competentthantheirfacultymembersatperformanceand interpretationoftheHINTSexam.Theyalsoreported perceptionsthat,forbothresidencygraduatesandfaculty members,confidencewashigherthancompetence.However, noneofthesepatternsreachedstatisticalsignificance.

SeeAppendixF.

ThemostfrequentlycitedHINTSsubtopicteaching strugglescenteredaroundtheHIT,testofskew,HINTS applicationtocorrectpatients,andoverallHINTS interpretation.See Figure2.Themostfrequentlycited HINTSteachingbarrierscenteredaroundlackoffaculty expertise,concernforpoorHINTSexamreproducibility, andlackofresources.See Figure3.Lastly,program leadershipindicatedthattheywoulddedicateameanof2.0 hours/year(SD1.3hours/year)toimplementingaformal, standardizedHINTSexamcurriculumifsuchacurriculum werewidelyavailable.

Figure1. Box-and-whiskerplotofleadership-perceivededucational modalityeffectivenessforteachingtheHINTS*examination:188 participantsprovidedusabledataonthisitem(item-levelresponse rate75.2%).Likert-scaleratingsfromleast(1)tomost(7)effective wereused.Dataforoneofthe12modalities(clinicalbedside teaching)wasnotnormallydistributedandthusmediansand interquartileranges(IQR)wereusedforallanalyses.Mediansare representedbythickverticallines,IQRsarerepresentedbygray boxes,whiskersrepresent1.5*IQR,andoutlierdataisrepresented byblackdots,withtheareaofeachdotproportionaltothe answerfrequency.

Overall,149/191(78.0%)believedtheHINTSexamis valuabletoteach,16/191(8.4%)believeditisnot,and25/191 (13.1%)wereunsure.Onsubgroupanalysesoftheseand otherkeysurveyitems,programdemographicfactors (programlength,setting,type,andregion)wereofno statisticalsignificanceaftercontrollingformultiple comparisons.Themosteffectiveeducationalmodalitiesfor teachingtheHINTSexamwerereportedtobeclinical bedsideteaching,videocasts/onlinevideos,andsimulation. Perceptionsofmodalityeffectivenessvariedwidely. See Figure1

HINTS*,headimpulsetest,nystagmus,testofskew.

Figure2. Frequencyofresidencyprogramdirector-reportedHINTS examsubtopicteachingstruggles:190participantsprovided useabledataonthisitem(item-levelresponserate76.0%).Error barsrepresentthe95%confidenceintervalforthesebinomial proportionitemsusingtheWilsonstatistic.

Figure3. Frequencyofresidencyprogramdirector-reportedbarriers toteachingtheHINTS*examination.185participantsprovided useabledataonthisitem(item-levelresponserate74.0%).Error barsrepresentthe95%confidenceintervalforthesebinomial proportionitemsusingtheWilsonstatistic. *HINTS,headimpulsetest,nystagmus,testofskew.

DISCUSSION

Our findingsmayreflectunderlyingcausesforthe difficultiesEPshaveteachingandusingtheHINTSexam. Facultymembersmayhesitatetoteachitiftheyexhibit discomfortwiththeirownHINTSexamskills.Inour findings, programleadershipexpressedlackoffacultyexpertiseasan educationalbarrier,andtheyalsoperceivedtheirresidency graduatestobemorecompetentwiththisskillthantheir facultymembers(althoughthis findingdidnotreachstatistical significance).Additionally,programdirectors’ citedbarrierof concernforpoorHINTSexamreproducibilitymaypoint towardphysicians’ innatedesirefordiagnosticcertaintyand theperceptionthattheHINTSexamisfallible.Our respondentsreportedthatHINTSisvaluabletoteach,but theylessoftenreportedHINTSofferingsinconferences.The citedreasonsforthisdiscordancecenteredaroundlackof time,resources,andfacultyexpertise.

IthasbeenshownintheoriginalHINTSliteratureand subsequentresearchreleasedfromneurologyandEM collaborationeffortsthatitispossibletoeffectivelylearnthis

skill,13,21 yetourresultsdonotdescribewidespreadskill acquisitionamongEPs.Itispossiblethateducational collaborationbetweenthespecialtiescouldpositivelyimpact EPs’ proficiencyintheexamitselfandinoptimaleducational methods.Regardless,oursurveyresultssupportadesireto addressthelackoffacultyexpertise.Muchcollaborationis alreadyoccurring,asevidencedbyrecentdizzinessand vertigoliteratureauthoredbyteamsincludingmembersof bothspecialties,21,28–30 aswellasresearchwithmixedcohorts frombothspecialties.1

Ourresultsshowthatsimulationwasperceivedasoneof themosteffectivemodalitiesforHINTSeducation,butlackof simulationmodelswasalsocitedasatopeducationalbarrier. TheHINTSeducationliteraturefromneurologyandneurosubspecialtieshasproposedsomeinnovativesimulation adjuncts.Forexample,onestudyfoundthatneurology trainees’ utilizationofvideo-oculography(VOG)technology insimulationcorrelatedwithsignificantimprovementsinHIT performance.31 Twostudiesusedvirtualreality-enhanced manikintasktrainersforHINTSsimulation,demonstrating examsensitivityandspecificityimprovements,including amongEPcohorts.32,33 Whilesuch “partialtasktrainers” may haveutility,oursurveysuggeststheyarenotwidelyusedor commerciallyavailable.TheVOGdevicesarecommercially availableandhavequalityassurance(QA)featurestoassist examiners’ HITperformanceviafeedbackonmaneuver anglesandvelocities.ArecentstudyusedtheseQAfeaturesin EMresidentHITsimulationandreportedsignificant improvementinHITmaneuverperformance.34

Therearenopublishedparametersfromtheneurologyor subspecialtyliteratureregardingtheoptimalHINTS curriculumtrainingdurations.However,a2022systematic reviewofHINTSandSTANDINGeducationreportedon fiveinstitutions’ EMeducationalpractices.Theyfoundwide curricularvariabilityindidactictime(1–5hours),workshop time(1–8hours),neurologyexposure(clinicalrotations),and proctoredexams(upto15)overeachresident’sdurationin theprogram.35 Ourparticipantsindicatedtheywould dedicateameanof2.0hours/yeartoHINTSeducation, andoverthecourseofathree-orfour-yearresidency,this wouldbeadequatetimefortheparametersdescribed.35 Despitewillingnesstocommitthistime,otherliterature suggeststhattheexamapplicationandmaneuversmaybe morecomplexthanourspecialtyrecognizes.14 Asreflectedin ourresults,programleadershipperceivedhigherconfidence thancompetenceamonggraduatesandfacultyalike (althoughthisdidnotreachstatisticalsignificance).This phenomenon theDunning-Krugereffect ispresentin medicine,andexistingliteraturesuggeststhatassessmentsby examinersfrommultipledisciplinesarerequiredtoensure proficiencyinsuchhigh-levelskills.36,37 Thiswould potentiallyaddmoretimetoaHINTScurriculum. Ourresultscontributetoagrowingdescriptionofthe HINTSeducationalmodalitiesinuse,buteachmodalityhas

prosandconsbeyondthetraininghoursrequired.Clinical bedsideteaching(thehighest-ratedmodalityinoursurvey) providesthehighest-fidelityandreal-lifeexperiencebutis dependentoncaseconvenience(dizzy/vertiginouspatients presenting)andeducatoravailabilityonshift.The opportunitycostofbedsidetrainingmustbeconsideredas well.Thesurveydoesnotexplorethehypotheticalon-shift facultytimespentandassociatedopportunitycost,which wouldbeausefultopicforfutureresearch.

Simulationtiedforsecondplaceasthehighest-rated HINTSeducationalmodality.Itmitigatesthecase convenienceissuebyprovidingon-demandpatientcasesina controlledsetting,butitalsopresentsafacultyopportunity/ costissuebyincreasingtrainingtimeinthesimulationcenter. Hands-onskillsimulationrequiressmall-grouporindividual instruction,whichusesmorefacultytimeandtheuseof simulationmodels,andpossiblyothersimulationadjuncts. OursurveydidnotaskaboutspecificHINTSsimulation equipmentortechniquesbeingusedatEMresidency programsintheUS,butevenifaggressivecost-ofimplementationestimatesaremade,thereturnoninvestment wouldmakeHINTSeducationalinitiatives financially worthwhile.Thenationwidecapitalexpenditures (specifically,VOGdevicesforsimulation)costabout$9.76 million,whichamountsto1%oftheestimated$1billion/year spentoninappropriateCTimagingforpatientswith dizziness/vertigointheUS.8 Theestimatednationalyearly costaftercapitalinvestment(specifically,thecostoffaculty time)isabout$331,883inadditiontocostsforany equipmentrepairornewdevices/adjuncts.SeeAppendixG forthecost-of-implementationanalysis.34,38–41

Notably,HINTSmanikin “partialtasktrainers” have beendevelopedandtested,butnonearewidelyavailable.32,33 The2023ACEPClinicalPolicyrecommendedincorporation oftechnologysuchasFrenzelgogglesandoculartracking softwareintraining.20 TheVOGdevicesarecommercially availablefor$12,000–40,000perdeviceandhaveshown promiseinthesimulationenvironment.31,34,38,42

Todescribetheeffectivenessofmanyeducationaloptions (includingthoseamenabletoasynchronousandlarge-group sessions),weaskedaboutseveralothermodalitiesin additiontoclinicalbedsideteachingandHINTSsimulation. Onlinevideosandvideocastsweretiedwithsimulationfor thesecondhighest-ratedteachingmodalityamong participants.Contrarytobedsideandsimulationteaching, thismodalityrequiresnofacultytimeorsupervisionandis free.OnlineHINTSeducationalvideoscanbeusedasan asynchronoussupplementtoclinicalbedsideteachingand simulation,butwatchingvideosisapassivelearning techniquewithnohands-onpracticeoropportunityfor acquisitionofmusclememory.However,recentstudies suggestthatachievingHINTSexamskills(particularly HITskills)doesrequireahands-oncomponentformotor skillsacquisition.34

Overall,moretime,effort,funding,andeducational researchcouldbetargetedtowardcreatingHINTScurricula andsimulationmodalities,andonmakingtheseresources widelyavailabletoimproveEMresidencyHINTS educationaloptions.Thevariabilityinoursurveyresults showsthatmultipleeducationmodalitiesarelikelybeing employedacrosstheresidencytrainingprogramsintheUS butwithsomeconsensusaboutthemostusefulmodalities.In suchasituationwheremultiplemodalitiesarebeing employedtothesameend,furtherresearchtoward developmentofastandardizedtrainingplanisneeded.

LIMITATIONS

Toachieveadequateresponseratesfromoursurvey,the lengthofthesurveywaslimitedattherecommendationsof theexpertpilottestgroup.Additionally,variabilityofthe questiondesignwasemployedtoholdparticipants’ interest andincreaseresponserates.Asaresult,somequestionswere askedinabinary “yes/no” formatinsteadofLikertscalesor rankings,potentiallysacrificingsomedepthofresponse interpretation.Anotherconcernwithoursurveydesignwas responsebias.Whileallowingquestionsonthesurveytobe leftunansweredsupportsoverallincreasedresponserates, biasmayhavebeenintroducedviarespondent-allocated missingdata.Itispossiblethatprogramleaderswho answeredfewerquestionshadmorepassiveopinionsabout theHINTSexam,exhibitingneutralresponsebiaswherein, forexample,theyselected “neutral” or “noopinion” on classicLikert-scalequestions.Theoppositeisalsopossible whereinthesurveyresultsarebiasedtowardthoseinstrongly infavoroforstronglyagainsttheHINTSexam(extreme responsebias).Fortunately,ouroverallhighresponserates andwidevariabilityofresponsessuggeststheselimitations areminimal.

SurveyswereinitiallysenttoEMresidencyprogram directorswhohadtheoptionofeithercompletingit themselvesorassigningtheresponsibilitytoanassociate programdirector,ortothefacultyleaderoftheresidency’ s curricularcontent.Thereis,thus,apossibilitythatanswers varieddependingontheroleofthesurvey-takerforeach program,whichwasnotrecorded.

CONCLUSION

Emergencymedicineresidencyprogramsandmedical educatorsshouldfocustheirHINTSeducationalpriorities ondevelopmentofaformalizedcurriculumwithadequate resources.Programswillalsoneedtoaddressthebarrierof lackoffacultyexpertise.

AddressforCorrespondence:MaryMcLean,MD,AdventHealth EastOrlando,DepartmentofEmergencyMedicine,7727Lake UnderhillRd.,Orlando,FL32822.Email: Mary.McLean.MD@ AdventHealth.com

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025McLeanetal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.KerberKA,BrownDL,LisabethLD,etal.Strokeamongpatientswith dizziness,vertigo,andimbalanceintheemergencydepartment: apopulation-basedstudy. Stroke. 2006;37(10):2484–7.

2.SavitzSI,CaplanLR,EdlowJA.Pitfallsinthediagnosisofcerebellar infarction. AcadEmergMed. 2007;14(1):63–8.

3.KimAS,FullertonHJ,JohnstonSC.Riskofvasculareventsin emergencydepartmentpatientsdischargedhomewithdiagnosisof dizzinessorvertigo. AnnEmergMed. 2011;57(1):34–41.

4.AhsanSF,SyamalMN,YaremchukK,etal.Thecostsandutilityof imaginginevaluatingdizzypatientsintheemergencyroom. Laryngoscope. 2013;123(9):2250–3.

5.SaberTehraniAS,CoughlanD,HsiehYH,etal.Risingannualcostsof dizzinesspresentationstoU.S.emergencydepartments. AcadEmerg Med. 2013;20(7):689–96.

6.HwangDY,SilvaGS,FurieKL,etal.Comparativesensitivityof computedtomographyvs.magneticresonanceimagingfordetecting acuteposteriorfossainfarct. JEmergMed. 2012;42(5):559–65.

7.TuLH,MelnickE,VenkateshAK,etal.Cost-effectivenessofCT,CTA, MRI,andspecializedMRIforevaluationofpatientspresentingtothe emergencydepartmentwithdizziness. AJRAmJRoentgenol. 2024;222(2):e2330060.

8.KeitaM,NasseryN,SebestyenK,etal.Diagnosticerrors,harms,and wasteinevaluatingdizzinessandvertigoinambulatorycaresettings acrosstheUnitedStates[abstract].DiagnosticErrorinMedicine11th InternationalConference.November4–6,2018;NewOrleans,Louisiana.

9.EaglesD,StiellIG,ClementCM,etal.Internationalsurveyof emergencyphysicians’ prioritiesforclinicaldecisionrules. AcadEmerg Med. 2008;15(2):177–82.

10.BeesonMS,AnkelF,BhatR,etal.The2019modeloftheclinical practiceofemergencymedicine. JEmergMed. 2020;59(1):96–120.

11.StettlerBA,JauchEC,KisselaB,etal.Neurologiceducationin emergencymedicinetrainingprograms. AcadEmergMed. 2005;12(9):909–11.

12.KattahJC,TalkadAV,WangDZ,etal.HINTStodiagnosestrokeinthe acutevestibularsyndrome:three-stepbedsideoculomotorexamination moresensitivethanearlyMRIdiffusion-weightedimaging. Stroke. 2009;40(11):3504–10.

13.GottliebM,PeksaGD,CarlsonJN.Headimpulse,nystagmus,andtest ofskewexaminationfordiagnosingcentralcausesofacutevestibular syndrome. CochraneDatabaseSystRev. 2023;(11):CD015089.

14.DmitriewC,RegisA,BodundeO,etal.Diagnosticaccuracyofthe HINTSexaminanemergencydepartment:aretrospectivechartreview. AcadEmergMed. 2021;28(4):387–93.

15.OhleR,MontpellierRA,MarchadierV,etal.Canemergencyphysicians accuratelyruleoutacentralcauseofvertigousingtheHINTS examination?Asystematicreviewandmeta-analysis. AcadEmerg Med. 2020;27(9):887–96.

16.HenriksenACandHallasP.Inter-ratervariabilityintheinterpretationof theheadimpulsetestresults. ClinExpEmergMed. 2018;5(1):69–70.

17.GerlierC,HoarauM,FelsA,etal.Differentiatingcentralfromperipheral causesofacutevertigoinanemergencysettingwiththeHINTS, STANDING,andABCD2tests:adiagnosticcohortstudy. AcadEmerg Med. 2021;28(12):1368–78.

18. VanniS, Pecci R,EdlowJA,etal.Differentialdiagnosisofvertigointhe emergencydepartment:aprospectivevalidationstudyofthe STANDINGalgorithm. FrontNeurol. 2017;8:590.

19.VanniS,NazerianP,PecciR,etal.Timingfornystagmusevaluationby STANDINGorHINTSinpatientswithvertigo/dizzinessinthe emergencydepartment. AcadEmergMed. 2023;30(5):592–4.

20.AmericanCollegeofEmergencyPhysiciansClinicalPolicies Subcommittee(WritingCommittee)onAcuteIschemicStroke.LoBM, CarpenterCR,etal.Clinicalpolicy:criticalissuesinthemanagementof adultpatientspresentingtotheemergencydepartmentwithacute ischemicstroke. AnnEmergMed. 2023;82(2):e17–64.

21.EdlowJA,CarpenterC,AkhterM,etal.Guidelinesforreasonableand appropriatecareintheemergencydepartment3(GRACE-3):acute dizzinessandvertigointheemergencydepartment. AcadEmergMed. 2023;30(5):442–86.

22.ShahVP,OliveiraJESilvaL,FarahW,etal.Diagnosticaccuracyofthe physicalexaminationinemergencydepartmentpatientswithacute vertigoordizziness:asystematicreviewandmeta-analysisfor GRACE-3. AcadEmergMed. 2023;30(5):552–78.

23.vonElmE,AltmanDG,EggerM,etal.Thestrengtheningthereportingof observationalstudiesinepidemiology(STROBE)statement: guidelinesforreportingobservationalstudies. PLoSMed. 2007;4(10):e296.

24.ArtinoARJr,LaRochelleJS,DezeeKJ,etal.Developingquestionnaires foreducationalresearch:AMEEguideno.87. MedTeach. 2014;36(6):463–74.

25.FidellLSandTabachnickBG.Preparatorydataanalysis.In:SchinkaJA &VelicerWF(Eds.) HandbookofPsychology:ResearchMethods inPsychology(115–141),Vol.2.Hoboken,NJ:JohnWiley& Sons,Inc,2003.

26.AgrestiAandCoullBA.Approximateisbetterthan “exact” forinterval estimationofbinomialproportions. AmStat. 1998;52(2):119–26.

27.BrownLDandCaiTT,DasGuptaA.Intervalestimationforabinomial proportion. StatSci. 2001;16(2):101–17.

28.EdlowJA,GurleyKL,Newman-TokerDE.Anewdiagnosticapproachto theadultpatientwithacutedizziness. JEmergMed. 2018;54(4):469–83.

29.Newman-TokerDE,PetersonSM,BadihianS,etal.(2022). Diagnostic errorsintheemergencydepartment:asystematicreview.Rockville, MD:AgencyforHealthcareResearchandQuality(US).

30.PuissantMM,GiampalmoS,WiraCRIII,etal.Approachtoacute dizziness/vertigointheemergencydepartment:selected controversiesregardingspecialtyconsultation. Stroke. 2024;55(10):2584–8.

31.KordaA,SauterTC,CaversaccioMD,etal.Quantifyingalearning curveforvideoheadimpulsetest:pitfallsandpearls. FrontNeurol. 2021;11:615651.

32.Charlery-AdèleA,GuigouC,RyardJ,etal.Effectsofsaccadedelay, sideofdeficit,andtrainingondetectionofcatch-upsaccadesduring head-impulsetestinvirtual-reality-enhancedmannequin. SciRep. 2023;13(1):2718.

33.UrsatG,CordaM,RyardJ,etal.Virtual-reality-enhancedmannequinto trainemergencyphysicianstoexaminedizzypatientsusingtheHINTS method. FrontNeurol. 2024;14:1335121.

34.LenningJC,MessmanAM,KlineJA.Applicationofmotorlearning theorytoteachtheheadimpulsetesttoemergencymedicineresident physicians. AEMEducTrain. 2024;8(1):e10936.

35.NakatsukaMandMolloyEE.TheHINTSexaminationandSTANDING algorithminacutevestibularsyndrome:asystematicreviewandmetaanalysisinvolvingfrontlinepoint-of-careemergencyphysicians. PLoSOne. 2022;17(5):e0266252.

36.KrugerJandDunningD.Unskilledandunawareofit:howdifficultiesin recognizingone’sownincompetenceleadtoinflatedself-assessments. JPersSocPsychol. 1999;77(6):1121–34.

37.BarnsleyL,LyonPM,RalstonSJ,etal.Clinicalskillsinjuniormedical officers:acomparisonofself-reportedconfidenceandobserved competence. MedEduc. 2004;38(4):358–67.

38.BastaniPB,BadihianS,PhillipsV,etal.Smartphonesversusgoggles forvideo-oculography:currentstatusandfuturedirection. ResVestib Sci. 2024;23(3):63–70.

39.KovalML.Medscapeemergencymedicinephysiciancompensation report2024:biggerchecks,yetmanydoctorsstillseeanunderpaid profession.2024.Availableat: https://www.medscape.com/ slideshow/2024-compensation-emergency-medicine-6017136 AccessedOctober1,2024.

40.AssociationofAmericanMedicalColleges.Reportonresidents.2023. Availableat: https://www.aamc.org/data-reports/students-residents/ report/report-residents.AccessedOctober1,2024.

41.LenioR.Emergencymedicineannualworkedhours:marketnormsvary widely.2023.Availableat: https://www.ajg.com/us/news-and-insights/ 2023/mar/emergency-medicine-annual-worked-hours-market-normsvary-widely/.AccessedOctober1,2024.

42.Newman-TokerDE,SaberTehraniAS,MantokoudisG,etal.Quantitative video-oculographytohelpdiagnosestrokeinacutevertigoanddizziness: towardanECGfortheeyes. Stroke. 2013;44(4):1158–61.

EDUCATION SPECIAL ISSUE -ORIGINAL RESEARCH

MichaelKiemeney,MD*

JamesMorris,MD,MPH†

LaurenLamparter,MD‡

MosheWeizberg,MD§

AndyLittle,DO∥

BrianMilman,MD¶

*LomaLindaUniversitySchoolofMedicine,LomaLinda,California

† TexasTechUniversityHealthSciencesCenter,Lubbock,Texas

‡ UniversityofIllinoisChicago,Chicago,Illinois

§ StatenIslandUniversity,StatenIsland,NewYork

∥ AdventHealthEastOrlando,Orlando,Florida

¶ UniversityofTexasSouthwesternMedicalCenter,Dallas,Texas

SectionEditors:MatthewTews,MD,AndrewKetterer,MD,andAndrewGolden,MD

Submissionhistory:SubmittedMay24,2024;RevisionreceivedNovember8,2024;AcceptedNovember25,2024

ElectronicallypublishedFebruary14,2025

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.21249

Introduction: Emergencymedicine(EM)historicallyenjoyedanearly100%matchrate.Arapidchange saw46%ofEMprogramswithoneormoreunfilledpositionsafterthe2023Match.Muchhasbeen discussedaboutpotentialcauses,andcharacteristicsofunfilledprogramshavebeeninvestigated.We surveyedrecentapplicantstoEMtofurtherunderstandwhatcontinuestodrawthemtoEMandwhat concernsdeterthemfromchoosingacareerinEM.

Methods: Across-sectional,mixedmethodssurveywasdistributedinthesummerof2023toa conveniencesampleofrespondentsviathelistservsofnationalEMresidentandstudentorganizations aswellasclerkshipdirectorsinEM.Wedidnotcalculateresponserateduetolistservconvenience sampling.Atotalof213responseswerereceived,representing7.7%ofthetotalnumberofEM applicants(2,765)in2023.Applicantswereaskedtorankfrom1to5theirexperienceswithEMandthe characteristicsofthespecialtythatwereimportantintheircareerdecision.Wecalculatedmeansand 95%confidenceintervalsforquantitativeresults.Weperformedqualitativeanalysisoffree-text responsestoidentifythemes.

Results: PositivefactorsforapplicantswereinteractionswithEMfaculty(4.29on1–5scale)and residents(4.42)aswellasclinicalexperiencesinthird-year(4.53)andfourth-yearclerkships(4.62). ApplicantscontinuetobedrawntoEMbythevarietyofpathologyencountered(4.66), flexiblelifestyle (4.63),andhigh-acuitypatientcare(4.43).Mostapplicants(68.5%)experiencedadvisementawayfrom EM.Ofthosewhoreceivednegativeadvisement,non-emergencyphysicianswerethemostcommon source(73.3%).FactorsnegativelyinfluencingacareerchoiceinEMwerecorporateinfluence(2.51), EDcrowding(2.52),burnout(2.59),presenceofadvancedpracticepractitioners(APP)inEM(2.63),and workforceconcerns(2.85).Jobconcernsstemmingfromthe2021EMworkforcereportwereidentifiedby respondentsastheprimaryreasonforrecentMatchresults.

Conclusion: Applicantsnotedclinicalexperiencesintheemergencydepartmentandinteractionswith EMattendingsandresidentsaspositiveexperiences.High-acuitypatientcare,varietyofpathology,and flexiblelifestylecontinuetoattractapplicants.ApplicantsidentifiedEMworkforceconcernsasthe primarycontributortorecentEMMatchresults.Corporateinfluence,EDcrowding,burnout,and presenceofAPPsintheEDwerealsosignificantissues.[WestJEmergMed.2025;26(2)261–270.]

INTRODUCTION

Emergencymedicine(EM)hashistoricallyenjoyedavery competitiveoutcomeintheNationalResidencyMatching Program(NRMP,or “theMatch”)with >95%ofprograms fillingtheirspots.1 Beginningin2022,however,adramatic declineoccurredleavingmanyprogramsunfilled.2 This declinecontinuedin2023,with46%ofEMprograms remainingunfilled.3 Although79.1%ofthoseprograms filled intheSupplementalOfferandAcceptanceProgram(SOAP),4 thisrepresentsatremendouschangefrompreviousyears.

Thecauseofthischangeislikelymultifactorial,with majorcontributingfactorsbeingtheexpansionofthe numberofresidencypositions,studentperceptionsofthe futurejobmarketwithinEM,andthevirtualinterview format.5,6 Otherproposedetiologiesofthedeclineinclude thecorporatepracticeofEM(whichoccurswhenanonphysicianorcorporationexertscontroloverthemedical decision-makingorcollectsreimbursementforthemedical servicesofphysicians),7 theexpandeduseofadvanced practicepractitioners(APP)suchasphysicianassistantsand nursepractitionersintheemergencydepartment(ED),and increasedburnoutfollowingaglobalpandemic.6 Concerns regardingthejobmarketandexpandeduseofAPPsare likelyrelatedtothe2021EMworkforcereportbyMarco etal,whichproposedarangeofpotentialoutlooksbasedon variousfactorswiththemostpublicizedresultbeinga projectedoversupplyofemergencyphysiciansby2030.8

Severalfactorsaffectedwhichprogramsweremorelikely togounfilledintheMatch.Getteletalfoundthatprograms accreditedwithintheprevious fiveyears,aswellasprograms thatwereunderfor-profitownershipweremorelikelytogo unfilled.9 Anotherstudyfoundthatpredictorsofnot filling werehavingunfilledpositionsinthepreviousMatch,a smallerprogramsize,locationintheMid-AtlanticorEast NorthCentralarea,priorAmericanOsteopathicAssociation accreditation,andcorporateownershipstructure.10 Overall, programsfelttheirmatchoutcomeswereworsethanin previousyears,buttheyperceivedthequalityofapplicantsas similartopreviousyears.5

Manyfactorsinfluenceastudent’sdecisiononwhich specialtytopursueincludingrolemodels, financialincentives, gender,degreeofpatientcontact,proceduralskills,prestige, andlifestyle.11–14 Thefactorsmostassociatedwithachoiceto specializeinEMincludelifestyle,diversityofpatient presentations, flexibilityinchoosingapracticelocation,worklifebalance,andperceivedjobsatisfaction.15–19 Factors associatedwithearlierselectionofEMincludeearlyexposure tothe field,presenceofanEMresidencyprogramata student’smedicalschool,prioremploymentintheED, previousexperienceasaprehospitalpractitioner,and completionofathird-yearEMclerkship.16

InthisstudywesurveyedEMapplicantsfrom2022and 2023toidentifyfactorsdeterringorattractingthemtothe

PopulationHealthResearchCapsule

Whatdowealreadyknowaboutthisissue? Applicantandspecialtycharacteristics attractingapplicantstoEMhavebeen previouslydocumented.

Whatwastheresearchquestion?

Whatfactorsdeterredandattracted applicantstoEMduringthe2023Match?

Whatwasthemajor findingofthestudy?

The4th-yearclerkshipwasthemajor attractingfactor(mean4.62,95%CI 4.50 – 4.74),whilecorporatein fl uence(mean 2.51,95%CI2.33 – 2.69)wasthestrongest deterringfactor.

Howdoesthisimprovepopulationhealth?

These fi ndingsoffernewinsightsinto applicantperspectivesofEMandspecialtychoiceconsiderationsfollowingthe 2023Match.

specialtyaswellasmodifiableinfluencesimpactingtheir careerdecisions.TorestorethecompetitivenatureofEMin theMatch,itisimportanttoknowwhatmotivatesmedical studentstoselectEMasaspecialtyinthecurrent environment.Itisadditionallyimportanttofurther understandthefactorscontributingtodecreasedinterestin EM,sothatwecancontinuetoaddresstheseasaspecialty.

METHODS

TheprojectwasconceivedbytheCouncilofResidency DirectorsinEmergencyMedicine(CORD)MatchTask Force,whichincludesrepresentativesfromtheAmerican AcademyofEmergencyMedicine(AAEM),American AcademyofEmergencyMedicineResidentandStudent Association(AAEM/RSA),AmericanCollegeof EmergencyPhysicians(ACEP),AmericanCollegeof OsteopathicEmergencyPhysicians(ACOEP),ACOEP ResidentandStudentOrganization(ACOEPRSO), AssociationofAcademicChairsinEmergencyMedicine (AACEM),CORD,EmergencyMedicineResidents’ Association(EMRA),theSocietyforAcademicEmergency Medicine(SAEM),andSAEMResidentsandMedical Students(SAEMRAMS).Taskforcememberscollaborated todesignthesurveyinstrument.Theconclusionsinthispaper representtheviewsandopinionsoftheindividualauthors anddonotrepresenttheviewsoftheorganizations.The

studywasapprovedbytheLomaLindaUniversityHealth InstitutionalReviewBoard.

WeperformedaliteraturereviewusingPubMedtocollect studiesinvestigatingfactorsimpactingresidencyapplicants’ specialtychoice.Questionswereadaptedfromprior publishedstudies.16,20 Currentfactorsnotpreviously investigated,suchasCOVID-19orEMworkforce projections,wereaddedfollowinganiterativeprocessof consensusdevelopmentwithintheresearchgroup.The surveywasreviewedbytheCORDMatchTaskForce membersandedited.Thesurveywasthenpilot-testedby currentmedicalstudentsandresidents.Weanalyzedthe responses,andthesurveywasrevisedforclarityandbrevity followingthebetarespondents’ feedback.

Medicalstudentswereaskedmultiple-choicequestions regardingtheirresidencyapplicationstrategyincluding whethertheyhadappliedtomorethanonespecialtyand,if so,whichspecialtiestheyappliedto.Thesurveyparticipants wereaskedtorankspecialtycharacteristicsinfluencingtheir choiceofEMasacareerona five-pointLikertscalefrom stronglypositivetostronglynegative.Theywerealsoasked toranktheimpactofpriorexperiencesontheirspecialty choiceona five-pointLikertscalefromverypositivetovery negative.Weinvestigatedtheimpactofcareeradvisement usingmultiple-choicequestionswiththeoptiontoselectup tothreeresponses.Finally,free-textresponsequestionswere askedtoassessapplicants’ opinionsaboutthecausative factorsleadingtothe2023EMMatchresults.Comment inthisspacewasoptionalandnotmeanttoreach saturationofthemes;rather,itwasmeanttoprovide participantstheopportunitytogiveadditionaldetails abouttheirexperiences.

WeusedaconveniencesampleofEM-boundmedical studentswhoappliedinboththe2022and2023Matchand thosewhoconsideredorareconsideringapplyingtoEMin upcomingMatchcycles.Surveyrespondentsweresenta web-basedsurveyviaQualtrics(QualtricsInternational,Inc, Seattle,WA)inthesummerof2023.Remindermessages weredistributedmonthlyduringthedatacollectionperiod. Thesurveywasdistributedthroughthelistservsofcurrent medicalstudentsinterestedinEMasidentifiedbytheir membershipinanEMnationalorganizationincluding AAEM/RSA,ACOEPRSO,EMRA,andSAEMRAMS. SurveyswerealsodistributedthroughtheSAEMClerkship DirectorsinEmergencyMedicine(CDEM)listservtobesent totheirrecentlymatchedapplicantswhomatchedintoEM orhadconsideredbutultimatelydecidednottopursueEM. Conveniencesamplingvialistservdistributiondidnotallow forsurveydistributionquantificationorresponse-rate calculation.Comparingthenumberofsurveyresponses (213)tothenumberofapplicantstoEMinthe2023Match (2,765)showsoursurveyresponseswereequalto7.7%ofthe totalnumberofEMapplicantsin2023.Theintendedsurvey participantsincludedmedicalstudentswho1)consideredbut

ultimatelydidnotapplytoEMresidency;2)appliedtoEM astheironlyspecialtychoice;3)dualappliedtoEMandan alternatespecialtychoice;or4)enteredEMthrough theSOAP.

A financialincentiveofa$10electronicgiftcardwasgiven tothe first160participants.Financialsupportforthestudy wasprovidedbyAAEM,AAEM/RSA,ACEP,ACOEP, AACEM,CORD,andSAEM.

WeanalyzeddatausingMicrosoftExcel365(Microsoft Corporation,Redmond,WA)tocalculatemeansand percentages.Wecalculated95%confidenceintervals(CI) usinganonlinetool.21 Aphenomenologicalapproachto qualitativeanalysiswasusedandfree-textresponseswere codedbytwoauthorswithexperienceinqualitativeanalysis (JM,BM)afterestablishingacodebookthroughaniterative processtogenerateanunderstandingofthephenomenonof theEMmatchprocessinconcertwiththequantitative questions.Anydisagreementsbetweencodeswereresolved byathirdauthor(MK).

RESULTS

Wereceivedresponsesfrom213individuals. Demographicsareshownin Table1.Mostrespondents (92.8%)hadappliedtoresidencyalready.Ofthose,87.2% appliedtoEMintheMatch.RespondentssecuredanEM residencypositioninthe2023Match(69.5%),2022Match (9.6%),2023SOAP(12.3%),2022SOAP(0.5%),andby othermeans(5.3%).Asmallproportionofrespondents (2.7%)werenotenteringEMresidency.

Incomparisontoapplicantssecuringapositioninthe 2023Match,oursamplewasfairlysimilarwithregardto genderbreakdown(57.2%male,39.9%femaleinoursample vs54.8%male,45.2%femaleintheMatch)butoversampled osteopathicseniors(42.7%inourstudyvs24.3%inthe Match).Regardingapplicationstrategy,70.1%appliedto onlyEMresidencies.Someindividualsappliedtomorethan onespecialtywithEMpreferred(12.3%).Themostcommon secondaryspecialtieswereinternalmedicineandfamily medicine.ApplyingtoEMasthesecondaryspecialty occurredin2.1%ofindividualswithprimaryspecialties beinganesthesiology,interventionalradiology,orthopedic surgery,andphysicalmedicineandrehabilitation. RespondentswhochosenottoapplytoEMatallmadeup 13.4%ofresponses.Thisgroupofindividualsmost commonlychosetoapplytoanesthesiology(39.1%), orthopedicsurgery(17.4%),generalsurgery(17.4%),family medicine(13.0%),internalmedicine,pathology,and preliminaryyear(each8.7%).(Responseoptionwas “Select allthatapply,” responsesum >100%).

ApplicantsmostcommonlychosetoapplytoEMinthe thirdyearofmedicalschool(33.5%)orbeforemedicalschool (33.0%).Theremainingresponseswereevenlysplitamong thepre-clinicalyearsofmedicalschool(6.8%),thefourth yearofmedicalschool(8.9%),aftermedicalschool(6.8%),

Table1. Demographicdataofsurveyrespondents.

Characteristics

Age(years)(n = 173) N(%) <

Genderidentity(n = 173)

Male 99(57.2%)

Female 69(39.9%)

Non-binary/thirdgender 1(0.6%)

Prefernottosay 4(2.3%)

Race(n = 177)

AmericanIndian/AlaskaNative 1(0.6%)

Asian 20(11.3%)

Black/AfricanAmerican

Hawaiian/PacificIslander 0 White

Prefernottosay 6(3.4%)

Ethnicity(n = 173)

Hispanic/Latino 18(10.4%)

NotHispanic/Latino

Prefernottosay 8(4.6%)

Medicalschoolbackground(n = 211)

MDinUS 85(40.3%)

DOinUS 90(42.7%)

UScitizenIMG 28(13.3%)

Non-UScitizenIMG 8(3.8%)

Medicalschooltype(n = 171)

Private

Public

Other 1(0.6%)

Medicalschoolgeographicregion(n = 171)

Central(IA,IL,IN,KS,MI,MN, MO,MT,ND,NE,OH,SD,WI) 43(25.1%)

Northeast(CT,DC,DE,MA, MD,ME,NH,NJ,PA,RI,VT)

29(17.0%)

South(AL,AR,FL,GA,KY,LA, OK,MS,NC,SC,TN,TX,VA,WV) 70(40.9%)

West(AK,AZ,CA,CO,HI,ID, NM,NV,OR,UT,WA,WY)

andduringSOAP(8.4%).ParticipantswereexposedtoEM intheirmedicalschoolviarequiredEMclerkshipsinthe fourthyear(42.1%),requiredclerkshipsinthethirdyear (24.0%),EMelectivesinthefourthyear(17.0%),andEM electivesinthethirdyear(11.1%). Table2 showsthedegree ofinfluenceeachfactorheldintheapplicants’ choiceofEM asacareer.Themostfrequentlycitedpositiveinfluenceswere EMresidentsonshift(4.42ona1–5scale),EMattendingson shift(4.29),thefourth-yearEMclerkship(4.62),andthirdyearEMclerkship/elective(4.53).PriorexperienceintheED inanon-physicianrole(4.43),inemergencymedicalservices (EMS)(4.52)orasascribe(4.55),wereidentifiedless frequentlybutasverypositivefactors.

Jobconcerns/workforcereport(65.8%),burnout(56.7%), increaseduseofadvancedpracticepractitioners(APP) (50.8%),andcorporateinfluenceinEM(42.5%)werethe most-citedreasonsforadvisingapplicantsawayfromEM. Emergencydepartmentcrowding(12.5%)andEM experienceduringtheCOVID-19pandemic(5.8%)wereless commonlycitedconcerns.Participantswereaskedabout advisementanditsinfluenceontheirspecialtychoice:68.5% reportedbeingadvisedagainstchoosingEMresidency training.Themostcommonsourcesofadvisementaway fromEMwereattendings/residentsinnon-EMspecialties (73.3%),peers(50.0%),socialmedia/messageboards (47.5%),andEMattendings(37.5%).Medicalschool representativesintheDean’sofficeaccountedforasmall proportionofadvisementawayfromEM(15.8%).Most participantsinoursurvey(81.8%)reportedthatadvising againstenteringEMdidnotchangetheirapplication strategy.Ofthosewhoinitiallypursuedadifferentspecialty 5.7%ultimatelyenteredEMintheSOAP,5.0%appliedto anotherspecialtyasabackuptoEM,and3.3%appliedto EMasabackupspecialty.Ofthoseapplicantswhodidnot changeapplicationstrategydespitenegativeadviceabout EM,themostcommonlycitedreasonswereperceived fitwith EM(73.7%), flexiblelifestyleofEM(64.6%),lackofinterest inotherspecialties(49.5%),anddoubtinaccuracyof workforcereport(49.5%).

Veryfewparticipantssaidtheywouldnotadviseafriend toapplytoEMforthe2024Match(2.3%).Most(75%) wouldadviseafriendtochooseEM.Mostofthosewho indicatedtheywouldadviseafriendagainstapplyingtoEM woulddosobecauseofconcernsabout fitforthespecialty (42.9%)andthejobmarket(22.9%),withcorporatizationof medicine,APPexpansion,andburnoutalsomentioned.

29(17.0%)

IMG, internationalmedicalgraduate; MD,Doctorof Medicine; DO, DoctororOsteopathicMedicine.

Mostsomewhatagreedorstronglyagreedthattheirpeers wouldbemoreinterestedinEMasacareeriftheywere exposedtoEMduringarotationinthethirdyearorearlier (82.7%).Participantswereaskedwhattheythoughtwould makeEMmoreappealingtopeerswhowereundecided

Table2. Factorsinfluencingselectionofcareerinemergencymedicine.

CI, confidenceinterval;

aboutaspecialtybutwereconsideringEM.Themost commonresponsesincludedearlyexposuretoEM(31.5%) andalleviatingconcernsaboutjobsecurityraisedbytheEM workforcereport(30.2%).Othersuggestionsincluded addressingtheexpandeduseofAPPsintheED(10.1%), improvingtheperceptionofEMamongmedicalstudents andphysicians(9.4%),andimprovingwork-lifebalanceand compensation(8.7%and8.1%,respectively).

Table3 showshowapplicantsrankeddifferentfactors whenchoosingEMasacareer.Themostimportantpositive factorswerevarietyofpatientpathology(4.66ona1–5 scale),lifestyle/flexibility(4.63),high-acuitypatientcare (4.43),lengthofresidencytraining(4.37),andfamily considerations(4.36).Participantswereaskedspecificallyif theybelievedthatEMisa “lifestylespecialty,” and60.1% respondedyes;9.0%didnotconsiderEMalifestylespecialty, while28.1%wereneutral,and2.8%wereunsure.Thefactors negativelyinfluencingacareerchoiceinEM,definedas95% CIlessthan3.0,werecorporateinfluenceinEM(2.51, 2.33–2.69),EDcrowding(2.52,2.37–2.67),burnout(2.59, 2.44–2.74),anduseofAPPsinEM(2.63,2.47–2.79).

AverageratingofconcernsaboutEMexperienceduringthe COVID-19pandemic(2.95)andworkforcereport/job securitywasnegative(2.85);however,upperlimitof95%CI waspositive,3.12and3.03,respectively.

Applicantswereaskedtoidentifythemostimportant reasoncontributingtoalarger-than-normalnumberof unfilledpositionsintheEMMatch.Theyidentifiedconcerns aboutjobsecurityandthefutureEMworkforceasthe primaryconcern(Table4).Qualitativeresponsestothe

increaseinunfilledspotsintheEMMatchpredominantly reflectedconcernsregardingtheEMworkforcereportand jobsecurity.Themesandrepresentativequotationsare includedin Table5.

DISCUSSION

ApplicantsinoursurveyweredrawntoEMbyclinical experiencesintheEDduringthethirdandfourthyearandby interactionswithEDresidentsandattendingphysiciansduring thoseexperiences.Unfortunately,onlyasmallproportionof applicantsinoursurveyhadrequiredEMclinicalexperience duringthethirdyearoftraining.Developingbestpractice recommendationsforearlyexposuretoEMduringmedical schoolmaybeanareatotargettoincreaseinterestinfuture applicants.Additionally,employmentinanEM-related field (ie,EMS,scribe)priortomedicalschoolwasalsoapositive experience.Earlyidentificationofthosestudentswithprior EM-relatedemploymentmaybeanareaformentorship effortsbyEMadvisors.

Applicantscontinuetobedrawntothehigh-acuitypatient care,diversepatientpathology,andthe flexiblelifestyleEM offers.These findingsareinlinewithpriorstudiesofEM applicantattitudesandthecornerstoneofEM’ s appeal.12–19,23 Additionalfactorsthatappealtoapplicants arethevarietyoffellowshipoptionsavailableafterEM residency,thelengthofresidencytraining,compensation, andavailabilityofjobsintheirdesiredlocation.Family considerationsareimportanttoapplicantsand,coupledwith thedesirefora flexiblelifestyle,signaladesireforwork-life balance.ShiftworkintheEDhasdownsidessuchassleep

Table3. Importanceofvariousaspectsofemergencymedicinetoapplicantsinthe2023Match. Howimportantwerethefollowing factorsinyourdecisiontoapply toEMresidency

APPs, advancedpracticepractitioners; CI,confidenceinterval; EM, emergency medicine; ED,emergencydepartment.

transitionsassociatedwithnightshiftsandworking weekendsandholidays.However,applicantsweresignaling thoseissuesarestillfavorabletobeingoncallorworkingina

Table4. Singlemostimportantreasonforunfilledemergency medicine(EM)residencypositionsin2022and2023Match, perEMapplicants.

Numberof

Programs’ failuretoadapttochanging applicantpool 2(1.3%)

Note:Totalsexceed100%,asrespondentscouldindicatemorethan one item;%indicates the percentoftotalrespondentsendorsing achoice.

APP,advancedpracticepractitioner; EM,emergencymedicine.

clinic fivedaysaweek.Highlightingthefactorsthatresonate withapplicantsisagoodstartingpointwhenpromoting thespecialty.

WithregardtofactorspushingapplicantsawayfromEM, mostapplicantsexperiencedbadmouthingofEMand advisingawayfromthespecialty.Inpriorstudies,overthreequartersofrespondentsreportedexperiencewith badmouthingofanotherspecialtyandone-quarterchanged theirspecialtychoicebecauseofit.24–26 Whenuncertain applicantsarenarrowingtheirspecialtychoicesbetweena fewseriousoptions,contendingwithnegativityaboutyour careerchoice,bothnowandinthefuture,fromfriendsor mentorsinotherspecialtiesmaybeenoughtoswaysomeone awayfromEM.

ThemostcommonsourceofadviceagainstEMin2023 wasnotfrompeers,formalmentors,orDean’sofficesbut fromattendingsandresidentsinnon-EMspecialties. Experiencingnegativeadvisementfromatrustedmentor aboutone’sdesiredspecialtyislikelyimpactful.Inaddition, applicantsreportedreceivingnegativepressurefromtheir peersandsocialmedia.MostpeopleinvolvedinEMmedical educationsuspectedapplicantswerebeingadvisedaway fromEM.Thiswassuggestedbyourdata.Mostassumed advisorsfromtheDean’sofficewereadvisingstudentsaway fromEMtowardmoreprestigiousspecialtiesorthosewith safermatchrates.Butthatwasnotthecaseinoursurvey,as

Table5. Qualitativeanalysisthemesandrepresentativequotationsregardingthe2022and2023EMmatch.

ThemeCodeGuidelineforuse

Employment opportunities Workforce/jobsecurity

Thiscode is usedwhenparticipantsdiscusstheworkforcereport, jobsecurity,employmentopportunities,ordifficulty findingjobs

• ThereisamythgoingaroundthattherearenotenoughjobsforEMphysiciansafterresidency.Iknowalotofpeoplethatmadethis commentuponsaying I wasapplyingtoEM

• Covid,andthatdamnmemo.Yallshotyourselvesinthedamnfootwiththatboneheadmove

• Workforcereporthysteria

• Theinfamousreportpredictingacominglaborsurplus.Thetiminglinesupandittrackswithwhatfriendsinmedschoolweresaying NumberofresidenciesThiscodeisusedwhenparticipantsdiscussresidencyexpansion

• Increasedamountofresidency program spotscreatedbyCMGhospitals

• Toomanyresidencyprograms

• Surplusof “pop-up” programsleveragingresidentlaborwithnointentionofrealtraining

APPexpansion

• Midlevelcreep

• increasingnumberof NPs/PAs filling inpositions

• PA/NPtakeover

Thiscodeisused when participantsdiscusscompetitionwith APPsforemploymentorincreaseduseofAPPsinEM

• IncreasedNP/PAreplacingjobsandthenMDlicenseonlineforanythingtheydo.Includingsigningtheircharts Practice environment Burnout

• Concernoverburnout

• Fearof burnout

• Emergency doctorsburntout

• TreatmentduringCOVID-19

COVID-19

Thiscodeis used whenparticipantsdiscussburnout

Thiscodeisused when participantsdiscuss theimpactofCOVID-19

• COVID-19experiences, lack ofpatientcareopportunitiesduringCOVID-19

• Highstress,especiallyduringCOVID-19

• COVID-19showedEM’struecolors

• COVID-19experiencesandfearsoffuturehealthrisks

Corporatization

• Corporatetakeover,thusphysicians lose powereveryday

• Corporatepracticeofmedicine

• HCAprograms!!!!!Thereareatonofnew,sketchyprograms.

Thiscodeisused when participantsdiscusscorporatizationof emergencymedicineorprivateequityinfluence

• Increaseinfor-profithospitalslotsavailableinTexas,Cali,andFlorida Qualityoflife,changeinpractice environment(boarding,volume,etc.)

This code isusedwhenparticipantsdiscuss negativepracticefactors

• Lackofperceivedqualityoflife

• Badjobprospects and EDculturehasbecometoxic

• Seeingpatientsinwaitingrooms/bedholds

• CultureofwhatEMhasbecome.Noonewantstochoosetoworkinthisoverrunenvironmentespeciallywhenthejobmarketis uncertainwhentherearespecialtieslikedermatologyandsub-specialtieswhereyoudon’thavetodealwiththechaosandpatient volumeswearenowseeingintheED.ERmedicineisatanall-timelowandneverusedtobethisoverwhelmingpre-pandemic.

Applicantor matchfactors Programs’ failuretoadapt to changing applicantpool

Thiscodeisusedwhenparticipantsdiscussresidencyprograms’ failuretoassesscompetitivenessorselectapplicantsefficiently

• Mismatchbetweenprograms’ opinionof themselves/how theyareperceivedvsactualapplicantperceptionsofprograms.

• Programsbeingoverlyselectiveandnothonestlyintrospectingregardinghowapplicantsperceivetheirprogram

(Continuedonnextpage)

Table5. Continued.

ThemeCodeGuidelineforuse

Perceptionofemergencymedicine

Thiscodeisused when participantsdiscussnegativeperceptions ofemergencymedicineamongstudentsorthroughsocialmedia ormentors

• Lackofrespecttoemergencyphysiciansandthoughtthatwearenotthatsmart

• Perceptionfromattendings of bothEMandnon-EM

• Socialmediainfluenceandimmaturityonbehalfofapplicants

• Decreasedperceivedcompetitivenessleadingtolackofinterest

• Badreputationamongconsultantspecialties

• Jackofalltrades/EMincompetencystigma

APP,advancedpracticepractitioner; CMG,contractmanagementgroup; EM,emergencymedicine; ED,emergencydepartment; HCA,HospitalCorporationofAmerica; NP,nursepractitioner; PA,physicianassistant.

advisorsintheDean’sofficerankedasthesixthmost frequentsourceofadvisementawayfromEM.

AdditionalfactorspushingapplicantsawayfromEM werecorporateinfluenceinEM,EDcrowding,burnout,the useofAPPsinEM,theexperienceofemergencyphysicians duringCOVID-19,andconcernsregardingjobsecurity stemmingfromthe2021EMworkforcereport.Applicants arewaryofenteringaspecialtydominatedbycorporations thatplaceprofitsoverpatientcare.Residenciesatfor-profit clinicalsiteshad1.3timesgreaterriskofnot fillingin2023.9 Applicantsareshowinganaversiontotrainingatthesesites. However,spotscontinueto fillduringthetime-limitedSOAP asunmatchedapplicantsarelikelyexcitedabouttheability tosecureanytrainingposition.Furtherunderstanding applicantconcernsandtheexperiencesofresidentsin for-profitprogramsisimportantandrequiresadditional study.Likewise,understandingtheexperienceofEM residentswhoentertrainingviatheSOAPisvaluablefor futureinvestigation.

Emergencydepartmentcrowdingnotonlynegatively impactsqualityofpatientcare;italsodetersfuture emergencyphysiciansfromenteringthe field.Studentson EDrotationsseethechallengesof findingspacetoreevaluatepatients,delaysinworkup,andprolongedcareof patientsboardingintheEDwhoareawaitinginpatientbeds. Effortstoaddressboardingaswellastheimplementationof surgecapacityplansmayresultinimprovingthisfactoras studentsconsiderspecialtychoice.

Furthermore,burnoutgeneratedthelargestnumberof moderateorstronglynegativeresponses.Emergency medicineiswidelycitedasthespecialtywiththehighest ratesofburnout. 27 , 28 Requirementstopromotewell-being andcounterburnoutexistinbothundergraduate (LiaisonCommitteeonMedicalEducationstandard 12.3) 29 andgraduatemedicaleducation(Accreditation CouncilforGraduateMedicalEducationCommon ProgramRequirementsforresidencyVI.C). 30 Prior

qualitativeresearchsugg estsfacultymodelingmay in fl uenceresidents ’ careerperspectives,indicating targetingfacultyforeducationonwell-beingandburnout mayyieldsubstantialbene fi tsforbothcurrentand prospectiveresidents. 31

Applicants,additionally,haveconcernsabouttheuseof APPsintheED.Manyfree-textresponsescited “ scope creep ” ofAPPsaswellasthenegativeimpactonphysician jobavailabilityasnegativefactors.Applicantssignaledthat theyarepayingattentiontothetopicofAPPusageintheED anditisanimportantissuetothem.NationalleadersinEM areactivelyworkingtoprotectthescopeofallpractitioners intheEDandcontinuetoemphasizetheimportanceof physician-ledpatientcareteams.Furtherdisseminationof theseadvocacyeffortsandtheeffectsonourspecialtywould bebeneficialforapplicants.

Lastly,theworkforcereporthasbeenfrequently hypothesizedasamajorcontributingfactortotherapid declineinEMresidencyapplicationsoverthelasttwoyears.8 ApplicantstoEMinoursurveyconfirmedthishypothesis, citingprojectionsstemmingfromthereportasthemost importantfactorleadingtothesignificantriseinunfilledEM residencypositionsinthe2022and2023Matches. Subsequentstudieshaveaddressedworkforceconsiderations suchasphysicianattritionandgeographicdistribution.32,33 FurtherinvestigationandclarityintothefutureEM workforcewouldaidapplicantsastheyweightheir careerdecisions.

ReinforcingthepositiveaspectsofEMwhileaddressing thenegativefactorsabovewillgoalongwaytoward bolsteringtheEMapplicantpoolandfutureworkforce.The 2023EMMatchwasunprecedentedwith554unmatched positions.However,EMstillmatched2,456applicants,the fourthlargestnumberinthe2023Match.3 Oursurveyyields insightsintothepositiveaspectsofEMthatdrawapplicants tothespecialtyandidentifiesnegativefactorsfollowingthe 2023EMMatch.

LIMITATIONS

Oursurveymaybeimpactedbyselectionbiasasour distributionmethoddidnotguaranteethateveryresidency applicantwhoconsideredapplyingtoEMresidencywas included.Forthisreason,surveyresponseratewasnot calculated,anditisunknowntowhatextentourresultsare representativeofallEMresidencyapplicantsinthe2022and 2023Matchcycles.Additionally,recallbiasmayalso contributeasresponsesfromapplicantswhomatchedtoEM in2022wereincluded.Aspotentialsurveyparticipantswere identifiedthroughtheirmembershipinnationalEMresident andstudentorganizations,thisstudymaynotbe representativeofindividualswhoconsideredEMearlyin theirmedicalschoolcareerandultimatelydidnotpursue EM.Theexactnumberofindividualswhoreceivedthe surveysolicitationisnotknown,makingitimpossibleto calculatearesponserate.Oursurveyresponsesrepresent 7.7%ofthetotalnumberofapplicantstoEMin2023, althoughitisunlikelythesurveyreachedallapplicantsinthe pool.Futurestudiesmaybenefitfromalongitudinal approachsolicitingEMinterest-groupparticipantsinthe firsttwoyearsofmedicalschoolandfollowing themthroughtheirrespectiveMatchyearsto improveresponserate.

CONCLUSION

Thespecialtyofemergencymedicineexperiencedasharp increaseinun fi lledpositionsinthe2022and2023matches. MostapplicantsreceivedadvisementawayfromEMwith themostcommonsourcebeingphysiciansinnon-EM specialties.Applicantsperceivecorporatein fl uenceinEM, EDcrowding,burnout,in fl uenceofadvancedpractice practitionersinEM,andworkforceconcernsasdriving forcesbehindtheEMMatchresults.Applicantscited clinicalexperiencesintheEDandinteractionswith EMattendingsandresidentsaspositivefactors.Highacuitypatientcare,diversepatientpathology,and fl exible lifestylewereseenaspositivecharacteristicsofacareer inEM.

ACKNOWLEDGMENTS

Theauthorswouldliketothanktheboardsofdirectorsof AAEM,AAEM/RSA,ACEP,ACOEP,AACEM,CORD, andSAEMforprovidingfundsforparticipantincentives. TheauthorswouldalsoliketothankAAEM/RSA,ACOEP/ RSO,CDEM,EMRA,andSAEMRAMSforassistance withdistributingthesurveyviatheirrespectivelist-servsand tothankthemembersoftheCORDMatchTaskForcefor assistanceindevelopingthesurveytool.

AddressforCorrespondence:MichaelKiemeney,MD,LomaLinda UniversitySchoolofMedicine,11234AndersonSt,MC-A208,Loma Linda,CA92354.Email: mkiemeney@llu.edu

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Surveyincentivesupportwasprovided byrespectiveBoardsofDirectorsforAAEM,AAEM/RSA,ACEP, ACOEP,AACEM,CORD,andSAEM.Themanuscriptrepresents theindividualauthor’sopinionsanddoesnotrepresenttheopinions oftheorganizationsproviding financialsupport.Therearenoother conflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Kiemeneyetal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.Nationalresidentmatchingprogram.Resultsanddata:2021main residencymatch.2021.Availableat: https://www.nrmp.org/wp-content/ uploads/2021/08/MRM-Results_and-Data_2021.pdf AccessedOctober3,2023.

2.NationalResidentMatchingProgram.2022NRMPmainresidency match:matchratesbyspecialtyandstatenationalresidentmatching program.2022.Availableat: https://www.nrmp.org/wp-content/uploads/ 2022/11/2022-Main-Match-Results-and-Data-Final-Revised.pdf AccessedOctober3,2023.

3.NationalResidentMatchingProgram.Resultsanddata:2023main residencymatch.Availableat: https://www.nrmp.org/wp-content/ uploads/2023/05/2023-Main-Match-Results-and-Data-Book-FINAL. pdf.AccessedOctober3,2023.

4.NationalResidentMatchingProgram.Resultsanddata:2023main residencymatch.Page46.Availableat: https://www.nrmp.org/wpcontent/uploads/2023/05/2023-Main-Match-Results-and-Data-BookFINAL.pdf.AccessedOctober3,2023.

5.MuranoT,WeizbergM,BurnsB,etal.Decipheringachangingmatch environmentinemergencymedicineandidentifyingresidencyprogram needs. WestJEmergMed. 2023;24(1):1–7.

6.LewisM,WilliamsK,TimpeJ,etal.The2022and2023emergency medicineresidencymatch:acautionarytale. Cureus. 2023;15(5):e38601.

7.AmericanAcademyofEmergencyMedicine.Corporatepractice. Availableat: https://www.aaem.org/publications/key-issues/corporatepractice/.AccessedOctober24,2024.

8.MarcoCA,CourtneyDM,LingLJ,etal.Theemergencymedicine physicianworkforce:projectionsfor2030. AnnEmergMed. 2021;78(6):726–37.

9.GettelCJ,BennettCL,RothenbergC,etal.Unfilledinemergency medicine:ananalysisofthe2022and2023matchbyprogram accreditation,ownership,andgeography. AEMEducTrain. 2023;7(4):e10902.

10.PreiksaitisC,KrzyzaniakS,BowersK,etal.Characteristicsof emergencymedicineresidencyprogramswithunfilledpositionsinthe 2023match. AnnEmergMed. 2023;82(5):598–607.

11.YoonJD,HamSA,ReddyST,etal.Rolemodels’ influenceonspecialty choiceforresidencytraining:anationallongitudinalstudy. JGradMed Educ. 2018;10(2):149–54.

12.DeZeeKJ,MaurerD,ColtR,etal.Effectof financialremunerationon specialtychoiceoffourth-yearU.S.medicalstudents. AcadMed. 2011;86(2):187–93.

13.LambertEMandHolmboeES.Therelationshipbetweenspecialty choiceandgenderofU.S.medicalstudents,1990–2003. AcadMed. 2005;80(9):797–802.

14.TeitelbaumHS,EhrlichN,TravisL.Factorsaffectingspecialty choiceamongosteopathicmedicalstudents. AcadMed. 2009;84(6):718–23.

15.RosenB,RosenP,SchoferJ,etal.Isemergencymedicinetheright choiceforme? JEmergMed. 2019;56(3):e35–8.

16.RayJC,HopsonLR,PetersonW,etal.Choosingemergencymedicine: influencesonmedicalstudents’ choiceofemergencymedicine. PLoS One. 2018;13(5):e0196639.

17.BoydJS,ClyneB,ReinertSE,etal.Emergencymedicinecareerchoice: aprofileoffactorsandInfluencesfromtheAssociationofAmerican MedicalColleges(AAMC)graduationquestionnaires. AcadEmergMed. 2009;16(6):544–9.

18.KeithKC,SmithE,ReddyS,etal.Lifestylefactorsandotherinfluences onmedicalstudentschoosingacareerinemergencymedicine. AEM EducTrain. 2021;5(1):37–42.

19.KazziAA,LangdorfMI,GhadishahD,etal.Motivationsforacareerin emergencymedicine:aprofileofthe1996USapplicantpool. CJEM. 2001;3(02):99–104.

20.AAMC.Graduationquestionnaire(GQ).2024.Availableat: https://www. aamc.org/data-reports/students-residents/report/graduationquestionnaire-gq.AccessedMay15,2024.

21.Caculator.net.Confidenceintervalcalculator.Availableat: https://www. calculator.net/confidence-interval-calculator.html AccessedFebruary6,2024.

22.EmergencyMedicineResidents’ Association.Consensusstatementfor theemergencymedicine2022–2023residencyapplicationcycle regardingemergencymedicineawayrotations.2022.Availableat:

https://www.emra.org/be-involved/be-an-advocate/working-for-you/ statement-for-residency-application-cycle-em-away-rotations AccessedMay7,2024.

23.MackeyC,FeldmanJ,PengC,etal.Howdoemergencymedicine applicantsevaluateresidencyprogramsinthepost-COVID-19era? AEM EducTrain. 2022;6(6):e10805.

24. Ajaz A,DavidR,BrownD,etal.BASH:Badmouthing,attitudesand stigmatisationinhealthcareasexperiencedbymedicalstudents. BJPsychBull. 2016;40(2):97–102.

25.HolmesD,Tumiel-BerhalterLM,ZayasLE,etal. “Bashing” ofmedical specialties:students’ experiencesandrecommendations. FamMed. 2008;40(6):400–6.

26.AlstonM,Cawse-LucasJ,HughesLS,etal.Thepersistenceofspecialty disrespect:studentperspectives. PRiMER. 2019;3:1.

27.ShanafeltTD,BooneS,TanL,etal.BurnoutandsatisfactionwithworklifebalanceamongUSphysiciansrelativetothegeneralUSpopulation. ArchInternMed. 2012;172(18):1377–85.

28.LuDW,DresdenS,McCloskeyC,etal.Impactofburnoutonselfreportedpatientcareamongemergencyphysicians. WestJEmerg Med. 2015;16(7):996–1001.

29.LiaisonCommitteeonMedicalEducation.Functionsandstructureofa medicalschool.Standardsforaccreditationofmedicaleducation programsleadingtotheMDdegree.2023.Availableat: https://lcme.org/ publications/.AccessedOctober3,2023.

30.ACGME.Commonprogramrequirementsforresidency,effective July1,2023. https://www.acgme.org/programs-and-institutions/ programs/common-program-requirements/

31.LuDW,GermannCA,NelsonSW,etal. “Pullingtheparachute”:a qualitativestudyofburnout’sinfluenceonemergencymedicineresident careerchoices. AEMEducTrain. 2020;5(3):e10535.

32.GettelCJ,CourtneyDM,JankeAT,etal.The2013to2019emergency medicineworkforce:clinicianentryandattritionacrosstheUS geography. AnnEmergMed. 2022;80(3):260–71.

33.GettelCJ,CourtneyDM,AgrawalP,etal.Emergencymedicine physicianworkforceattritiondifferencesbyageandgender. Acad EmergMed. 2023;30(11):1092–100.

Emergency Medicine Residency Website Wellness Pages: A Content Analysis

Louisiana State University Health Sciences Center, Department of Emergency Medicine, New Orleans, Louisiana University of Texas Southwestern Medical Center, Department of Emergency Medicine, Dallas, Texas

Section Editor: Jules Jung, MD, MED

Submission history: Submitted August 28, 2024; Revision received February 16, 2025; Accepted February 21, 2025

Electronically published May 16, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem

DOI 10.5811/westjem.34873

Introduction: The COVID-19 pandemic impacted the way medical students seek residency positions. In 2020, the Accreditation Council for Graduate Medical Education advocated for virtual interviews. Most emergency medicine (EM) interviews in 2023 remained virtual, and this format will persist for the foreseeable future. Since students are not evaluating programs in person in most cases, residency websites are crucial for prospective residents. Resident wellness is critical for resident training and important to prospective residents; it follows that programs must be transparent about resident wellness on websites. In this study we aimed to quantify the number of EM programs with wellness pages on their websites and identify themes portrayed on those pages.

Methods: We analyzed residency website wellness pages from EM websites based on the 2022 directory of the Electronic Residency Application Service. We independently coded wellness statements through an inductive process. Codes were revised iteratively to consensus and organized into themes.

Results: We identified 278 (100%) EM residency websites. Of these websites, 57 (20.5%) had a wellness page, 45 (16.2%) linked to an institutional page that discussed wellness, 169 (60.8%) discussed wellness themes on their website in areas other than a wellness page, and 69 (24.8%) had no direct mention of wellness anywhere on their website. Using this information, we identified themes including community involvement, growth and development, nutrition and health, psychological well-being, social and relaxation activities, wellness culture and environment, wellness curriculum, wellness structure and resources, and work-life integration.

Conclusion: Most EM program websites do not include a wellness page. Of the programs that do, we identified important themes. The absence of dedicated wellness pages on most EM websites suggests an opportunity for programs to better communicate their wellness initiatives to applicants, helping them identify programs that align with their values. [West J Emerg Med. 2025;26(3)573–579.]

INTRODUCTION

Physician wellness is critical.1 Burnout, as defined in the 1970s by Herbert Freudenberger, details the repercussions of significant amounts of stress in “helping professions.”2 Emergency medicine (EM) has a higher rate of burnout when compared to other specialties.3 A national EM resident wellness survey study disseminated in 2017 found that 77.7%

of residents were identified as burned out.4 The COVID-19 pandemic exacerbated levels of burnout in subsequent years.3 The COVID-19 pandemic also created a shift in the way medical students apply for residency. Specifically, in June 2020 the Accreditation Council for Graduate Medical Education released a statement that advocated for virtual interviews.5 The Association of American Medical Colleges advocated for a

continuation of virtual interviews for the 2022-2023 cycle, as it eliminates financial and scheduling challenges for programs and applicants. The Council of Residency Directors in Emergency Medicine further encouraged EM residency programs to follow virtual interview guidelines. It appears that virtual residency interviews will remain the dominant format for years to come. Emergency medicine-bound students list program websites as the most important factor in determining their rank lists in the post-COVID-19 era and rank program websites as important in determining faculty reputation, program diversity and inclusion, and program culture.6,7 We can expect EM students to continue to rely heavily on program websites for the foreseeable future.

Residency training is known to have negative effects on physical, emotional, and social wellbeing.8 Eliminating the in-person evaluation of perceived happiness and comradery among residents will require programs to be transparent about resident wellness on websites.5 A survey conducted in 2022 showed that resident wellness was identified by medical students as the most important content on a residency website.9 Studies have been performed evaluating wellness content on program websites in several fields, but not in EM. By analyzing how wellness is presented on EM websites, we aimed to provide insight into the current landscape of this aspect of wellness communication.

METHODS

We obtained a list of all EM residency programs accepting Electronic Residency Application Service (ERAS) applications for the 2022 application cycle. The EM ERAS directory was accessed in March 2023, and we accessed each EM program website between April 1–April 30, 2023. If the ERAS directory did not link to a program website, we performed a Google search to identify the residency program website. Each program website was reviewed by a single reviewer to determine whether there was a wellness page on the website. We also reviewed each website in its entirety to determine whether other pages on the website discussed themes of wellness or wellbeing. If a website linked to an institutional page discussing wellness separate from the residency website, that page was reviewed as well. Wellness pages and linked graduate medical education (GME) pages were recorded in an Excel spreadsheet with the full text from that webpage.

At the time of analysis, AS was a medical student, bringing a unique perspective as a prospective applicant navigating the residency application process. BM was an associate program director during analysis. The combination of these viewpoints allowed for a more comprehensive understanding of the wellness information presented on residency program websites. Using constructivist grounded theory, each author independently examined the full text from 15 wellness pages to generate initial codes. AS identified 138 descriptive codes, and BM identified 70 descriptive codes (184 unique codes combined). After discussion and review of each of the first 15 statements, areas of overlap were

Population Health Research Capsule

What do we already know about this issue? Wellness is important to applicants when deciding how to rank residency programs. However, wellness content is not always available on program websites.

What was the research question? What percentage of EM programs have a wellness page on their website and what themes are discussed?

What was the major finding of the study?

20.5% of programs had a wellness page, while 24.8% did not mention wellness anywhere on their website.

How does this improve population health? Enhancing website wellness content can improve applicant decision-making and encourage programs to think deliberatively about their wellness initiatives.

identified, and the initial codes were consolidated into a codebook containing 47 codes.

We independently reviewed an additional 10 website texts, and the codebook was revised. One code was added, two were removed, and five were redefined, resulting in 46 final codes. We then re-coded the first 25 wellness statements using the updated codebook. The coding structure was stable, and no additional codes were added to the codebook. We coded the remaining wellness statements and identified themes. We then held a Zoom meeting to discuss and resolve discrepancies. During this meeting, we reviewed the source text simultaneously with the codebook open. Through discussion and mutual agreement, we resolved all discrepancies without having to involve an additional coder or arbitrator. Themes were identified by grouping related codes into broader conceptual categories that represented patterns in website wellness content. We discussed these themes and agreed upon them. This study was determined to be non-regulated research by the University of Oklahoma Internal Review Board in February 2023.

RESULTS

We identified 278 EM residency programs based on the 2022 ERAS Directory list of participating programs and specialties. Websites were identified and accessed for 278 (100%) programs. Fifty-seven programs (20.5%) had a main page or subpage dedicated to wellness or wellbeing, 169

(60.79%) programs discussed wellness somewhere on their website other than on a page dedicated to wellness, 45 (16.19%) programs linked to a GME page highlighting wellness, and 69 (24.82%) programs did not directly mention wellness or wellbeing anywhere on their website. Programs were counted in multiple categories if information was included in multiple areas of their website.

Of the 57 programs that had a page dedicated to wellness on their departmental website, 22 (38.6%) were titled “Wellness,” 12 (21.05%) were titled “Resident Wellness,” and the remainder were a variation of wellness or wellbeing. A complete list of page titles can be found in Appendix A. One wellness page contained pictures only, and it was not included in the content analysis. The most common subjects discussed on wellness pages included social events, mental health, physical health, institutional support, wellness didactics, and burnout. The percentage of programs that discussed each subject can be found in Table 1. The least common subjects, defined as <5%, that appeared on residency wellness pages were empathy, achievement, personal development, legal concerns, leadership skill development, lack of personal fulfillment, imposter syndrome, and harassment.

Nine broader themes emerged from analysis of EM residency website wellness pages:

Theme 1: Social and Relaxation Activities

The most common theme that appeared was social and relaxation activities. 83.9% of programs highlighted retreats, class activities, and other social events on their wellness page and 68.4% included pictures of their residents participating in social activities.

These outings include paintball, large group dinners, and outdoor activities such as skiing and team sports. In the past they have organized softball games and ping-pong and bowling tournaments between the other local EM residencies.

Theme 2: Psychological Wellbeing

Many programs found it difficult to discuss wellness without discussing burnout. Programs also included their approach to mitigating burnout and building resilience.

Healthcare providers are not immune to poor wellness and well-being, and their high prevalence of burnout, depression, anxiety, and sleep disorders are all contributing factors.

At this monthly get together, at an attending’s house we discuss building resilience and professional excellence and externalizing and highlighting serious threats to wellness like substance abuse, interpersonal conflict, and PTSD.

Sappington

Programs acknowledged that residents are partners in improving wellness and the best initiatives are often resident driven.

We understood the importance of resident input and feedback into their own wellness. Who else would know what residents need, in terms of wellness, other than residents themselves?

This year we added a 4th elected chief resident position specifically dedicated to wellness!

Wellness is not one size fits all and frequently requires a more individualized approach.

We understand that wellness is not mandatory events, meditation, and yoga for everyone. While we have a robust curriculum to explore the different avenues of wellness, we encourage our residents to identify their own stress relieving practices and to maintain those activities to avoid burning out.

While we know that residency is hard, we also know that “wellness” is a moving target and that which makes a person “well” is highly individualized.

Theme 3: Nutrition and Health

About half of programs mentioned food available to residents while at the hospital and on shift, as well as gyms and other physical fitness resources available to residents.

Residents have their own dedicated lounge and fully stocked fridge with food and drinks. There are food trucks available at nearly all hours of the night out front of the main facility to take care of those evening and late-night cravings. Tired of the cafeteria food? Need something quick? The snack shack in the main ED provides this – of course available to use with your meal stipend.

Theme 4: Wellness Structure and Resources

Many programs had both departmental and institutional structure to wellness and addressed their holistic approach on their website.

The infrastructure supports and promotes preventative care, healthy living, mental health, second-victim support, work-life balance, and peer-peer counseling and mentoring.

Our wellness activities focus on service, resiliency, and career development, and will continue to grow with creative ideas to support and empower residents.

Through intentional reflective practices, didactic

sessions, and interactive social opportunities, our goal is to help residents maintain perspective and create healthy habits that promote longevity in Emergency Medicine.

Programs also included information about counseling services or methods of monitoring mental health throughout residency.

Trainees are required to complete the Well-being Index twice each year while in their training program. During resident/fellow semi-annual review meetings with their Program Director, one of the topics for discussion will be the trainee’s self-care and completion of the Wellbeing Index.

Additionally, 42.9% of wellness pages talked about a wellness committee.

Specific goals of the wellness committee include: Promote a healthy work life balance. Provide physical, psychological, social and professional wellness education. Maintain a peer support and advocacy network for the residents.

The Wellness Committee, made up of attendings and residents from all years, provides resources, workshops and events to build and support the physical, psychological and emotional well-being of our emergency department.

The Department of Emergency Medicine has established a wellness committee to promote the wellness of its residents through a multifaceted approach that includes education, social programming, mentorship, and organization-directed interventions.

Theme 5: Wellness Culture and Environment

The clinical environment can be an impediment to resident wellness. Some programs discussed their wellness culture and how they can make changes in the clinical and learning environments to positively impact their team.

Implementing projects designed to improve the meaning residents find in their daily work.

Advocating for changes in the learning environment that will improve resident well-being without compromising patient care or education.”

Theme 6: Wellness Curriculum

Nearly all programs have a didactics section on their website, but 51.8% of programs with wellness pages featured ways that they incorporate wellness topics into their didactic sessions.

Developing a wellness curriculum that includes traditional lectures (depression, substance use), faculty panels (sleep, work-life balance), guest speakers (financial health), and experiential exercises (yoga, mindfulness).

Theme 7: Work-Life Integration

The scheduling demands of residency are one of the drivers of decreased wellness. Many programs mentioned how their schedule and other residency requirements directly affects wellness.

Sleep loss has negative effects including learning and cognition which is why it is important to avoid sleepless nights and to watch for circadian violations.

Resident centric scheduling, maximizing vacation preferences.

Every block, each residency class will have a protected Wednesday evening as a class after Grand Rounds to spend time as a class, have social events sponsored by the residency program, catch up on appointments, errands, or have family time.

In order to decrease physician burn-out, our shifts are 8-hours in length. We also encourage our providers to stop seeing new patients 1-hour before shifts end in order to decrease charting-time past your shift. Moreover, as you progress in your residency, the total number of your shifts per block gradually decreases, allowing for more time for other activities.

Theme 8: Growth and Development

Professional growth and development is a natural and necessary part of residency. Many programs outlined various curricula and mentoring programs that help their residents succeed professionally and improve their mental wellbeing.

The focus of coaching is to improve current performance by helping a person learn how to do things better to reach their desired outcome. The goal of coaching is to help trainees reach their peak potential, personally and professionally while in training.

Faculty mentors are chosen for their professional and life experience and ability to model and mentor healthy life/work balance and continued joy and success in their practice of medicine. Through the faculty mentorship program, residents are guided through their residency and are able to learn and adopt skills from their mentor’s years of experience.

Theme 9: Community Involvement

Although community involvement appeared less frequently than many other themes, some programs highlighted their involvement in wellness and advocacy on a national level.

Our section strives to provide local, regional and national leadership toward improving the health and wellness of all physicians and healthcare providers. Leaders in our department have been involved in advocating for and promoting local, regional, and national change in the healthcare system with the goal to improve wellness for physicians and healthcare providers.

Forty-five programs linked to a general GME wellness page that applied to all residencies, not just EM. These pages were analyzed as well using the same codebook developed for EM pages. The most common subjects on GME wellness pages included wellness resources, mental health, physical health, and institutional support. All subjects discussed on GME pages can be found in Appendix B.

Most programs (60.79%) discussed their program’s commitment and approach to wellness on areas of their website other than a dedicated wellness page. The most common website pages on which themes of wellness were mentioned included PD/Chair Welcome, Curriculum, Why Us, Mission/Values, FAQs, and Overview. The complete list of page titles that include themes of wellness are outlined in Appendix C.

DISCUSSION

Residency applicants strongly consider wellness when determining which programs to apply to and rank.9 Residency websites are one of the few ways that applicants can learn about a program’s approach to wellness. Despite this, only 68.35% of EM residency websites discussed wellness directly on their websites, and only 20.50% had website pages dedicated to wellness. Over half of programs that had wellness pages on their websites discussed social events, mental health, physical health, having institutional structure for wellness, wellness didactics, and burnout.

Forty-five programs linked to institutional pages. These pages were evaluated as well, but the topics that occurred most were different from the sub-themes on EM-specific pages. Conveying wellness information is clearest with a page dedicated to wellness, but wellness information appears throughout residency websites. Students may not access all parts of the website; so featuring wellness information on a prominent section such as a “PD Welcome” section or “Program Highlights” would be most visible.

To our knowledge, this is the first study in EM to explore residency websites for wellness-related statements. In 2021 Pollock et al performed a descriptive analysis of EM residency

websites to characterize the presence of 38 items organized into the following categories: general program information; application process; research; facility information; resident information; lifestyle; and social media.10 Wellness was not directly assessed in their analysis. A radiology group previously performed a deductive analysis of radiology residency websites to determine the presence or absence of 26 predefined criteria related to resident wellness.5 They found that financial, clinical, and technical aspects of programs were commonly present on websites, but less than 10% of radiology programs mentioned resident mentoring, wellness committees, or their non-clinical curricula.

Similarly, in internal medicine a group reviewed 579 internal medicine websites for variables that a focus group found to be important to wellness and found that 81% of internal medicine websites mentioned wellness, and 41% had a page dedicated to wellness.11 Pavuluri et al accessed urology residency websites to determine whether the words “wellness” or “wellbeing” were used anywhere on the website and found that only 20% of programs mentioned one of these terms.12 Using a two-proportion z-test (P <0.001), we found that in EM, a significantly higher percentage of programs mentioned wellness directly.

In contrast to the reviews published in the radiology, internal medicine, and urology literature, we performed an inductive conceptual analysis. We sought to characterize all concepts discussed on residency wellness pages rather than a predetermined list of criteria. While a deductive approach employed by other groups may be less prone to bias, it also misses important content, and the depth of analysis is limited. Our approach allowed us to assess the topics that residency programs deemed important to convey to applicants or the public. Social events were the most common sub-theme on EM residency websites, with 83.9% of programs discussing events that they hold for residents. In programs that linked to a GME page highlighting wellness the most common subtheme was institutional resources.

While there are no published studies that establish a correlation between a residency website’s representation of wellness and actual resident wellbeing, describing website representation of wellness among EM programs is still valuable for residency program leadership, marketing teams, web design teams, and social media teams. We hope that this article will lead programs to enhance the quality of wellness information on their website and, more so, to continue to improve wellness initiatives for their program. After reviewing 278 EM websites, we believe that the following information should be included on EM residency program pages:

1. A subpage dedicated to wellness

2. Feature wellness information in a prominent location such as the program director’s message or the program highlights

3. The program’s approach to social activities, psychological wellbeing, health, wellness resources, culture, and curriculum focused on wellness

4. Specific examples of programming, curricula, committees, resources, or social events that increase wellness and mitigate burnout through inclusion of descriptions, photos, videos, or linked pages

A comprehensive and mission-aligned description of wellness on the program website could increase medical student interest, engagement, and ultimately recruitment to programs. Additionally, reviewing the approach to each of the themes discussed above may lead programs to improve aspects of their program’s overall wellness structure.

LIMITATIONS

Our study provides insight into how EM residency programs convey their wellness structure and culture to applicants through their websites, but it is important to acknowledge some limitations. First, we accessed websites in Spring 2023. Because program websites are constantly evolving, the current website content and page structure may be different from when we reviewed. Additionally, websites are only one way that programs communicate information to applicants. In addition to their website, programs may use social media, virtual meet-and-greet sessions, second-look events, interview days, and other means to highlight aspects of their program’s wellness efforts. None of those communication platforms were considered in the current study.

Second, we had a small team, and coding was performed by two individuals. While we followed rigorous methodology and iteratively developed and refined a codebook, this type of analysis lends itself to bias. Third, some programs have multiple websites. In this instance, we did consider all wellness statements if they appeared on any website that was found. While our best efforts were made to include data points from any program website, it is possible we may have missed programs who have multiple websites under varying names or nicknames for the program. Finally, our content analysis was based on the presence of concepts and did not assess the detail or quality of information provided.

CONCLUSION

Residency websites are an important resource for medical students when they are reviewing programs for residency. Information about wellness is important to most students. There is significant variation in how programs address wellness topics on their website; 75.18% of programs discuss wellness either on a dedicated wellness page, in other locations on their website, or on a linked institutional wellness page. Of the 20.5% of programs that have a dedicated page to wellness, they explore themes related to community involvement, growth and development, nutrition and health, psychological wellbeing, social and relaxation activities, wellness culture and environment, wellness curriculum, wellness structure and resources, and work-life integration. We hope this study encourages

Sappington et al. EM Residency Website Wellness Pages

improvement in the way EM residency programs present wellness information on their websites, as the internet will continue to be a vital source of information for applicants. Future research could explore the alignment or misalignment between wellness programs offered and the perceived needs of EM residents.

Address for Correspondence: Brian Milman, MD, University of Texas Southwestern Medical Center, Department of Emergency Medicine, 5323 Harry Hines Boulevard E4.300, Dallas, TX 753908579. Email: brian.milman@utsouthwestern.edu.

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Sappington et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Battaglioli N, Ankel F, Doty CI, et al. Executive summary from the 2017 Emergency Medicine Resident Wellness Consensus Summit. West J Emerg Med. 2018;19(2):332–6.

2. Freudenberger HJ. Staff burn‐out. J Soc Issues. 1974;30(1):159-65.

3. Shopen N, Schneider A, Aviv Mordechai R, et al. Emergency

medicine physician burnout before and during the COVID-19 pandemic. Isr J Health Policy Res. 2022;11(1):30.

4. Li-Sauerwine S, Rebillot K, Melamed M, et al. A 2-question summative score correlates with the Maslach Burnout Inventory. West J Emerg Med. 2020;21(3):610–7.

5. Wong TY, Huang JJ, Hoffmann JC, et al. Resident wellness in radiology as portrayed by departmental websites. Acad Radiol 2022;(8):1259-65.

6. Taher A, Hart A, Dattani ND, et al. Emergency medicine resident wellness: Lessons learned from a national survey. CJEM 2018;20(5):721-4.

7. Li-Sauerwine S, Weygandt PL, Smylie L, et al. The more things change the more they stay the same: factors influencing emergency medicine residency selection in the virtual era. AEM Educ Train 2023;7(6):e10921.

8. Mackey C, Feldman J, Peng C, et al. How do emergency medicine applicants evaluate residency programs in the post-COVID-19 era? AEM Educ Train. 2022;6(6):e10805.

9. Ganguli S, Chen SW, Maghami S, et al. Residency program website content may not meet applicant needs. Int J Med Stud 2024;12(1):60-8.

10. Pollock JR, Weyand JA, Reyes AB, et al. Descriptive analysis of components of emergency medicine residency program websites. West J Emerg Med. 2021;22(4):937-42.

11. Storm K, Kelly G, Kottapalli A, et al. Published support for wellness, diversity, equity, and inclusion among internal medicine residency program websites. Cureus. 2022;14(9):e29328.

12. Pavuluri H, Malik R, Seideman CA. An assessment of residency wellness programming in urology training programs. Urology 2022;165:113-9.

Inequities in the National Clinical Assessment Tool for Medical Students in the Emergency Department

Bushra Z. Amin, MD*

C.Jessica Dine, MD, MSHP*†§

Erica R. Tabakin, MD*‡

Michael Trotter, MD*‡

Janae K. Heath, MD, MSCE*†

Perelman School of Medicine at the University of Pennsylvania, Philadelphia, Pennsylvania

Hospital of the University of Pennsylvania, Department of Medicine, Philadelphia, Pennsylvania

Hospital of the University of Pennsylvania, Department of Emergency Medicine, Philadelphia, Pennsylvania

Leonard Davis Institute of Economics at the University of Pennsylvania, Philadelphia, Pennsylvania

Section Editor: Jules Jung, MD

Submission history: Submitted February 20, 2025; Revision received June 3, 2025; Accepted June 12, 2025

Electronically published October 3, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI 10.5811/westjem.43506

Introduction: The National Clinical Assessment Tool for Emergency Medicine (NCAT-EM) was designed to standardize medical student assessments during emergency medicine clinical rotations. While multiple assessment tools implemented in medical education have been prone to inequities, it remains unknown how student and rater demographics impact NCAT-EM scores. In this study we examined how a student’s gender and status as under-represented in medicine (URM) affected NCAT-EM scores.

Methods: This was a retrospective cohort study of all NCAT-EM assessments of clerkship medical students at a single institution in 2022. We performed mixed-effect ordinal logistic regression analyses to determine the association between the seven NCAT-EM domains (history/physical, prioritized differential, formulation of plans, observation/monitoring, emergency management, communication, and global assessment) and student gender, as well as the NCAT-EM domains and students’ URM status (specifically in domains of race and ethnicity). We adjusted our analyses for the site of rotation, time, the rater’s role (attending or resident), and rater demographics (gender, URM status). We then evaluated the interaction in gender concordance and URM-status concordance on outcomes.

Results: A total of 1,881 NCAT-EM assessment forms were submitted on 142 students completed by 266 raters. There were no significant associations between student gender and NCAT-EM ratings across the seven domains. We found an association between URM students and lower scores in multiple NCAT-EM domains, including global assessment (odds ratio [OR] 0.50, CI 0.25-0.99, P = .01); history/physical (OR 0.38, CI 0.19-0.77, P = .01); and prioritized differential (OR 0.47, CI 0.26-0.88, P = .02). This effect was moderated by a significant positive interaction effect with URM concordance between raters and students in the prioritized differential and observation/monitoring domains.

Conclusion: This is the first study to highlight differences in both gender and status as underrepresented in medicine within the nationally implemented NCAT-EM assessment tool. Women students were overall rated similarly across the NCAT-EM domains compared to men, with no association of gender on ratings. However, students’ URM status was associated with lower scores in multiple NCAT-EM domains. This finding was mitigated by URM concordance between faculty and resident raters. Our findings support the need for additional studies to understand bias and inequities in the application of the NCAT-EM tool nationally. [West J Emerg Med. 2025;26(5)1250–1259.]

INTRODUCTION

A longstanding challenge in medical education has been to accurately assess medical students on clinical rotations with assessment tools that have strong validity and reliability evidence.1,2 The fairness and accuracy of clinical assessment of medical students is critical as it informs clinical grades, Medical Student Performance Evaluation (MSPE), or “dean’s letters”, and—for those applying to emergency medicine (EM) residency—the Standardized Letter of Evaluation (SLOE).3 As EM residency program directors have consistently ranked SLOEs and EM rotation grades as some of the most important criteria when offering interviews and ranking applicants,4 ensuring fairness and accuracy in these assessments plays a paramount role in achieving equity among EM applicants.

In an effort to improve the fairness and accuracy of EM medical student assessments (and thus that of SLOEs and EM rotation grades), the National Clinical Assessment Tool for Emergency Medicine (NCAT-EM) was developed in 2016 via consensus as a standardized assessment tool.5 This tool allowed for post-shift assessment of students by faculty or residents across six clinical performance domains: history and physical exam skills; prioritized differential diagnosis; ability to formulate a plan; observation, monitoring, and follow-up; emergency recognition and management; and patient- and team-centered communication. This tool has begun to replace institution-specific tools in numerous EM rotations across the United States,2,5,6 and was the first nationally standardized, specialty-specific, entrustable professional activities-based assessment tool for medical students.5

Prior work evaluating early implementation of the NCAT-EM suggests it is achieving some of its stated goals. Specifically, Hiller et al noted high internal consistency in scores within a given institution,6 suggesting that this tool supports reliable comparison of students within an institution during the residency application process. However, this work showcased some gaps in the validity evidence, specifically noting site-specific variation in ratings, perhaps suggesting limitations in the response process of the validity (how the raters differentially use the tool) or the generalizability of the tool.1 Additionally, this prior work was predominantly limited to medical students in their fourth year (with a high percentage interested in EM residency), limiting generalizability. Additionally, although they found site-specific differences based on student and rater gender, they were unable to examine the association of race or ethnicity on NCAT-EM scores (despite known racial disparities in other standardized EM assessments,7,8 such as the SLOE9–11). Additional work to investigate the various domains of validity of this tool would add to the literature.

Unfortunately, such disparities in assessment have been observed throughout medical education,12 potentially contributing to the known leadership disparities and pay-based disparities based on gender and race.13–18 Studies have identified gender- and race-based differences in language used

Population Health Research Capsule

What do we already know about this issue? Clinical assessment tools in medical education often show racial and sex disparities in scoring and narrative feedback.

What was the research question?

Do student and rater demographics affect National Clinical Assessment Tool for Emergency Medicine (NCAT-EM) scores in emergency medicine clerkships?

What was the major finding of the study?

Under-represented in medicine (URM) students had lower global scores (OR 0.50, 95% CI 0.25-0.99, P =.048), which was mitigated by URM rater concordance.

How does this improve population health?

Identifying disparities in clerkship assessments supports equitable evaluation, critical for building a diverse physician workforce.

in the MSPE19,20 and language used in clerkship evaluations.7,19,21 Gender- and race-based differences have also been observed through clinical grades (with lower clinical grades for non-White students, even after adjusting for variables such as scores on Step 1 of the US Medical Licensing Exam)7,22 and overall recommendations on SLOEs.9–11 These differences likely represent inequities, especially when the observed variations are not explained by the student performance but instead by other factors such as the clinical learning environment or the evaluator.23,24 It is not known whether such differences persist despite the implementation of a nationally standardized tool such as the NCAT-EM.

To address this gap, our goal in this study was to analyze the association between student and rater demographics and NCAT-EM scores of clerkship students rotating through various emergency department sites at a single institution. Given prior evidence suggesting that concordance in demographics may impact ultimate evaluation,25,26 we similarly assessed how concordance in student and rater demographics was associated with NCAT-EM scores.

METHODS

Setting and Participants

We performed a retrospective, single-center cohort study

Inequities in the NCAT-EM for Medical Students

of all electronically completed NCAT-EM assessments of clerkship medical students at the University of Pennsylvania. Individualized NCAT-EM forms were made available electronically through a Qualtrics (Qualtrics International Inc, Provo, UT) QR code. The dataset included all submitted NCAT-EM assessments from January–December 2022, as assessments completed during this year were unaffected by the COVID-19 pandemic. Given our interest in the impact of demographics on NCAT-EM scores, we excluded assessments for which demographic information (for either the student or the rater) was unavailable.

The Perelman School of Medicine curriculum includes 1.5 years of pre-clerkship content, followed by a one-year clerkship year, consisting of eight core clerkships (emergency medicine, family medicine, internal medicine, neurology, obstetrics and gynecology, pediatrics, psychiatry, and surgery) graded on an honors/high pass/pass/fail basis. During the clerkship year, students additionally complete an additional month of otolaryngology, orthopedic surgery, anesthesia, and ophthalmology (one week each, graded on a pass/fail basis). The EM clerkship is a four-week core clerkship completed at either one or two of six affiliated clinical sites.

The NCAT-EM has been used for clinical assessment of EM clerkship students in this institution since 2018. The NCAT-EM consists of six clinical performance domains rated on a four-point entrustability scale, a global assessment domain, a professionalism section, and mandatory free-text comments for strengths and suggestions for improvement (Supplemental Table 1). Students are required to present a QR code linking the NCAT-EM to attendings or EM residents during every shift, which is completed at the rater’s convenience on an online platform.

During the EM clerkship, students were assigned to work 14 eight-hour shifts (or total hourly equivalent for sites with 10-or 12-hour shifts) for the duration of the clerkship. Students were required to present the QR code to at least one rater per shift (either an attending or a supervising resident ranging from postgraduate year [PGY] 2-4). The dataset did not include discrete PGY-level data for resident raters. Prior to and during this study, the clerkship directors performed annual education focused on the NCAT-EM tool and the process of assessment, which consisted of an introduction to the tool, a brief overview of the scale, and a review of medical student evaluation processes. This information session included both faculty and residents at all sites (for all individuals who would be working with students).

Data Collection and Analysis

Data collected in addition to completed NCAT-EM forms included student factors (gender and under-represented in medicine [URM] status) and rater factors (gender, URM status, and role, either resident or faculty). We extracted gender and race data for students from admissions demographics based on self-identification. For faculty,

self-reported gender and race were obtained through the university’s faculty affairs database. We defined URM status for both faculty and students using the Association of American Medical Colleges (AAMC) definitions. The URM status was specifically chosen as a binary variable (as opposed to race and ethnicity data), to improve power in our statistical analysis. Importantly, the definition of URM can broadly include groups that are minoritized, such as first-generation, low-income students, or students with disabilities, although for this work we used the AAMC definition of URM based on race and ethnicity. The dataset also included the quarter of year in which the student was completing the clerkship (block 1, 2, 3, or 4), and the clinical site where they were rotating. All data were deidentified prior to data analysis.

We performed univariate ordinal logistic regression analyses to determine the association between the global assessment on the NCAT-EM tool (bottom third, middle third, top third, or top 10%), with student gender, student URM status, faculty gender, faculty URM status, clerkship site, and rotation block. We then performed mixed-effect ordinal logistic regression analyses to determine the association between NCAT-EM scores and student gender, clustered by student, after adjusting for site of rotation, time, role of rater, student URM status, and rater demographics (gender, URM status). To assess the association with URM status, we performed mixed-effect ordinal logistic regression analyses to determine the association between NCAT-EM scores and student URM status, clustered by student, after adjusting for site of rotation, block, role of rater, student gender, and rater demographics. Given the hypothesis that concordance in rater gender and student gender and URM status might be associated with NCAT-EM scores, we also assessed the interaction between student gender and rater gender, and student URM status and rater URM status.

For each analysis, the model was clustered on student (random effects) and rater (random effects). This model was used intentionally to adjust for the non-independent nature of students and/or raters throughout the dataset, as this model provides adjusted standard errors accounting for student and/ or rater clustering (random effects) throughout the dataset. Based on prior factor analyses showing each domain in the NCAT-EM assessed unique domains, we repeated the above analysis for each of the six clinical performance domains of the assessment. (See Supplemental Table 1 for the NCATEM domains.)

While our primary analysis included URM (as per AAMC definition) as a binary variable, we aimed to further understand our findings in the context of URM categories, recognizing that URM individuals who spanned different identities might have had different experiences with assessment. Thus, we performed a sensitivity analysis using racial and ethnic groups within the AAMC definition of URM (African- American/Black, Hispanic/Latino, Native American, including American Indian, Alaska Native, and Native

Hawaiian, Pacific Islander, and mainland Puerto Rican). We then performed a second sensitivity analysis comparing individuals identifying as Black compared to other individuals (noting the large proportion of individuals within the cohort identifying as URM were Black and the distinct experiences of this population27,28).

We completed statistical analysis using STATA v18.0 (StataCorp, LLC, College Station, TX). Statistical significance was determined using a P-value of .05 (not adjusting for multiple comparisons given the exploratory nature of the analysis, to reduce the risk of type 2 error). This study was deemed exempt by the University of Pennsylvania Institutional Review Board.

RESULTS

Over the course of 2022, 1,881 complete NCAT-EM assessment forms were submitted on 142 distinct students (consisting of 74 women [52%] and 68 men [48%], including 34 [24%) who identified as URM) completed by 266 different raters. We excluded 122 NCAT-EM forms prior to analysis, due to incomplete demographic information for the rater. The median number of completed forms per student was 13 [(IQR 11-15), which was similar between genders (13.4 for men vs 13.1 for women, P = .59). There were fewer NCAT-EM assessments completed on those who identified as URM within the sample (with a mean of 11.8 vs 13.8 assessments for non-URM students, P = .01). Most assessments were completed by raters who identified as men (n = 1,070, 60%), and 11% (n = 195) were completed by raters who identified as URM. The racial demographics of faculty identified as URM (as per AAMC definitions) was 119 (61%) Black, 55 (28%) Hispanic or Latino, and 21 (11%) Pacific Islander. The racial demographics of students identified as URM (as per AAMC definitions) was 224 (56%) Black, 38 (10%) Hispanic or Latino, and 138 (35%) spanning multiple groups. Complete demographic information of completed NCAT-EM forms are included in Table 1.

Distribution of scores for each of the six clinical performance domains on the NCAT-EM as well as the global assessment domain (see Supplemental Table 1) are summarized in Table 2. Global assessment scores were skewed leftward (consistent with prior national data5), with 38 ratings (2.1) ratings representing the lower third in the global assessment, 506 (28%) in the middle third, 878 (49%) in the upper third, and 387 (21%) in the top 10% (“exceptional”).

The results of the univariate ordinal logistic regression are shown in Table 3. In the univariate analysis, there was a significant association based on rater role, with faculty raters being less likely to rate students in the higher entrustment scores compared to resident raters for all domains (P < .001 for all domains); thus, this was included in the multivariate analysis. There was also an association between rotation block and NCAT-EM scores, and site of the rotation and NCAT-EM scores; thus, these variables were included in the final

Table 1. Student characteristics, rater characteristics, clinical site and block of completed NCAT-EM assessment forms.

Characteristic Completed NCAT-EM forms (n, %)

Student gender 1881 (100.0%)

Men 913 (48.5%)

Women 968 (51.5%)

Student URM status 1881 (100.0%)

URM

(21.3%)

African American/Black 224 (56%)

Hispanic/Latino 38 (10%)

Multiple AAMC URM Groups 138 (35%)

Non-URM 1481 (78.7%)

Rater Role

(100.0%) Resident

(72.7%) Rater Gender

(60.0%)

(40.0%) Rater URM Status 1759 (100.0%)

(100.0%)

care site

All results are expressed as number of NCAT assessment forms completed within each category, followed by percent of assessments for which data are available.

NCAT-EM, National Clinical Assessment Tool for Emergency Medicine; URM, under-represented in medicine; AAMC, Association of American Medical Colleges.

regression model.

The results of the multivariate ordinal logistic regression, clustered by student and rater, are shown in Table 4, using a significant threshold of P = .05 (rather than adjusting for

Table 2. Overview of ratings for National Clinical Assessment Tool for Emergency Medicine clinical performance domains and global assessment.

NCAT-EM Domain n (%)

Focused history and physical exam skills

Unable to assess

Pre-entrustable

Mostly entrustable

Fully entrustable/Milestone 1

Outstanding/Milestone 2

1,879 (100%)

25 (1%)

118 (6%)

742 (40%)

754 (40%)

240 (13%)

Ability to generate a prioritized differential diagnosis 1,866 (100%)

Unable to assess

Pre-entrustable

Mostly entrustable

Fully entrustable/Milestone 1

Outstanding/Milestone 2

31 (2%)

152 (8%)

818 (44%)

664 (36%)

201 (11%)

Ability to formulate plan (diagnostic, therapeutic, disposition) 1,861 (100%)

Unable to assess

Pre-entrustable

Mostly entrustable

Fully entrustable/Milestone 1

Outstanding/Milestone 2

34 (2%)

163 (9%)

894 (48%)

592 (32%)

178 (10%)

Observation, monitoring, and follow-up 1,861 (100%)

Unable to assess

Pre-entrustable

Mostly entrustable

Fully entrustable/Milestone 1

Outstanding/Milestone 2

Emergency recognition and management

Unable to assess

Pre-entrustable

Mostly entrustable

Fully entrustable/Milestone 1

Outstanding/Milestone 2

42 (2%)

120 (6%)

683 (37%)

747 (40%)

263 (14%)

1,851 (100%)

385 (21%)

98 (5%)

673 (36%)

523 (28%)

172 (9%)

Patient- and team-centered communication 1,848 (100%)

Unable to assess

Pre-entrustable

Mostly entrustable

Fully entrustable/Milestone 1

Outstanding/Milestone 2

Global assessment

Lower third

Middle third

Top third

Exceptional (top 10%)

40 (2%)

65 (4%)

605 (33%)

794 (43%)

344 (19%)

1,809 (100%)

38 (2%)

506 (28%)

878 (49%)

387 (21%)

All results are expressed as number of NCAT-EM assessment forms completed within each category, followed by percentage of assessments for which data are available.

NCAT-EM, National Clinical Assessment Tool for Emergency Medicine.

multiple comparison due to the exploratory nature of the study). As there was no significant interaction between the student gender and rater gender in each analysis, this interaction term was excluded from the final regression model. The final regression model included the rater gender and student gender, rater and student URM status (and the interaction between them), rater role (faculty vs resident), clinical site, and rotation block. The results of the mixed regression identified no significant associations between student gender and NCAT-EM scores for each NCAT-EM domain (Table 4). There was a significant interaction effect between student gender and rater gender in the domain of history and physical exam (OR 0.31, CI 0.11-0.83, P = .02). There were no other significant interaction effects in the remainder of the domains.

Student URM status was associated with lower scores for the global assessment, (OR 0.50, CI 0.25-0.99, P = .05), history/physical exam domain (OR 0.38, CI 0.19-0.77, P = .01) and the prioritized differential diagnosis domain (OR 0.47, CI 0.26-0.88, P = .02) after multivariate adjustment, as shown in Table 4. These findings were moderated by a significant positive interaction effect between student and rater URM status in the observation/monitoring domain (OR 4.55, CI 1.21-17.1, P =.03), suggesting that concordance in URM status between raters and students lessened (and in some instances, reversed) the negative effect of URM status on NCAT-EM scores. More specifically, when assessing the significant domains for a given student, the adjusted ORs for URM-concordant dyads were adjusted OR 0.83 for the global assessment (suggesting the difference persisted); adjusted OR 1.06 for history/physical domains; and adjusted OR 1.76 for observation/monitoring (suggesting reversal of the score, and URM-concordant dyads had higher odds of receiving a higher score than the reference cohort).

The sensitivity analysis using racial and ethnic groups within the URM status (Black, Hispanic/Latino, Native Americans, including American Indians, Alaska Natives, and Native Hawaiians, Pacific Islanders, and mainland Puerto Ricans demonstrated a poorer fit to the data—based on likelihood ratio tests and Bayesian information criterion and Akaike’s information criterion comparisons—than the original model. As this approach risked underestimating the true effect, it was not included in the results. In our sensitivity analysis comparing individuals who were Black to other individuals in the cohort, we found significant associations with NCAT-EM ratings in the domains of history/physical domain (OR 0.32, CI 0.14-0.76, P = .01); prioritized differential domain (OR 0.36, CI 0.17-0.79, P =.01); the ability to formulate a plan domain (OR 0.46, CI 0.59-1.48, P =.05), with significant interaction effects noted in the majority of domains. (See Table 5 for full details.)

DISCUSSION

Our study demonstrates important associations between both rater and student demographics and NCAT-EM scores

Student gender

Student URM-status

Rater role

(0.38, 0.57)

(0.28, 0.42)

Rater gender

(0.88, 1.26)

Rater URM-status

(0.53, 0.93)

(1.06, 1.51)

(0.82, 1.43)

OR, odds ratio; NCAT-EM, National Clinical Assessment Tool for Emergency Medicine; URM, under-represented in medicine.

within our cohort, with notable findings based on student URM status. Our multivariate analysis did not find any gender-related differences in NCAT-EM domains. However, the multivariate analysis showed that students identified as URM received lower NCAT-EM scores in several domains, including the history/physical exam domain and the prioritized differential diagnosis domain. This effect was mediated (and in some cases reversed) by concordance of URM status between raters and students in some of the domains, such that concordance in URM status between students and raters was associated with higher NCAT-EM scores.

The association between student URM status with lower NCAT-EM scores is consistent with prior literature

documenting longstanding racial disparities in clerkship grading.21,29,30 Despite the NCAT-EM being noted to have excellent internal consistency based on prior studies,6 this suggests that the use of the tool continues to be impacted by its differential use by raters and raises some concern about additional domains of validity with the tool.1 It is important for us as medical educators to ensure the assessment tool widely used to guide clerkship grading does not introduce any construct irrelevance variance at all.31,32

In this study, the observed score differences by URM status may be the result of implicit bias of raters affecting both their global perception and perception of competency-related behaviors of students, lack of mentorship leading to

Table 3. Univariate associations between ratings for National Clinical Assessment Tool for Emergency Medicine domains and student and rater characteristics.

Table 4. Multivariate associations between ratings for National Clinical Assessment Tool for Emergency Medicine domains and student sex and under-represented in medicine (URM) status, after adjusting for rater sex and URM status, concordance of student-rater URM status, clinical site, and time.

Student Sex

Interaction between rater and student gender

(0.49, 1.09)

Student URM Status

Interaction between rater and student URM-status

OR, odds ratio; NCAT-EM, National Clinical Assessment Tool for Emergency Medicine; URM, under-represented in medicine.

Table 5. Multivariate associations between ratings for National Clinical Assessment Tool for Emergency Medicine domains and Black vs non-Black race, after adjusting for rater sex and race (Black vs non-Black), concordance of student-rater race, clinical site, and time.

Student Race

between Black Raters and Black Students

OR, odds ratio; NCAT-EM, National Clinical Assessment Tool for Emergency Medicine.

inequitable opportunities, or different lived experiences of URM students impacting their experience, and performance in the clinical environment (including stereotype threat, microaggressions, patient mistreatment, being tasked with being a racial ambassador, unrewarded labor, limited resources, and othering).17,21,33–35 The complexity of this amalgam of factors that exacerbates disparities in clerkship grading has been described as the “social milieu of medical education,”35 and may ultimately contribute to the inequities observed in other standardized assessments used in EM, such as the SLOE. Improved understanding of these disparities in the EM clerkship setting and further evaluation of the validity evidence of the NCAT-EM tool is critical to identify solutions to mitigate these issues.

Perhaps more interestingly, some of the findings of a differential score based on URM status were mitigated (and in some cases, reversed) by URM concordance between the student and rater, specifically within the prioritization of a differential, and the observation and monitoring domains. As a possible explanation, URM concordance may reduce implicit bias of the rater as well as the other effects of racism on the medical student, such as stereotype threat. Concordance between the rater and student can also enhance performance of the student through the role-model effect.33 This aligns with prior studies that have shown the importance of racial concordance in multiple domains, including patient care, professional development,17 and medical education assessment,36,37 further highlighting the critical nature of supporting equity initiatives to advanced diversity across EM faculty and residents.

It is not clear why this phenomenon would be present for only two of the NCAT-EM domains, although it could represent something unique about those domains, including that they may capture more direct interaction between students and raters (such as prioritizing differentials) and, thus, concordance would be more heavily impacted. Regardless, this further suggests that additional robust validity studies of the NCAT-EM tool are needed. Additionally, while we noted various impacts of URM concordance on the NCAT-EM scores, the impact on the overall disparities identified in our study may be negligible, and further work is needed to examine this phenomenon across a larger sample of more diverse learners and raters.

Another interesting observation within our study was that URM students had fewer submitted NCAT-EM forms in the full cohort, which persisted after adjusting for minor site differences. The structure of the NCAT-EM within our institution requires learners to seek out designated feedback and collect assessments via a QR code. This difference could indicate differences in self-promotion behavior,38 which may uniquely disadvantage URM students. Specifically, there is a complex interplay between evaluations, biases, and the associated impact on confidence, self-esteem, and motivation. In URM students, negative evaluations, even if biased, may

reinforce stereotype threat—defined as a fear of confirming negative stereotypes about their group—17 and ultimately hinder professional growth. Understanding this, as well as other unique barriers to seeking evaluation by URM students as observed in this study, should be further evaluated.

We also found no difference in NCAT-EM scores based on gender, with no significant difference in scores between men and women in the cohort. This absence of gender associations across the NCAT-EM performance domains was surprising, and in contrast to prior work analyzing the NCAT-EM in medical students. Specifically, in a study by Hiller and colleagues, there were student gender-based differences in composite NCAT-EM scores at 4 of the 13 sites included in their study.6 However, this prior work was conducted with limited demographic data and an overrepresentation of male students, with predominantly students in their final year of medical school. It is possible that gender disparities across diverse assessment domains become apparent at later stages in training (as has been observed in residency assessments). 39,40

As NCAT-EM scores inform clinical grades, and subsequently the SLOE and MSPE, it is critical to mitigate disparities in use of the NCAT-EM tool as found in our study. The NCAT-EM has features of prior recommendations to reduce grading inequity, including workplace-based assessment, criterion-based rubrics, and competency-based grading. Our data show that rating disparities are still present, despite high internal consistency metrics of the NCAT-EM. This argues that the tool itself does not contribute to disparities, but real-world use of the tool by raters contributes to these differences. Ultimately, the scoring differences found in our study support the use of rater training, which has been shown to improve the accuracy of workplace-based entrustment ratings of medical learners.41

Additionally, ongoing efforts to promote an equitable and diverse workforce are necessary, noting the role of concordance on some of these disparities. Ultimately, achieving fairness and accuracy in NCAT-EM assessments is crucial to promoting gender and racial equity among EM applicants, especially with national implementation of the NCAT-EM tool. In addition, clerkships in other specialties should note that despite the positive impacts of using a standardized and national assessment instrument with strong reliability, testing by itself is not the solution for overcoming observed differences not explained by student performance.

LIMITATIONS

Although we identified compelling findings, this study had several limitations. Our study analytic approach involved multiple regressions, and we did not adjust for multiple comparisons due to the exploratory nature of the study, which increased the risk of type 1 error in our conclusions. However, we feel the presence of findings in our sensitivity analysis suggests that this finding is a true trend. In addition, the study

Inequities in the NCAT-EM for Medical Students

was limited to a single institution. Although it included six different sites within the institution (each with a unique culture and patient population), obtaining a multicenter study across distinct geographical regions is an important next step to fully evaluate the effect of student and rater demographics on NCAT-EM performance nationally. Furthermore, it is not yet clear whether these findings among second- and third-year clerkship students can be generalized to more senior medical students on sub-internships or electives, and additional work evaluating the impact of URM status on advanced students is needed. Our study did not include individuals who identified as gender diverse, which would also be important to include in future research. Finally, while we importantly noted some differences in NCAT-EM use by site and by level (residents NCAT-EM scores were higher as compared to faculty), we were unable to assess PGY level in residency training nor the impact of faculty development on the tool. This is an important area of future work.

CONCLUSION

While we found no association between student gender with NCAT-EM scores, we did find an association between student under-represented in medicine status in two of six NCAT-EM performance domains, an effect that was mediated by concordance in URM status with the rater. Future multiinstitution research is needed to verify grading disparities based on student and rater characteristics on the national level, which would further support the use of multifaceted interventions to mitigate disparities in ratings, including diversity efforts in recruitment practices, equitable access to medical school resources, gender- or URM-specific student support and rater training to ultimately promote equity among emergency physicians.

Address for Correspondence: Janae K. Heath, MD, MSCE, Hospital of the University of Pennsylvania, Department of Medicine, 3600 Spruce St, 822 West Gates Building, Philadelphia, PA 19104. Email: Janae.heath@pennmedicine.upenn.edu

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Amin et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Cook DA, Beckman TJ. Current concepts in validity and reliability for

Amin et al.

psychometric instruments: theory and application. Am J Med 2006;119(2):166.e7-16.

2. Lawson L, Jung J, Franzen D, et al. Clinical assessment of medical students in emergency medicine clerkships: a survey of current practice. J Emerg Med. 2016;51(6):705-11.

3. Negaard M, Assimacopoulos E, Harland K, et al. Emergency medicine residency selection criteria: an update and comparison. AEM Educ Train. 2018;2(2):146-153.

4. Katzung KG, Ankel F, Clark M, et al. What do program directors look for in an applicant? J Emerg Med. 2019;56(5):e95-101.

5. Jung J, Franzen D, Lawson L, et al. The National Clinical Assessment Tool for Medical Students in the emergency department (NCAT-EM). West J Emerg Med. 2018;19(1):66-74.

6. Hiller K, Jung J, Lawson L, et al. Multi-institutional implementation of the national clinical assessment tool in emergency medicine: data from the first year of use. AEM Educ Train. 2021;5(2):e10496.

7. Gauer JL, Mustapha T, Violato C. Race and gender bias in clerkship grading. Teach Learn Med. 2024;36(3):304-11.

8. Nguemeni Tiako MJ, Ray V, South EC. Medical schools as racialized organizations: how race-neutral structures sustain racial inequality in medical education—a narrative review. J Gen Intern Med 2022;37(9):2259-66.

9. Winfield A, Amin DP. To the Editor: Where Is equity in the SLOE? J Grad Med Educ. 2022;14(3):357.

10. Calles I. To the Editor: For equity in assessment: a comment on bias in the emergency medicine Standardized Letter of Evaluation. J Grad Med Educ. 2023;15(1):129.

11. Kukulski P, Ahn J. Validity evidence for the emergency medicine Standardized Letter of Evaluation. J Grad Med Educ. 2021;13(4):490-9.

12. Colson ER, Pérez M, Blaylock L, et al. Washington University School of Medicine in St. Louis Case Study: A Process for Understanding and Addressing Bias in Clerkship Grading. Acad Med. 2020 Dec;95(12S Addressing Harmful Bias and Eliminating Discrimination in Health Professions Learning Environments):S131-5.

13. Cheng D, Promes S, Clem K, et al. Chairperson and faculty gender in academic emergency medicine departments. Acad Emerg Med 2006;13(8):904-6.

14. Madsen TE, Linden JA, Rounds K, et al. Current status of gender and racial/ethnic disparities among academic emergency medicine physicians. Acad Emerg Med. 2017;24(10):1182-92.

15. Wiler JL, Wendel SK, Rounds K, et al. Salary disparities based on gender in academic emergency medicine leadership. Acad Emerg Med. 2022;29(3):286-93.

16. Jena AB, Khullar D, Ho O, et al. Sex differences in academic rank in US medical schools in 2014. JAMA. 2015;314(11):1149-58.

17. Bullock JL, Lockspeiser T, Del Pino-Jones A, et al. They don’t see a lot of people my color: a mixed methods study of racial/ethnic stereotype threat among medical students on core clerkships. Acad Med. 2020;95(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 59th Annual Research in Medical Education Presentations):S58-S66.

18. Ackerman-Barger K, Boatright D, Gonzalez-Colaso R, et al. Seeking

inclusion excellence: understanding racial microaggressions as experienced by underrepresented medical and nursing students. Acad Med. 2020;95(5):758-63.

19. Ross DA, Boatright D, Nunez-Smith M, et al. Differences in words used to describe racial and gender groups in Medical Student Performance Evaluations. PLoS ONE. 2017;12(8):e0181659.

20. Axelson RD, Solow CM, Ferguson KJ, et al. Assessing implicit gender bias in Medical Student Performance Evaluations. Eval Health Prof. 2010;33(3):365-85.

21. Hanson JL, Pérez M, Mason HRC, et al. Racial/ethnic disparities in clerkship grading: perspectives of students and teachers. Acad Med 2022;97(11S):S35-S45.

22. O’Sullivan L, Kagabo W, Prasad N, et al. Racial and ethnic bias in medical school clinical grading: a review. J Surg Educ 2023;80(6):806-16.

23. Lucey CR, Hauer KE, Boatright D, et al. Medical education’s wicked problem: achieving equity in assessment for medical learners. Acad Med. 2020;95(12S Addressing Harmful Bias and Eliminating Discrimination in Health Professions Learning Environments):S98-S108.

24. Kakara Anderson HL, Govaerts M, et al. Clarifying and expanding equity in assessment by considering three orientations: Fairness, inclusion and justice. Med Educ. 2025;59(5):494-502.

25. Takeshita J, Wang S, Loren AW, et al. Association of racial/ethnic and gender concordance between patients and physicians with patient experience ratings. JAMA Netw Open. 2020;3(11):e2024583.

26. McOwen KS, Bellini LM, Guerra CE, et al. Evaluation of clinical faculty: gender and minority implications. Acad Med 2007;82(10):S94.

27. Nguemeni Tiako MJ, Wages JE 3rd, Perry SP. Black medical students’ sense of belonging and confidence in scholastic abilities at historically Black vs predominantly White medical schools: a prospective study. J Gen Intern Med. 2023 Jan;38(1):122-4.

28. Ghanem N, Goldberg DG, Granger E, et al. A critical qualitative study to understand current Black women medical student perspectives on anti-racist reform in US medical education. Med Educ Online. 2024;29(1):2393436.

29. Boatright D, Anderson N, Kim JG, et al. Racial and ethnic differences

in internal medicine residency assessments. JAMA Netw Open 2022;5(12):e2247649.

30. Low D, Pollack SW, Liao ZC, et al. Racial/ethnic disparities in clinical grading in medical school. Teach Learn Med. 2019;31(5):487-96.

31. Downing SM. Threats to the validity of locally developed multiplechoice tests in medical education: construct-irrelevant variance and construct underrepresentation. Adv Health Sci Educ Theory Pract 2002;7(3):235-41.

32. Tavakol M, Dennick R. The foundations of measurement and assessment in medical education. Med Teach. 2017;39(10):1010-5.

33. Wheeler M, de Bourmont S, Paul-Emile K, et al. Physician and trainee experiences with patient bias. JAMA Intern Med 2019;179(12):1678-85.

34. Colson ER, Pérez M, Chibueze S, et al. Understanding and addressing bias in grading: progress at Washington University School of Medicine. Acad Med. 2023;98(8S):S64.

35. Osseo-Asare A, Balasuriya L, Huot SJ, et al. Minority resident physicians’ views on the role of race/ethnicity in their training experiences in the workplace. JAMA Netw Open. 2018;1(5):e182723.

36. Heath JK, Dine CJ, LaMarra D, et al. The impact of trainee and standardized patient race and gender on internal medicine resident communication assessment scores. J Grad Med Educ. 2021;13(5):643-9.

37. Berg K, Blatt B, Lopreiato J, et al. Standardized patient assessment of medical student empathy: ethnicity and gender effects in a multi-institutional study. Acad Med. 2015;90(1):105-11.

38. Pololi L, Conrad P, Knight S, et al. A Study of the relational aspects of the culture of academic medicine. Acad Med. 2009;84(1):106.

39. Dayal A, O’Connor DM, Qadri U, et al. Comparison of male vs female resident milestone evaluations by faculty during emergency medicine residency training. JAMA Intern Med. 2017;177(5):651-7.

40. Santen SA, Yamazaki K, Holmboe ES, et al. Comparison of male and female resident milestone assessments during emergency medicine residency training: a national study. Acad Med. 2020;95(2):263.

41. Kogan JR, Dine CJ, Conforti LN, et al. Can rater training improve the quality and accuracy of workplace-based assessment narrative comments and entrustment ratings? A randomized controlled trial. Acad Med. 2023;98(2):237-47.

Program Director Perspectives on the Impact of the Proposed 48-Month Emergency Medicine Residency Requirement: A National Survey

Richard Austin, MD*

Chinmay Patel, DO†

Kristin Delfino, PhD‡

Sharon Kim, PhD*

Southern Illinois University, School of Medicine, Department of Emergency Medicine, Springfield, Illinois

Baylor Scott & White All Saints Medical Center, Department of Emergency Medicine, Fort Worth, Texas

Southern Illinois University, School of Medicine, Department of Surgery, Springfield, Illinois

Section Editor: Kendra Parekh, MD, MHPE

Submission history: Submitted June 2, 2025; Revision received October 15, 2025; Accepted October 15, 2025

Electronically published November 26, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI 10.5811/westjem.48359

Introduction: In early 2025, the Accreditation Council for Graduate Medical Education (ACGME) announced proposed revisions to emergency medicine (EM) residency training to include substantial changes to the length of training programs, required rotations, and structured experiences. To date, no published national survey has sought to determine how these changes would impact individual programs.

Methods: Over a three-week period in April 2025, we anonymously surveyed program directors or their designees online through the Council of Residency Directors in Emergency Medicine listserv. Survey respondents were asked about the impact the changes would have on their programs and their overall opinions of the proposed 48-month minimum requirement.

Results: A total of 86 program directors responded to the survey (response rate of 29.9%) with representative samples from current three-year (83.7%, 72/86) and four-year (16.3%, 14/86) programs. Most program directors reported that they would have to make significant revisions in either structured experiences, required rotations, or both. Most survey respondents from three-year programs (52/72) do not support the proposed changes, whereas all respondents from four-year programs (14/14) do support the changes (P<.001).

Conclusion: Proposed program requirements may require modifications in both three- and four-year programs; 33 of the 86 program directors surveyed reported that would need more than one year to meet the requirements, if adopted. This raises the concern that programs may not be prepared to implement the revisions within the proposed timeline, potentially impacting resident education and the future EM workforce. The ACGME should consider a staged rollout of requirements to allow them to be thoughtfully implemented in a meaningful way. [West J Emerg Med. 2025;26(6)1504–1509.]

INTRODUCTION

On February 12, 2025, the Accreditation Council for Graduate Medical Education (ACGME) proposed significant revisions to the program requirements for emergency medicine (EM) residency training in the United States, with the most notable change being the standardization of training length to 48 months for all programs, effective July 1, 2027.1 This proposed change has generated considerable discussion and

debate within the EM community, with concerns raised about its potential impact on resident education, program finances, and the EM workforce. Currently, most programs are three years in length, with four-year programs comprising less than 25% of EM residency programs in the US.2

Approximately 60% of EM program directors (PD) (173/289) from the ACGME database completed a survey created by the Program Requirements Writing Group 3, which

found that summed averages for necessary experiences were 41.6 months for three-year programs and 50.7 months for four-year programs. This survey has subsequently been used as justification for the proposed new program requirements, including the 48-month minimum program length. However, the survey did not specifically ask about support for a change from three to four years of training, and it was not designed to examine the impact of any potential changes. The ACGME’s rationale for this change includes concerns about declining board pass rates, potentially attributed to shorter EM shifts and fewer patient encounters during training.4 Yet the available published data show that graduates of three- and four-year programs perform similarly in clinical practice and on board pass rates. 5,6

To further explore the perceived challenges and opportunities associated with this change, we surveyed EM PDs on the changes that would be required within their programs and anticipated challenges with the new requirements, and we gauged their support for the proposed requirement of 48 months of training for all EM programs.

METHODS

We conducted a national cross-sectional survey of EM PDs, or their selected designees (defined as a faculty member delegated by the PD or other residency leadership), from ACGME-accredited EM residency programs in the US over a three-week period in April 2025. After we developed the survey instrument it was piloted for content validity, clarity, and relevance by four members of our educational leadership teams who have experience in program leadership and surveybased research. All feedback was incorporated into the survey, which was approved by our institutional review board as an exempt study. The survey was designed on SurveyMonkey (Momentive Inc., San Mateo, CA) and disseminated to EM PDs through the Council of Residency Directors (CORD) in Emergency Medicine Program Director list-serv. Reminders were sent at one-week intervals for a total of three times. At the time of the study there were 288 PDs in ACGMEaccredited EM programs.

The survey (Appendix A) consisted of 11 questions and was divided into three sections: demographic information; curricular changes; and reflection. In the section on proposed curricular changes, participants were asked to assume that the program requirements had been adopted and to answer questions on anticipated changes to their program’s required rotations (62 weeks at primary emergency department [ED], low-resource ED, high-resource ED, low-acuity area, critical care, pediatric intensive care unit, pediatric ED, administration/quality assurance, toxicology/addiction medicine, and emergency medical services). They were then asked about anticipated changes that would be necessary to meet the required structured experiences (non-laboratory diagnostics such as ultrasound, telemedicine, primary assessment and decision-making, airway management,

Population Health Research Capsule

What do we already know about this issue?

The ACGME has proposed major changes to emergency medicine (EM) training.

What was the research question?

How do program directors view the proposed ACGME changes and what resources are needed to comply?

What was the major finding of the study?

33.6% of 3-year and 100% of 4-year programs support the change to 48 months minimum residency training in emergency medicine (P < .001).

How does this improve population health?

The study identifies changes that EM programs would need to implement meet new standards, to ensure the future workforce is well-prepared to deliver quality care.

ophthalmologic procedures, acute psychiatric emergencies, sensitive exams, transitions of care, and observation medicine). The final section of the survey included questions on the time needed to adopt the 48-month format, additional resources required (additional funding aside from salary, additional training sites, additional core faculty, additional clinical faculty, more protected time, additional simulation or procedure lab time), and agreement on the proposed changes. We summarized categorical survey responses with frequencies and percentages. Chi-square tests were used to evaluate associations. P-values < .05 were considered statistically significant. We performed analysis using SAS v9.4 (SAS Institute Inc, Cary, NC).

RESULTS

A total of 92 respondents completed the survey. However, six were excluded because they did not identify as either a PD or their designee, and their data was not included in the analysis. In total, 86 EM programs were included in the analysis of the survey data for a final response rate of 29.9%. Of the 86 PDs who completed the survey, 72 (83.7%) were from three-year programs and 14 (16.3%) from four-year programs, which is similar to the breakdown of three- and four-year programs currently listed on the Emergency Medicine Residents’ Association Match website.7 Most programs were university-based (33, 38.4%), followed

Impact of the Proposed 48-Month EM Residency Requirement

by community-based university-affiliated (31, 36%), and community-based (22, 25.6%). Participant programs were geographically representative (Table) of the EM academic community based on Fellowship and Residency Electronic Interactive Database Access geographic regions.7

Curriculum Changes

Of the 86 PDs who responded to the survey, 50 (58%) anticipated needing to make three or more changes to their curricula to meet the required nine structured experiences (Table). This was not significantly different between threeand four-year programs. Of the 86 respondents, 14 (16%) reported already having all the required rotations in the proposed requirements, and 34 (40%) reported that they would

need to add three or more rotations. Forty-four respondents (51.2%) indicated that they would be likely to increase their complement of residents, whereas 12 (14%) indicated that they would likely decrease the number of residents. Twentyfive respondents (29.1%) indicated that they were unlikely to change their complement, and five respondents (5.8%) did not answer the question. Most of the programs that would look to expand their complement were currently three-year programs (42/44, 95.5%).

Time Needed

Forty-nine (57%) PDs indicated readiness for all the changes within a one-year period, while 33 (38.4%) reported they would require more than one year to prepare, 20 (23%)

Table. Survey results comparing three- and four-year programs by demographic region, program type, time required to implement changes, additional resources required to implement changes, agreement on 48 months of training, total changes needed in experiences, and total changes needed in rotations.

What geographic region is your program in (as listed in FRIEDA)? East North Central (IL, IN, MI, OH, WI)

East South Central (AL, KY, MS, TN)

(NJ, NY,

(AZ, CO, ID, MT, NM, NV, UT, WY)

New England (CT, MA, ME, NH, RI, VT)

(AK, CA, HI, OR, WA)

South Atlantic (DC, DE, FL, GA, MD, NC, SC, VA, WV)

West North Central (IA, KS, MN, MO, ND, NE, SD)

West South Central (AR, LA, OK, TX)

What best describes your program?

Given your current resources, how much time do you feel you would need to create the new rotations and experiences required in the new rules?

FRIEDA, Fellowship and Residency Electronic Interactive Database Access.

Table. Continued.

What additional resources would you require to meet the new requirements?*

Do you agree with the change to require 48 months of training for all EM programs?

changes needed in required rotations

*Not mutually exclusive. EM, emergency medicine; FRIEDA, Fellowship and Residency Electronic Interactive Database Access.

would require two years, and 13 (15%) would require more than three years to prepare.

Overall Support

Of the PDs of four-year programs, 100% (14/14) supported the change to a minimum 48 months of residency training. However, only 21% (15/72) of three-year PDs supported the change (P<.001).

DISCUSSION

Structured Experiences

Our survey results indicate that the new program

requirements would require a substantial need for curricular revision, which impacts programs differently. For the new “experiences” requirement, only one PD surveyed reported already having all components in place. By contrast, 36% of PDs (31/86) reported that they would probably require one to two changes, and 26% (22/86) would have to make ≥ 5 curricular changes to meet the “experiences” requirements. Curricular revision, including time to pilot, revise, and assess the curricula, is a time-intensive process that can take over a year.

Required Rotations

The required rotations also pose challenges for programs.

While 16% (14/86) reported already having all required rotations, 40% (34/86) of the PDs surveyed reported that they would require ≥ 3 revisions to their rotations. When these new rotations require a new training site, such as adding a low-resource ED, it takes a considerable amount of time to research sites and reach agreements. These external sites can also impact the funding of programs through the Centers for Medicare and Medicaid Services.8 Additionally, new rotations may require the addition of new faculty, more faculty development, and institutional agreements that cost money and take time.

Time Needed

While 57% of the PDs surveyed (49/86) reported readiness for changes in one year, 38% (33/86) reported they would need more than one year to prepare for the new requirements. This affected both three-year (28/72, 39%) and four-year (5/14, 36%) programs. More concerning is that 15% (13/86) of the PDs surveyed anticipated needing more than three years to prepare for the new program requirements. If those programs were truly unable to prepare in the time frame proposed by the Residency Review Committee (RRC) and decided to close their programs, this could have a major impact on the number of trainees in EM. Additionally, should the RRC-EM grant exceptions or extensions to some programs transitioning to a 48-month format, it could create a competitive advantage to those programs in resident recruitment.

Overall Support

While overall support for the change to 48 months of training was universal for the PDs of four-year programs, there was considerable disagreement among the PDs of threeyear programs, with only 15 (22%) supporting the change and 52 (78%) opposing. Our findings of support for a 48-month training requirement mirror a previous study that showed a strong correlation between the current length of a program and its PD’s support for that format.9

There has been robust discussion regarding the proposed changes since they were presented. Emergency medicine is not alone in considering lengthening residency training time. Family medicine has also discussed a transition to a 48-month training program, which would entail more study and a gradual transition rather than a sudden turnaround.10 The majority of EM program directors surveyed indicated that they would be likely to increase their complements of residents, which could significantly impact the future workforce in EM. Further study is needed to determine how these complement changes may affect the total number of residency spots available in EM each year. More study is needed to fully understand the impacts these changes would have on EM training programs, as well as their impact on the costs involved, especially when considering the lack of evidence to support the extension of training.

LIMITATIONS

The results of this survey do not include all EM residency programs in the United States, as not all programs are part of CORD, and not all members participate in the list-serv we used to disseminate the survey. Additionally, only 29.9% of programs responded to the survey, creating a significant risk of non-responder bias; however, an appropriate representation of programs both geographically and in terms of length of training was included, which provides support that the data are appropriately representative of the general EM academic community. The narrow three-week window to respond also may have impacted the total number of responses. Finally, we did not collect data on whether the respondent was a program director or their designee, only that they attested to being either the PD or a designee.

CONCLUSION

Most of the program directors who responded to a survey on the proposed new minimum of 48 months training in emergency medicine were opposed to the change, and a significant minority reported being unprepared to implement the new requirements within one year as proposed by the RRC-EM. If the ACGME does adopt the proposed program requirements in total, multiple years may be required for programs to create new and effective curricula and rotations. More study is needed on the impact of the proposed changes that focuses on the outcomes of graduates. Previous studies have already shown us that graduates of three- and fouryear programs perform similarly on the American Board of Emergency Medicine certifying exam and in clinical practice.5,6 The ACGME should consider a phased rollout of new requirements to ensure programs have time to thoughtfully and meaningfully adhere to the new requirements in a way that is beneficial to their trainees.

Address for Correspondence: Richard Austin, MD, Southern Illinois University, School of Medicine, Department of Emergency Medicine, 701 North First Street, Springfield, IL 62781. Email: raustin@siumed.edu.

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Austin et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Emergency Medicine. 2025. Available at: https://www.acgme. org/globalassets/pfassets/reviewandcomment/2025/110_ emergencymedicine_rc_02122025.pdf. Accessed March 2, 2025.

2. Nelson LS, Calderon Y, Ankel FK, et al. American Board of Emergency Medicine report on residency and fellowship training information (2021‐2022). Ann Emerg Med. 2022;80(1):74‐83.

3. Regan L, McGee D, Davis F, et al. Building the future curriculum for emergency medicine residency training. J Grad Med Educ. 2025;17(2):248-53.

4. Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Emergency Medicine Summary and Impact of Major Requirement Revisions. 2025. Available at: https://www.acgme.org/globalassets/ pfassets/reviewandcomment/2025/110_emergencymedicine_ impact_02122025.pdf. Accessed March 2, 2025.

5. Beeson MS, Barton MA, Reisdorff EJ, et al. Comparison of

performance data between emergency medicine 1-3 and 1-4 program formats. J Am Coll Emerg Physicians Open. 2023;4(3):e12991.

6. Nikolla DA, Zocchi MS, Pines JM, et al. Four- and three-year emergency medicine residency graduates perform similarly in their first year of practice compared to experienced physicians. Am J Emerg Med. 2023;69:100-7.

7. Emergency Medicine Residents’ Association. EMRA Match. 2025. Available at: https://www.match.emra.org. Accessed March 2, 2025.

8. Association of American Medical Colleges. Medicare Payments for Graduate Medical Education: What Every Medical Student, Resident, and Advisor Needs to Know. Association of American Medical Colleges. 2025. Available at: https://www.aamc.org/media/71701/ download?attachment. Accessed September 10, 2025.

9. Hopson L, Regan L, Gisondi MA, et al. Program director opinion on the ideal length of residency training in emergency medicine. Acad Emerg Med. 2016;23(7):823-7.

10. Green LA, Miller WL, Frey JJ 3rd, et al. The time is now: a plan to redesign family medicine residency education. Fam Med. 2022 Jan;54(1):7-15.

EDUCATION SPECIAL ISSUE -BRIEF RESEARCH REPORT

ChristineMotzkus,MD,PhD* CaseyFrey,MD†

AloysiusHumbert,MD*

*IndianaUniversitySchoolofMedicine,DepartmentofEmergencyMedicine, Indianapolis,Indiana † BooneCountyEmergencyMedicine,Indianapolis,Indiana

SectionEditors:JulesJung,MDandAndrewGolden,MD

Submissionhistory:SubmittedJune2,2024;RevisionreceivedNovember22,2024;AcceptedNovember26,2024

ElectronicallypublishedFebruary5,2025

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.21292

Introduction: Incorporatingvirtualinterviewsintoresidencyrecruitmentmayhelpdiversifyaccessto residencyprogramswhilereducingthecostinvolvedwithtravelandlodging.Programsmaybemore likelytorankstudentstheyhavemetinpersonataninterviewwhencomparedtounknownvirtual applicants.Ourobjectivewastocharacterizehomeinstitution,in-state,andin-regionmatchratesto emergencymedicine(EM)residencyprogramsforfourth-yearmedicalstudents.

Methods: WeusedNationalResidencyMatchingProgramdataavailabletotheprogramdirectorto identifymedicalschoolandmatchlocationoffourth-yearmedicalstudentswhointerviewedatalargeEM residencyprogramintheMidwestfrom2018–2023.Students’ medicalschoolsandultimatelymatched programsweremappedtoElectronicResidencyApplicationServicegeographicregions;subgroup analysesevaluatedallopathicandosteopathicmedicalstudentsseparately.Weusedchi-squareteststo compareproportionsofstudentsmatchingtohome,in-state,orin-regionprogramsacrossyears.

Results: Therewere1,401applicantswithmatchinformationavailable.Thepercentageofstudents matchingtoahomeinstitutionremainedstableoverthecourseofthestudy.Thepercentageofstudents matchingtoanin-stateinstitutionincreasedoverthe firsttwoyearsofvirtualinterviewsrisingfrom23.2% inthe2020matchto30.8%in-statematchesforthe2022match.Chi-squaretestsdidnotrevealany significantdifferencesamonggroupsforallapplicants.Allopathicmedicalstudentsdemonstrateda significantincreaseinmatchestohomeinstitutions.In-regionmatchesstayedrelativelystableoverthe studytimeframeregardlessofsubgroup.

Conclusion: Virtualinterviewschangedthelandscapeofresidencyinterviews.Homeinstitutionandinstatematchesmaybemorelikelyforapplicantsfromallopathicschoolswhoparticipatedinavirtual interviewasbothprogramsandapplicantsaremorefamiliarwitheachother;however,ourstudydidnot findconvincingevidenceofthispossibilityamongallapplicants.Additionalstudyisneededtodetermine ongoingeffectsofthetransitiontovirtualinterviews.[WestJEmergMed.2025;26(2)285–289.]

INTRODUCTION

Interviewsareacriticalelementoftheresidencymatch processforbothresidencyprogramsandmedicalstudentsto ensureselectionofhigh-qualityapplicantsandtraining programs.UntiltheCOVID-19pandemicstruckinearly 2020,nearlyallinterviewswereconductedinperson requiringmedicalstudentstoarrangetraveltodifferent programlocations,aprocessknowntobeexpensiveand

time-consuming.1 Withtravelrestrictionsandsocial distancingconcerns,the2021Matchcyclemarkedthe first useofvirtualinterviewsforemergencymedicine(EM) residencyspots.

Thetransitiontovirtualinterviewswasmarkedwith uncertaintyfrombothstudentsandprograms.Studentswere uncertainastohowtheywouldbeabletoassessprograms whileprogramsfeltsimilarlyabouttheabilitytoassess

students,particularlythosewhohadnotcompleteda rotationattheirprogram.Programdirectorshavealsobeen notedtoreportdifficultyassessingthe fitofapplicantsdespite theincreasedconvenienceofvirtualinterviews.2 However, virtualinterviewsofferincreasedopportunitiesforstudents tocompleteadditionalinterviewsatlowercost,whichhas beennotedinsurgicalspecialtieswithatransitiontovirtual interviews.3 Programdirectorsalsoexpressedconcernsthat programswouldmatchmorestudentsfromtheirhome programs,reducingopportunitiesforprogramstobenefit fromstudentswithnon-homogenousmedicalstudent training.2 Forfellowshipapplicants,similarconcernshave beenexpressed;however,therewasnotfoundtobea significantincreaseininterviewscompletedbypediatricEM fellowshipapplicationsorachangeinfellowshipapplicants matchingwithintheirpreferredstate.4

Weevaluatedwhetherthetransitiontovirtualinterviews atonelarge,MidwesternEMprogramcorrelatedwith increasednumbersofstudentsmatchingtotheirhome programs.Additionally,weevaluatedwhetherthe transitiontovirtualinterviewscorrelatedwithincreased numbersofstudentsmatchingtoin-stateor in-regionprogram.

METHODS

StudyPopulation

WeobtaineddatafromtheNationalResidentMatching Program(NRMP)forrankedmedicalstudentsfromone MidwesternEMresidencyprogramfortheyears2018–2023.

DataCollectionandAnalysis

Allmedicalstudentswhointerviewedatonemidwestern universityfrom2018–2023hadtheirhomeandmatched programsrecordedaspartofroutineNRMPrecordkeeping. Alldatawasstoredonasecureserver.Thisdatawas deidentifiedbytheprogramdirectorandcodedtodetermine whethertheintervieweematchedwithaprogramfromanyof thefollowing:1)thesameinstitutionastheirmedicalschool; 2)thesamestateastheirmedicalschool;and3)thesame regionastheirmedicalschool.Regionsweredefined accordingtoElectronicResidencyApplicationService (ERAS)geographicpreferenceregions;theseregionswere designatedbeginningin2022.Intervieweeswereableto signalageographicpreferenceaccordingtotheseregions. Areasofdisagreementregardingprogramaffiliationwere discussedbetweenauthorsandresolved.AuthorAH performedtheinitialcoding,andafterreviewbyauthorCM anydiscrepancieswereresolvedbetweenaffiliationsusing resourcesincludingtheAccreditationCouncilforGraduate MedicalEducationandprogramwebsitestoverify affiliations.Weusedchi-squaredteststoassessdifferences betweengroups.5 Weconductedsubgroupanalysesto evaluatedifferencesbetweenapplicantsfromallopathic (MD)andosteopathicschools(DO).

Outcome

Theprimaryoutcomeofthisstudywaspercentageof studentswhomatchedtoprogramswithintheirhome institution,state,orregion.

EthicsStatement

Thisstudywasreviewedandapprovedbytheinstitutional reviewboard.Nofundingwasobtainedforthisstudy.

RESULTS

Overthesixinterviewcyclesincludedinthestudyperiod, 1,401studentscontributeddatatotheNRMPandwere subsequentlycodedtohavingmatchedattheirhome programortoprogramswithinthesamestateorregion. Therewasanincreaseinthenumberofinterviewscompleted bytheprogramoverthesix-yearperiodwithanaverageof 201interviewscompletedinanin-personformatpriortoand duringthe2020pre-pandemicinterviewseason.Afterthe globalCOVID-19pandemic,beginninginthe2021 recruitmentseason,therewasaninitialincreaseinthe numberofinterviewsofferedastheformatswitchedto virtual.Virtualinterviewscontinuedthroughoutthe2022 and2023interviewseasons,butoverallnumbersof interviewsdecreasedduringthistimeframe(Table1).

Anincreasingpercentageofstudentsmatchedtotheir homeinstitutionfrom2020–2023,withthelargestincrease beingobservedoverthe2020–2021seasoncorresponding withthetransitiontovirtualinterviews;however,thistrend wasnotstatisticallysignificant.Notably,proportionsof studentsmatchingtohomeinstitutionsweresimilarin2018 and2023.Anincreasingnumberofstudentsmatchedtoinstateinstitutionsfrom2020to2021;furtherincreasesinthe percentageofin-statematcheswereobservedfrom2021to 2022beforestabilizingatapproximately30%ofin-state matchesinthe finalincludedyear,closeto2018levels.Inregionmatchesremainedroughlystableacrossthestudy periodwithslightlylessthanhalfofstudentsmatchingtoan institutionintheirhomeERASgeographicregion(Table1). Chi-squaretestsdidnotrevealanysignificantdifferences betweengroups.

Whenevaluatingthesubgroupofapplicantsfrom allopathicschools,itappearedthatanoverallincreased proportionoftheseapplicantsmatchedtotheirhome institutionsoverthecourseofthesixyearsofthestudy(P < 0.01).Thisincreasewasmostnotablein2023when31.8%of theseapplicantsmatchedtotheirhomeinstitutions,nearly doublethatofanyprioryear.Therewasalsoanincreasein MDapplicantsmatchingtoinstitutionswithinthesamestate astheirmedicalschooloverthestudyperiod(P = 0.01). Regionalinstitutionmatchesforallopathicapplicants remainedstableoverthestudyperiod.Osteopathic applicantsdidshowanincreaseinproportionofthem matchingtoin-stateorin-regioninstitutions;however,these trendswerenotstatisticallysignificant(Table2).

Applicationyear

DISCUSSION

Wefoundnostatisticallysignificantdifferenceofmatch locationamongallapplicantsapplyingtooneMidwestern EMresidencyprogramaftertheimplementationofvirtual interviews.Similarnumbersofapplicantsmatchedtothe sameERASregionastheirmedicalschoolregardlessofinpersonorvirtual-interviewformat.Applicantsfrom allopathicschoolsdidshowanincreasedproportion matchingattheirhomeorstateinstitutionsafterthe implementationofvirtualinterviews,andthis findingwas statisticallysignificant.Anincreasingnumberofosteopathic applicantsmatchedtoin-stateinstitutionsafterthe implementationofvirtualinterviews.Thistrenddidnot reachstatisticalsignificancebutdidapproachsignificance. Virtualinterviewsreducecosttoapplicantsandmayallow applicantstocompleteinterviewsatadditionalprograms. Correspondingly,thenumberofinterviewsconductedbythe programincreasedinthe firstyearofvirtualinterviewsprior tostabilizingatasomewhathighernumberthaninthe previoustimeframewithin-personinterviews.Increased numbersofinterviewsofferedmeantincreasedtimedemands fromfacultyparticipatinginthoseinterviewsandmayhave contributedtointerviewfatigue.Notably,oneobstetrics/ gynecologyprogramdidnot findanincreaseinnumbersof interviewsofferedtoorcompletedbyapplicants.6 Conversely,applicantshavingtheabilitytocompletemore interviewsmayallowforfewer financialdisparitiesto perpetuateamongstudents,assomestudentsmayhave previouslylimitedinterviewsduetocostconcerns.An AssociationofAmericanMedicalCollegessurveyshowed thatpreviousmonetarycostsforresidencyinterviewsranged from$1,000to$11,580(median$4,000).7 Usingavirtual processmayalsobenefit financiallychallengedstudentsby eliminatingthecostof flightsandhotels,andothertravel expensespreviouslynecessarytocompletetheinterview season.Thetransitiontovirtualinterviewsmayhave downstreameffectsonthediversityoftheEMworkforceif applicantsarelesslikelytomatchoutsidetheirhomeorinstateprograms.8

Table2. Allopathicandosteopathicapplicantmatchlocationbyyear.

Higherpercentagesofallopathicstudentsmatchingtoinhomeandin-stateprogramsmayindicatethatprogramsand applicantsalikepreferentiallyrankeachotherdueto familiarity,althoughgiventheuncertaintiesofthe COVID-19pandemicandrestrictionsonawayrotations from2021onward,itisdifficulttoattributethisincreaseto onefactor.Itiswellknownthatmoststudentshaveastrong geographicpreferencetomatchneartheirhomeandthat locationisasignificantdriverofresidencyprogramchoice.9

Applicationyear

DOapplicants

MD, DoctorofMedicine; DO,Doctorof Osteopathic Medicine.

Table1. Applicantmatchlocationbyyear.

Thistrendhasalsobeenseeninorthopedicsurgeryprograms withtheirtransitiontoavirtualinterviewprocess10; however,thisdidnotholdtrueforneurologyandgeneral surgeryprograms.11,12 Students’ geographicpreferencesin EMseemtohavebeenamplifiedbythetransitiontovirtual interviews,particularlyamongallopathicapplicants.While virtualinterviewsarenottheonlychangethatoccurredinthe residentrecruitmentprocessduringthe2021andsubsequent interviewseasons,itisplausiblethatinterviewformatisone ofmanyfactorsinfluencingstudentinterviewbehavior, althoughwedidnot findevidenceofthisbehavioramongall applicantsinourstudy.

Itwasnotpossibletodeterminewhateffectotherfactors includingtravelrestrictions,societalunrest,andother changeshadonapplicantbehaviorandtheirprocessof selectingapplicationlocations,interviews,andultimately matchlocation.Further,itisdifficulttounderstandwhat effecttheadventofprogramsignalinghadonboth intervieweeandinterviewerbehaviorafteritsintroductionin 2022,andthisremainsanactiveareaofstudy. Understandingthestabilityofthein-regionmatchratesis difficulttointerpretbutsuggeststhatsimilarnumbersof studentsarelookingtoleavetheirmedicalschoolregionover time.TheERASregionswerealsodefinedduringthistime frame,whichmayhavealteredstudents’ perceptionsof region.Thesegeographicpreferencesareanareaforongoing studyasprogramsevaluateresidencymatchestoservetheir communitiesandensuremutuallybeneficialmatches betweenprogramsandapplicants.

LIMITATIONS

Thisstudyhasmultiplelimitations.First,onlyonelarge, MidwesternEMresidencyprogramisrepresented.Thereare multipleotherfactorsincludingthenumeroussocialand societalchangesthattookplaceduringtheCOVID-19 pandemic,aswellastheintroductionofpreferencesignaling certainlyimpactedapplicants’ matchpreferencesand interviewbehaviorsinadditiontothetransitiontoavirtual interviewmodel.Wewereunabletocontrolforthesefactors orotherchangestoapplicantbehaviorsuchasthepotential desiretoremainclosertohomewhentravelwasmore constrainedduringtheglobalpandemicorasaresultof ongoingsocietalunrest.Ofnote,overallapplicantbehavior alsochangedacrossmatchyearswithadecreasein applicationsbeginningin2022andincreasedproportionsof osteopathicandinternationalmedicalgraduates.13 Additionally,EMapplicantscontinuetobeadvisedto completenomorethanoneawayrotationperinterview cycle,whichlimitsprogramandapplicantexposuretoeach other.Further,whileERASregionswereused,thisdoesnot accountforapplicantswhomayhavematchedjustacrossthe bordertoanotherregion,creatingafalseinflationof geographicdistance.

CONCLUSION

Virtualinterviewsarenowa fixtureoftheresidency applicationprocesswithEMprogramsrequiringthisprocess toparticipateinthematch.14 Wedidnot findstatistically significantdifferencesinhomeinstitutionorin-statematch ratesforallapplicants;however,allopathicapplicantsdid haveanincreaseinproportionofstudentsmatchingtotheir homeinstitution.Whileourdatadoesnotsuggestanoverall impactofvirtualinterviewsinmatchdecisionsmadeby applicantsorprograms,thesetrendswarrantadditional monitoringforongoingimpact,particularlyamong allopathicapplicantswhereanincreaseinhomeandin-state matcheswasstatisticallysignificant.Furtherlargerstudies wouldbehelpfultounderstandhowtransitioningtothis modelaffectsapplicantmatchbehavior.Additionalstudies wouldbebeneficialtohelpprogramsfurtherunderstandkey areasoffocusandensuresuccessfulinterviewplanningfor EMprograms.

AddressforCorrespondence:ChristineA.Motzkus,MD,PhD, IndianaUniversitySchoolofMedicine,DepartmentofEmergency Medicine,2651EDiscoveryPkwy.,RoomC3018,Bloomington, IN47408.Email: cmotzkus@iuhealth.org

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Motzkusetal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.JoshiA,BloomD,SpencerA,etal.Videointerviewing:areviewand recommendationsforimplementationintheeraofCOVID-19and beyond. AcadRadiol. 2020;27(9):1316–22.

2.PonterioJM,LevyL,LakhiNA.Evaluationofthevirtualinterview formatforresidentrecruitmentasaresultofCOVID-19 restrictions:residencyprogramdirectors’ perspectives. AcadMed. 2022;97(9):1360–7.

3.NewsomeK,SelvakumarS,McKennyM,etal.Shifting thesurgicalresidencymatchtoa100%virtualinterview formatduringtheCOVID-19pandemic:howhasitaffected placementintosurgicaltrainingprograms? AmerSurg. 2023;89(4):935–41.

4.BaghdassarianA,BaileyJA,CaglerD,etal.Virtualinterviewsandthe pediatricemergencymedicinematchgeography:anationalsurvey. WestJEmergMed. 2024;25(2):186–90.

5.PreacherK.J.2001.Calculationforthechi-squaretest:aninteractive calculationtoolforchi-squaretestsofgoodnessof fitandindependence [computersoftware].Availableat: http://quantpsy.org AccessedJanuary20,2024.

6.Santos-ParkerKS,HammoudMM,WinkelAF,etal.Distributionsof residencyinterviewswiththeimplementationofvirtualinterviews andstandardizationofinterviewofferdates. JSurgEduc. 2022;79(5):1105–12.

7.AssociationofAmericanMedicalColleges.2024.Thecostof interviewingforresidency.Availableat: https://students-residents. aamc.org/financial-aid-resources/cost-interviewing-residency AccessedMarch30,2024.

8.CapersQ,JohnsonA,BerlacherK,etal.Theurgentand ongoingneedfordiversity,equity,andinclusioninthe cardiologyworkforceintheUnitedStates. JAmHeartAssoc. 2021;10(6):e018893.

9.HasnieUA,HasnieAA,Preda-NaumescuA,etal.Exploringmatch space:howmedicalschoolandspecialtycharacteristicsaffect residencymatchgeographyintheUnitedStates. AcadMed. 2022;97(9):1368–73.

10.NestlerAJ,FeibelBM,BeasonAM,etal.Thestudentyouknow: orthopedicsurgeryhomeprogrammatchratesandgeographic relationshipsbeforeandafterCOVID-19. JSurgEduc. 2022;80(3):476–82.

11.BeinhoffP,AttlassyN,CarlsonC.Nogeographicdistributionchange amongresidencyapplicantsintheneurologymatchduringCOVID-19. Cureus. 2023;15(2):e34898.

12.BeesleyH,PernarL,KettoolaY,etal.Theassociationbetween virtualinterviewingandgeographicaldistributionofmatched residencyprogramsforgeneralsurgeryapplicants. JSurgEduc. 2022;80(2):194–9.

13.NationalResidencyMatchingProgram.2023.Resultsanddata:2023 mainresidencymatch.Availableat: https://www.nrmp.org/match-data/ 2023/06/results-and-data-2023-main-residency-match/ AccessedNovember6,2024.

14.AssociationofAmericanMedicalColleges.2024.AAMCinterview guidanceforthe2022–2023residencycycle.Availableat: https://www. aamc.org/about-us/mission-areas/medical-education/interviews-gmewhere-do-we-go-here.AccessedNovember6,2024.

EDUCATION SPECIAL ISSUE -EDUCATIONAL ADVANCES

DevelopmentofaReliable,ValidProceduralChecklistfor AssessmentofEmergencyMedicineResidentPerformance ofEmergencyCricothyrotomy

DanaE.Loke,MD,MS*

AndrewM.Rogers,MD,MBA†

MorganL.McCarthy,MD‡§

MarenK.Leibowitz,MD¶

ElizabethT.Stulpin,MD#

DavidH.Salzman,MD,Med‡∥

* UniversityofWisconsinSchoolofMedicineandPublicHealth,BerbeeWalsh DepartmentofEmergencyMedicine,Madison,Wisconsin

† NorthShoreUniversityHealthSystem,DivisionofEmergencyMedicine, Evanston,Illinois

‡ NorthwesternUniversity,FeinbergSchoolofMedicine,DepartmentofEmergency Medicine,Chicago,Illinois

§ StLuke’sHospital,DepartmentofEmergencyMedicine, NewBedford,Massachusetts

¶ IcahnSchoolofMedicineatMountSinai,InstituteofCriticalCareMedicine, NewYork,NewYork

# EmoryUniversityHospital,DepartmentofEmergencyMedicine,Atlanta,Georgia

∥ NorthwesternUniversity,FeinbergSchoolofMedicine,DepartmentofMedical Education,Chicago,Illinois

SectionEditors: MatthewTews,MDandChristineStehman,MD

Submissionhistory:SubmittedJune15,2024;RevisionreceivedNovember8,2024;AcceptedNovember12,2024

ElectronicallypublishedJanuary30,2025

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.20365

Introduction: Emergencycricothyrotomyisararebutpotentiallylife-savingprocedureperformedby emergencyphysicians.Acomprehensive,dichotomousproceduralchecklistforemergency cricothyrotomyforemergencymedicine(EM)residenteducationdoesnotexist.

Objectives: Weaimedtodevelopachecklistcontainingthecriticalstepsforperforminganopen emergencycricothyrotomy,toassessperformanceofEMresidentsperforminganopenemergency cricothyrotomyusingthechecklistonasimulator,andtoevaluatethereliabilityandvalidityofthe checklistforperformingtheprocedure.

CurricularDesign: Wedevelopedapreliminarychecklistbasedonliteraturereviewandsentitto expertsinEMandtraumasurgery.AmodifiedDelphiapproachwasusedtorevisethechecklistand reachconsensusona finalversionofthechecklist.Toassessusabilityofthechecklist,weassessedEM residentsusingacricothyrotomytasktrainer.Scoresweredeterminedbythenumberofcorrectly performeditems.Wecalculatedinter-raterreliabilityusingtheCohenkappacoefficient.Validitywas assessedusingtheWelch t-testtocomparetheperformanceofresidentswhohadandhadnot performedanopenemergencycricothyrotomy,andweusedanalysisofvariancetocompare performanceofpostgraduateyear(PGY)cohorts.

Impact/Effectiveness: The final27-itemchecklistwasdevelopedafterthreeroundsofrevisions.Interraterreliabilitywasstrongoverall(κ = 0.812)withindividualchecklistitemsrangingfromslighttonearly perfectagreement.Atotalof56residentsparticipated,withanaveragescoreof14.3(52.9%). PerformancevariedsignificantlyamongPGYgroups(P < 0.001).Residentswhohadperformedan emergencycricothyrotomypreviouslyperformedsignificantlybetterthanthosewhohadnot (P = 0.005). Thedevelopedchecklist,whichcanbeusedinproceduraltrainingforopenemergencycricothyrotomy, suggeststhatimprovedtrainingapproachestoteachingandassessingemergencycricothyrotomyare neededgiventheoverallpoorperformanceofthiscohort.[WestJEmergMed.2025;26(2)279–284.]

BACKGROUND

Emergencycricothyrotomyisararebutpotentiallylifesavingprocedurethatemergencyphysicians(EP)mustbe abletocompetentlyperform.ItisperformedwhentheEPis unabletooxygenateandventilateapatientafterrapid sequenceintubationisinitiatedand,therefore,mustpursue cricothyrotomyinatime-sensitivemanner.Thus,itis essentialforEPstobeabletoperformtheprocedure correctly.Furthermore,theAccreditationCouncilfor GraduateMedicalEducationincludescricothyrotomyasa “keyprocedure” forwhichresidents “mustdemonstrate competence.”1 However,therearefewopportunitiestolearn thisprocedureintheclinicalenvironment,withonestudy demonstratingthatonly22%ofgraduatingemergency medicine(EM)residentshadtheopportunitytoperform cricothyrotomyonalivingpatient.2 Anotherstudyindicated thatevenexperiencedEPsfeltthattheylackedtrainingin performingcricothyrotomyandthatthisprocedural inexperiencecoulddirectlyaffectthesurvivalofapatientand leadtohighemotionalpressure.3 Lastly,thecritically importantnatureoftheproceduremakeslearningonshifta patientsafetyissue.

Thecombinationofcompetency-basedapproachesusing checklist-basedassessmentsandthesimulationenvironment hasdemonstratedalongtrackrecordofimprovingresident performanceonspecificproceduralskills.4–8 Whilevarious instructionalvideosandchecklistsmeantfordifferent specialtiesareavailable,astandardized,reliable,valid, comprehensive,anddichotomousproceduralchecklistfor assessmentofperformingemergencycricothyrotomyforEM residenteducationislacking.9–11 Historically,thestudysite program ’smethodforteachingtheopenemergency cricothyrotomyoccurredduringtheannual “ rare procedures” simulationlab.Thesesessionsinvolvednonstandardizedpracticewithatasktrainerorsheeplarynxthat didnotfollowacompetency-basedtrainingmodel.

OBJECTIVES

RecognizingthisunmetneedinEMproceduraltraining forourlearners,wesetseveralobjectivesinthisstudy.The primaryobjectivewastodevelopachecklistcontaining thecriticalstepsforperforminganopenemergency cricothyrotomybasedoninputfromamultidisciplinary teamofexperts.Thesecondobjectivewastoevaluatethe reliabilityandvalidityofthechecklistforperformingopen emergencycricothyrotomy.Finally,thethirdobjectivewas tousethechecklisttoassessagroupofEMresidentsontheir abilitytoperformtheprocedureonasimulatorandcompare performancebytrainingyear.

CURRICULARDESIGN

ChecklistDevelopment

WeperformedaliteraturereviewinMEDLINEandthe MedEdPortaltoassesspublishedliteratureforemergency

cricothyrotomyprocedurechecklistsandcurriculums.Key phrasesforliteraturesearchesincluded “ emergency cricothyrotomycurriculum,”“emergencycricothyrotomy checklist,”“emergencycricothyrotomyprocedure,” “emergencycricothyrotomysimulation,”“ emergency cricothyrotomyresident,”“emergencycricothyrotomy residency,”“emergencycricothyrotomyeducation,” and variationsandcombinationsofthekeywords/phrases. Searchesincludedallarticlespublisheduntilthesearchdate ofNovember1,2020.AnEMproceduralskillstextbookand asurgicaltechniquetextbookwerereviewedaswell.12,13 We alsoevaluatedrelevantarticlesfromthebibliographiesofthe textbooksandincludedstudiesforinclusion.

WeusedtheStufflebeamframeworkforchecklist developmentaftertheliteraturereviewwascompleted.14 A preliminarydichotomous(“done” vs “incorrect/notdone”) checklistwasdevelopedbasedonthisliteraturereview.The initialchecklistwassenttoapanelof13expertscomprisedof emergencyphysiciansandtraumasurgeonsofvarying practicetype(academic,community,military),geographic practicelocation(withintheUnitedStates),andgender. Practicetypeincluded10academic,twocommunity,andone militaryhospital;practicelocationincluded fiveinternaland eightexternal;andbreakdownbysexwas fivefemaleand eightmale.Expertswereblindedtoeachother’sidentities andcomments.Weinformedtheexpertpanelofthe curriculum’sintendedaudienceofEMresidentswith anticipateduseforacompetency-basedcurriculum.Weused amodifiedDelphiapproachtoseriallyrefinethechecklist andreachconsensusona finalchecklist.15,16 Wethenpilottestedthechecklisttoensuretheitems,wording,and formattingwereideallyoperationalized.Finally,theexpert panelrevieweditfor finalapproval.

StudyPopulation

Thestudywasperformedatasingleurbanacademic centerwithafour-yearEMresidencytrainingprogram.Four residentswereexcludedfromthestudyduetotheir participationinthechecklistdesignandassessmentprocess. AllotherEMresidentswereincludedintheeducationaspart oftheannualsimulationcurriculum;however,participation inthestudywasvoluntary.Thestudywasreviewedbythe institutionalreviewboard(IRB)atNorthwesternUniversity, FeinbergSchoolofMedicineanddeterminedtobeexempt. Writteninformedconsentwasobtainedfromparticipants usingaconsentformapprovedbytheIRB.

Assessment

Assessmentsoccurredinthesimulationcenterusinga simulationmanikin(TraumaMan,Simlab,Seattle,WA) fromAugust31–September28,2021.Performance assessmentsweredocumentedusinganelectronicversionof thechecklistinQualtrics(Qualtrics,Seattle,WA),including adichotomous “Yes” or “No” forcompletionofeachstep.

Onein-personrater(DL)wassituatedadjacenttothe simulationmanikinwiththeabilitytomoveaboutthe simulationroomtoensureidealvisualization.Audiovisual recordingoftheassessmentincludedonecameraoverhead providingadirectoverheadviewandasecondcamera situatedtoprovideaviewfromtheside.Eachparticipant assessmentwasrecordedfromstarttocompletionofthe checklist.Thedualvideofeedswithaudioweresavedasa singleside-by-sidevideorecording.Theserecordedvideos werereviewedbyasecondrateratalatertime.Weusedan onlinerandomnumberpicker(https://www.random.org/ lists/)toselect30%oftheparticipantsforscoringbythe secondrater.17 Thesecondrater(AR)scoredtherandomly selectedsampleofvideorecordingsusingthesameelectronic assessmentinstrumentinQualtrics.

DataAnalysis

Thechecklistwasanalyzedforinter-raterreliabilityand validityamongacohortofEMresidentsrangingfrom PGY1-4.Inter-raterreliabilitywascalculatedoverallandfor eachcheckliststepusingtheCohenkappacoefficient.We determinedvalidityusingtheWelch t -testtocomparethe performanceofparticipantswhohadandhadnotperformed anemergencycricothyrotomyinclinicalpracticeor simulationandalsobetweenconsecutivePGYgroups. Analysisofvariancewasusedtocompareperformance amongPGYcohorts.

IMPACT/EFFECTIVENESS

Results

Theliteraturesearchproducedatotalof394articles.After review,13articlesweredeemedsuitabletoinformchecklist development.Anadditionaltwoarticleswereidentifiedand includeduponreviewingreferencesoftheincludedarticles andthetwotextbooks.Wedevelopedapreliminary33-item dichotomouschecklistbasedonthisliteraturereview. Consensuswasachievedafterthreeroundsofrevisions, resultinginthefourthversionofthechecklistbeingthe final version.Wethentestedthe final27-itemchecklistamong ourselvesforusability.Onlyminorwordingandformatting changesweremadetoensureidealoperationalizationofthe checklist.The finalchecklistwasapprovedbytheexpert panelafterusabilitytesting,andnoadditionalrevisions weresuggested.

The table includespercentagecorrectofchecklistitems, inter-rateragreement,andCohenkappacoefficientsforeach checklistitem.Overall,inter-raterreliabilitywasstrong (κ = 0.812)withindividualchecklistitemsrangingfromfair tonearlyperfectagreement,withoneitemhavingslight agreement.Atotalof56residentsparticipated,including15 PGY-1,14PGY-2,13PGY-3,and14PGY-4residents. Whileonlyoneresidenthadperformedanemergency cricothyrotomyonalivepatient,69.6%hadpreviously performedanemergencycricothyrotomyinsimulation.The

averagechecklistscorefortheoverallresidentcohortwas 14.3(52.9%).Emergencymedicineresidentchecklist performancevariedbyPGYclass(Figure).Performance variedsignificantlyamongstPGYgroups(P < 0.001).The PGY-4sperformedbestwithanaveragescoreof16.7 (61.9%)ofchecklistitemscompletedcorrectly.They performedbetterthanPGY-3s,butnotsignificantly(61.9% vs59.5%, P = 0.21).ThePGY-3sperformedsignificantly betterthanPGY-2s(59.5%vs48.9%, P = 0.01).ThePGY-2 performancewasbetterbutnotsignificantlydifferent comparedtoPGY-1performance(48.9%vs.42.7%, P = 0.13).Theresidentswhohadpreviouslyperformedan emergencycricothyrotomyonalivepatientorinsimulation performedsignificantlybetterthanthosewhohadnot (56.8%vs.44.2%, P = 0.005).

Discussion

Althoughweidentifiedproceduralnarrativesand checklistswithvarying degrees ofspecificityforourlearner groupatthetimeofourliteraturereview,oursearch demonstratedalackofastandardized,validated,reliable, anddichotomousproceduralchecklistforemergency cricothyrotomyforEMresidents.Thischecklistaddsto morerecentlypublishedarticlestargetingattendings, students,and “novice” learners.Thisnewlydeveloped proceduralchecklistforemergencycricothyrotomy addressesthisunmetneedforEMresident proceduraltraining.

Theexpertpanelprovidedcriticalinsightduringthe checklistdevelopment.Ourinitialchecklistfocusedonthe classic “hookanddilator,” scalpel-basedapproachto emergencycricothyrotomy.However,weultimatelyrevised thechecklistbasedonexpertfeedbacktoincludethe additionalacceptedapproachesof “scalpelonly” and “bougie-assisted” emergencycricothyrotomy.Theinclusion ofallthreeacceptedapproachesallowedforamoreversatile checklistthatismoregeneralizabletoallresourcesettings andbetterreflectsthevariablereal-worldenvironmentand urgencyoftheprocedure.Theinclusionofmultiple techniquesalsosuggestsgeneralizabilitytootherclinical environments,suchassurgeryandotolaryngology;however, thiswasnottheintendedaudienceatthetimethechecklist wasdeveloped.Whilethereareseveralpotentialoptionsfor performinganemergencycricothyrotomy,includinga needle/wireSeldingertechnique,thischecklistreflectsthe developmentwiththeprimaryconstructofusingascalpelbasedapproach.

Thisstudy’sstrongoverallinter-raterreliabilityusingthis checklistandonein-personraterandoneremote-videorater reinforcespreviousstudiesusingasimilartechnique.18,19 Additionally,inter-raterreliabilityusingthismethodwas strongoverall,whichisconsistentwithpriorchecklist developmentstudieswithsimilarmethods.18,19 Most individualitemshadmoderatetonear-perfectinter-rater

Table. Percentcorrect,inter-rateragreement,andreliabilityforindividualchecklist-itemscoring.

Checklistitem

1.Gatherssterilesupplies

2.Gathersprimarycricothyrotomy procedure supplies

3.Gatherssecondary/supplementalcricothyrotomyproceduresupplies

4.Gatherssupplementalintubationsupplies 0%100%n/a*

5.Washeshands

6.Sterilizestheneck

7.Donspersonalprotectiveequipment

8.Proceduralistpositionsonthepatient’srightside

9.Identi fiescricothyroidmembrane(CTM)

10.Usesthumbandmiddle fingerofnon-dominanthandtostabilizeairway

11.Confirmsincisionsitewithpalpationbyindex fingerontheCTMusingnon-dominant handwhilemaintainingstabilizationusingthumbandmiddle fingerofnon-dominanthand

12.Usesscalpeltomakeverticalskinincision~2–4cminlengthovertheCTMusing dominanthand

13.DissectsdowntoCTM

14.Re-identifiesCTMbypalpationorvisualization

15.Makes~1–2cm(widthofscalpelblade)horizontalincisionthroughCTMwithdominant handandmaintainsscalpelbladeintrachea

16.Maintainspatencyoftract

17.Removesscalpel,onlyaftertrachealhook,Trousseaudilator,bougie,orsecondary scalpelhandleisinplace,maintainingpatencyofCTM

18.ProceduralistdilatesCTM

19.Insertsendotrachealtubeortrach

20.Insertsendotrachealtubeortrachtocorrectdepth

21.Inflatesthecuffwitha10-ccsyringe

22.Connectsbag-valve-masktoendotrachealtube/trachandbeginsassistedventilation92.9%94.1%0.638

23.Usescapnographytoconfirmtubelocation

24.Listensforbilateralbreathsounds

25.Securesendotrachealtube/trach

26.Orderschestradiograph

27.Documentsprocedure

*Unabletocalculatekappacoefficient duetoone or bothratersgivingthesamescoretoallscoredparticipants.

reliability,overalldemonstratingreliabilityofthechecklist.20

Theitemswiththelowestkappascoresincluded “gathers sterilesupplies” (item1), “identifiescricothyroidmembrane” (item9),and “usesscalpeltomakeverticalskinincision ~2–4cminlengthoverthecricothyroidmembraneusing dominanthand” (item12).Wesuspectthatthislikelyreflects theremotenatureofthesecondrater,asmishearingarequest forasinglepieceofequipmentorinabilitytoaccurately visualizethemembraneorexactlengthofincisionona recordedvideowouldleadraterstoscoredifferently.This couldhavebeenimprovedwithgreaterverbalizationofall stepsbythelearnerandprimaryraterorhavingasecondinpersonraterwhenable.

Theresidentswhohadperformedanemergency cricothyrotomypreviouslyperformedsignificantlybetter thanthosewhohadnot,demonstratingcriterionvalidityfor thischecklistastherewascorrelationwiththisgroup’sprior experience.Severalstudieswithsimilarmethodshavealso demonstratedcongruent findingsonchecklistvalidity.18,19 Whilenotsignificant,moreseniorPGYresidentsperformed betteraswell.Thismayhavebeenduetoincreasedclinical exposurewithseeinganemergentcricothyrotomyperformed orimprovedproceduralexperiencewithpracticeinthe simulationenvironment.However,despitethesepotential exposuresandpreviousexperiences,thiscohortonly correctlycompletedjustoverhalfofthechecklistitems.

Figure. Emergencycricothyrotomychecklistperformancebyemergencymedicineresidentpostgraduateyear.Boxlimitsrepresentthe25th and75thpercentileswiththemedianchecklistscorerepresentedbythebar. PGY,postgraduateyear.

Additionally,certainitemshadparticularlylow completionrate,including “Gatherssupplemental intubationsupplies” (item4)(0%); “Proceduralistdilates cricothyroidmembrane ” (item18)(3.6%);and “Documents procedure” (item27)(8.9%).Whilesomeofthese completionratesmaybeattributabletothesimulation environment,itisimportanttohighlightthatmerely planningforanintubationwouldnotnecessarilyensure thatallequipmentnecessaryforacricothyrotomywasalso available.Theoverallperformanceofthisresidentgroup, withresidentsonlycompletingroughly50%ofthechecklist items,suggeststhatthecurrent,non-standardized techniqueforteachingemergencycricothyrotomyinthis cohortislackingandthatacompetency-basedapproach usingawell-developedproceduralchecklistmay improveperformance.

LIMITATIONS

Thisstudyhasseverallimitations.First,thesingle-site natureofthestudymaynotreflectresidentperformanceat otherinstitutions.Studyingthechecklist’suseatother residencysiteswouldhelptounderstanditsgeneralizability tootherenvironmentswithdifferentapproachestoteaching opencricothyrotomy.Second,whilewerecruitedanexpert panelincludingEMandtraumasurgeryrepresentativeswith diversityinpracticetype,practicelocation,andgender,most oftheexpertspracticedinanacademicenvironment.Despite this,thestepstoperformingtheprocedureshouldnotvaryby practiceenvironmentand,therefore,wedonotbelievethat thislimitsvalidityorgeneralizabilityofthechecklist.Expert panelreviewincludingadditionalcommunityandhybrid expertswouldhelptestthishypothesis.

Third,thechecklistandtestingwereperformedusinga bloodlesssimulationtasktrainer,whichmaynotideally representanactualpatientencounter.However,the infrequentnatureoftheprocedure,asevidencedbyonlyone residenthavingperformedanemergencycricothyrotomy duringtheirtraining,necessitatesanon-clinicalenvironment trainingsimulation.Whileemergencycricothyrotomy simulationexperiencehasbeendocumentedusingsheep larynxand3D-printedmodels,ourstudywasnotperformed usingthesemodelsandinsteadusedacommerciallyavailable trainingdevice.Therefore,wedonotknowtheinfluenceof differentsimulationmethodsonthestudyandchecklist performance,andthisremainsanareaforfuturestudy.

CONCLUSION

Wedesignedareliable,valid,dichotomousprocedural checklisttoassessEMresidents’ abilitytoperform emergencycricothyrotomy.Theoverallperformanceofthe residentstestedinthisstudysuggeststhatthecurrentmethod ofteachingemergencycricothyrotomyforthisgroupis insufficient.Giventheneedtodevelopprocedural competencyforthisrarebutpotentiallylife-saving procedure,acurriculumsuchassimulation-basedmastery learningshouldbedevelopedtoensuremasteryofthis procedureforEMresidents.Thechecklistdevelopedinthis studycouldserveasafoundationforsuchacurriculum.

AddressforCorrespondence:DanaE.Loke,MD,MS,Universityof WisconsinSchoolofMedicineandPublicHealth,BerbeeWalsh DepartmentofEmergencyMedicine,800UniversityBayDr.,Suite 310,Madison,WI,53705.Email: dloke@medicine.wisc.edu

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Lokeetal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.AccreditationCouncilforGraduateMedicalEducation.ACGME programrequirementsforgraduatemedicaleducationinemergency medicine.2023.Availableat: https://www.acgme.org/specialties/ emergency-medicine/program-requirements-and-faqs-andapplications/.AccessedSeptember11,2023.

2.MakowskiAL.Asurveyofgraduatingemergencymedicineresidents’ experiencewithcricothyrotomy. WestJEmergMed. 2013;14(6):654–61.

3.ZinkW,BernhardM,KeulW,etal.Invasivetechniquesinemergency medicine.I.Practice-orientedtrainingconcepttoensureadequately qualifiedemergencyphysicians. Anaesthesist. 2004;53(11):1086–92.

4.BarsukJH,CohenER,CaprioT,etal.Simulation-basededucationwith masterylearningimprovesresidents’ lumbarpunctureskills. Neurology. 2012;79(2):132–7.

5.BarsukJH,McGaghieWC,CohenER,etal.Useofsimulation-based masterylearningtoimprovethequalityofcentralvenouscatheter placementinamedicalintensivecareunit. JHospMed. 2009;4(7):397–403.

6.MillerDT,ZaidiHQ,SistaP,etal.Creationandimplementationofa masterylearningcurriculumforemergencydepartmentthoracotomy. WestJEmergMed. 2020;21(5):1258–65.

7.KleinMR,SchmitzZP,AdlerMD,etal.Simulation-basedmasterylearning improvesemergencymedicineresidents’ abilitytoperformtemporary transvenouscardiacpacing. WestJEmergMed. 2022;24(1):43–9.

8.KleinMR,LokeDE,BarsukJH,etal.Twelvetipsfordeveloping simulation-basedmasterylearningclinicalskillschecklists. MedTeach. 2024.Inpress.

9.MelchiorsJ,TodsenT,NilssonP,etal.Preparingforemergency:avalid, reliableassessmenttoolforemergencycricothyroidotomyskills. OtolaryngolHeadNeckSurg. 2015;152(2):260–5.

10.HockSM,MartinJJ,StanfieldSC,etal.Novelcricothyrotomy assessmenttoolforattendingphysicians:amulticenterstudyofanerror avoidancechecklist. AEMEducTrain. 2021;5(4):e10687.

11.IssaN,LiddyWE,SamantS,etal.Emergencycricothyrotomyduringthe COVID-19pandemic:howtosuppressaerosolization. TraumaSurg AcuteCareOpen. 2020;5(1):e000542.

12.RobertsJR,CustalowCB,ThomsenTW.Cricothyrotomyand percutaneoustranslaryngealventilation.In: ClinicalProcedures inEmergencyMedicineandAcuteCare.Philadelphia,PA: Elsevier,2020:127–141.e3.

13.Cioffi WG,AsensioJA,AdamsCA,etal.Chapter3.Surgicalairways: tracheostomyandcricothyroidotomy.In:TownsendCMandEversBM (Eds), AtlasofTrauma/EmergencySurgicalTechniques.Philadelphia, PA:Elsevier,2014:23–34.

14.StufflebeamDL.Guidelinesfordevelopingevaluationchecklists:the checklistsdevelopmentchecklist(CDC).2000.Availableat: https:// wmich.edu/sites/default/files/attachments/u350/2014/guidelines_cdc. pdf.AccessedSeptember11,2023.

15.WaggonerJ,CarlineJD,DurningSJ.Isthereaconsensuson consensusmethodology?Descriptionsandrecommendationsforfuture consensus research. AcadMed. 2016;91(5):663–8.

16. Hasson F,KeeneyS,McKennaH.ResearchguidelinesfortheDelphi surveytechnique. JAdvNurs. 2000;32(4):1008–15.

17.WalterSD,EliasziwM,DonnerA.Samplesizeandoptimaldesignsfor reliabilitystudies. StatMed. 1998;17(1):101–10.

18.KleinMR,SchmitzZP,AdlerMD,etal.Developmentofarigorously designedproceduralchecklistforassessmentofemergencymedicine residentperformanceoftemporarytransvenouscardiacpacing. AEMEducTrain. 2021;5(3):e10566.

19.ZaidiHQ,DhakeSS,MillerDT,etal.Emergencydepartment thoracotomy:developmentofareliable,validatedchecklistfor proceduraltraining. AEMEducTrain. 2019;4(2):139–46.

20.McHughML.Interraterreliability:thekappastatistic. BiochemMed (Zagreb). 2012;22(3):276–82.

EDUCATION SPECIAL ISSUE -BRIEF EDUCATIONAL ADVANCES

A Taste of Our Own Medicine: Fostering Empathy in Medical Learners Through Patient Simulation

* †

The University of Chicago Medical Center, Department of Internal Medicine, Chicago, Illinois

Rush University Medical Center, Department of Emergency Medicine, Chicago, Illinois

Section Editor: Danielle Hart, MD, MACM

Submission history: Submitted June 11, 2025; Revision received October 10, 2025; Accepted October 10, 2025

Electronically published November 26, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI 10.5811/westjem.48535.

Introduction: Residents and medical students spend thousands of hours of medical education learning the physician’s perspective but rarely find themselves on the other side of the stethoscope. In this study we evaluated whether a brief, novel curriculum of simulating the patient experience could improve medical learners’ reported empathy for patients and ability to explain medical interventions.

Curricular Design: Fifty-eight medical learners (medical students and resident physicians) participated in a 50-minute didactic session where learners simulated patient experiences such as wearing a patient gown and cervical collar, walking with crutches, and tasting potassium chloride and thickened water. Learners evaluated their perceptions of the curriculum with a survey.

Impact/Effectiveness: Participants reported limited experience as patients, with 66.7% never having been hospitalized and 50% not taking any daily medications. Learners rated the curriculum highly on a seven-point Likert scale with 98% expressing it helped them to empathize with patients (90% either agreed or strongly agreed) and 95% expressing that it would help them explain interventions (81% either agreed or strongly agreed). There was no difference between medical students and residents regarding reported effect on empathy (M 6.24 vs 6.44; P = .30) or effect on ability to explain the intervention (M 6.06 vs 6.24; P = .43). This brief curriculum simulating the patient experience was well-received by medical student and resident learners, who overwhelmingly felt it improved their empathy for patients and explanations of common interventions. This approach to fostering empathy could help both medical student and resident learners, many of whom may have limited experience as a patient. [West J Emerg Med. 2025;26(6)1526–1529.]

BACKGROUND

Resident physicians and medical students receive extensive medical education focused on thinking like a physician. While trainees have significant clinical knowledge, many lack significant personal experience as a patient, with < 5% of those aged 18-24 years of age being hospitalized per year.1 Curricular activities where learners simulate the patient perspective can potentially overcome trainees’ gaps in patient experience. However, such studies have been limited. Some prior studies have maintained a very narrow focus, such as tasting various oral antibiotics or participating in a diabetic shopping experience.2,3 While such

activities can lead to increased empathy, the effects may not easily translate to other domains.

Other studies have involved longer interventions such as an overnight hospital stay or three-hour visit to the emergency department (ED).4,5 While these broaden the spectrum of experiences encountered, they are very resource intensive for learners and may tax hospitals with limited bed capacity and education funding. A recent simulation-based study of residents role-playing as patients was rated favorably but without measurable improvements in empathy.6 Empathy has been associated with increased patient satisfaction, patient adherence to plans, and improved

clinical outcomes.7 We hypothesized that having learners undergo simulated patient interventions would improve reported learner empathy.

CURRICULAR DESIGN

In this study we piloted a brief patient simulation curriculum employing common but uncomfortable activities that exemplify a spectrum of medical experiences faced by patients. The curriculum was designed using Kern’s six-step approach to curriculum development, with needs assessment from a group of five medical educators and 12 learners.8 This needs assessment noted a lack of direct experience with the interventions that trainees were learning. Experiences were chosen to cover a broad variety of common medical interventions while also retaining a brevity that allowed for easy integration into existing didactics. We considered but did not pursue other interventions, such as tasting oral antibiotics (risk of side effects) or trying bilevel positive airway pressure (resource intensive).

We hypothesized that this curriculum would increase medical learners’ reported empathy for their patients (primary outcome) and their perceived ability to explain these medical interventions (secondary outcome). This study consisted of a 50-minute didactic session where medical learners simulated patient experiences. Emergency medicine (EM) residents, internal medicine residents, and medical students were recruited to the study. A total of 58 learners participated: 33 medical students and 25 EM or internal medicine residents over the course of two separate days. This study was approved by the University of Chicago Institutional Review Board [IRB21-1203].

During the didactic session, learners were separated into two groups of 12-15 learners, each with an instructor. They followed an activity flow starting with donning patient gowns and taping intravenous tubing to their arms to simulate hospital garb (five minutes). Learners subsequently underwent a “trauma” station where they were fitted with cervical collars and then placed on a hard, trauma backboard with a simplified trauma roll performed by other learners (seven minutes). After the trauma roll, they were instructed to ambulate using crutches (five minutes). Finally, learners experienced a “per os station” where they were given 20 mL of thickened water and 20 mL of a typical potassium chloride oral solution to simulate dysphagia diet and potassium repletion, respectively (five minutes). As trainees transitioned between activities, instructors elicited learner experience and had a 2-3 minute debrief of each activity.

After completing all activities, learners filled out an anonymous survey regarding their perceptions of the curriculum and prior patient experience. Learners used a Likert scale to rate how they felt the study changed their empathy for and explanations to patients. Finally, the survey collected qualitative data focusing on learners’ feelings during their time as “patients” and how the activity might impact

their medical practice. After completing the survey, the trainees had a large-group debrief for approximately 10 minutes where they shared their experience and personal learning points. Learners were compensated with a $10 gift card for their participation.

We analyzed survey findings in Stata (StataCorp LLC, College Station, TX) and Microsoft Excel (Microsoft Corporation, Redmond, WA) using two-sample t-tests with Bonferroni correction. Qualitative data were coded using an inductive approach to generate themes with two coders. Discrepancies were discussed until coders agreed. A priori power analysis indicated that a sample of 32 participants would provide 80% power to detect a change of 20% in perceived empathy (Cohen d = 0.8, α = 0.05), which was chosen as a de novo threshold. This study exceeded that sample size.

IMPACT / EFFECTIVENESS

A total of 58 learners participated in two separate sessions; 33 medical students and 25 residents, with equal male/female ratio, and all participants completed the survey. Participants reported limited experience as patients, with the majority never having been hospitalized and half taking no daily medications (Table 1).

Learners rated the curriculum highly on a seven-point Likert scale: 99% of participants expressed that the curriculum helped them to empathize with patients, with 90% either agreeing or strongly agreeing (Table 2); and 95% of learners reported that the session would help them better explain interventions to patients, with 81% either agreeing or strongly agreeing. There was no difference between medical students and residents regarding reported effect on empathy (M 6.24 vs

Table 1. Baseline demographic information of medical student/ resident learners who participated in a didactic session that simulated patient experience.

Learner demographics Count (percentage)

Training level

Medical student

Resident

Sex

Female

33 (57%)

25 (43%)

29 (50%) Male

Prior hospitalizations Never

29 (50%)

38 (66%) Once

Twice

≥ Three times

Daily medication use

Yes

No

15 (26%)

2 (3%)

3 (5%)

29 (50%)

29 (50%)

Table 2. Survey findings of medical student/resident learners who participated in a didactic session that simulated patient experience.

6.44; P = .30) or ability to explain interventions (M 6.06 vs 6.24; P = .43). However, learners who had never been hospitalized prior to the study had a significantly higher reported increase in empathy compared to learners who had been previously hospitalized (M 6.47 vs 6.05; P = .03). There was no difference in reported improvement in explanations to patients between learners who had been hospitalized and those who had not (M 5.85 vs 6.29; P = .06). Of those who participated in the session, 97% reported they would change how they would describe interventions to patients based on their experience in the study.

Qualitatively, the two most common themes identified were 1) discomfort leading to reconsideration of interventions; and 2) empathy toward what the patients were experiencing (Table 3). A representative quote of these two changes was as follows: “[The study] will help me prepare patients for uncomfortable parts of their hospitalization and be conscious about when I can back off on uncomfortable interventions.”

LESSONS LEARNED

We devised a learner handout with the station flow and instructions, which helped learners track their progress. Speech/swallow staff generously provided liquid thickening mix, and hospital pharmacists provided potassium chloride. Learners had the option to change into gowns in the bathroom or don gowns over their clothes. None chose to fully change, which facilitated a discussion about patient vulnerability. When learners had emotional responses to the stimuli, it helped to empathize and then remind them of the shift in magnitude as a patient: “Now imagine you have to drink thickened water every single day from now on.” The one-hour duration was feasible to implement during weekly didactics, and the various stations could support small-group rotations with floating instructors.

This was a single-center study focused on perceived changes in empathy. This study only used a post-survey, which could have led to response shift or recall bias. Future iterations

Table 3. Qualitative themes in survey of participants in a didactic session that simulated patient experience.

Gratitude “I feel appreciative of my health.”

could evaluate higher levels on the Kirkpatrick model to establish improved communication or change in practice. The questions of the validated Jefferson Scale of Empathy had a focus beyond the scope of this intervention but could be considered as a future measure.

Address for Correspondence: William Weber, MD, MPH, Rush University Medical Center, Department of Emergency Medicine, 1750 W. Harrison St, Kellogg 103, Chicago, IL 60612. Email: william_weber@rush.edu.

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study. There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Peña et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. National Center for Health Statistics. Table HospStay. People with hospital stays in the past year, by selected characteristics: United States, selected years 1997-2019. 2023. Available at: https://www.cdc.gov/nchs/ data/hus/hus20-21.pdf Accessed September 14, 2025.

2. Trujillo JM, Hardy Y. A nutrition journal and diabetes shopping experience to improve pharmacy students’ empathy and cultural competence. Am J Pharm Educ. 2009;73(2):37.

3. Gee SC, Hagemann TM. Palatability of liquid anti-infectives: clinician and student perceptions and practice outcomes. J Pediatr Pharmacol

Peña et al. Fostering Empathy in Medical Learners Through Patient Simulation

Ther. 2007;12(4):216-23.

4. Wilkes M, Milgrom E, Hoffman JR. Towards more empathic medical students: a medical student hospitalization experience. Med Educ 2002;36(6):528-33.

5. Nelson S, Germann C, MacVane C, et al. Intern as patient: a patient experience simulation to cultivate empathy in emergency medicine residents. West J Emerg Med. 2018;19(1):41-8.

6. Culhane A, Martin J, Huston Z, et al. Simulating empathy: a

qualitative experiential study of embedded resident learners in an empathy curriculum. AEM Educ Train. 2024;8(2):e10957.

7. Derksen F, Bensing J, Lagro-Janssen A. Effectiveness of empathy in general practice: a systematic review. Br J Gen Pract. 2012;63(606):e76-84

8. Singh MK, Gullett HL, Thomas PA. Using Kern’s 6-step approach to integrate health systems science curricula into medical education. Acad Med. 2021;96(9):1282-90.

EffectivenessofaCollaborative,VirtualOutreachCurriculumfor 4th-YearEM-boundStudentsataMedicalSchoolAf filiatedwitha HistoricallyBlackCollegeandUniversity

CortlynBrown,MD,MCSO* RichardCarter,MD† NicholasHartman,MD,MPH‡ AarynHammond,MD‡ EmilyMacNeill,MD* LynneHolden,MD§ AvaPierce,MD∥ LinelleCampbell,MD§ MarquitaNorman,MD,MBA∥

*AtriumHealthCarolinasMedicalCenter,DepartmentofEmergencyMedicine, Charlotte,NorthCarolina

† HowardUniversity,CollegeofMedicine,Washington,DC ‡ WakeForestUniversity,SchoolofMedicine,Winston-Salem,NorthCarolina

§ AlbertEinsteinCollegeofMedicine,MontefioreMedicalCenter,Bronx,NewYork

∥ UTSouthwesternMedicalCenter,Dallas,Texas

SectionEditors:MatthewTews,MDandChristineStehman,MD

Submissionhistory:SubmittedJanuary24,2024;RevisionreceivedOctober9,2024;AcceptedOctober17,2024

ElectronicallypublishedDecember16,2024

Fulltextavailablethroughopenaccessat http://escholarship.org/uc/uciem_westjem DOI: 10.5811/westjem.18748

Background: Diversitywithinthephysicianworkforceisassociatedwithimprovedclinicaloutcomesand patientsatisfaction.Despitethis,theUSphysicianworkforce,particularlyinemergencymedicine(EM), remainsrelativelyhomogeneous.OfallBlackmedicalschoolstudentsintheUS,14%attendthefour HistoricallyBlackCollegesandUniversities(HBCU)thathaveamedicalschool.Unfortunately,noneof theseschoolsareaffiliatedwithanacademicEMprogram.Becauseofthis,thereislessprofessional mentorshipfocusedonobtainingacareerinEMandpotentiallylessformalcurriculaforseniormedical studentsdoingtheirhomesub-internshipinEM.

Objectives: Ourobjectivewasto fillthegapleftbytheabsenceofanacademicEMdepartmentat HowardUniversityCollegeofMedicine(HUCOM)bycreatingacollaborativeeducationalexperiencefor fourth-yearmedicalstudentsduringtheirhomeEMsub-internship.Thecurricularobjectiveswereto teachcoreprinciplesofEM,buildrelationshipswithstudents,andpreparethemforpursuingEM residencytraining.

CurricularDesign: FourEMacademicdepartmentscollaboratedtocreateandimplementavirtual curriculumusingthesix-stepapproachtocurriculardevelopment.

Impact/Effectiveness: Aftercompletionofthecourse, fivestudents(100%)reportedstronglyagreeing withthefollowingstatements.Thesesessions1)helpedmelearntheapproachtocoreEMtopicsmore thanIwouldhavebeenabletodoonmyown;2)helpedmelearnkeyskillsforexcellinginanEMrotation morethanIwouldhavebeenabletodoonmyown;and3)allowedmetoconnectwithfacultyand residentmentorstolearnmoreaboutthe fieldofEM.Ofthese fivestudents,80%and20%reported stronglyagreeingandagreeing,respectively,thatthesesessionshelpedthemlearnabouttheprocessof applyingtoandselectinganEMresidencyprogram.[WestJEmergMed.2025;26(1)129–134.]

INTRODUCTION

NeedforInnovation

Medicalstudentsinterestedinemergencymedicine(EM) whoattendahistoricallyBlackcollegeoruniversity(HBCU) donothavetheteachingandmentorshipthatoccurswhena medicalschoolisaffiliatedwithanacademicEMprogram. Weformedacollaborativeprogramamongfouracademic EMdepartmentstohelp fillthisneedforEM-boundstudents atHowardUniversityCollegeofMedicine(HUCOM).To ourknowledge,thisisthe firstsuchprogramtobereportedin theliterature.

Background

Adiversephysicianworkforceisassociatedwithincreased accesstoandutilizationofthehealthcaresystem,improved healthoutcomesandpatientexperience,andimproved fiscal marginsforhospitals.1–4,4–6 Despitethis,themedical fieldas awholehasmademinimaladvancesinincreasingphysician diversity.In2008thepercentageofBlackorHispanicUS physiciansfromallspecialtieswas6.3%and5.5%, respectively.By2018,however,thosepercentageswereonly 5.0%and5.8%,respectively.Evenmoreconcerninggiventhe diversepatientpopulationthattheemergencydepartment (ED)serves,EMremainsamongthemedicalspecialtieswith thelowestnumberofphysiciansfrombackgroundsunderrepresentedinmedicine(URiM).Between2008–2018,the percentageofemergencyphysicianswhoidentifiedasBlack decreasedfrom5.0%to4.5%,andstayedconstantat5.3% forHispanic/Latinos.7

Whensurveyed,35%ofEMprogramdirectorsreported thatthesmallnumberofURiMresidencyapplicantswasthe greatestbarriertoobtainingadiverseresidencyclass.8 Ofall BlackmedicalschoolstudentsintheUS,14%attendfour HBCUswithamedicalschool.Becausenoneoftheseschools areaffiliatedwithanacademicEMprogram,theirmedical studentshavedecreasedexposuretoEMinthepre-clinical years,lessprofessionalmentorshipfocusedonobtaininga careerinEM,andfewerformalcurriculaforseniormedical studentsdoingtheirhomesub-internship(sub-I)inEM.This lackofmentorshiphasbeenidentifiedasacriticalbarrierfor URiMstudentsacrossvariousspecialties,contributingto lowerapplicationratesandresidencyplacement.Studies suggestthatmentorshipincreasesbothcareersatisfaction andinclusivityandthelikelihoodofthesestudentsentering andsucceedingincompetitive fieldslikeEM.9,10 Inaddition, anationalsurveyofclerkshipdirectorsfoundthathavinga structured,standardizedsub-Icurriculumsignificantly improvedthepreparednessofstudentsforresidency, especiallywhentheserotationswereaffiliatedwith residencyprograms.11

TheEmoryUniversityDepartmentofEmergency MedicinecreatedaprogramwithMorehouseSchoolof Medicinetoprovideguidancetomedicalstudentsinterested inEM.Atotalof115MorehousestudentscompletedanEM

clerkshipatEmory,and62.6%successfullymatchedinto EM.12 Whilethisprogramwassuccessful,studentstypically relyontheirhomesub-Itoprepareformandatoryaway rotations.Thisabsenceofsupportfromanacademic departmentpriortoawayrotationsmaycausethestudentsto findthemselveslesspreparedandatacompetitive disadvantagewhentheybegintheirawayrotations. Furthermore,manyEMresidenciesarenotinproximitytoa HBCU,requiringstudentstobearthe financialburdenof travelingtoothercitiesandstatesfortheirawayrotations.

AtHUCOM,theEMsub-Ireliedheavilyonanolder, recordedonlinelectureseriesfromanexternalinstitution, supplementedbybedsideteachingfromcommunity attendingsatonesite,HowardHospital.Studentsnotedthat theabsenceofformaleducationalcomponents,suchas weeklydidactics,journalclubs,andsimulation,resultedin limitedexposureto “cutting-edge” EMpractices.Moreover, thelackofinteractionwithacademicattendingswhoare dedicatedtomedicalstudenteducation,alongwiththe absenceofresidents whorepresentthenextstepincareer progression leftstudentswithoutaccesstocritical mentorshipandguidance.Thisgaphinderedstudents’ ability tovisualizetheirownprogressionandreceivepractical advicefromindividualsatasimilarstageintraining,further limitingtheirconnectiontothebroaderEMcommunity.

Tohelpovercomethatbarrier,wecreatedacollaboration betweenfouracademicEDsandHUCOMinanattemptto augmentcurricularofferingsforEM-interestedstudentson theirHUCOMfourth-yearEMhomerotation.The collaborationbetweenfouracademicEDsbroadensthe exposurestudentsreceivetodifferentteachingstyles, institutionalcultures,andclinicalperspectives.Thisvariety providesamorecomprehensiveeducationalexperiencethan whatcanbeofferedbyasingleinstitutionalone.

ObjectiveofInnovation

WeaimedtoaddresstheabsenceofanacademicEDat HUCOMbydevelopingacollaborativeeducational experience.ThisprogramfocusesoncoreprinciplesofEM andresidencypreparationandwasdesignedspecificallyfor fourth-yearmedicalstudentsduringtheirhomeEMsub-Iat HUCOM.Weobtainedinstitutionalboardreviewapproval fromWakeForestUniversitySchoolofMedicine.

DevelopmentProcess

Weusedthesix-stepapproachtocurriculardevelopment. All finalcurriculardesignandcontentwasagreeduponby thefacultyrepresentativesateachofthefourparticipating residencysites.13,14 1) Problemidenti fi cationandgeneral needsassessment. Unliketraditionalcurriculum developmentwheretheneedassessmentisbasedonaspecific healthproblem,ourneedsassessmentwasbasedontheneed toincreasethediversityofemergencycliniciansbyhelping prepareunder-representedstudentstosucceedinaway

rotationsandthematch.2) Determiningandprioritizing content. Whileindividualsateachparticipatinginstitution wereinvolvedwithteachingattheirowninstitution,the needsoftheHUCOMstudentswereunique.Therefore, educationalobjectivesweredevelopedinconjunctionwith thefacultyadvisortothefourth-yearEMrotationat HUCOMwhoconductedstakeholderinterviewswith five currentmedicalstudentsand fivealumniwhohadrecently graduatedandwerecurrentlyinEMresidenciesacrossthe country.Itwasdecidedthatcurricularcontentwouldinclude amixofcoreEMtopics(asdeterminedfromstakeholder interviews)andadvisingsessions.

Afterallsessions,studentswereprovidedwiththecontact informationforthefacultylecturersandwereencouragedto reachout.3) Goalsandobjectives .Broadcurriculargoals weredeveloped.Theseweretoa)teachtheapproachtocore complaintsinEM;b)teachkeyskillsinEM;c)demystifythe processofapplyingtoanEMresidencyprogram;andd) connectstudentswithresidentsandfacultyinthe fieldofEM. Afterthis,specificmeasurablelecturegoalsweredeveloped basedoncognitive,affective,andpsychomotorobjectivesfor thelearner.4) Educationalstrategies .Wecreatedanentirely virtual,four-weekdidacticprogram,withcontentorganized intoweeklyfour-hourblocks,eachledbyadifferent academicED,onaninteractiveplatformthatallowedfor case-baseddiscussions,small-groupdiscussions,and standardlectureformat.Sinceimplementationin2022,the programhasbeenmandatoryforallstudentscompleting theirfourth-yearEMsub-IatHUCOM.

Eachweek,thesessionsrequiredtheparticipationoffour to fivefacultymemberswhovolunteeredtheirtime,withthe majorityoflecturesdeliveredbyasinglefacultymember. However,selectsessions,suchasthe “Applicationand InterviewingProcess,” wereco-ledbyadynamicteam consistingoftheassistantprogramdirector,program director,andchiefresidents,providingawell-rounded perspectiveandvaluableinsightsfortheparticipants. Contentwasmappedandcoordinated,andpre-readingwas assignedfromtheAcademyforDiversityandInclusionin EmergencyMedicinewebinarseries “HowtoBeaSuccessful

EMApplicant” andtheClerkshipDirectorsinEmergency Medicine/SocietyofAcademicEmergecyMedicineM4 curriculum.Eachdayincludedamixofclinicaltopicsand “advising” sessions(Table1).5) Implementation. Approval fromtheEMdirectorwasobtained,andthecurriculawas implemented.6) Evaluationandfeedback .Aftereachblock ofcontent,evaluationsforeachindividualsession(including thepresenter)weresenttoparticipatingstudentsvia REDCap(ResearchElectronicDataCapture,hosted atHowardUniversitySchoolofMedicine.

Theseevaluationsconsistedofonequestionforeach session: “Pleaseratetheeffectivenessofthefollowingsession inaccomplishingitslearningobjectives: Session,Presenter . ” Attheendofthemonth-longprogram,anoverallevaluation oftheprogramwassenttoparticipatingstudents,alsovia RedCap.Theprogramevaluationsurveytool,includingfour multiple-choicequestionsregardingtheoveralllearning objectives,isreflectedin Figure1.Thetoolalsoincludedtwo free-responsequestions:1) “Whichpartsofthecurriculum wereofmostvaluetoyou?”;and2) “Whichpartsofthe curriculumcouldbeimproved?” Werefinedthecurricula eachyearduringanend-of-yeardebrief.

ImplementationPhase

Priortothe first session,studentswereprovideda spreadsheet with pre-sessionwork,curriculumtopics, presentingfacultyandresidents,datesandtimes,andlinksto accesstheweeklyvirtualsessions.EachEMprogram providedfourhoursofinteractivedidacticstothestudents accordingtothescheduleddatesandtimes.

Outcomes

Apost-curricularsurveyfounduniversalagreementfrom studentsthatthecurriculumwaseffectiveinmeetingthe abovegoals.Ofthe fivestudents,100%reportedstrongly agreeingwiththefollowingstatements.Thesesessions1) helpedmelearntheapproachtocoreEMtopicsmorethanI wouldhavebeenabletodoonmyown;2)helpedmelearn keyskillsforexcellinginanEMrotationmorethanIwould havebeenabletodoonmyown;and3)allowedmeto

Didacticsessionone Institutionone

Chestpain AlteredmentalstatusToxicologyoverview

Application and interviewing process

Headache ShortnessofbreathAbdominalpainShockandsepsisGynecologicandurologic emergencies

Radiographs Electrocardiogram introduction VaginalbleedingEndocrineandelectrolytes

SocialemergencymedicineUltrasoundbasicsAdvancedtraumalife support Advancedcardiaclifesupport, basiclifesupport

Table1. Curriculafromsampleblock.

Session:

Presenter:

Please rate the effectiveness of the following session in accomplishing its learning objectiveson a scale from 1 (not effective) to 5 (very effective)

Questions These sessions helped me learn the approach to core emergency medicine topics (abdominal pain, chest pain, headache, etc.) more so than I would have been able to do on my own.

These sessions helped me learn key skills for excelling in an emergency medicine rotation including oral presentations, EKG interpretation, xray interpretation and ultrasound, more so than I would have been able to do on my own.

These sessions helped me learn about the process of applying to and selecting an EM residency program. These sessions allowed me to connect with faculty and resident mentors to learn more about the field of emergency medicine.

Response Options

Strongly agree, agree, neutral, disagree, strongly disagree

Strongly agree, agree, neutral, disagree, strongly disagree

Strongly agree, agree, neutral, disagree, strongly disagree

Strongly agree, agree, neutral, disagree, strongly disagree Response

Which parts of the curriculum were of most value to you?

Which parts of the curriculum could be improved?

connectwithfacultyandresidentmentorstolearnmore aboutthe fieldofEM.Ofthe fivestudents,80%and20% reportedstronglyagreeingandagreeing,respectively,that thesesessionshelpedthemlearnabouttheprocessof applyingtoandselectinganEMresidencyprogram.

Narrativefeedback,suchasthequotesbelow,from studentshighlightedthevalueofmeetingwithfacultyand residentsfromdifferentprograms.fromgoingthroughcases inrealtime.

MeetingthefacultyandprogramdirectorsatvariousEM programsreallywasthehighlightofthecurriculum.Itwas greattogetaninsidelookateachprogramandlearnmore abouttheirculture,approach,andthepeoplethere.

Ireallyenjoyedhearingtheresidents’ perspectiveonhow tonavigatetheapplicationprocess.

Narrativefeedback,suchasthequotesbelow,also emphasizedthevalueofthecurriculum’sinteractivenature andhowtraditionallyin-persontopicswereeffectively adaptedforvirtuallearning.

Myfavoritepartwasparticipatinginreal-timecases.Being involvedasthecaseunfoldedfeltlikehands-onpractice.

Itwasincredibletohavethemechanismsofultrasound explainedinsuchdetail.Breakingitdowntothebasics reallyhelpedmeunderstandultrasoundforthe firsttime.

REFLECTIONSANDLESSONSLEARNED EngagementoftheHomeInstitution

Successfulimplementationrequiredactiveengagement fromHUCOM,specificallytheclerkshipdirectorand administrativestaff,whoservedasleadcontacts.Control overrotationschedulingwasessentialtoensureallstudents werefullyengagedinthesessions.Inaddition,as participatinginstitutionsusedvariousonlineplatformsto communicateanddisseminatecurriculamaterials,suchas Tintinalli ’ sEmergencyMedicine ,withtheirstudents,itwas necessarytohaveHUCOMmanageacentral communications-andvideo-conferencingplatform thatwasaccessibletoalllecturinginstitutionsand participatingstudents.

Figure1. Evaluationformsenttostudentsaftereachsession.

EngagementofCollaboratingInstitutions

Recruitingfacultyandresidentsforeachinstitution’ s weekwaschallenging,buthavingrepresentativeswithstrong connectionsinmedicaleducationmadeasignificant difference.Theserelationshipsallowedthemtoquicklyand effectivelyrecruitlecturers,leveragingtheirnetworksto secureindividualswhowerebothwillingandenthusiasticto participate.Thishighlightsthevalueofhavinginstitutional leadswithestablishedtiestotheireducationalinfrastructure, streamliningtherecruitmentprocess.

CollaborativePower

Thesuccessofthisprojectinvolvedahighdegreeoftrust asmanyoftheinstitutionalrepresentativeshadnotworked together.Todevelopthistrust,wefollowedtheframeworkof engaging,listening,framing,envisioning,andcommitting.15 Thepowerofthisprogramistrulyinthecollectiverather thantheindividual.Whilestudentscouldlearnaboutatrial fibrillationfromoneinstitution,thereallearningoccurs whentheyseethecollaboration,getasenseofthescopeof EMasaprofessional field,andareabletointeractwith variedinstitutionsthathavedifferentapproachestoteaching andthepracticeofmedicine.

ChallengeswithSmallStudentCohorts

UnliketraditionalEMrotationsthatattractstudentsfrom acrossthecountry,ourprogramhadasmallcohort comprisedsolelyofHUCOMstudents,astherewasno affiliatedresidency.Thissmallgroupsizemeantthatifone studentmissedasessionduetointerviews,illness, orotherreasons,itnoticeablyimpactedthelearning environment,limitinggroupdynamicsand peer-to-peerlearning.

ProgramLimitationsandAdaptations

Virtuallearningposedchallengesforteachinginteractive skillssuchasultrasound.Weaddressedthisbyincorporating case-basedlearningwithcuratedimagelibrariesand real-timefeedback.Tofurtherenhancethelearning experience,futureiterationsshouldexploretheintegration ofultrasoundsimulationsoftwaretobettermimic hands-onscenarios.

ScalabilityandExpansion

AlthoughinitiallydesignedforHUCOMstudents,this modelcouldbeexpandedtoothermedicalschoolswithout academicEDs,especiallythosewithahighproportionof URiMstudents.WiththeopeningofadditionalHBCU medicalschools,thereisanevengreaterneedforprograms thatincreaseaccesstoEMeducation.

Limitations

Studylimitationsincludethesmallsamplesizeaswellas lackofacomparisongroup.Futureanalyseswilladdress

theselimitationsandincludeevaluationofmatchoutcomes aswellasotherlearner-centeredtargetssuchasperformance inStandardizedLettersofEvaluationorsubsequent rotationsandinternyearperformance.

AddressforCorrespondence:CortlynBrown,MD,MCSO,Atrium HealthCarolinas,DepartmentofEmergencyMedicine,1000Blythe Blvd.,Charlotte,NC28203.Email: Cortlyn.Brown@atriumhealth.org

ConflictsofInterest:Bythe WestJEMarticlesubmissionagreement, allauthorsarerequiredtodiscloseallaffiliations,fundingsources and financialormanagementrelationshipsthatcouldbeperceived aspotentialsourcesofbias.Noauthorhasprofessionalor financial relationshipswithanycompaniesthatarerelevanttothisstudy. Therearenoconflictsofinterestorsourcesoffundingtodeclare.

Copyright:©2025Brownetal.Thisisanopenaccessarticle distributedinaccordancewiththetermsoftheCreativeCommons Attribution(CCBY4.0)License.See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1.KomaromyM,GrumbachK,DrakeM,etal.TheroleofBlackand Hispanicphysiciansinprovidinghealthcareforunderserved populations. NEnglJMed. 1996;334(20):1305–10.

2.MarrastLM,ZallmanL,WoolhandlerS,etal.Minorityphysicians’ rolein thecareofunderservedpatients:diversifyingthephysicianworkforce maybekeyinaddressinghealthdisparities. JAMAInternMed. 2014;174(2):289.

3.GomezLEandBernetP.Diversityimprovesperformanceand outcomes. JNatlMedAssoc. 2019;111(4):383–92.

4.SahaS,KomaromyM,KoepsellTD,etal.Patient-physicianracial concordanceandtheperceivedqualityanduseofhealthcare. Arch InternMed. 1999;159(9):997.

5.TakeshitaJ,WangS,LorenAW,etal.Associationofracial/ethnic andgenderconcordancebetweenpatientsandphysicianswithpatient experienceratings. JAMANetwOpen. 2020;3(11):e2024583.

6.ShenMJ,PetersonEB,Costas-MunizR,etal.Theeffectsofraceand racialconcordanceonpatient-physiciancommunication:asystematic reviewoftheliterature. JRacialEthnHealthDisparities. 2018;5(1):117–40.

7.GarrickJF,PerezB,AnaebereTC,etal.Thediversitysnowballeffect: thequesttoincreasediversityinemergencymedicine.Acasestudyof Highland’semergencymedicineresidencyprogram. AnnEmergMed. 2019;73(6):639–47.

8.BoatrightD,TunsonJ,CarusoE,etal.Theimpactofthe2008Councilof EmergencyResidencyDirectors(CORD)panelonemergencymedicine residentdiversity. JEmergMed. 2016;51(5):576–83.

9.MabezaRM,ChristophersB,EderaineSA,etal.Interventions associatedwithracialandethnicdiversityinUSgraduatemedical education:ascopingreview. JAMANetwOpen. 2023;6(1):e2249335.

10.MartinezJ,MieresJH,RoswellRO,URiM(underrepresented inmedicine)learnerandfacultymentoring.In:FornariAandShahDT

(Eds.), MentoringinHealthProfessionsEducation(34–43).IAMSE manuals.Cham,Switzerland:SpringerInternationalPublishing,2021.

11.DeLaCruzMSD,SairenjiT,StumbarSE,etal.Curricular recommendationsforanationalfamilymedicinesubinternship: aqualitativeanalysisfrommultiplestakeholders. FamMed. 2021;53(10):835–42.

12.GoinesJ,IledareE,AnderD,etal.Amodelpartnership:mentoring underrepresentedstudentsinmedicine(URiM)inemergencymedicine. WestJEmergMed. 2021;22(2):213–7.

13.SweetLandPalazziD.ApplicationofKern’ssix-stepapproachto curriculumdevelopmentbyglobalhealthresidents. EducHealth. 2015;28(2):138.

14.ThomasPA,KernDE,HughesMT,etal.(Eds.). CurriculumDevelopmentforMedicalEducation:aSix-step Approach.3rd Ed.Baltimore,Maryland:JohnsHopkinsUniversity Press,2016.

15.MaisterD,GreenC,GaifordR. TheTrustedAdvisor.NewYorkNew York:Simon&Schuster,20th AnniversaryEd.,2021.

A 30-year History of the Emergency Medicine Standardized Letter of Evaluation

Jenna S. Hegarty, BS*

Cullen B. Hegarty, MD†

Jeffrey N. Love, MD, MHPE‡

Alexis Pelletier-Bui, MD§

Sharon Bord, MD||

Michael C. Bond, MD#

Samuel M. Keim, MD, MS**

Kevin Hamilton, BS††

Eric F. Shappell, MD, MHPE‡‡

Rosalind Franklin University of Medicine and Science, Chicago, Illinois

University of Minnesota Medical School, HealthPartners Institute/Regions Hospital, Department of Emergency Medicine, St. Paul, Minnesota

Georgetown University School of Medicine, Department of Emergency Medicine, Washington, DC

Cooper Medical School of Rowan University/ Cooper University Hospital, Department of Emergency Medicine, Camden, New Jersey

The Johns Hopkins University School of Medicine, Department of Emergency Medicine, Baltimore, Maryland

University of Maryland School of Medicine, Department of Emergency Medicine, Baltimore, Maryland

University of Arizona, Department of Emergency Medicine, Tucson, Arizona

University of Maryland Medical System Center for Technology Innovation, Baltimore, Maryland

Massachusetts General Hospital / Harvard Medical School, Department of Emergency Medicine, Boston, Massachusetts

Section Editor: Jules Jung, MD, MEd

Submission history: Submitted April 30, 2025; Revision received November 6, 2025; Accepted November 3, 2025

Electronically published November 26, 2025

Full text available through open access at http://escholarship.org/uc/uciem_westjem DOI 10.5811/westjem.47110

Thirty years ago, education leaders in emergency medicine (EM) developed a standardized letter of recommendation to address limitations of narrative letters of recommendation in the residency selection process. Since then, multiple iterations and improvements with specialty-wide adoption have led to this letter being cited as one of the most essential pieces of a residency application. Based on the experience and success in EM, many other specialties have also now adopted standardized letters of their own. In this paper, we detail the 30-year history of the EM standardized letter including form changes and technological innovations, research and validity evidence, and discussion of research and administrative priorities for the future. [West J Emerg Med. 2025;26(6)1544–1548.]

INTRODUCTION

Emergency medicine (EM) was the first specialty to adopt a standardized letter for residency applications. Noting the shortcomings of narrative letters of recommendation featuring lengthy descriptions and heterogenous content and structure, the Council of Residency Directors in Emergency Medicine (CORD) assembled a task force in 1995 to develop a structured assessment to replace narrative letters that was standardized, concise, and discerning.1 The resulting assessment form became the Standardized Letter of Recommendation (SLOR) and debuted in the 1995-1996 EM residency application cycle.

Over the past 30 years, the EM standardized letter has evolved through multiple iterations and advancements including updates to the items and domains assessed, migrating from disseminated paper forms to a centralized electronic database, and development of form variants for subspecialty and off-service rotations. Now known as

the Standardized Letter of Evaluation (SLOE),2 the EM SLOE has led the way for other specialties that recognized the strength of this approach and developed their own standardized letters of evaluation in subsequent years, such as plastic surgery in 2012, internal medicine and orthopedic surgery in 2017, obstetrics and gynecology in 2022, and more. 3-6 Additionally, the Coalition for Physician Accountability’s Undergraduate Medical Education-Graduate Medical Education Review Committee recommended in 2021 that all specialties develop and move to structured evaluation letters instead of narrative letters.7

In this paper, we detail the 30-year history of the EM SLOE including form changes and technological innovations in response to evolving needs and priorities of the broader EM community. This history and accompanying context can inform efforts of those responsible for developing, researching, writing, and interpreting SLOEs by standardizing the language used to describe SLOE versions and variants,

summarizing the literature on the topic, and mapping research and administrative agendas for the future.

STANDARDIZED LETTER VERSIONS

The first version of the EM standardized letter, the SLOR, was produced in 1995 by a task force commissioned by CORD (Table 1).1 This letter debuted in the 1995-1996 residency application cycle and featured sections assessing qualifications for EM (commitment, work ethic, ability to formulate a differential and plan, personality) in addition to a global assessment, estimated match list position, and comments (Appendix 1A).From 1995-2011 the SLOR became an influential aspect of the EM residency match process, and the original letter format was iterated upon.8

In 2011, CORD re-established the SLOR task force to study, re-evaluate and update the letter.8-10 The result was the second official version of the letter, the 2012 SLOE (Table, Appendix 1B), which debuted in the 2012-2013 residency application cycle. At this time, the name SLOR was changed to SLOE to better represent that the letter’s purpose was not necessarily to recommend a student, but rather to provide a standardized evaluation of their performance. In addition to the name change came additions and edits to the letter, such as asking which EM rotation this was for the student and the dates during which the student rotated. In Part B, “Qualifications for EM,” the personality question was removed, and questions were added regarding ability to work with a team, ability to communicate a caring nature to patients, and anticipated guidance during residency. The anchors in this section also moved away from adjectives and toward peer comparison. In Part C, “Global Assessment,” Question 2 changed the rank-list options from descriptions of likelihood of matching to numeric anchors (eg, “Top 10%”). Question 2 also added a clarifying question, “Are you on the committee that determines the final rank list?” to understand whether the letter writer had experience with such rankings. Lastly, the narrative section now had a reduced word limit of 250 words or less to encourage letter writers to be more concise, and to decrease the common practice of advertising the institution where the rotation was completed. Throughout the time spanning these first two iterations of the standardized letter, some authors customized or changed the form.8 This variance weakened the SLOE by straying from one of its core tenets: standardization. Many efforts from 2011–2016 were made by the SLOE task force to increase standardization and prevent customization such as providing author guidelines and training (including lectures, workshops, and discussion groups), and advocating for a non-modifiable electronic template.9 To further promote standardization in the use of the SLOE, an electronic portal to write and save letters was developed in 2016, in addition to a new letter form referred to as the electronic SLOE (eSLOE) (Table, Appendix 1C).11 This change ensured that no alterations could be made to the form, thus standardizing SLOE data for

review and comparison. This form also introduced a section after the narrative to describe the institution, both to provide context for the reader and preempt the use of the narrative to describe institutional characteristics. Additionally, the eSLOE website saved all letter information. It produced copies in the correct format, making uploading to the Electronic Residency Application Service system easier for authors, and establishing an electronic database amenable to research and quality improvement initiatives.

To allow authors to provide context for the unprecedented pandemic conditions of the 2020 and 2021 application cycles, the SLOE committee added a single narrative question to the evaluation asking how the student’s rotation was affected by COVID-19. While this change technically makes the 2020 edition of the eSLOE a different version of the standardized letter (Table, Appendix 1D), it is otherwise the same 2016 eSLOE.

Following the addition of the COVID-19 question in 2020, the SLOE committee again re-evaluated and updated the eSLOE resulting in the 2022 electronic SLOE 2.0 (eSLOE 2.0) (Table, Appendix 1E).12 The most notable change in the 2022 eSLOE was the addition of criterion-referenced items and removal of some norm-referenced items. This transition was made in step with broader trends in medical education toward assessments that compare performance to a standard as opposed to other trainees. Another competency-based assessment for EM students in use at that time, the National Clinical Assessment Tool for Medical Students in Emergency Medicine (NCAT-EM),13 provided helpful context as a fieldtested, criterion-referenced clinical assessment to emulate in the 2022 eSLOE.14,15 A question was added to provide more insight into the sources of information used in compiling the SLOE. Authors could also denote whether this evaluation was based on a rotation taken by all students at the letter writers’ institution or just by EM sub-interns, as each would presumably result in a different grading breakdown. There was also the ability to denote any changes in grading practices to inform comparisons of grades across years. With the transition of US Medical Licensing Exam Step 1 scores to pass/fail and more institutions moving to pass/fail curricula, the SLOE committee added a section regarding test-taking ability, identifying any standardized testing completed during the rotation (eg, National Board of Medical Examiners shelf exam, Society of Academic Emergency Medicine tests, or home- grown assessment).

Given the growing number of SLOE iterations, it is important to standardize the nomenclature to improve clarity in discussions and future literature on this topic. When referring to these evaluations generally and inclusive of SLOR and SLOE versions, we propose reference to the emergency medicine standardized letter. When referring to specific versions of the EM standardized letter, we propose referring to the year the version was first used in practice and either SLOR or SLOE, as appropriate (eg, 1995 SLOR,

2016 SLOE). Modifiers to further distinguish versions (eg, 2016 eSLOE or 2022 eSLOE 2.0) may also be used; however, we recommend still including the year in these cases to avoid potential misunderstanding via errors of omission (eg, omitting “2.0” from “eSLOE 2.0” for brevity or by mistake could lead to the reader interpreting this as the 2016 version when the 2022 version was intended, whereas “2022 eSLOE” is unambiguous).

STANDARDIZED LETTER VARIANTS

From the use of the SLOR through the 2016 SLOE, writers and reviewers began to identify and report to CORD leadership opportunities where clearer differentiation between types of authors and rotations would be beneficial for writers, reviewers, and researchers. These opportunities for clearer differentiation resulted in the creation of multiple SLOE variants. In 2016, the SLOE for Non-Residency-based EM Physicians was introduced (Table 1). This variant removed the item requiring authors to describe where the candidate would reside on their rank list, noting that this question was inappropriate for physicians not involved in a residency program. This new form allowed students to still receive evaluations from this group of authors but provided additional context for reviewers by clearly describing the source of the letter. Additionally, this separation facilitated more granular data for research and quality assurance initiatives. Also released in 2016, the subspecialty SLOE extended evaluation opportunities to include EM subspecialists in toxicology, ultrasound, pediatric EM, and emergency medical services (Table).

The COVID-19 pandemic in 2020 prompted significant restrictions on visiting clerkships nationwide, resulting in limited opportunities for students to receive outside SLOEs. To create more opportunities for students to receive standardized letters in the absence of additional EM clerkship availability, the Off Service Standardized Letter of Evaluation (O-SLOE) was developed (Table 1, Appendix 2G). The O-SLOE expanded access to standardized letters from offservice faculty in non-EM specialties. At this time a question regarding COVID-19 was also added to both the subspecialty SLOE and the SLOE for non-academic emergency physicians (Table 1, Appendix 2A, 2D).

All three variants of the SLOE were updated by the CORD SLOE Committee again in 2022 to match the updated 2022 SLOE (Table , Appendix 2B, 2E, 2H). The variants were also added to the eSLOE database at that time. The latest addition to SLOE variants in 2024 was a bar at the top of each PDF with a unique color to signify each variant, making it clear to SLOE readers which type of SLOE variant they were reading (Table, Appendix 2C, 2F, 2I).

RESEARCH

Highlights of SLOE research from author experience and PubMed search for “standardized letter of evaluation”

include a broad scope of topics. Past research has highlighted the SLOE’s value as one of the most heavily weighted aspects of an applicant’s file.9,13,16 When compared to narrative letters of recommendation, the EM SLOE was interpreted faster by recruitment committees and had higher interrater reliability.17

Research investigating the process of how SLOEs are written has noted an increasing proportion of SLOEs authored by groups compared to those authored by individuals.18 Program directors in EM have cited increased trust of group SLOEs compared to those authored by individuals despite limitations noted in past analysis of group SLOEauthorship processes.8,9,19 This may be due to slight but statistically significantly higher ratings seen in individual SLOEs compared to group SLOEs, which some may interpret as grade inflation in individual SLOEs. It is worth noting, however, that these score differences are smaller and even reversed when comparing only individual SLOEs written by clerkship directors to group SLOEs, suggesting that clerkship directors authoring individual SLOEs exhibit little to no grade inflation compared to group SLOE authors.18 Data presented at the 2025 CORD Academic Assembly also has linked the quantity of SLOEs authored per year with rating trends, noting that lower volume SLOE author(s) gave higher mean ratings compared to high-volume author(s) for both individual and group SLOEs.20

Trends in ratings by writer experience and home vs away rotations have also been explored, noting higher ratings in less experienced writers and home rotations.21,22 While it is encouraging that high-volume author(s) and clerkship director ratings are similar to group SLOE ratings on average, optimizing standardization of ratings across all author types remains a potential growth area for the SLOE. Form updates and consistent messaging and training efforts through CORD have been shown to decrease evidence of rating leniency,11 as has defaulting score selections to the midpoint of the range and creating a pop-up notification for when score extremes are selected23; however, recent evidence suggests persistence of variable rating practices across institutions that warrants continued efforts in this area.24

The competitiveness of applicants based on SLOE information has also been explored through the lenses of simultaneous goals of (1) optimizing match outcomes for applicants and (2) providing programs with stratifying performance information. Analysis of match outcomes for applicants with lower ratings in one study shows increased risk of not matching, but lower ratings did not preempt a successful match.25 Another study noted that adherence to rating standards did not seem likely to increase risk of applicants failing to match in EM.26 Both of these studies support the notion that whole rating scales can and should be used, although with consistency and transparency to decrease the risk that authors see lower rankings as outlier red flags, which has been described in a qualitative study investigating how SLOEs are interpreted.19

Multiple recent studies demonstrate a high degree of faculty consensus regarding the level of competitiveness of an applicant based on the SLOE.13,27,28 These studies also show promise for algorithms to predict consensus levels of competitiveness. These models outperformed artificial intelligence software when comparing their ability to predict faculty consensus rankings of competitiveness.29 How these algorithms can be operationalized to improve the application process is an ongoing area of discussion, but this could involve applicant-facing applications such as broad competitiveness feedback to tailor application quantity and breadth, or program-facing applications such as competitiveness estimations to which faculty ratings could be compared to assess for potential bias or to cut down on time needed for reviews.

There have been limited investigations into the association of SLOE ratings with future performance.30–32 Published studies face challenges of small sample sizes and use of unvalidated outcome measures in two studies. In the study assessing the association between SLOE and Accreditation Council for Graduate Medical Education (ACGME) Milestones ratings, only one year of Milestones data was used, which limits the scope of these results.31 National data presented at the 2025 ACGME conference, however, shows a clear association between algorithmderived SLOE competitiveness and multiple measures of residency Milestones performance including mean first and last Milestone ratings by competency and the binary outcome of residency completion.33 Future SLOE research should continue to prioritize studies linking SLOE ratings to future performance.

While many strengths of the EM standardized evaluation have been discovered, areas for improvement have also been identified. Literature suggests that both sex-based and racial bias are demonstrated in certain components of the eSLOE.34–36 There is also evidence that institutional rating patterns and adherence to written standards vary widely, which has raised long-standing concerns about grade inflation and its impact on the ability to stratify applicant performance.8,11,24,26,37,38 Additionally, a review of validity evidence for the 2016 SLOE highlights areas of improvement to consider, although more recent research has addressed some of these concerns.39

NEXT STEPS

Emergency medicine has led the field in standardized letters for the residency application process for the past 30 years. Looking forward to how EM can lead in the next 30 years, several areas stand out. These areas include mitigating the influence of bias on standardized letter of evaluation assessments, continuing to adapt the SLOE instructions, questions, data points, and form to improve response processes and data quality (including efforts to curb, or at least track and facilitate adjustment for, grade inflation), and

further bolstering the validity evidence for the SLOE through research including measuring the association of SLOE ratings with future performance.

Address for Correspondence: Eric F. Shappell, MD, MHPE, Harvard Medical School/ Massachusetts General Hospital, Department of Emergency Medicine,125 Nashua St. Room 2426, Boston, MA 02114. Email: eshappell@mgh.harvard.edu.

Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. No author has professional or financial relationships with any companies that are relevant to this study There are no conflicts of interest or sources of funding to declare.

Copyright: © 2025 Hegarty et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/ licenses/by/4.0/

REFERENCES

1. Keim SM, Rein JA, Chisholm C, et al. A Standardized letter of recommendation for residency application. Acad Emerg Med 1999;6(11):1141-6.

2. Garmel GM, Grover CA, Quinn A, et al. Letters of recommendation. J Emerg Med. 2019;57(3):405-10.

3. Alweis R, Collichio F, Milne CK, et al. Guidelines for a standardized fellowship letter of recommendation. Am J Med. 2017;130(5):606-11.

4. Inclan PM, Cooperstein AA, Powers A, et al. When (almost) everyone is above average: a critical analysis of American Orthopaedic Association Committee of Residency Directors standardized letters of recommendation. JBJS Open Access 2020;5(3):e20.00013-e20.00013.

5. Tavarez MM, Baghdassarian A, Bailey J, et al. A call to action for standardizing letters of recommendation. J Grad Med Educ 2022;14(6):642-6.

6. Reghunathan M, Mehta I, Gosman AA. Improving the standardized letter of recommendation in the plastic surgery resident selection process. J Surg Educ. 2021;78(3):801-12.

7. Richard Alweis, Steven Angus, Michael Barone, et al. The Coalition for Physician Accountability’s Undergraduate Medical EducationGraduate Medical Education Review Committee (UGRC): Recommendations for Comprehensive Improvement of the UMEGME Transition. Coalition for Physician Accountability; 2021:86.

8. Love JN, Deiorio NM, Ronan-Bentle S, et al. Characterization of the Council of Emergency Medicine Residency Directors’ standardized letter of recommendation in 2011-2012. Acad Emerg Med 2013;20(9):926-32.

9. Love JN, Smith J, Weizberg M, et al. Council of Emergency Medicine Residency Directors’ standardized letter of recommendation: the program director’s perspective. Acad Emerg Med. 2014;21(6):680-7.

10. Hegarty CB, Lane DR, Love JN, et al. Council of Emergency

Hegarty

Medicine Residency Directors standardized letter of recommendation writers’ questionnaire. J Grad Med Educ. 2014;6(2):301-6.

11. Jackson JS, Bond M, Love JN, et al. Emergency Medicine Standardized Letter of Evaluation (SLOE): findings from the new electronic SLOE format. J Grad Med Educ. 2019;11(2):182-6.

12. Bord S, Dubosh N, Hegarty C, et al. SLOE: a step in the right direction. AEM Educ Train. 2023;7(3):e10881.

13. Schrepel C, Sehdev M, Dubosh NM, et al. Decoding competitiveness: exploring how emergency medicine faculty interpret standardized letters of evaluation. AEM Educ Train. 2024;8(4):e11019.

14. Jung J, Franzen D, Lawson L, et al. The National Clinical Assessment Tool for Medical Students in the Emergency Department (NCAT-EM). West J Emerg Med. 2018;19(1):66-74.

15. O’Dowd E, Lydon S, O’Connor P, et al. A systematic review of 7 years of research on entrustable professional activities in graduate medical education, 2011-2018. Med Educ. 2019;53(3):234-49.

16. Negaard M, Assimacopoulos E, Harland K, et al. Emergency medicine residency selection criteria: an update and comparison. AEM Educ Train. 2018;2(2):146-53.

17. Girzadas DV, Harwood RC, Dearie J, et al. A comparison of standardized and narrative letters of recommendation. Acad Emerg Med Off J Soc Acad Emerg Med. 1998;5(11):1101-4.

18. Sehdev M, Egan DJ, Bord S, et al. Prevalence and characteristics of group standardized letters of evaluation in emergency medicine: a cross-sectional observational study. AEM Educ Train 2025;9(1):e11057.

19. Love JN, Doty CI, Smith JL, et al. The Emergency Medicine Group Standardized Letter of Evaluation as a workplace-based assessment: the validity is in the detail. West J Emerg Med. 2020;21(3):600-9.

20. Wright, K., Sapp, R., Commissaris, C., Monette, et al. Standard Letter of Evaluation Rating Associations with Individual versus Group Authorship and Volume of Letters Written. West J Emerg Med. 2025; 26: S2-3.

21. Beskind DL, Hiller KM, Stolz U, et al. Does the experience of the writer affect the evaluative components on the standardized letter of recommendation in emergency medicine? J Emerg Med 2014;46(4):544-50.

22. Boysen-Osborn M, Andrusaitis J, Clark C, et al. A retrospective cohort study of the effect of home institution on emergency medicine standardized letters of evaluation. AEM Educ Train. 2019;3(4):340-6.

23. Pelletier-Bui A, Franzen D, Karl E, et al. Evaluating the impact of electronic interventions on EM Standardized Letter of Evaluation Part B Ratings. West J Emerg Med. 2025; 26: S65-6.

24. Shappell E, Hegarty C, Bord S, et al. Hawks and doves in standardized letters of evaluation: 6 years of rating distributions and trends in emergency medicine. J Grad Med Educ. 2024;16(3):328-32.

25. Hansroth JA, Davis KH, Quedado KD, et al. Lower-third SLOE rankings impede, but do not prevent, a match in emergency

medicine residency training. J Med Educ Curric Dev 2020;7:2382120520980487.

26. Pelletier-Bui A, Van Meter M, Pasirstein M, et al. Relationship between institutional standardized letter of evaluation global assessment ranking practices, interviewing practices, and medical student outcomes. AEM Educ Train. 2018;2(2):73-6.

27. Sehdev M, Schnapp B, Dubosh NM, et al. Measuring and predicting faculty consensus rankings of standardized letters of evaluation. J Grad Med Educ. 2024;16(1):51-8.

28. Schnapp B, Sehdev M, Schrepel C, et al. Faculty consensus on competitiveness for the new competency-based emergency medicine standardized letter of evaluation. AEM Educ Train. 2024;8(5):e11024.

29. Schnapp B, Sehdev M, Schrepel C, et al. ChatG - PD ? Comparing large language model artificial intelligence and faculty rankings of the competitiveness of standardized letters of evaluation. AEM Educ Train. 2024;8(6):e11052.

30. Hayden SR, Hayden M, Gamst A. What characteristics of applicants to emergency medicine residency programs predict future success as an emergency medicine resident? Acad Emerg Med 2005;12(3):206-10.

31. Burkhardt JC, Parekh KP, Gallahue FE, et al. A critical disconnect: residency selection factors lack correlation with intern performance. J Grad Med Educ. 2020;12(6):696-704.

32. Bhat R, Takenaka K, Levine B, et al. Predictors of a top performer during emergency medicine residency. J Emerg Med 2015;49(4):505-12.

33. Shappell E. Standardized letter of evaluation associations with ACGME Milestones. Accreditation Council for Graduate Medical Education Annual Educational Conference, Nashville, TN, February 2025. Oral presentation.

34. Miller DT, McCarthy DM, Fant AL, et al. The Standardized Letter of Evaluation narrative: differences in language use by gender. West J Emerg Med. 2019;20(6):948-56.

35. Kukulski P, Schwartz A, Hirshfield LE, et al. Racial bias on the Emergency Medicine Standardized Letter of Evaluation. J Grad Med Educ. 2022;14(5):542-8.

36. Mannix A, Monteiro S, Miller D, et al. Gender differences in emergency medicine standardized letters of evaluation. AEM Educ Train. 2022;6(2):e10740.

37. Grall KH, Hiller KM, Stoneking LR. Analysis of the evaluative components on the Standard Letter of Recommendation (SLOR) in emergency medicine. West J Emerg Med. 2014;15(4):419-23.

38. Wilson D, Laoteppitaks C, Chandra S. A comparison of standardized letters of evaluation for emergency medicine residency applicants. West J Emerg Med. 2020;22(1):20-5.

39. Kukulski P, Ahn J. Validity evidence for the Emergency Nedicine Standardized Letter of Evaluation. J Grad Med Educ. 2021;13(4):490-9.

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.