Unsupported Browser
The American College of Surgeons website is not compatible with Internet Explorer 11, IE 11. For the best experience please update your browser.
Menu
Become a member and receive career-enhancing benefits

Our top priority is providing value to members. Your Member Services team is here to ensure you maximize your ACS member benefits, participate in College activities, and engage with your ACS colleagues. It's all here.

Become a Member
Become a member and receive career-enhancing benefits

Our top priority is providing value to members. Your Member Services team is here to ensure you maximize your ACS member benefits, participate in College activities, and engage with your ACS colleagues. It's all here.

Become a Member
ACS
RISE

Implementing Scientific Tools into the Selection Process: It’s About Respecting Our Newest Colleagues

Aimee K. Gardner, PhD, and Brian J. Dunkin, MD, FACS

August 1, 2018

This article is intended for those interested in learning more about how advances in selection science can benefit training programs and applicants. This may be of particular interest to residency and fellowship program directors. The objectives are to:

  1. Describe the various ways in which our current selection system is costing both applicants and programs
  2. Provide resources and examples of how advances in selection science can benefit training programs and applicants
  3. Discuss ideas for next steps in advancing selection efforts in surgery

Less than a month ago, more than 1,200 medical students found out they successfully matched into a general surgery residency program.1 Aside from basking in the excitement of their future careers, they are also likely feeling relieved they have successfully navigated the gauntlet of The Match

Residency programs are likely similarly relieved they filled all of their positions, as they will have invested anywhere from $61,000-$129,000 in time and resources reviewing applications, conducting interviews, and creating rank lists.6

Unfortunately, if past data is any indication of future data, this “match” won’t last long. More than one in five interns entering a surgery residency program won’t make it to the graduation finish line.7 The reasons for this high attrition rate are likely multifaceted, but program directors report that those who leave join a different surgery program (25 percent), join another surgical discipline (52 percent), or leave medicine entirely (21 percent).8 Additionally, at least a third of those remaining will undergo at least one remediation attempt, most often to enhance “soft skills” such as professionalism or interpersonal skills.9–10

These data are concerning for a number of reasons, but they particularly highlight the opportunities for improvement that exist within our current residency selection process. Fortunately, there is an entire field of personnel psychology dedicated to improving the process of selection. What follows are three areas in which selection science may be able to offer insight and value for the surgical education community:

1. Efficiency

As most of the current selection costs are incurred for the on-site visit, a logical starting point for both parties would be to reduce the number of interviews without compromising selection quality. Selection science offers a number of tools to help in this regard; one of the most powerful is a situational judgment test (SJT). SJTs are written exams that present the candidate with real-life situations they will likely encounter during residency.11–13 Programs are able to judge how applicant responses match those they would expect from a resident in their program, and applicants are able to get a realistic preview of what would be expected from them during training.

Another powerful selection tool is the structured interview. Unlike the typical “get acquainted” interview conducted for most candidates,14–15 a structured interview focuses on asking questions that determine if a candidate has the specific attributes the program has identified as important for success. For these, it is critical that faculty are trained on conducting structured interviews and using the rating forms.

A surgical fellowship recently incorporated online SJTs and structured interviews into their application process. They were able to reduce the number of on-site interviews by more than 50 percent while still matching top-ranked candidates.16 Exit surveys indicated that applicants believed the application and interview process was more organized in comparison to other programs and gave them better insight into their fit with the program.17 Surveyed faculty felt empowered to make better selection decisions based on objective data.

2. Objectivity & Equity

The foundation of any assessment is fairness and freedom from subjectivity or bias.18 Unfortunately, the current process by which residency candidates are assessed—standardized test scores, letters of recommendation, and personal statements14—may not meet this criteria. As mentioned earlier, the USMLE is most often the primary screening tool for further consideration. However, the USMLE is prone to issues of fairness, given research that cognitive-based examinations result in substantial racial differences in test performance.19 The exam is also not predictive of residency performance beyond future test scores,20 so it likely eliminates candidates from consideration that might overall perform well in the program. In addition, numerous studies have shown sex bias in letters of recommendation,21–23 with women receiving less standout adjectives and ability words, compared to men even when objective criteria indicates no differences in qualifications. Personal statements can demonstrate bias as well, as they may handicap applicants who lack access to individuals in positions of power or prestige to help with letter review or coaching.24 Finally, unstructured interviews, which are often given the most weight in final ranking decisions,25 are subject to a number of biases26 and prone to inappropriate questions related to candidate demographics.27

Evidence-based selection focuses on objectivity and equity. Tools, such as personality tests, SJTs, and structured interviews, can offer important information about a candidate’s fit within a program, without incorporating high levels of bias like more traditional screening tools. In fact, preliminary research within surgery suggests that these tools can actually increase diversity in training programs16 if used in the application screening process.

3. Reduced Attrition

Although the reasons for resident attrition are likely multi-faceted, one of the primary reasons for dropout may be lack of fit within the program. By designing selection systems that are based on past resident successes (and failures), residency programs can narrow in on competencies that have historically been indicators of resident fit (or misfit). For example, Kelz et al.28 worked with an organizational management expert to review historical reasons for resident dismissals, which included deficiencies in stress management, organizational skills, future aspirations, and prioritization abilities. From this, they re-structured their application and interview process to focus on screening for these specific competencies. The authors stipulated that this novel selection strategy reduced attrition from an overall five-year rate of 27 percent to 3.2 percent.

Conclusion

Surgery is a profession that cares about people. In addition to caring for patients, it is critical that we utilize systems that allow us to care about the newest members of our profession—our applicants—and those on the education team who are tasked with managing the high volume of applications received. By adopting principles from selection science, we have the opportunity to incorporate fair, efficient, and evidence-based screening tools that allow candidates to learn more about our training program while we learn more about them.

References

  1. 2017 Main Residency Match Results and Data. National Resident Matching Program. http://www.nrmp.org/wp-content/uploads/2017/06/Main-Match-Results-and-Data-2017.pdf. Accessed on February 5, 2018.
  2. Apply smart in general surgery: New data to consider. Association of American Medical Colleges. https://www.aamc.org/cim/480042/applysmartgs.html. Accessed on February 5, 2018.
  3. Historical Specialty Specific Data on ERAS Applicants and Applications. https://www.aamc.org/services/eras/stats/359278/stats.html. Accessed on February 5, 2018.
  4. Results of the 2017 NRMP Applicant Survey. National Resident Matching Program. http://www.nrmp.org/wp-content/uploads/2017/09/Applicant-Survey-Report-2017.pdf. Accessed on February 5, 2018.
  5. Cost of Applying to Residency Questionnaire Report. Association of American Medical Colleges. https://www.aamc.org/download/430902/data/costofapplyingtoresidency.pdf. Accessed on February 5, 2018.
  6. Gardner AK, Smink DS, Scott BG, Korndorffer JR, Harrington DT, Ritter EM. How much are we spending on resident selection? Podium presentation at the Association of Program Directors in Surgery (APDS) Annual Meeting. Austin, Texas. May 4, 2018.
  7. Yeo H, Abelson J, Mao J, Lewis F, Michelassi F, Bell RH, Sedrakyan A, Sosa J. Who makes it to the end? A novel predictive model for identifying surgical residents at risk of dropping out. Annals Surg 2017; 266: 499-507.
  8. Schwed AC, Lee SL, Salcedo ES, Reeves ME, Inaba K, et al. Association of general surgery resident remediation and program director attitudes with resident attrition. JAMA Surgery 2017; 152: 1134-1140.
  9. Bergen PC, Littlefield JH, O’Keefe, GE et al. Identification of high-risk residents. J Surg Res 2000; 92: 239-244.
  10. Yaghoubian A, Galante J, Kaji A, et al. General surgery resident remediation and attrition. Arch Surg 2012; 147: 829-833.
  11. Gardner AK, Ritter EM, Paige JT, Ahmed RA, Fernandez G, Dunkin BJ. Simulation-based selection of medical trainees: Considerations, challenges, and opportunities. J Am Coll Surgeons 2016; 223: 530-536.
  12. Gardner AK, Dunkin BJ. Evaluation of validity evidence for personality, emotional intelligence, and situational judgment tests to identify successful residents. JAMA Surg 2017; DOI: 10.1001;jamasurg.2017.5013.
  13. Gardner AK, Grantcharov T, Dunkin BJ. The science of selection: Using best practices from industry to improve success in surgery training. J Surg Educ 2017; DOI: doi: 10.1016/j.jsurg.2017.07.010.
  14. Makdisi G, Takeuchi T, Rodriguez J, Rucinski J, Wise L. How we select our residents – a s survey of selection criteria in general surgery residents. J Surg Educ 2011; 68: 67-72.
  15. Stephenson-Famy A, Houmard BS, Oberoi S, Manyak A, Chiang S, Kim S. Use of the interview in resident candidate selection: A review of the literature. J Grad Med Educ 2015; 7: 539-548.
  16. Gardner AK, Dunkin BJ. Leveling the playing field: The power of selection science to enhance diversity, increase efficiency, and provide meaningful data for selecting surgical trainees. Ann Surg (revise and resubmit).
  17. Gardner AK, Dunkin BJ. Applicant perceptions of new selection systems for surgical training: Selection science doesn’t “scare away the good ones.” Podium presentation at the Association for Surgical Education (ASE) annual meeting. Austin, Texas, May 1, 2018.
  18. American Educational Research Association, American Psychological Association, National Council on Measurement in Education, Joint Committee on Standards for Educational and Psychological Testing (U.S.). (2014). Standards for educational and psychological testing. Washington, DC: AERA.
  19. Outtz JL. The role of cognitive ability tests in employment selection. Human Performance. 2002;15:161–71.
  20. De Virgilio C, Yaghoubian A, Kaji A, Collins JC, Deveney K, et al. Predicting performance on the American Board of Surgery qualifying and certifying examinations. A multi-institutional study. JAMA Surgery 2010; 145: 852-856.
  21. Trix F, Psenka C. Exploring the color of glass: Letters of recommendation for female and male medical faculty. Discourse & Society, 2003.
  22. Madera JM, Hebl MR, Martin RC. Gender and letters of recommendation for academia: Agentic and communal differences. J Appl Psychol 2009.
  23. Schmader T, Whitehead J, Wysocki VH. A linguistic comparison of letters of recommendation for male and female chemistry and biochemistry job applicants. Sex Roles 2007; 57: 509-514.
  24. Wright SR, Bradley PM. Has the UK Clinical Aptitude Test improved medical student selection? Med Educ 2010; 1069-1076.
  25. Results of the 2016 NRMP Program Director Survey. National Resident Matching Program. http://www.nrmp.org/wp-content/uploads/2016/09/NRMP-2016-Program-Director-Survey.pdf. Accessed on February 5, 2018.
  26. Gardner AK, D’Onofrio BC, Dunkin BJ. Can we get faculty interviewers on the same page? An examination of a structured interview course for surgeons. J Surg Educ 2017; S1931-7204.
  27. Hern HG, Trivedi T, Alter HJ, Wills CP. How prevalent are potentially illegal questions during residency interviews? A follow-up study of applicants to all specialties in the national resident matching program. Acad Med 2016; 91: 1546-1553.
  28. Kelz RR, Mullen JL, Kaiser LR, Pray LA, Shea GP. Prevention of surgical resident attrition by a novel selection strategy. Annals 2010; 252: 537-543.

About the Authors

Aimee K. Gardner, PhD, is assistant dean of evaluation and research, department of surgery, School of Allied Health Sciences, Baylor College of Medicine, Houston, TX.

Brian J. Dunkin, MD, FACS, is the John F., Jr. and Carolyn Bookout Chair in Surgical Innovation and Technology, Houston Methodist Hospital, Houston, TX.