May 4, 2021
Today, medicine splinters into specialties, subspecialties, and sub-subspecialties. A physician is not just a surgeon, but an orthopaedic surgeon; not just an orthopaedist, but one who focuses on the foot. There are interventional cardiologists, gastroenterologists dedicated to endoscopic retrograde cholangiopancreatographies, vascular surgeons who only treat veins, and so on. This hyper-differentiation was not always the case and, indeed, largely reflects both professional and scientific developments that transpired over the course of the 20th century.
As recently as 1876, Samuel Gross, MD, opined, “It is safe to affirm that there is not a medical man on this continent who devotes himself exclusively to the practice of surgery,” demonstrating how every American physician in that era was first and foremost a general practitioner (see Figure 1).1 But over the next 145 years, both medicine and surgery have balkanized into hundreds of sub-fields.2,3 In the process, the American College of Surgeons (ACS) has helped establish and maintain the importance of the specialty of surgery.
While technical specialists, such as cutters for stone, have existed since antiquity, modern specialization began in the 19th century with the ontological conceptualization of disease.
Before individual diagnoses of pneumonia or cancer, patients were either sick or healthy, an intellectual framework that neither required nor valued specialized knowledge. Only after the medical profession identified, classified, and diagnosed particular diseases—which resulted in an exponential increase in medical knowledge—did physicians focus their attention on individual maladies, develop expertise in that pathology, and emerge as specialists.
Two other factors contributed heavily to specialization. First, the rise of cities concentrated people in confined geographic areas, bringing together large numbers of particular diseases. Whereas specialists could not survive economically in small villages, they thrived in larger metropolises. Second, technology facilitated diagnosis and treatment, expanding specialists’ scope of practice. Ophthalmology without the ophthalmoscope was severely limited in what it could offer; the stethoscope along with the electrocardiogram made cardiology viable.2,4
In the U.S., surgery helped lead the movement toward specialization and particularly its institutionalization. American medicine prided itself on its general practice—the idea that any physician could go out to the frontier and deliver babies, treat pneumonia, and remove an appendix. This practice pattern became untenable by the early 20th century.
In particular, a group of surgeons, led by ACS Founder Franklin H. Martin, MD, FACS, grew increasingly concerned that general practitioners lacked the proper training or experience to perform operations safely and that their errors not only hurt individual patients, but also discredited the entire profession. Dr. Martin thus established the ACS in 1913 as a way to distinguish—and advertise—competency (see Figure 2).5 Crucially, it catalyzed a movement away from self-defined, ad hoc specialism to a regulated, institutionalized form.
As one of the first national organizations dedicated to this effort, the ACS became a role model that other fields emulated. The prestige and position of specialties grew such that the all-encompassing general practitioner effectively disappeared in the decades following World War II. Since then, specialties themselves have divided into sub- and sub-subspecialties—with resulting consternation over the apparent “death” of general surgery.6 But with knowledge, technology, and possibilities expanding exponentially, a specialized future is certain.