fbpx

Diagnosing Ideological Medicine

In the fall of 2022, many commentators were exercised by a new and highly progressive oath taken by the members of the class of 2025 at the University of Minnesota School of Medicine. One organization characterized it as making students who wish to become physicians “read verbatim from ideological pledges,” branding it a “new low.” Another, noting that the oath had been penned by a small group of students on behalf of a much larger class, decried the finished product as just another example of “activists” speaking with “the loudest megaphone.” One critic even speculated that if such trends continued, the school might soon do away with the white coat, the traditional garb of physicians, simply because it is white.

Among the features of the oath that came in for especially harsh criticism were these:

  • The oath noted that the institution is located on “Dakota land” and that “many indigenous people from throughout the state, including Dakota and Ojibwe (ooj-jib-way), call the Twin Cities home,” while also admitting that “this acknowledgment is not enough.”
  • Those who recited the oath committed to “uprooting the legacy and perpetuation of structural violence deeply embedded within the healthcare system.”
  • They recognized “inequities built by past and present traumas rooted in white supremacy, colonialism, the gender binary, ableism, and all forms of oppression.”
  • They committed to “promoting a culture of anti-racism, listening, and amplifying voices for positive change.”
  • They pledged “to honor all indigenous ways of healing that have been historically marginalized by Western medicine.”

Of course, the inclusion of such elements in a professional oath does not by itself generate a diagnosis but merely represents a symptom of broader cultural phenomena.

Medical Progressivism and Its Critics

These elements of the oath proved ripe for criticism on many grounds. First, there was the matter of the Dakota people. In fact, the Dakota had not been the only prior inhabitants of the land now partitioned into the Twin Cities, and in the eighteenth century they and the Anishinaabe had been engaged in vicious conflict over control over Minnesota’s wild rice beds, their principal source of food. The medical students who drafted the oath would have benefited from a deeper grasp of the heritage they sought to acknowledge.

Moreover, in saying that their acknowledgment was “not enough,” they raised the question of what would count as sufficient. Asked one commentator, “What else will they do for the Dakota people? Will they give the land back, or compensate the original ‘owners?’” Should they, perhaps, offer free or reduced-cost care to the descendants of indigenous peoples, or strive extra diligently to ensure that they receive the very best care?

As to the “uprooting” of the “legacy and perpetuation of structural violence deeply embedded within the healthcare system,” commentators attacked the notion of “structural violence.” Coined in 1969 by Norwegian sociologist Johan Galtung, structural violence commonly refers to institutionalized ageism, racism, sexism, and speciesism, among other forms of “social injustice.” As one might expect, the fact that such biases are seen as “structural” has led to calls for structural interventions.

Ironically, however, two of the countries in which such structural interventions have been most prominently attempted, Haiti and Rwanda, remain among the poorest and least healthy nations on earth. Merely prohibiting the commodification of health needs and establishing a social “safety net” does not necessarily translate into enhanced health or a better life. Which begs the question, what structural interventions are the first-year medical students at the University of Minnesota making or planning to make to overcome such structural violence at their own institution, and how are the medical school and university responding to them?

Commentators also seized on the students’ determination to ferret out “inequities built by past and present traumas rooted in white supremacy, colonialism, the gender binary, ableism, and all forms of oppression.” This immediately raises the thorny problem of establishing responsibility for these traumas. For example, is the responsibility for the enslavement of blacks in the Americas shared only by those actively involved in the slave trade or who held slaves? Or does it extend to their descendants over a certain number of generations? Or is it shared by all white people, whether they or their ancestors happen to have been personally associated with slavery or not?

There are other questions. Should apologies and perhaps reparations for such injustices be directed only at those who suffered under slavery, their descendants over a certain number of generations, or all people who self-identify as black, regardless of whether their ancestors were held as slaves or not? Likewise, who is responsible for perpetuating the gender binary, and what should be done about it? Does the use of such words as boy, girl, man, woman, male, and female establish complicity, and should medical students avoid all such binary terms? Finally, does medical research and treatment of disabilities—or perhaps different abilities—such as vision and hearing loss, restricted mobility, and dementia represent and perpetuate “ableism”?

Students also committed to “promoting a culture of anti-racism, listening, and amplifying voices for positive change.” What, commentators asked, would a culture of anti-racism in medicine look like? For example, what are students to make of race in the first place? Is it a biological category or a social construct? If the latter, perhaps the best way of reducing racism would be not to make anti-racism a pillar of medical education and practice but to take race out of medical discourse entirely. In many cases, it obscures more than it reveals.

The suggestion that persons be treated as members of categories rather than as distinct individuals should give anyone with any sense of history serious pause.

For example, I know of two medical school applicants, one who identified as black and the other as white. The black student grew up in affluence because both of her parents were physicians. The white student grew up in poverty, with unemployed, self-described “hippie” parents who lived in a trailer in the woods. The black student attended elite educational institutions her whole life. The white student was the first person in her family to go to college and got through college through a combination of loans and self-employment. When medical schools looked at these students’ applications, they saw a member of the historically disadvantaged black race and a historically privileged member of the white race, and gave preference to the black student, in large part because neither student elected to describe their family circumstances in their applications.

Some have argued that taking race into account is justified, in part because of studies suggesting that members of minority groups are better cared for and achieve better outcomes when under the care of health professionals who belong to the same groups. Hence, we need more black males, who are markedly underrepresented in medicine, to pursue medical careers, so they can care for patients like themselves. Yet those who make such arguments seem to be suggesting that we segregate patients and health professionals by race, ethnicity, religion, and so forth, and who really wants to invoke segregation as a policy?

The issue is not as simple as meritocracy versus affirmative action, and any suggestion that persons be treated as members of categories rather than as distinct individuals should give anyone with any sense of history serious pause. Even diseases such as sickle cell anemia are not so confined to specific racial and ethnic groups as once supposed. When people begin thinking that they can discern what they need to know to make a decision about a person from a photograph or a box checked on a form, they have positioned themselves in ethically perilous territory whose denizens included some of the most notorious figures of the twentieth century.

Perhaps the deepest problem of all is the assignment of racial categories. Who counts as white, who as black, who as Hispanic, and who as an Asian/Pacific Islander? Should premedical students, medical students, and physicians be subjected to genetic testing, and what threshold level should be established to qualify as a member of a particular race? For example, could one qualify as black with 50%, 25%, or 12.5% of the genes commonly found among people who self-identify as black? Would such genes need to be of African origin, or would genes common among dark-skinned people from other parts of the world count as well? If race is not defined genetically, should it be based on documented family trees, and if so, how would anyone establish the race of their parents, grandparents, great-grandparents, and so on?

And if race cannot be established by either of these means, what alternative is there? Would self-identification be appropriate, and if so, how would medical school officials or health professionals handle individuals who appear to be self-identifying as members of a race to which they do not appear to belong? Perhaps Martin Luther King, Jr. was on to something when he longed for a world in which his children would be judged not by the color of their skin but by the content of their character.

Finally, students pledged “to honor all indigenous ways of healing that have been historically marginalized by Western medicine.” Critics asked how this would play out in practice. Would medical students and physicians caring for patients of Native American descent, for example, recognize taboo transgression, improper animal contact, inappropriate ceremonies, and contact with malignant entities as potential causes of their patient’s illnesses? When puzzled by a diagnosis or a lack of response to therapy, would they refer patients to medicine men or hand tremblers? Would they themselves resort to ceremonies, herbs, and sand paintings as means of treating patients?

To be sure, practitioners of so-called Western medicine have adopted many practices not well supported by evidence, and no doubt many currently accepted diagnoses and therapies will be supplanted in the future, but there is a difference between saying that Western medicine is a work in progress and saying that it supplanted indigenous ways of healing through a power imbalance. There is strong evidence that microscopes, CT scanners, vaccines, and antibiotics really work, while such evidence for many indigenous practices is simply lacking.

In Praise of Idealism

Despite such objections to the student-crafted oath, however, more is called for than the mere venting of spleens at their naïveté. I believe that the students were engaged in a fundamentally laudatory activity, sketching out aspirations for themselves and their colleagues that extended beyond merely passing their examinations, obtaining the MD degree, avoiding malfeasance and malpractice litigation, and enjoying the fruits of a secure and well-paying career.

Instead, the students were looking up from their books and computer screens long enough to consider the larger contributions they hoped to make as members of a learned and highly respected profession, gifted with first-rate minds and rich educations, and capable of making a real difference in their communities and society at large. They recognized that members of professions such as law, clergy, teaching, and medicine bear special responsibilities as bellwethers and guardians of standards of goodness and service, and they wanted to establish for themselves some sense of where such a profession’s aspirations should lie. Although their list of such aspirations was more than a little bit muddled, they were attempting to do good.

Such aspirations have deep roots in the profession of medicine, arguably best exemplified by the nineteenth-century German physician Rudolph Virchow. Virchow is a towering figure in the history of medicine, perhaps the greatest pathologist who ever lived, author of some 2,000 scientific papers, the identifier of such pathologic lesions as thrombosis, embolism, and leukemia, and known in his own day as “the Pope of Medicine.”

The young Virchow was also known to harbor highly progressive political ideas, and the German government attempted to sideline him by sending him to study a typhus epidemic in Upper Silesia. Their strategy backfired. Virchow was soon convinced that the roots of the epidemic lay above all in poverty, and he became a notable proponent of what today might be called social or political activism in medicine. Returning home to conditions of profound political instability, Virchow founded a journal, The Medical Reform, which agitated for political reorganization. He introduced venerable slogans to medicine, such as “Medicine is a social science,” and “The physician is the natural advocate of the poor.”

Virchow argued that the prevention of disease and promotion of health required immense social changes. Calling on physicians to pay as much attention to social conditions as biology, he wrote:

Medicine is a social science and politics is nothing else but medicine on a large scale. Medicine as a social science, as the science of human beings, has the obligation to point out problems and to attempt their theoretical solution; the politician, the practical anthropologist, must find the means for their actual solution.

Diagnosing the diseases of individual patients and treating individual patients with drugs was not enough. Physicians must shift their focus from individual patients to the social circumstances of the entire population.

Instead of waiting for diseases to develop and then attempting to cure them, Virchow held, physicians should be actively engaged in their prevention. “The logical answer to the question as to how conditions similar to those that unfolded before our eyes in Upper Silesia can be prevented in the future is,” he wrote, “therefore, very easy and very simple: education, with its daughters, liberty and prosperity.” His response to the problems in Upper Silesia embodied this view, including education in Polish, self-government, a more progressive system of taxation, and improved agricultural techniques and transportation. He wrote, “I am no longer a partial man but a whole one, and my medical creed merges with my political and social ones.”

Virchow was no hypocrite, and he attempted to carry out in action what he advocated in words. He became a leader of the progressive party and served for 12 years as a member of the national legislature. He was known nationally and internationally as a leader in the promotion of clean drinking water and the installation of improved sewer systems.

Reflecting on the significance of Virchow’s death in 1902 from a femur fracture, medical historian Erwin Ackerknecht said that Germany had lost four great men in one: its leading pathologist, its leading anthropologist, its leading sanitarian, and its leading liberal. The great physician William Osler compared his American colleagues to Virchow and found them wanting, decrying the fact that “In [the US], doctors are, as a rule, bad citizens, taking little or no interest in civic, state, or national politics,” while by contrast, Virchow supported “all reasonable measures for the relief of the poor.”

Virchow’s successors in contemporary medicine often refer to the “social determinants of health.” In the case of poverty, the life expectancy of men at age 25 living below the federal poverty level is 46 additional years, while for men above four times the poverty level, it is 54 years, and the comparable figures for women are 52 years and 58 years, respectively. Poverty also has indirect effects on health. For example, there is a strong positive correlation between economic status and educational attainment, and men and women without a high school diploma report fair or poor health 57% of the time, while the rate among college graduates is only 9%.

Instead of simply condemning students for holding opinions that we regard as wrong-headed, naïve, or even the product of brainwashing, we would do better to take what steps we can to ensure that their views are tested and challenged.

More broadly speaking, life expectancy for people without a high school degree is about a decade shorter than for those who have completed college. In the Indianapolis metropolitan area, where I live and work, the life expectancy between the poorest and most poorly educated and the richest and best-educated zip codes—which are just 17 minutes apart by car—varies by 17 years. One way to improve the health of such cities might be to build new, state-of-the-art hospitals, but another perhaps more effective approach that Virchow would no doubt condone would be to secure well-paying, secure jobs for those who lack them and improve the quality of schools and educational programs.

Promoting the Contest of Ideas

The real problem in contemporary medical education and practice is not that the students, faculty, and administration are idealistic or have been captivated by one or another ideology, but that many are operating from relatively superficial and untested perspectives. And this, in turn, stems from the fact the quality of intellectual and cultural discourse in many medical schools, medical practices, hospitals, and health systems ranges from poor to almost entirely absent.

To be sure, physicians and physicians-in-training routinely rely on highly sophisticated scientific, technological, and clinical knowledge and skills, but when it comes to inquiry into core questions about the ethical and political dimensions of medicine and their place in the larger scheme of contemporary society, the silence can be deafening.

The roots of this problem run deep and have their origin, at least in part, in the premedical curriculum. Instead of pursuing a broad liberal education that would prepare them to learn and develop as human beings and citizens, many undergraduate students choose courses and majors with a view to what will help them shine in medical school. They commonly focus their studies on disciplines such as biology, chemistry, and their subdisciplines such as biochemistry and molecular biology. Many eye courses in literature, history, philosophy, and religion with suspicion, partly because they seem less directly relevant to medicine. For one thing, in contrast to the natural sciences, such subjects are not covered on the standardized test students must take to get into medical school.

These difficulties are compounded once students enter medical school. From the first day, many feel as though they are drinking from the proverbial fire hose of scientific subjects such as anatomy, physiology, and pathology and clinical medical specialties such as medicine, surgery, and pediatrics. Most students quickly develop the opinion that to perform well, or simply to survive, they need to spend their time studying for course examinations and standardized, high-stakes tests that determine whether they can progress to the next level of training. Many students experience a gradual contraction of intellectual and cultural interests into tunnel vision focused almost entirely on preparation for the next test.

Further compounding the problem is how many students learn to study. In the past, most students relied heavily on textbooks, lecture notes, and journal articles to get through medical school. Each particular topic was clearly situated in a larger body of knowledge. Today, however, many students devote most of their time and energy to the study of sample examination questions. Many do not attend lectures in person or even watch them online asynchronously at 1.5- or 2-times speed. The experience of many medical students has become more solitary, more fragmented, and more exam-focused than ever before. Intellectual and cultural discourse cannot thrive under such circumstances.

At the core of the problem is the multiple-choice exam mentality. Such exams are relatively inexpensive to administer and score, require relatively little faculty and administrator time, and appear at least superficially fair and objective. One problem, of course, is that actually practicing medicine—that is, caring for patients—differs quite substantially from a multiple-choice test, if for no other reason than the fact that no patient arrives with one correct response and four distractors taped to their chest. Furthermore, selecting the “one best response” is a lot different from asking good questions, formulating and testing hypotheses well, and actually caring for a person.

Even deeper is the problem that learners are nurtured from their first days as students all the way through medical school and residency on a steady diet of multiple-choice questions. They cannot be as well-prepared for lively intellectual and cultural discourse as learners accustomed to vigorous class discussions about engaging texts and evaluation by essays and oral examinations. Medicine is, at least in part, an intrinsically philosophical art, but in making contemporary medical education highly efficient, we have also undermined its inherent capacity to cultivate curious, thoughtful, and compassionate physicians, citizens, and human beings.

To counteract this unfortunate “dumbing down” of contemporary medical students and physicians, radical change is indicated. For one, we need to rethink the real purposes of medical education, shifting the balance from mass-producing medical technicians to providing a first-rate liberal education at the graduate level. This would mean, at least in part, stimulating reflection and conversation. For example, a good medical school might hold a monthly school-wide debate where students explore important questions in medical policy, practice, research, and education, developing their capacity to consider multiple sides of a question and marshal convincing evidence and arguments for different points of view.

Another remedy would be to reduce as much as possible medicine’s addiction to multiple-choice testing. Where possible, examinations should be essay based, so that students need to formulate arguments, rather than merely selecting correct responses. Oral examinations would encourage students to think on their feet, become more adept at responding to challenges, and pursue truth rather than merely win arguments. Moreover, students should be encouraged to explore and develop their own distinctive interests and abilities, as opposed to merely performing well on the exact same examinations everyone else takes. This could take the form of course essays, theses, and a variety of other scholarly and creative projects that might be combined into each student’s learning dossier.

Instead of simply condemning students of the professions for holding opinions that we regard as wrong-headed, naïve, or even the product of brainwashing, we would do better to take what steps we can to ensure that their views are tested and challenged. No matter how good the scientific and technical aspects of medical education are, we are doing more harm than good if we allow medical students’ capacities to engage in serious ethical and political discourse—and to lead as human beings and citizens—to atrophy. Instead of merely dismissing students at the University of Minnesota and elsewhere, we should be inviting them out for coffee.