If you want an argument in favor of teaching the humanities, I suggest you ask a medical educator.
Across the US, the age-old debate about the value of a liberal arts education has seemingly devolved into mortal combat, leaving the humanities in dire straits on college campuses. For example, the Atlantic recently reported on West Virginia University’s decision to gut its humanities programming, and the New York Times wondered whether the liberal arts would exist after the budget cuts happening in higher education. The burgeoning movement to defund (perhaps even defenestrate) the liberal arts is not only capricious and hasty, but also remarkably short-sighted. As a former medical school dean, I know the liberal arts are not only more relevant than ever; they are critical to the future of health and health care in America.
Today’s students, looking to justify the cost of tuition, are choosing college majors based on the likelihood of gainful employment upon graduation. “Fewer than one in 10 college graduates obtained humanities degrees in 2020, down 25 percent since 2012,” the Hechinger Report, an education publication, reported in 2021. In 2023, the New Yorker published a feature titled “The End of the English Major .”