Stephanie Wortel, the director of education for the Global STEM Alliance at the New York Academy of Sciences, recently wrote a letter to The New York Times in which she stressed that “If we want to prepare the next generation of students for a STEM-centred workplace, we need to begin with the scientific method. The development of critical thinking skills will not be possible without exposure to conclusions—even those regarding climate change—derived from scientific inquiry.” This gives voice to a very real concern that the politics dominating Washington are leading us toward tragedy.
There’s a strong need in the United States for analytical thinking to be taught as an antidote for just about every social malady in the news, especially those that could have catastrophic global consequences. And while we may disagree on the economics, logistics, and assumptions that hold up our institutions, most educated people agree that teaching evidence-based scientific inquiry is necessary. Unfortunately, in our zeal to protect the scientific method from self-serving politicians who may see it as unprofitable, we risk going too far and making the sciences the only lens through which we are willing to view human experience and world.
Not everyone is prescient enough to keep from falling into the trap of “scientism”—the belief that the scientific method is appropriate for all other disciplines, including the humanities and the social sciences. “STEM” stands for “Science, Technology, Engineering, and Math,” fields that when practiced scientistically, dovetail perfectly with the kind of profit-driven pragmatism that coerces impressionable college students into majoring in biology, information technology, or finance instead of theater arts or philosophy.
There’s a solid argument to be made that if the planet is about to die, good costumery and the ability to explain Tractatus Logico-Philosophicus isn’t going to save us. But such absurd utilitarianism isn’t what the new humanities major hears from a sneering entrepreneur relative when he goes home on his first Christmas break. He hears, “Victorian literature? What are you going to do with that?” as if we all must agree that Dickens failed in his ultimate goal to design a more efficient toaster.
Similarly, I’ve no doubt that medical science will someday cure most forms of cancer. Let that day come soon. But I continue to wonder whether curing cancer is of any value if we have no idea what it means to us as human beings to have cancer and then have it cured. It reminds me of an old joke:
Engineering major after college: you want fries with that?
Psychology major after college: why do you want fries with that?
Philosophy major after college: what is the nature of wanting?
A fresh bucket of fries means nothing if you don’t know what it is, why you might want it, or who it is that wants. Human experience is constructed out of different kinds of meaning, different perspectives on what it is to be alive and to bear witness to the world. We need Hamlet just as much as Microsoft Word. We need Daniel Day-Lewis to show us something different than what Elon Musk will show, even if they’re both next to each other on YouTube. Without that broad spectrum of meaning, we’re less alive.
So I completely agree with Stephanie Wortel when she argues, “A comprehensive STEM education teaches students to make evidence-based decisions using data collected in accordance with the scientific method. Such process-driven skills are in high demand by employers in all industries, and we only harm our children—and their future career prospects—by withholding or limiting such instruction.” But I think advocating vehemently for STEM verges on scientism when it overlooks the need for other forms of education, other ways of knowing.
As someone who spent two decades in academia, I think the humanities are in deep trouble in the United States, perhaps even more than the sciences. This is partly due to the misguided fetishization of STEM, which has drawn away funding, talent, and cultural approbation. When academic hiring freezes and the non-renewal of tenure-track lines make everyone afraid; when departments are primarily staffed by desperate adjunct instructors; when, in order to make ends meet, universities accept and graduate hordes of PhDs, flooding the job market with specializations that have no value outside the academy; arts and humanities degrees become frivolous distractions for those with nothing better to do for a few years.
In my creative writing MFA and English PhD years, in which I was a student and then an instructor at various levels, I saw the same personality types come and go. The most gifted tended to be the writers who saw their work as an experimental internal process—not one that was necessarily analytical or scholarly but highly intuitive and introspective—while pushing themselves to acquire as much narrative craft as possible. Doing that in a few short years is hard in the best of circumstances and is both heroic and thankless in our current economy.
The worst students—the ones who inspired you to marvel at the human capacity to make erroneous life choices—were typically the children of the elite, who were studying creative writing as a clever way to hide out for a while without creating a resumé gap. They all planned to “get serious” eventually, but these are the wages of scientism in our culture. “Serious” comes down to science, industry, and money. Goofing off is the answer to “What are you going to do with that?”
Now the Millennial and post-Millennial trend seems to be more in favor of going to work for an INGO as a form of poverty tourism. You can seem concerned with social justice and actually do as little as possible. Then the time comes for you to return and take up the family business and say you did good before you started doing well. In the meantime, you might start a blog. You’ll certainly do a lot of shopping and have a lot of interesting dinners. You’ll make a lot of new friends. It’ll be tough but, in the end, you’ll feel all grown up.
This is unquestionably a generalization. There are exceptions to the rule. But the rule was starkly visible in the more prestigious creative writing programs the late-1990s and early-2000s. As someone who spent a lot of time interacting with MFA students from programs across the country for about 12 years, I saw it over and over. One simply didn’t take most of the well-connected students very seriously—even if the invisible Machiavellian web of patronage that kept the faculty published and fed had already ensured certain professors would sing their praises no matter what.
Still, as the old programming saying goes: garbage in, garbage out. At the end of a two-year writing program, if all you’ve done is attend literary readings, go on road trips, and throw parties while skating through your course load with the least effort possible, what do you have? You could have taken that tuition money, gone to Europe, and done what they used to call the Grand Tour. At least the cocktails would’ve been better and the baguettes more plentiful. But your parents were tax attorneys, managed a hedge fund or owned a chain of appliance stores in lower New Jersey and the money kept flowing as long as you kept pretending that you were doing something beyond a bachelor’s degree.
It goes without saying that all the hiding out and gaming the system often caused a deeper harm in the those privileged enough to be insulated from the hard-nosed reductive forces that allowed them to take their protracted vacation in the first place: they started to think of themselves as aesthetes. Sure, they could write. After all, can’t anybody? But their real gift was their impeccable taste. This is the great existential turn where such a student starts to speak in the voice of mother lecturing the designer about the new ottoman. It’s also the moment where art no longer means as much as the ability to choose it.
From there, the developmental arc is just about complete. They’re just about ready to go home and assume their rightful place in the one percent. They “did a writing program” the way kids today who’ve done a little entry-level work for Mercy Corps or Oxfam say, “Yeah, I did Senegal.” No, my child, Senegal did you and you don’t even know it. You could’ve accomplished something. Instead, you hung out a lot, got romantically involved with a co-worker, bought some souvenirs, and drank beers with pictures of endangered animals on the label.
I saw this more than once in 25-year-old trust funders, who might’ve created amazing works of literary art if overwhelming privilege and reductive materialistic values (that hold STEM as a primary driver of industry) hadn’t neutered them before they had a chance. And it always made me sad because I’m among that dwindling minority that thinks the humanities in general and MFA creative writing programs in particular teach uniquely valuable things.
After years of leading the writing life in and out of academia, I’ve gotten over feeling bewildered by the extent to which the publishing industry has evolved in a this brave new scientistic world. It has made use of privileged students like this, subverting and repurposing MFA programs in a kind of Manhattan provincialism, making the “New Yorker story” code for a kind of literary Xanax, and turning out a cultural elite unable to do much more than choose new interpretations of the same old commercially successful books. But let’s go study engineering. I’m sure the market can be trusted to provide us with the best art, music, film, dance, theater, and literature.
I wonder whether I could entreat Stephanie Wortel to take up the cause of English studies in just one excellent letter to The New York Times. I wonder whether it would cause more exciting fiction to appear in magazines and on the shelves. I wonder whether it would influence Uncle Larry the Laundromat King to stop asking his niece what she expects to do with a degree in Restoration drama. And I wonder if art schools would then be attended by fewer dilettante princelings and more individuals who know the humanities are a bad career choice but need it like they need the air.