Splicetoday

Writing
Apr 28, 2021, 05:57AM

A Guide to Modern Nonfiction and Its Dangers

Gladwell did steal the secret to writing sizzling histories from Foucault.

B2ae41d9 7d7f 4b62 8237 95d8811ad8c2.jpeg?ixlib=rails 2.1

Like many attracted to beauty in my youth, and power in middle age, my reading habits have shifted. I used to read almost nothing but novels and poems; now I read books that establish the “rules” for doing things and the “secrets” nobody but the author (and the people he interviewed) know exist. As a regular contributor to Splice Today, I sometimes review this type of McJournalism when I can skim through it fast enough. Recently, I began to blurrily scan the pages of Kevin Roose’s new book Futureproof: 9 Rules for Humans in the Age of Automation. My attitude, as I began reading, was fairly neutral. The book was looking for readers who like following rules, which is bad, in my opinion. It came recommended by The New York Times, in a review that promised I wouldn’t feel patronized by Roose’s exciting new guidelines for being alive. So the bad title didn’t faze me.

Then I ran smack into this: “Recently, I was at a party in San Francisco when a man approached me and introduced himself as the founder of a small AI start-up.” Instantly, I knew what was coming: Roose was about to deliver A Meaningful Anecdote. Nowadays, meaningful anecdotes are virtually the only allowable way to begin works on nonfiction; we hardly even blink. They’re the modern equivalent of telling a joke to start a speech at an annual trade convention. But it wasn’t always this way. The sound of someone lighting their fuse with an anecdote used to be startling. When the first such explosion was heard, towards the end of the 1950s, it issued from somewhere deep in the bright, amiable hallways of the University of Paris. (We know it as the Sorbonne.) There, a gifted and difficult student named Michel Foucault decided to write about people like himself: that is, people labeled as insane. With little more than a heap of neglected asylum records, a striking diary from France’s first official “schizophrenic,” and a hunch about the connection between asylums and leprosy, Foucault began work on a treatise that would emerge, three years later, as the definitive history of madness in our time.

Foucault wasn’t mentally ill; he simply identified with the marginalized victims of social norms, especially when those norms were moving targets like “sanity.” I’ve no idea whether the hallways of the Sorbonne are brightly lit. I assume they are. I don’t have a single piece of evidence that Foucault was “difficult,” nor do I know what research materials surrounded him when he started work. Here’s the curious thing: it doesn’t matter. Foucault’s actual text, published in English as Madness and Civilization, is easy to obtain. If you order it, you’ll find that it does start with a lively story about leper colonies that French authorities converted into madhouses. That’s the important bit. Foucault was a dyed-in-the-wool academic, making a serious argument about madness. Most people writing those kinds of books were dull; Foucault was the exception. His way of telling that story came to matter more, to modern intellectuals, than his radical ideas. It was entertaining. It was suspenseful. The introductory anecdote practically shouted, to a dubious reader thinking of buying the book, how much reality was to be found between its modest covers.

“Foucault came along, and just like that, you couldn’t write the same old way anymore. I saw that at once.” Malcolm Gladwell is pulling leaves and bits of old paper out of his wiry and uncontainable hair; he’s also telling me about his life as a student at the University of Toronto. “We’d have maybe one professor assigning this guy to us, and even he was like, ‘I think it’s pretty weird, but I also think it’s genius.’ To everyone else Foucault was obscure. There was this feeling coming from the big lecture classes that his project was somehow provincial, and too European. But we thought it was the only game in town. It sounded real to us and it galvanized us to write.” Gladwell needed the inspiration; in 1984, Toronto resembled nothing so much as a sleepy town overgrown into a city. Local officials were busily touting a new tweed factory that had automated the process of sewing on new buttons; meanwhile, the university languished.

No, it didn’t. I owe you an apology. Gladwell has never met with me, much less given me an interview while simultaneously fixing his hair. He doesn’t talk about Foucault like some stoned cultural theorist trying to describe the impact of the Beatles. Even what I wrote about Toronto in the 1980s is patently untrue. But Gladwell did steal the secret to writing sizzling histories from Foucault. Here’s Foucault’s first sentence, from 1961: “At the end of the middle ages, leprosy disappeared from the Western world.” Compare that with Gladwell, breaking his story about “the tipping point” in 2000: “In the mid-1990s, the city of Baltimore was attacked by an epidemic of syphilis.” Of course, the disease has been changed. We’re certainly not in Paris anymore, either; “this is Baltimore, lady!” But the rest is identical; it’s mysterious when illnesses vanish, and mysterious when they attack. You can imagine whole cities crying out, like a person with syphilis, or lying prone in the silent aftermath of leprosy.

Foucault’s ominous, inscrutable chapter title, “Stultifera Navis,” would get its due from Gladwell later. Gladwell’s book Outliers, for instance, includes the marvelously undecodable title “Die Like A Man, Like Your Brother Did!” Meanwhile, for The Tipping Point, Gladwell went a different route. He wanted to create something easy to remember… something bounded, schematic, and complete. He wanted some rules. Chapter 1, “The Three Rules of Epidemics,” is Ground Zero for everything Jordan Peterson and Ray Dalio have put people through since then.

Other titans of nonfiction have grown up alongside Gladwell: Walter Isaacson, for instance, America’s poet laureate of “great men” and geniuses, or James Gleick, the honorary professor of chaos theory and other formerly inexplicable things. We’ve lived through Freakonomics and at least one big popular tome (like Thinking, Fast and Slow) from everyone who’s ever won a Nobel Prize. Introverts have their own Gladwell imitation (Quiet), as do dilettantes (Range). There’s even a Gladwellian book for people whose idea of “cultural criticism” is yelling at their television after work. It’s called The Signal and the Noise, and it’s available for download to your Kindle right now.

What these books share, beyond an annoying debt to The New Yorker, is a set of underlying assumptions about society that should bother us tremendously. I can’t fault Foucault for making his history of madness vividly readable. Once it gets going, it’s a work of original and provocative philosophy. But I can take issue with Gladwell and the deluge that followed after him. That entire school of journalism exists to produce the intellectual equivalent of carbon monoxide. You know why carbon monoxide asphyxiates people? Because your blood cells can’t distinguish it from oxygen. It binds to them, blocking the normal transmission of oxygen, even though you seem to be breathing normally. Have you ever been to a party, in San Francisco, where a CEO from dangerous AI startup started shouting in your ear? Of course not. That doesn’t happen to ordinary people, as Kevin Roose knows perfectly well. Despite his Jimmy Stewart modesty—the anecdote really stresses how trapped and awkward it feels to be sought after like this—his real goal is to impress you with the many things that he’s has been lucky enough to overhear. That night in San Francisco is just one example. Roose has been partying nonstop for our sakes: “This conversation… was happening privately among elites and engineers… at night, after their public events [in Davos] were over.” (I’m not even getting this from the first chapter; it all comes tumbling out in Roose’s breathless introduction.) I bet one or two drunken engineers were even kind enough to slur at Roose that they were speaking “strictly off-the-record,” just to make sure he would quote them in his book—that daring, unauthorized transcript of his years with the “elites.”

That shameless elitism is one problem; another big error is equating success or talent with morality. When I tried reading Walter Isaacson’s book Steve Jobs, I stalled out, because I couldn’t stand the way Isaacson tiptoed around the cracks in his subject’s inane facade. Take, for example, Jobs’ numerous failed relationships. They failed because Jobs treated his lovers terribly once his infatuation with them faded. It’s not a staggering flaw, and it’s largely unrelated to Jobs’ career as a digital innovator. But it does give the lie to any half-baked notion that Jobs was morally profound. He didn’t invent the iPod or the iMac because he was deep; he invented them for the opposite reason, because he wasn’t. Jobs liked clean, shiny, antiseptic surfaces, and he put a classy, luminescent shell around all the messy tangles of chips and code that scared laypeople away from computers. It’s possible to appreciate such path-breaking work without buying into everything you hear, later, from the dying CEO. Jobs always wanted to be more than what he was: an exterior decorator for certain electronics. Most people want to be more than their occupation, and fail in the attempt. That’s a story worth telling, but it goes places Isaacson fears to tread.

The great deception perpetrated by modern nonfiction is a persistent attempt to make history personal in the telling. Steve Jobs goes to India, and we get a touchscreen. Sleazy profiteers go to Davos, networking about automation, and we get unemployment. The Tipping Point slowly condenses into a narrative about newly coined personality types—Mavens, Salesmen, and Connectors—as though everyone (or, perhaps, everyone who matters) can fit neatly or happily into one of those molds. All the pointless “color” these authors use to frame their “exclusive interviews” is actually supposed to tell: just look at The Big Short, by Michael Lewis. To Lewis, if a man has a strange stock portfolio, there’s always a story behind it. It’s the story of his iTunes playlist, his years at medical school, and his disconcerting glass eye.

No doubt Lewis is right, in a sense; everybody goes through life trying to make their uniqueness compatible with society’s demands. A select few succeed. But that doesn’t make them self-made men. In the documentary Spellbound, which has a frightening lucidity, the filmmakers spend a large amount of time telling stories. They’re studying children who’ve made it to the final, national round of America’s biggest spelling bee. These children and their families all have some way of explaining what their success means. For one child, spelling tough words correctly is a glimpse of a better life. For another, it’s the crowning achievement of a perfectly regulated existence. Hey, that Harry kid seems to be autistic: is that why he’s good at spelling? What seems, at first, to be a celebration of American diversity gradually starts to corrode the whole project of explaining success. It gets tough to watch each successive family shoehorn its own cherished values onto the rote, random task of spelling words. If the child gets the word correct, the myth is proved right. If the child messes up, the family sweeps up behind them, stuck with a salvage job. But you can always find meaningful explanations, for both success and failure, if you never let yourself ask bigger questions about the task itself.

“In this world, there is one awful thing,” one man wearily declares, in a film by Jean Renoir, “and that is that everybody has their reasons.” He appears to be talking about what’s wrong with people, but he’s really saying something is wrong with the world. In an attempt to reassure us, I think, Kevin Roose says “the truth about the AI revolution” is that “it’s just people, deciding what kind of society we want.” But that’s not what he found at the peaks of wealth and influence. There he heard the sound, as Agent Smith would call it, of inevitability:

I hear the “automation is destiny” argument all the time—especially in Silicon Valley, where people tend to talk about technological progress as a speeding train we either have to climb aboard or get run over by—and I get why it’s tempting to believe. 

Successful people don’t want to change the world. They want to see its destiny fulfilled. It’s tempting to believe in them, because the alternative feels like believing in nothing. But let me tell you a story.

In 2001, around the same time Malcolm Gladwell was pondering how to follow up The Tipping Point, I was working at a desk in the Employment Development Department in Sacramento, California. My job as an analyst was to track and document about $80 million in state funding for unemployment insurance claims. My work was collated and merged with similar output from everyone else at my level. There was no possible way to do an especially good job; after a while, I automated most of my deliverables using Microsoft Excel. After that, the whole job took about one day of work per week. I spent the other four days reading surreptitiously.

Once in a while, we’d hear that the department’s unfathomable river of data was about to have a new home: a single, statewide database, run by the Oracle Corporation, that would contain every single information stream we made or used. We’d receive vague memoranda instructing us to use new, database-friendly formatting. This new system, which always seemed to be right around the corner, was all anyone “higher up” could talk about—for months, it seemed like. And then, one day, nothing happened.

I mean that’s how it felt. Actually, there’d been no progress on anything related to the Oracle database for a very long time. But one day I looked up from Project Gutenberg (I was reading their free copy of Dracula) and realized something had to be wrong. It struck me that the database was never going to debut, and that perhaps Oracle had not intended to make a functional product from the outset. That made me wonder why they’d received such a handsome, unprecedented contract. I knew the deal had come from the governor’s office; such were the perks of living in Sacramento. Frankly, I knew more about Oracle than I wanted to know. I just wasn’t sure how Gray Davis, the governor, might be connected to them. So I started Googling contributors to his most recent campaign. Oracle’s generous donations were there, right at the top of the list; at one point they’d made a single donation of $300,000.

A week passed. I didn’t tell anyone what I’d found, because it scared me a little. Two weeks later, the news was public knowledge; Gray Davis had been making corrupt deals with Oracle, among others, to pay them back for his election. He was voted out of office that same year, after a special recall election. I’ve had two realities to contend with ever since. First of all, I failed. I had the goods on the governor of California; it was surely my job to blow the whistle. Second, it didn’t matter that I failed. Somebody else saw the same pattern, and they made it public. The truth about Oracle in California wasn’t some back-room revelation, accessible only to well-connected people schmoozing behind a velvet rope. I solved it by putting some water cooler gossip together with the public records I found online. I was, to quote Knives Out, “following arcs like lobbed rocks.”

The man who says that, Benoit Blanc, is talking about what he calls “the inevitability of truth.” Blanc goes on to say, “the complexity and the gray lies not in the truth but what you do with the truth once you have it.” It would be both exciting and maddening if the truth about AI and our machine-tooled future were something remote, available to just a few noble, dissembling reporters who take vodka shots with Elon Musk while on assignment. Then knowing would be everything; it would perhaps even be the difference between surviving or becoming obsolete. I’d be practically on fire to know what it was that Kevin Roose heard at Davos late at night. If I wanted to understand Steve Jobs, I’d turn to Walter Isaacson, who sat down and talked with the man. But when I tried that, and started in on Isaacson’s book, I found out that Jobs was exactly who I thought he was. Is Roose really trying to convince us that factory owners want to replace their workers with robots? Because I just can’t figure out who it is, exactly, that wouldn’tbelieve him. We’re all in the position once held by a single man: Doctor Faust, to whom the Devil granted knowledge of all things, knowing the complexity and the gray would remain. I’m indebted to Mephistopheles for the advice he gives Faust next. “I wish to share the whole experience of mankind,” Faust declares. “I think you’d better,” says the demon, “go and hire a poet.” But Faust demurs; if poetry must be my refuge, he complains, “what am I then?” So Mephistopheles offers him something else, instead. Rules.

Discussion

Register or Login to leave a comment