I have some bad news. Your brain is not a quantum computer, and your decisions do not create fissures in the universal timeline.
Last night, I finished Season 1 of Undone, a show on Amazon Prime about a funny, alienated woman who discovers (with the help of her dead father) that she can move about freely in time. The show is ambiguous about her mental health; she may be a shaman with mystical, indigenous ancestors, or she may just be schizophrenic.
Let me take that back. The show pretends to be ambiguous, but it's really as duplicitous as Breaking Bad became while pretending to disapprove of Walter White. There's no chance, at all, that the protagonist is merely schizophrenic; the people who love her just don't understand her. Come to think of it, nobody understood Donnie Darko, either—but he still jumped through a wormhole and saved his entire town. In fact, according to these films, and the acclaimed videogame Life Is Strange, approximately 100,000 people in the United States alone will be misdiagnosed with schizophrenia by the end of this year. The vast majority of them will be forcibly prevented, by a mixture of inpatient care and antipsychotics, from traveling through time to fix various problems in their local areas. I'm sure it won’t surprise you to learn, like Keanu Reeves in The Matrix: Resurrections, that the most effective prescriptions (for hiding otherwise visible, functioning tesseracts) are oblong and blue.
Not everyone who makes a leap through a wormhole is schizophrenic, however. Loki isn't; he's just a megalomaniac. Nor are any of the bureaucrats who hold meetings with each other to fill time in his acclaimed Disney+ show. Loki's friends the Avengers defeat Thanos by making quantum leaps. Kirk and Spock (in the reboot of Star Trek) first join forces while battling the disoriented villains that a stray tesseract keeps spewing out like a dropped garden hose. There's a wormhole in Interstellar; there’s a wormhole in Arrival. Tesseracts are becoming so common, at this point, that only blind luck prevented Jeff Bezos, Richard Bronson, and/or Elon Musk from being sucked into a different era as soon as they reached space—an ironic fate, since Bronson was originally born is 1890, and Musk is an emissary from a vaguely annoying future. (Bezos, however, is right when he's supposed to be. He's 100 percent our fault.)
It's overwhelming. Time travel has been around for a while, but only more recently—if I can even put it that way—has every single blockbuster franchise (plus a thousand other franchises, like The Tomorrow War, that nobody even noticed) started tripping on this same vial of well-reviewed acid. For every show or movie about time travel, there's another one about "the multiverse," where timelines split and diverge whenever somebody makes a decision, gets hit by a car, or ends their current relationship. (For example, in one timeline, Nick Offerman is hilarious. In a different timeline, explored by the FX show Devs, he's not.) If our current, as-yet unproven theories about time are correct, all these "parallel" universes exist simultaneously, side-by-side, and all of them are real.
Why is this happening? You might explain all these wormholes and loopholes and special military projects as a matter of natural curiosity—physicists have said bizarre, fascinating things about spacetime, and the mainstream has started taking notice. That's the kind of silly, cheerful shrug we get from the screenwriters themselves, and the directors, who tell us such spacetime events are "theoretically possible" or "matters of current speculation." That way, nobody has to ask themselves why we're so insistent that these fantasies are plausible.
Tesseracts are never, ever going to work like freeway tunnels, or expand to let a giant spacecraft through. I don't need to walk you through tons of quantum physics to assert that "time travel," in quantum mechanics, is just a convenient way of picturing the behavior of certain subatomic particles, to the extent we're able to describe their behavior or nature at all. Furthermore, such quantum exceptions to linear time don’t alter the typical behavior of atoms or molecules. But what about our desire to ponder science's strangest frontiers? Even if they’re remote, or just wishful thinking, it’s still worth asking why so many modern stories risk everything for the sake of a few quantum speculations.
Call it a panic button. Sometimes the writers in question really are worried about mental illness. Naturalist Barry Lopez has written movingly about the growing number of people who can’t have a decent life, in the First World, without taking psychoative drugs. Even then, in most cases, medication only eases things; many people with “successfully managed” disorders have painful symptoms nonetheless. The pills, meanwhile, often become a life sentence—everything from mood stabilizers, to antidepressants, to stimulants, merely compensate for some chronic imbalance. They don’t change the patient’s fundamental neurology, so they never stop being necessary. The quiet desperation people feel, once they’re locked into this kind of regimen, boils over in epic fictions about crazies who save the world.
Even so, the reality is still that there’s no upside, nor any superpowers, set aside just for neurodiverse people. We don’t even feel especially bad for them. For every messiah in hiding, like Dan Stevens in Legion, there’s an asylum full of other patients who aren’t urgently needed by our radically imperiled world. The show brutally ignores most of them; those few it does notice are just there to provide comic relief. In short, we can’t solve the problem of mental illness with fanciful spacetime operas, any more than we solved it back in 1962 by getting R. P. MacMurphy involved.
More broadly, the panic we try to escape by "rewriting history" has been sparked by scary predictions about the future. Movies where some assassin—or, more efficiently, just his isotope-rich bullets—travel back in time are fables about climate change, population growth, depleted resources... everything bad that might be lying in wait for us. The science (and increasingly blatant signs) of global climate change have us pretty freaked out. Even so, we haven’t taken action. In most countries, greenhouse emissions have continued to increase; we’ve done more in the past five years to ruin our climate than we did over twenty years before that.
Greta Thunberg, the young Swedish activist, is currently blazing a trail that many others will surely follow. She became famous by committing herself totally to the single cause of climate change. She organized her entire life around it, starting when she began to boycott school. After she gained a following, she gave some really nice, optimistic speeches about organizing a global movement to save the world. Her star rose even faster after that; unfortunately, however, she was alone. There are cute climate-related stunts, now and then, but there isn’t a mass movement fighting for our climate anywhere on Earth. Thunberg has become a Cassandra despite her best efforts. She’s tried everything. She's tried giving hope, and she’s tried sulking; she's even tried putting her own famousness down. She's tried big tent stuff and targeted messages; nobody rallied. She’s done time on late-night talk shows and she’s given speeches at the United Nations. All for naught.
Thunberg's real enemies aren’t greedy billionaires or insincere politicians. She’s fighting against our very nature. Human beings mostly ignore predictions about the future, which is good, because they’re usually wrong. Basically unheralded events—like the Great Recession, Covid-19, or Donald Trump's victory over Hillary Clinton—have had a larger, more visible impact than anything climate change has done so far. Those of us who believe that our climate is about to be “weaponized”—should keep all those recent, nasty surprises in mind. If we did, we might be more patient with “climate deniers”—people who just can’t believe their biggest worry should be the glaciers receding in Greenland. When neuroscientists tell us that it's hard, for evolutionary reasons, for us to make rational decisions about climate change, they often seem profound. But they'd be on much firmer ground if they admitted that the same "rational" thinkers who have made a cause celebre of climate change have simultaneously failed, on an almost daily basis, to help us dodge a lot of other, very consequential bullets.
Furthermore, neuroscientists are wrong about our brains; at least, they're wrong that we're wired to prioritize short-term gains and desires ahead of everything else. There are plenty of things that compel people to take certain predictions more seriously than any of their immediate circumstances: these include moral values, personal goals, and religious beliefs. When deluded believers gathered around would-be prophet Jim Jones, and voluntarily committed suicide in Guyana, they were doing something that modern science can’t explain. We can scan the brains of religious fanatics, but we can’t determine why, 40 years after Jonestown, nobody’s marching behind Greta Thunberg. It’s easier to generate zeal for something utterly bogus, apparently, than it is to find advocates for things everybody knows.
You can always fall back on neurological jargon about religion "rewiring" the brain, or on psychoanalytic models of people "identifying" with charismatic leaders. But facts should be able to rewire our brains, too, if they’re shocking enough. And aren’t there people who identify with Thunberg, Sweden’s youngest activist with Asperger’s? (For instance, everyone who cried at the end of the speech where Thunberg “addressed world leaders” and scolded them without mercy.) So the problem isn’t as simple as it seems; it’s not really about the difference between reason and belief. Thunberg isn’t too “rational” to succeed. She’s too cold; she’s a wet blanket, despite her faith in grassroots change.
Thunberg never explains what made it possible for her to suddenly stop going to school. It’s a point of principle with her. Whenever she's grilled about herself, she demurs, telling people that it’s all objective science: “I’m not the point here, I’m just the messenger.” But she’s precisely the point. She’s the Girl Who Cared! If we understood how her climate protests were inspired by living “on the spectrum,” that might help. If we understood her family better, or her childhood community, that might make a difference, too. But Thunberg feels compelled to disavow precisely those aspects of her life that could help us empathize with her radical course of action. Without some kind of origin story, there’s nothing to build a movement with. We can’t all skip school, as I argued previously on SpliceToday, especially when we can’t figure out how Thunberg herself worked up the courage for that first climate strike.
There’s much to be said, one of these days, about what Thunberg’s failure reveals about our need for stories. Facts stick to good stories like burrs. Any truth that doesn’t find a narrative big enough to freight it, on the other hand, “dies in darkness.” Even outright lies—if they’re part of a fantastic, compelling story—make a larger mark on history than that. It’s here, at the intersection of the personal and the political, that most people begin frantically groping around, in the dark of their local movie theater, for some sign of a quantum multiverse. Specifically, they start to imagine a series of parallel universes where every decision they make has an equal and opposite counterpart. I mean… what if Greta Thunberg hadn’t boycotted school? Would the world be different, and a little bit worse? Because if climate awareness really did begin with a middle-schooler making a weird decision in Scandinavia, then we’re in a fix. Suddenly our “inspirational turning point” begins to look random. It’s like those stupid, factually wrong, high-flown sentences about Rosa Parks in history primers: “And if Ms. Parks had not been tired that day, or the bus driver hadn’t asked her to move, Alabama might never have witnessed the simple, eloquent protest that would soon inspire others like it… and that would eventually shake our nation to its core.”
It seems like we'll discover, any day now, that our brains are really quantum computers. This will solve the “Greta Thunberg” problem very nicely, since Greta will decide to boycott school in at least a few of our timelines. I’m sure you’ve heard how our amazing brains continuously force spacetime to “branch,” trillions of times daily, like some sad protozoan dividing at gunpoint. The reasoning goes like this: our “ideas” are patterns of electricity blazing pathways through our brains. Those electrons are quantum particles; therefore, we must be the victims of quantum fluctuations—when we think. (“The quanta made me do it!”) But we’re also the heroes of quantum multiplicity, when we act, because we’re splitting the universe in two. In one universe, minimum, we do the right thing. So that’s good. Of course, other versions of us will have to make the inferior choices we almost picked, then rejected. But that’s hardly our problem, right?
This is all lots of fun, and makes us seem like the (accidental) masters of the universe. Unfortunately, it’s pure fantasy. Quantum effects we know how to predict cancel each other out whenever something happens on the scale of human life. That includes any "current" of streaming electrons and any kind of brain activity. There's no "quantum fluctuation" that's ever going to affect your decisions… and remember, in order for us to fork the multiverse, we need to make something quantum happen. Otherwise we’re back to one timeline: this ordinary, discouraging, linear world that we know.
“Fine,” you respond, “let's make this even weirder.” Nobody really knows what consciousness "is," right? Maybe our brains actually reproduce the quantum structure of the universe on a larger scale. Maybe that's why nobody can ever decide what flavor of ice cream to order: because their neurons are in a "superstate" where more than one truth can exist, with multiple possible universes hanging suspended in the balance.
When the cognitive scientist Daniel Dennett wrote Consciousness Explained in 1991, he already had a pretty good explanation for the sensations we feel when we're facing tough decisions. According to Dennett, our brains do come up with multiple, simultaneous, incompatible responses to the sensory inputs we receive. Dennett calls this the "multiple drafts hypothesis," and he explains it well:
These fragmentary drafts of “narrative” play short-lived roles in the modulation of current activity...augmented, and sometimes even overwhelmed in importance, by microhabits of thought that are developed in the individual, partly idiosyncratic results of self-exploration and partly the predesigned gifts of culture. Thousands of memes, mostly borne by language, but also by wordless “images” and other data structures, take up residence in an individual brain, shaping its tendencies and thereby turning it into a mind.
Dennett's theory has fared well in experiments; in the three decades since it first appeared, it has more or less become accepted wisdom. On its surface, it seems to confirm the mystical idea that human beings turn dilemmas into quantum lotteries. This is largely because Dennett never adequately explains, in that first book, or any other, how our brains settle on a single version of reality after first auditioning a swarm of different possibilities. Dennett would almost rather ignore that selection process entirely; he prefers melodramatic statements about the Multiple Drafts Model like this one: "The result...is bedlam." I always picture him experiencing the world like it's one long, incomprehensible music video made in 1991. Who knows what our brains will serve up next! Multiple Drafts, man! Freaky stuff!
The answer to Dennett is Daniel Kahneman. Thinking, Fast and Slow synthesized decades of research into the foibles of human decision-making. It proved that people make bad decisions, which wasn’t a huge surprise to any readers over the age of 14. But it also explained why so many of the decisions we make cause us pain and regret later on. And it explained how we resolve all competing narratives into a single, perceptible train of thought, and commit ourselves to a single course of action.
Our behavior follows, in general, the principle of cognitive ease. In order to judge new experiences, we compare them to what we already know. In order to explain the changing world around us, we abridge it, making some of it look cozy and familiar, and then filtering out the rest. Most importantly, everything we do has a logic to it. Sometimes that logic is the result of sustained, deliberative attention; in those instances, we consciously "think through" the matter at hand. Other times, we apply “heuristics”: rules of thumb based on a loosely-sorted mix of relevant memories and our present environment. We do this instantaneously, mostly without making any significant effort, or second-guessing our perceptions, even when we should.
This doesn’t mean, however, that we end up taking action at random. We’re still the architects of our "microhabits of thought"—including every time we choose not to worry about them, acting "on impulse." Even when we’re oblivious to it, our actions are based on a single, triumphant "draft" of reality, based on theories we've developed either through conscious effort, or else in the vacuum of its absence. Either way, what freedom we have over the choices we make comes from the most difficult, most highly evolved thinking we've done. It's not built-in at the ground level, and it's mostly a rational process, especially in people who happen to value and practice being rational. This is good news for Greta Thunberg, who sat and pondered what a very young person might do about our climate. She came to the only possible conclusion she could have reached. We didn’t get lucky that one day; Thunberg wasn’t acting on a whim. The boycott was a fait accompli. If she’d somehow come down with chicken pox, that particular week, the boycott would’ve merely been delayed.
The point is simple enough: human thinking is, if you like, "deterministic." It’s either deliberative, on a conscious level, or else it’s axiomatic, following a "rule of thumb" we may only dimly recognize. Either way, neural activity is never random. It always presupposes some notion of benefit, some idea of what's in our best interest, whether we apply those criteria consciously or not.
Priding ourselves on our rationality, and trying to follow the dictates of conscience, doesn't begin to preserve us from error. As I wrote in 2014, on the subject of free will:
Over and over, we find ourselves with incomplete information, making choices that may not lead to their intended outcomes. It is a vast enterprise of trial-and-error, and since nobody has that much more information than anyone else, it is essential to let individuals try things out for themselves, including stupid things.
This means that an even better way of explaining human consciousness—better, I mean, than consenting to call it deterministic—is to borrow from a different scientific lexicon, in the same neighborhood as quanta. We're not quantum machines. We are, instead, small miracles of relativity. The unbridgeable gap between our perceptions and those of the people around us leave us unconditionally free. We ought to be grateful for the stubborn intransigence of superstitious, irrational, perverse human beings. Here's why: if those people hadn't become idiots, and weren't busily justifying their inconsiderate, appalling deeds, we’d never be able to follow the better light of other stars either. We don't know our world perfectly, or even very well. In order to recover from our accidents, and failures, we resort to principles we can't ever fully prove.
If we could wander back in time, through a gaping wormhole, I suppose that our ignorance wouldn't matter. Trial-and-error wouldn’t be the adventure it is for us; it would be a tedious necessity. Every day would be Groundhog Day. As soon as we got one thing right, we'd be onto something else, endlessly trying to stitch up, or somehow avoid, all of time’s bloody little thorns.
Or suppose that multiple, parallel universes really did exist, branching all the time, relentlessly. Every variation on tragedy and loss would befall some version of you. You'd not only hear about all the little ways to poison your heart and the heart of your lover; you'd try them out, one by one, in “alternate timelines” that guttered out as junkyards, or ruins. Or maybe you’d survive some harrowing crisis, only to find yourself haunted afterward by visions of the “other you”: just a little more unlucky, and probably dead.
But wait. There's another possibility—we try to walk, pursuing all our defensible, difficult, "personal" goals. Alright, the world around us swerves. But surely our attempt, however clumsy, still amounts to a bold piece of history—to this trackless, undetermined universe, we may well seem like trespassers, or even vandals. There we go, at it again: hacking an unfixable path through time, as if there was no way back, nor any other choice we could possibly make.