The New Yorker has a grimly fascinating piece by Adam Entous and Ronan Farrow in its Feb. 18 & 25 issue, published online with the title “Private Mossad for Hire.” In print, the piece has the less evocative headline “Deception, Inc.,” whereas as “Private Mossad” suggests, it’s about ex-operatives of Israeli intelligence doing nefarious things in the private sector.
The article describes how a Tel Aviv company named Psy-Group brought covert operations skills to a local hospital-board election in California. The company specialized in the use of avatars, or fake Internet personae, a tool initially developed for infiltrating jihadi recruitment chatrooms. Now, websites sprang up in which people purporting to be small-town California residents—but whom locals had never heard of—spewed invective against a hospital-board candidate.
The effort failed, which is a bit of good news; the targeted candidate won by a large margin, and Psy-Group, which had also offered services to the Trump campaign, shut down in February 2018—“just as Mueller’s team began questioning employees,” note Entous and Farrow. Still, the culture of tech-empowered political dirty tricks clearly remains in an expansion phase.
Shortly before reading the article, I finished a new book, The Misinformation Age: How False Beliefs Spread, by Cailin O’Connor and James Owen Weatherall, both professors of logic and philosophy of science at the University of California, Irvine. The spread of fake news and other misinformation is a concern to everyone, and of professional interest to me, since much of my day-to-day work is as a fact checker for a major science magazine.
I thought the book would be about how the general public falls for false information, and it is, but an unexpected emphasis is on how incorrect beliefs can spread through the scientific community. O’Connor and Weatherall invoke historical debates, such as involving risks of tobacco and ozone depletion, as well as mathematical models of network effects, to show how scientists, even when practicing a high level of rationality, can absorb and relay false beliefs.
Scientists were wrong about the ozone risk for a while, even after recognizing the basic problem that chlorofluorocarbons, of CFCs, a type of chemical present in hairsprays, refrigerators and other products, reduce the atmospheric ozone that shields against space radiation. In the 1980s, when data collected in Antarctica showed the problem was worse than thought, many scientists were skeptical, since satellite data seemed to show otherwise. It turned out the satellite data was misconstrued by software set to disregard extreme data points as presumed results of an instrument glitch.
There also was a campaign by the chemical industry to impede regulatory action on the grounds of scientific uncertainty. O’Connor and Weatherall argue that such industry efforts enable false information to persist in science, as represented in their network models by a “propagandist” who doesn’t even have to lie but rather just publicizes data selectively. Their solution, stated near book’s end, is that “we must abandon industry funding of research.” Stated with no qualifications or discussion of tradeoffs, this strikes me as extreme, an artifact of the same moment of left-wing exuberance that just promoted the Green New Deal.
O’Connor and Weatherall recount the prevalence of fake news in and around the 2016 election, exemplified by the incident of a man firing a weapon over “Pizzagate.” They argue government needs to be revamped to cope with the profusion of misinformation, as current democratic institutions aren’t up to the task. How they’d fix them they leave vague. “The challenge,” they write, “is to find new mechanisms for aggregating values that capture the ideals of democracy, without holding us all hostage to ignorance and manipulation.”
Both The Misinformation Age and “Private Mossad for Hire” left me with some sense of optimism that my particular skill set as a fact checker is not going to become obsolete soon. It’s not entirely clear who’ll be paying for fact checking in the future, given the journalism industry’s financial troubles; but somebody will. As it happens, O’Connor and Weatherall argue that major media should consider not debunking fake news (since calling attention to it could spread the falsehoods) and leave that instead to entities like Snopes.com.
In any case, just as scientists are not impervious to misinformation, neither are fact checkers. About a year ago, I saw on Twitter that Clint Eastwood had died, clicked through to what looked like a detailed CNN article saying so, and retweeted it. I then started looking around Twitter and the Web for what else was being said about the actor’s demise, and only then did I realize that what I’d seen and promoted was a well-crafted fake news story. Who wrote it and why I don’t know. Maybe some ex-Mossad officers were honing their business model.
—Kenneth Silber is author of In DeWitt’s Footsteps: Seeing History on the Erie Canal and is on Twitter: @kennethsilber