Space the Nation: Weaponizing data

Contributed by
May 8, 2018

The headlines:
How Trump consultants exploited the Facebook data of millions
Cambridge Analytica: How did it turn clicks into votes?
How Facebook likes could profile voters for manipulation

The Trope: Big Bad Data. It’s a kissing cousin to the Surveillance State, but usually tricked out in disturbingly, dismayingly specific surface appeal.

Where you can find it: Unlike the Surveillance State, Big Bad Data appears to be a thoroughly modern invention. Humans have had an instinctual mistrust of authority figures’ ability to snoop on them since they were telling stories around campfires, but feeling queasy about private entities turning our consumer preferences against us is specific to the self-destructive horror show of late capitalism.

At first, the oldest example I could conjure (even with Twitter’s help) was Isaac Asimov’s introduction of “psychohistory” in his Foundation series. He posited that, given enough data, mathematicians could predict large-scale human events with highly specific accuracy (down to the year, if not the day). Still, psychohistory winds up being a kind of MacGuffin to the story. It’s the reason things happen, and it raises some philosophical questions, sure, but Asimov is more interested in how people behave in the context of knowing “psychohistory” exists than he is in questioning the morality of psychohistory itself. The Foundation series is science fiction, but in some ways reads more like ancient myths; instead of featuring characters struggling with their fates as spun out by the gods, it’s fate spit out by machines. And, relevant to the ways that Big Bad Data has shown up in recent headlines, Asimov in the Foundation series was not that interested in the interaction of all that information and commerce.

Yet the notion that corporations could use information about ourselves in a way that feels invasive and wrong is actually quite a bit older than Asimov’s benevolent Foundation. It turns up for the first time, as far as I can tell, in H.G. Wells’ 1903 novel When the Sleeper Wakes. It’s one of Wells’ lesser-known works that may deserve re-examination today in light of its major concerns: income inequality and the rise of a technocratic elite with monopolistic control over banking and finance. The titular sleeper comes to in the year 2100 and is assaulted by commercials. They play on the screens on the sides of buildings, flown across the sky on giant balloons, and burbled between propagandistic headlines on ubiquitous “Babble Machines.” It sounds a lot like Blade Runner, actually. The refugee from Fin de Siecle can barely handle it: “Everywhere was violent advertisement, until his brain swam at the tumult of light and color.”

Indeed, the rise of commercially dystopian futures seems hinged not to advances in technology so much as the encroachment of advertising into all forms of culture. Mid-20th-century commentary on it emphasizes its insidiousness and the way the profit motive can overwhelm all other values, as in Robert Heinlein’s Podkayne of Mars. The book depicts a wholly owned Venus in which you can get away with murder if you pay off the “value” of the person, but you can’t get away from ads, which even appear in (of all places!) taxi cabs. Frederik Pohl’s The Space Merchants gets even darker, depicting a world ruled by marketers selling addictive, toxic products. The “advertising corrupts” idea gained purchase (thank you) in popular science fiction in the 1980s, when the grotesque consumerism of the Go-Go Reagan years begged for both parody and expressions of horror. Thus you find the lethal “blipverts” of Max Headroom and Looker’s assertion of humans surgically modified to a standard of beauty that’s computer-calculated to generate sales.

In the intervening decades, humorists and futurists alike have seemed mostly focused on brainstorming the reach of advertising: trying to come up with more and more ridiculous and far-fetched scenarios where advertising might be displayed or employed. Think the mini-state franchises of Snow Crash (and inventor who figures out how to advertise on chopsticks in The Diamond Age) or David Foster Wallace’s “Year of the Depend Undergarment” in Infinite Jest. It’s the underpinning joke to Idiocracy. And, sure, in real life we have seen logos come to dominate the visual environment and product placement is as unobjectionable as your favorite podcast, but it’s the data-driven models (ahem) of Looker that seem to have presaged our current state of alarm.

Micro-targeting and mass data-mining is, perhaps, less sexy to portray in fiction — or at least it’s not as easy. In Minority Report, it’s a throwaway visual joke of sorts. But it shapes the world of the novel Feed, where having the wrong sort of consumer profile can sentence you to death. The background social contract of Infomocracy is that people have given up their privacy in exchange for the ease and safety provided by The Information, a Google-ish stand-in that is both corporation and governing platform. The idea that corporations know us too well seems to be the subtext of almost every episode of Black Mirror. And the season premiere of Westworld teased that the Delos corporation was keeping a record of guests’ experiences alongside a sample of their DNA — is that foreshadowing the ultimate tailored experience or merely blackmail?

Crunching the numbers: We like to think of advertising as somehow intrinsic to the world—a naturally occurring and immersive element, like oxygen. And we don’t think twice about world-smashing strangeness of, say, an advertisement for toothpaste showing up in the middle of a documentary about human trafficking, or soccer jerseys bearing the logo of Deutsche Telecom. But there was a time that promotion of goods and services outside the context of their use was both strange and offensive, and a lot of good commentary proceeds from reminding us of advertising’s essentially alien nature. But it is more difficult to conjure even fictional disastrous consequences for the scale of data harvested by Facebook and Google.

Underlying most dystopian visions of a hyper-consumerist future is the threat of privacy being invaded—of authorities finding out something about you that you don’t want known. Really, that is the same old fear we can trace to legends about gods and monsters. But isn’t there something different about what Big Bad Data threatens in the real world? In news coverage, the panic is about leaks and “your data” being stolen; in reality, the dangers posed by Cambridge Analytica’s data mining wasn’t anything connected to specific people — it was the company’s desire to weaponize voters’ attitudes, to create mobs rather than expose individuals.