Surveillance capitalism is profoundly antidemocratic and follows the logic of the quest for profit.
“The Googles and Facebooks of this world are shaping an antidemocratic world, in which every little detail of our lives becomes raw material for products that eliminate our freedom.”
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
The age of Surveillance Capitalism by Shoshana Zuboff deserves a productive public and academic debate. It should not only be read and discussed by a broad audience; we can only hope that scholars across disciplines engage with her argument, finetune and improve it.
What is surveillance capitalism?
The main argument developed by Zuboff is simple, elegant and powerful. Zuboff argues that Google – just like Henry Ford in the 20th century – introduces a new type or form of capitalism: surveillance capitalism. The humans using the digital services of Google, Facebook and the likes, are not the product, as so many people by now argue, and they are certainly not the customers. They are the stuff from which surveillance capitalists scrape their “raw material.”
Surveillance capitalism claims human experience as raw material for translation into behavioral data. That data is partially used to improve the digital products or services; but most importantly it is declared "proprietary behavioral surplus’" fed into "machine intelligence" manufacturing processes producing “predictions products." These "behavioral prediction products" are sold in a new type of market: the "behavioral futures market"’(Zuboff, 2019: 8).
In the battle for market domination and profit maximalization, surveillance capitalists are on an endless quest to acquire ever-more predictive sources of behavioral surplus. This is, according to Zuboff, the first economic imperative of Surveillance capitalism: “the extraction imperative.”This imperative means that "raw material supplies must be produced at an ever-expanding scale" (Zuboff, 2019: 87).
Surveillance capitalism claims human experience as raw material for translation into behavioral data.
Surveillance capitalists are thus bound to strive for a total view: every little detail about humans should be scraped. Every society, every social relation and key societal processes are now a “fresh terrein for rendition, calculation, modification and prediction” (Zuboff, 2019: 399).
If we have to highlight one weak point of the book, it is precisely here that we can locate it. Zuboff, for some reason, tries to frame surveillance capitalism as “a new species of capitalism,”a dangerous one. Throughout the book, she never forgets to highlight that surveillance capitalism is a whole different animal. This argument is at odds with her general understanding of surveillance capitalism as “accumulation of dispossession” of our behavioral surplus (Zuboff, 2019: 99).
Surveillance capitalism wants to accumulate “our behavioral data” at the lowest cost possible and turn them into profitable use. Surveillance capitalism is in first instance an economic phenomenon: it is driven by the desire to accumulate capital. Surveillance capitalism is in first instance capitalism. It is the fact that digital technologies are embedded within a capitalist structure, that sets accumulation at its core.
Surveillance capitalism is thus not a 'new order' in the sense of something entirely different and new, but rather a 'new manifestation' of capitalism. The extraction imperative is driven by the desire for huge profits: the more and the better a company extracts behavioral surplus, the more profits it can make.
Google and the birth of surveillance capitalism
It is the extraction imperative -- and the desire for profit -- that explains, for instance, Google's seemingly unconnected range of products: from Gmail, over to their search engine, their book projects, their Android software or the mapping of the world. What all these things have in common, is that they contribute to and make possible: "the extraction of raw material": our voices, our search queries, our mails, our use of their home sensors, our gps location. They all enable Google to extract "behavioral data’" fabricated into predictive products.
Zuboff views Google as the inventor of surveillance capitalism. Surveillance capitalism, she stresses, should not be equaled with digital technology. Surveillance capitalism was born digitally, but is no longer confined to born-digital companies. Crucial to the success of surveillance capitalist companies is the dispossession of behavioral surplus: raw material supplies should be free and the law should be kept at bay (Zuboff, 2019: 175).
Surveillance capitalism is not just a technological achievement. "Sustainable dispossession requires a highly orchestrated and carefully phased amalgam of political, communicative, administrative, legal and material strategies that audaciously assert and tirelessly defends claims to new ground". (Zuboff, 2019: 130). Companies constantly have to reproduce and refine their extraction methods, along with their access to raw material. They therefore deploy a "don’t ask” strategy. Zuboff distinguishes four stages of what she calls the "dispossion cycle":
1.Incursion: surveillance capitalists act first and declare your data free for extraction and re-fabrication.
2.Habituation: When protest arises (think about the Google mailscanning controversy or the controversy around Google Street View), they try to ignore it, in the hope that habitation will eventually kick in.
3.Adaption: If protest is successful, a phase of adaptation is announced
4.Redirection: after which redirection of attention is set up.
What never changes, is the fact that Google will extract data; only new routes towards that goal will be set up. This arrogant, antidemocratic and imperialist attitude resembles what is at stake.
The surveillance market is a hugely profitable market. When Google just embarked on its surveillance capitalist journey in 2001 its net revenues jumped to $86 million (a 400 percent increase); in 2002 revenues rose to $347 million, $1.5 billion in 2003 and $3.5 billion in 2004. "The discovery of behavioral surplus had produced a stunning 3,590 percent increase in revenue in four years." (Zuboff, 2019: 87). No wonder, then, that Facebook, Microsoft and even net providers were keen to join the party. And it is these profits that yield invasive and imperialist companies.
Economies of scale, scope and action
Among surveillance capitalists, the real competition is about finding new "mines" of (free) raw material guaranteeing continuous access. The first economic principle of surveillance capitalism -- the extraction imperative -- necessitates economies of scale. Prediction is only as good as (1) the amount of data, (2) the computing power and (3) the sophistication of machine learning procedures.
This is why Google and other surveillance capitalists embark onto all kinds of -- at first sight -- completely unrelated activities. All Google’s services – I surveyed some earlier – contribute to and collaborate in the revamping of all mediated computation into an extraction architecture.
It is important to remember that the actual product that these surveillance capitalists are selling is not "your data", but "predictional models.” Not only data but also machine learning is, thus, crucial to achieve this. In order to be able to deal with such huge and rich data, computing power should be huge. And it is again no surprise that Google not only has the largest computing network in the world: it also focuses on hiring AI-specialists and on developing new technologies like the Tensor processing Chip for "deep learning," in order to enable their machines to learn more and faster.
It is the need to predict human behavior (or, at least, the claim that it needs to be predicted) that defines the second imperative of surveillance capitalism: the so-called prediction imperative. It is this imperative that pushes surveillance capitalists to supplement the economy of scale with economies of scope and of action (Zuboff, 2019: 195).
Economies of scope are based on the axiom that behavioral surplus must not only be vast, but it should also be varied and indepth. Extraction operations should thus not be limited to the online sphere, but should be able to extend into offline life too: from your daily commute over to your sporting or workout routines to the breakfast conversation you're having. All of this should be extracted.
Economies of scope are born out of the idea that "highly predictive, and therefore highly lucrative, behavioral surplus would be plumbed from the intimate patterns of the self" (Zuboff, 2019: 201). From mattresses to television sets, all these “things” are now used to gather the most intimate data: from our intonation, to our faces and our mails: everything is scanned, analyzed and refabricated.
Such economies of scope and scale are necessary but insufficient to sustain a competitive edge in the highly competitive surveillance capitalist market: economies of action are also required. In order to achieve these, machine processes are configured to "nudge, tune, herd, manipulate, and modify behavior in specific directions" (Zuboff, 2019: 202). Zuboff (2019: 309-319) discusses Pokémon Go as an excellent example of such an economy of action.
Niantic, the company behind Pokémon Go, set the Game up as an economy of action. Pokémon Go, because it was born out of Google maps, and tracks the movement of gamers block-by-block, has very detailed location-based social graphs. Niantic also has access to your camera, your contacts and audio fingerprinting. Niantic also didn't set limits to the collection (and distribution) of that data. It knows how long you are playing, how you got to a certain location, and who is with you.
In Pokémon Go, all this data was used not only to predict your behavior, but to direct you to very specific locations: pizzerias, tea houses, parks and coffee bars. One pizza bar in Queens, paid around 10$ to attract Pokémons to the bar, producing virtual creatures on bar stools and in bathroom stalls. In the first weekend, revenue raised by 30% and later rose to 70%.
Internet of things, instrumentarianism and politics
According to Zuboff, what is known as the "internet of things" or ubiquitous computing is better understood as "a twenty-first-century means of behavioral modification" (Zuboff, 2019: 202-203). The extraction architecture is combined with a new "execution architecture", and both function as a coherent whole. That whole infrastructure, the vision and the aims of the architects, according to Zuboff, constitute a turning point in the evolution of surveillance capitalism and conditions the future of our societies.
Throughout the book, Zuboff stresses that we should have specific vocabulary to describe new phenomena, if we want to have a clear understanding of what we are up against. She, for instance, stresses that we should not understand surveillance capitalism as a totalitarian project, but as 'instrumentarianism' (Zuboff, 2019: 376-397). Her main argument there, is that the purpose of instrumentarism is not the “perfection of society/species,” but “the automation of market/society for guaranteed outcomes” (Zuboff, 2019: 396).
It is at this point that some criticism is needed on two assumptions that run throughout the book: (1) that surveillance capitalism produces correct “knowledge” and thus really can “predict” and steer behavior, and (2) the sharp distinction between economy and politics.
Knowledge and surveillance capitalism
(1) One of the key elements in Zuboff's argument is that surveillance capitalism not only has unprecedented access to our data, but also that this access allows them to produce “knowledge” about us. Knowledge that is seemingly so accurate that they can “predict” our behavior. Zuboff reproduces one of the core fundaments of big data scientists, namely that quantified data speaks for itself. And that total surveillance is possible.
It is, or at least should be, clear by now that data science is not perfect nor neutral at all. As the math nerd Cathy O'Neill states, we need to "get a grip on our techno-utopia, that unbounded and unwarranted hope in what algorithms and technology can accomplish" (O'Neill, 2016: 207-208). Zuboff, from this perspective, continues to build towards a technological utopia, as she contributes to the idea that algorithms and machine learning do produce “knowledge” (see Varis & Hou, in press, for a detailed discussion on this assumption from a digital ethnographic perspective).
In reality, we see that the vast architecture of extraction and prediction, produces rather linear and crippled behavioral scripts. A tour around all the “non-relevant” ads you encounter on a daily basis is only one banal proof. This fact, of course, does not change the massive impact that such means of behavioral modification do have on society. Think about Trump's election as only one example.
Politics and surveillance capitalism
(2) The black and white distinction between “totalitarianism” and “intrumentarianism” is hard to maintain and it betrays a very limited perspective on what politics is really about. From the moment you understand that politics is about the organization of society, and thus not solely about “politicians” and “government,” you realize that surveillance capitalism, just like capitalism, is all about politics.
Here we see, again, the same weakness of the analysis popping up. Zuboff's quest to safeguard the “free market” is based on a very limited understanding of politics. Zuboff seems to be arguing that capitalism is just fine. Her whole analysis is wrapped in a post WWII discourse on freedom, free markets and democracy.
“Surveillance capitalism, just like capitalism, is all about politics.”
In doing so, she seems not only to miss the historical continuity of contemporary “surveillance capitalism,” but she is also missing the “political dimensions” of surveillance capitalism and she fails to imagine alternative digital technological routes outside the capitalist paradigm (Wikipedia for instance).
Politics cannot be seperated from the economy and vice versa. And we best understand “surveillance capitalism” as political. The big digital companies and their charismatic leaders don't hide that they have “a vision” for society. The common sense vision among these visionairies (from Musk to Thiel) sees “democracy” as an anachronism. They all stress that “technologies” can be used to build an “ideal society.”
The “techno utopanism” guiding Facebook, Google and many more companies and tech-gurus is not just a collection of ”thin theories” as Zuboff (2019: 406) seems to understand them, but it is breeding ground for a flowering ideological production in which technology in itself is imagined as an alternative for democracy. In its most benevolent conception, technology is seen as inherently democratic; in its most reactionary conception, technology enables a dark Enlightenment.
What seems to be clear, is that the dominant techno utopianist conception today, is quite authoritarian, neoliberal and anti-democratic in its core. The example of China's interest in and commitment to setting up a 'social credit system', is probably the best counter-argument to the idea that “surveillance capitalism” is not political. Even more, we see today that politics and this vast digital architecture are deeply intereconnected in every corner of the world. From election campaigns, to the collaboration between security agencies around the world and tech giants (Greenwald, 2014).
Surveillance capitalism, power and democracy
Zuboff’s book is chilling. Considering the strength of the argument, one can only suspect that ad hominem attacks will follow. Her personal position is not what should matter here; what matters is the argument she presents. Her analysis is groundbreaking and seminal. She presents the world with an enormously powerful argument, richly supported by tons of proof. Her main argument is as simple as it is elusive, relevant and scary: the Googles and Facebooks of this world are shaping an antidemocratic world, in which every little detail of our online and offline lives becomes raw material in the production of predictive products that eliminate our freedom.
These products not only produce huge profits and unlimited surveillance; they also provoke modification and manipulation of our behavior in order to produce "guaranteed outcomes" and thus even larger profits. What we end up with, thus, is a massive and invasive, but largely opaque infrastructure dedicated to "behavioral modification" not by repression, but through "tuning," modeling and suggesting. Surveillance capitalism is "profoundly antidemocratic" and "its remarkable power" is the result "of its consistent and successful logic of accumulation" and the quest for profit (Zuboff, 2019: 192).
Zuboff rightfully stresses that surveillance capitalism was born in the neoliberal era. This is no coincidence: Western democracies were already in the ropes by then. Neoliberalism created an economic Far West. Surveillance capitalists "quickly learned to exploit the gathering momentum aimed at hollowing out democracy’s meaning and muscle" (Zuboff, 2019: 518). 9/11 and the war on terror not only gave room to governments to breach democratic boundaries in the name of security; we now know, thanks to Zuboff, that it also gave birth to surveillance capitalism.
Surveillance capitalism takes an even more expansive turn towards domination and antidemocracy than neoliberalism. Not only is surveillance capitalism an economic or instrumentarianist project, but it bares the seeds of a totalitarian project. Its behavioral modification apparatus gives rise to a new source of social inequality, and also attacks democracy and demeans human dignity.
The book should therefore also be read as an argument against voices, such as that of Anne Applebaum, presently claiming that anonymity is part of our antidemocratic problem. It is, in fact, the lack of anonymity that is the problem.
Zuboff ends the book with a sparkle of hope. Hope that in the end, democracy will prevail and that people will decide: "no more of this." We can only subscribe to that hope, and state that a hot debate on Zuboff's thesis is maybe the best chance we will have to find enough people to curtail surveillance capitalism.
Greenwald, G. (2014). No place to hide. Edward Snowden, The NSA & the surveillance state. London: Penguin Books.
O'Neill, C. (2016). Weapons of math destruction. How big data increases inequality and threatens democracy. London: Allen Lane.
Varis, P. & Hou, M. (in press). Digital approaches in linguistic ethnography. The Routledge Handbook of Linguistic Ethnography (ed. Karin Tusting). Abingdon: Routledge._
Zuboff, S. (2019). The age of surveillance capitalism. The fight for the future at the new frontier of power. London: Profile Books.
Ico Maly is assistant professor at Tilburg University and editor-in-chief of Diggit Magazine.
This article previously appeared in Diggit Magazineand Portside.
Please join the conversation on Black Agenda Report's Facebook page at http://facebook.com/blackagendareport
Or, you can comment by emailing us at [email protected]