In an interview with the Harvard Gazette, Zuboff defined surveillance capitalism as “the unilateral claiming of private human experience as free raw material for translation into behavioral data; these data are then computed and packaged as prediction products and sold into behavioral futures markets business customers with a commercial interest in knowing what we will do now, soon, and later”, a practice that she goes on to define as just too lucrative to resist (Laidler, 2019). Surveillance capitalism can be understood as an economic system that places the commodification of personal data for profit at its heart. According to Zuboff, Google was a pioneering force in this field as it was one of the first organizations that started employing user data as a resource. At first, personal data was used to improve search results and overall user experience, what Zuboff calls behavioral value reinvestment cycle. In this cycle, user data is a resource created by the user, for the user, thus creating a closed cycle of value creation that benefits the user but is not profitable for the organization (in this case, Google). Zuboff believes that the turning point that resulted in the creation of surveillance capitalism is when personal data was first used for ad-matching.
Zuboff first analyzed this new form of capitalism, along with its societal implications, in a 2015 article in which she examined the ways that surveillance capitalism is reliant on a global, technologically-mediated architecture that she named ‘Big Other’. This new globally distributed and largely uncontested form of power feeds into a hidden to the consumer web of mechanisms of extraction, commodification and control of personal data that, according to Zuboff, is incredibly harmful to freedom, privacy and even democracy. Donnell Holloway (2019) identifies the foremost ‘Big Other’ actors as companies such as Google, Amazon, Facebook and Apple, and also warns of the dangers they pose by turning the vast amounts of personal data at their disposal into incredibly lucrative products and services.
Zuboff identifies four key features of surveillance capitalism:
- An escalation in data extraction and analysis.
- The constant development of new forms of computer-monitoring and automation.
- The need to increasingly personalize and customize the services offered to users.
- The use of technological infrastructure to experiment on its users and consumers.
An Escalation in Data Extraction and Analysis
The way in which the real world and the digital environment interface is called ‘extraction architecture. In surveillance capitalism, extraction benefits only one side that of the aforementioned corporations and is not influenced by the other the users, as the process is almost entirely automated; the data is not used to improve the user experience anymore, but is instead sold to other corporations. This data is analyzed in order to predict behaviors and shape them to reach the desired outcome in the case of targeted ads, this is often the sale of a product. The data is extracted and personal information is used to provide users with targeted ads or other similar experiences that lead to further creation of data by the user, feeding into a never-ending loop that has proven to be astoundingly profitable.
The extraction architecture has some limits, however. While the loop described above is successful in the creation of more raw material, it is not only quantity that must be kept in consideration, but also quality, in order to more accurately predict behavior. As Zuboff says, the best predictions approximate observations. The kind of data needed therefore needs to also approximate observations as much as possible. This need for a vast surplus of data along with quality data is described as economies of scope. This is actualized in two steps:
- The data collection needs to be extended from the virtual world into the real world as the data collected in the former needs to mirror as closely as possible the experiences lived in the latter. So far, the most common ways to implement this extension happen online, in the form of user inputs such as likes and clicks. However, the implication is that this will result in efforts to gather data in offline settings as well. This would remove the need for active engagement on the part of the user and leave the data gathering to sensors or other devices that passively extract physical data.
- This form of extraction needs to reach a deeper and more intimate level of information on individuals. The economies of scope dictate that data gathered on user preferences must not be limited to the virtual environment it is simply not enough. The ways in which data is collected are too limited and, as such, new techniques need to be developed and employed, new ways to try and get even more sensitive and personal data, such as: facial recognition and affective computing to voice, gait, posture, and text analysis that lay bare personality, moods, emotions, lies, and vulnerabilities.
The economies of scope impose a strict escalation of extraction as limits are reached. This constant escalation is supported by what Zuboff calls economies of action.
The Constant Development of New Forms of Computer-Monitoring and Automation
Economies of action are distinct to surveillance capitalism and its digital milieu and are defined by a process of behavior modification. Economies of action allow the Big Other to not only predict behavior, but actively influence it. The idea of trying to influence customers behavior to promote revenue creation is not something that was created by Big Other, but unlike other attempts to shape user behavior such as priming, suggestion or social comparison, these economies of action operate in a new and different way. This is due to the digital architecture in which they operate: a continuous network of data input that allows for uninterrupted monitoring and, as a result, shaping of user behavior. This is accomplished by nudging, herding and conditioning individuals, larger groups or populations in real time in subtle and hidden ways, such as inserting a specific phrase into your Facebook news feed, timing the appearance of a ‘buy’ button on your phone with the rise of your endorphins at the end of a run, shutting down your car engine when an insurance payment is late, or employing population-scale behavioral micro-targeting drawn from Facebook profiles.
If at first the extraction of user data was a way to improve user experience, it is now a well-oiled machine with the reach and power to modify the behaviors of countless people. Just as industrial capitalism was characterized by the intensification of means of production, surveillance capitalism is now following the same logic, but applying it instead to means of behavioral modification. These new systems and procedures take direct aim at individual autonomy, systematically replacing self-determined action with a range of hidden operations designed to shape behavior at the source. Zuboff considers the ability to shape human behavior as instrumentarian power. In her words: instrumentarianism, defined as the instrumentation and instrumentalization of human behavior for the purposes of modification, prediction, monetization, and control. Instrumentarianism represents the crux of the democracy dilemma that surveillance capitalism presents and will be analyzed further in a subsequent section.
The Need to Increasingly Personalize and Customize the Services Offered to Users
While features that are designed to customize and create tailored experiences for users are not inherently associated with malevolent surveillance, things like cookies or the ability to access the location of individuals have been instrumental in shaping surveillance capitalism. It must be said that most of these features are usually classified as opt-in meaning that an individual as to personally consent to them being activated asking, in a way, for consent to be monitored. However, the design of these features is often purposefully obscure and cryptic, especially when it comes to communicating the extent to which a user is being monitored and the ways in which their data is used. Most websites will claim to collect behavioral or other personal data, such as location, to optimize or customize the user’s experience but, as was explored above, the ways in which personal data is employed under surveillance capitalism extend far beyond personalization. The clear asymmetry of knowledge between the Big Other and the individual creates a convenient state in which users allegedly provide informed consent to surveillance.
The Use of Technological Infrastructure to Out Experiment on Its Users and Consumers
It is only logical that the need to provide adaptive content and increasingly customized experiences would require continuous experimentation, in order to increase the accuracy of behavior prediction and modification. Users could be shown different products or recommendations to figure out what they respond to and the most effective way to manipulate them. New algorithms are constantly being created, trailed and tested for the same reason. Zuboff uses Facebook as an example, in particular the way in which the company has been able to influence user’s emotions: emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.
Relevance for Democracy
The swift escalation of surveillance capitalism can be credited in no small part to exploitative agreements between users and the Big Other, such as the one mentioned above. These can be defined as unconscionable contracts, a form or agreement that strips away the ability to negotiate terms from users and often puts them in a position where they have insufficient knowledge to truly provide informed consent about personal data collection and the ways in which said data is used. Nevertheless, the means through which surveillance capitalists amass personal data and profit from it is entirely permitted under the law, thanks to cryptically written end user license agreements and by making data privacy and data protection policies purposefully hard to find. It must also be said that the very nature of these policies and agreements is not to benefit the customer, but to protect the Big Other in their endeavour to collect personal data. User privacy, however, is only one of a number of values at risk under surveillance capitalism: autonomy, fairness, equality, due process, and democratic sovereignty are all threated by corporate personal data practices.
Surveillance capitalism uses algorithms and other programs to analyze and monitor our lives online the ways in which this happens have already been analyzed, but the implication this has individuals outside of the virtual environment is not only limited to buying a certain product because it was advertised to them on social media, it can in fact have quite sinister ramifications. Thanks to massive data files, corporations and governments the Big Other can profile users, judge them, predict their credit worthiness or tastes, their spending habits, and take actions accordingly. This results in the creation of what Davidow (2014) calls ‘algorithmic prisons’. These prisons have the very real power of restricting individuals’ freedom and rights. Information that used to be private or hard to access can now cause certain people to no longer qualify for certain loans or be able to cash a check. Credit card interest rates or car insurance can also be influenced by surveillance capitalism predictions and profiling. Because of algorithmic predictions, some people may have difficulties finding employment, some might be selected for government audits or subjected to increased scrutiny and screenings, such as at the airport. Davidow (2014) uses the airport as an example to really showcase how far-reaching and terrifying the implications of surveillance capitalism are for personal rights and freedom. No-fly lists, screenings and other controls are all influenced by algorithms rooted in surveillance. By combining data such as tax identification numbers, past travel itineraries, property records, physical characteristics, and law enforcement or intelligence information, the algorithm is expected to predict how likely a passenger is to be dangerous.
It isn’t entirely appropriate to refer to these algorithmic prisons as an invention of surveillance capitalism. Even before the advent of more modern technologies such as the internet, credit scores and rating agencies were already in existence. However, the unbridled scale of data production, commodification and exploitation that characterizes surveillance capitalism bring this phenomenon to new heights; never before has so much information about individuals been so readily available, and never before has that information been monetized and harvested to this extent.
The concept of instrumentarianism analyzed above should also be seen as incredibly concerning, as it not only has the potential to harm our individual freedoms, but to undermine the concept of democracy. Privacy theories have demonstrated that there is a link between the concept of privacy as necessity and personal autonomy. Democratic theories also show that there is a correlation between autonomy and democracy that is to say, personal autonomy is an imperative condition for democracy to be effective. Thus, it can be inferred that loss of autonomy as can occur as the result of behavior modification, which is instrumental in surveillance capitalism will have a direct effect on effective democratic participation. When Zuboff talks about privacy rights, she defines them as decision rights that are redistributed. This redistribution must be seen in combination with the potential for manipulation that occurs as a result of instrumentarianism. As more behavioral data is collected and as the services offered by surveillance capitalism become more and more personalized, they are also able to predict and shape our behavior much like in the case of the aforementioned algorithmic prisons. The right of choice might indeed be redistributed, but since the loss of privacy that is inherent in surveillance capitalism creates a loss of personal autonomy, thus leading to the tarnishing of one’s democratic participation, it can be argued that the right of choice is redistributed in a way that is harmful to democracy itself. In other words, surveillance capitalism architecture strips the individuals of their power of choice and autonomy. If we understand democratic participation as individuals placing power in the government, this redistribution of power makes it so it is not the individuals who bestow their power to the government, but the surveillance capitalism architecture itself. Even more worrisome, the techniques employed to manipulate customers behavior are designed to evade individual awareness and thus bypass individual decision rights. Through this, we the customers are left blissfully unaware of their influence. Zuboff herself gives us an example of this happening in real life, using the case of Cambridge Analytica, a consultancy firm that influenced American elections and the Brexit vote in the United Kingdom by using targeted ads and other similar tactics.