Chapter 9Intro and table of contentsChapter 11


Chapter 10   –  Make Them Dance


Economies of Action

In Surveillance Capitalism, we will remember that the behavioral surplus is extracted in order to better predict desired outcomes. And while algorithms that try to guess, analyze, optimize for better outcomes are useful, being able to actually deliver said outcomes in the real world is infinitely more powerful! This is where economies of action come into the picture: the ability to modify behaviors of users through the same sensors that were able to harvest the behavioral data in the first place. Zuboff separates those economies of action in three categories:

  • Tuning: this is designing situations that are pre-structured to channel attention and shape a specific action, outside of our awareness. The term “nudge” has been widely used to describe this concept (which, by the way, is already hyper-present in our everyday life: why are tables in a classroom facing the teacher?). 
  • Herding: modifying one’s environment to make them do a specified action. This is the notification principle, an action in the real world to make the user carry out a certain behavior as a direct response.
  • Conditioning: applying reinforcements (rewards) to shape specific behavior. 

As seen before, those techniques could be used in the behavioral value reinvestment cycle, to help users better themselves (eating habits, physical training, etc…), but surveillance capitalism dictates that they are instead used for the capitalists’ benefits, without user awareness.

Facebook Writes the Music

Facebook experimented since at least 2012 with behavior modification on their users. For example, that year they found out that encouraging Facebook users to vote at a political election was more effective if people were shown the profile pictures of some of their friends who themselves had already voted. They then turned to user behavior related to engagement on the platform: how to get users to spend more time on Facebook more effectively, automatically and economically? (to note: every single large website does this). In a widely-shared (and criticized) experiment, they showed that they could manipulate users to write happy posts by showing them a happy timeline, and sad posts by showing them a sad timeline.

Zuboff highlights that this hijacking of people’s minds exploits their empathy, and can result in actual harm to users. Unlike academic experimenters however, private companies do not legally have to worry themselves with research principles such as “experiment subject welfare,” and usually have very limited internal ethical oversight. Facebook did implement some new processes after the scandal of the emotional manipulation, but Zuboff interprets this show of self-regulation as a meager effort to try to keep real regulators at bay. And rather than focusing on public research, Facebook went back to doing the experimentation in private, for the benefit of its advertiser clients. So they continued to exploit users’ wills to take them down a path that they did not explicitly choose themselves.

Pokemon! Go! Do!

What happens when a company realizes that games can be effectively used to achieve economies of action? Pokemon Go happens! The game built on the concept of gamification, already widely explored when it comes to molding actions of users, to get users to move around their physical space with their GPS on, and actively sharing their position at all times. The tuning, herding, and conditioning techniques are actually used openly, as gamers fight for rewards while acting in a way that is doubly beneficial for the operator of the game: first by mapping out the real world of cities and suburbs for Google, and second by enabling location-based discounts, advertising, and general optimization of marketers’ ploys to derive profits.

The logic behind the monetization was genius: real-world businesses (with a physical address) would pay to be “sponsored locations” which would be featured as important “places to be” in the in-game world, therefore attracting footfall to their premises, and they would pay on a cost-per-visit basis. The game creators established and hosted a behavioral futures market for its customers: companies willing to exploit the behavioral surplus in order to increase their sales. In the process, in a turn of events that surprised absolutely nobody, the game was collecting large amounts of user data, not always related to the purpose of running the game, and with unclear data retention policies.

Zuboff marvels at this manifestation of surveillance capitalism, which she thinks indicates where the aim is for platforms: “It follows the prediction imperative to its logical conclusion, in which data about us in scale and scope combine with actuation mechanisms that align our behavior with a new market cosmos.”

What Were the Means of Behavioral Modification?

Zuboff opens here a parenthesis to cover the appearance and spreading of behavioral modification techniques at the time of the cold war, where the CIA was tasked to research and weaponize these techniques, despite the move being generally seen at the time as unethical. From there, the research and applications progressively made their way into a range of institutions that had in common that they were all driven by a mission to re-engineer the defective personalities of captive individuals in settings that offered “total control” (prisons, psychiatric wards, etc…).

In the 1970s however, a Senate subcommittee vehemently attacked the behavioral control technologies being developed in the US, singling them out as purely anti-democratic and reneging of all humans’ right to self-determination and self-control. The subcommittee issued a report that highlighted a certain exceptionalism (the Cold War) to explain “how they got away with creating those technologies?” which is similar to the exceptionalism that brought about surveillance capitalism (the War on Terror). Furthermore, in a prescient inspiration, the Senators had also already identified the dangers of “devices worn by a subject constantly to monitor and control his behavior through a computer.” 

The report participated in some progress in the field, as the next few years saw the creation of oversight institutions and the establishment of the Common Rule for ethical treatment of research subjects. But Zuboff observes that during those years of intense public debate, no one could imagine those “means of behavioral modifications” to be in the hands of anyone but the State. So the mechanisms that were created to fight them did not consider that private actors might possess them in the future. Which explains why Facebook was not bound by the same ethical imperatives as an academic team would be.

At this stage, we feel Zuboff is eager to go deeper into answering the questions of “Who should decide who decides?”…


Chapter 9Intro and table of contentsChapter 11