Chapter 8Intro and table of contentsChapter 10


Chapter 9   –  Rendition From the Depths


Personalization as Conquest

Personalization of the user’s experience is the next step of rendering data, and personal assistants (Cortana, Google Now, etc…) are the trojan horses used to extract intimate behavioral data from the dark continent of one’s emotions, thoughts, desires, and other very personal pieces of information. Big tech claims that people will be OK giving up those data because the product they receive in exchange will be of such quality that they won’t be able to resist. They present those assistants as the new necessities that everyone will want, just like the Ford Model T in its time.

However, Zuboff sees these assistants more as the next level of prediction products, mixing all of the behavioral data extracted about an individual in the real world (location data, email content, and more) to repackage it into a product that will offer more surfaces for advertisers, but also ultimately participate in modifying the user’s behavior in the real world. To make those assistants more acceptable to users, they are hidden under the guise of “conversations,” which lure users into believing they are interacting with a “Voice” that they can trust rather than with what is truly on the other side: an economic system made of sales, payment, distribution and shipping processes. All of which become completely hidden from the view of the user. Enter Google Assistant. 

In order to provide the best service to users, the assistants need to know evermore information about you, increasing the depth of the data rendered through those tools. And since the medium is vocal, tech companies are doubling down on better interpreting what we say, and how we say it, in a sort of arms race to obtain voice samples and data. All of this data is then consumed by algorithms, and by the humans who are controlling the supervised learning process. The extraction processes can go way overboard, as they have with scandals involving smart TV manufacturers Samsung, Vizio, and others, who recorded personal speech not aimed at the TV and used it for advertising purposes. Even worse: connected toys have functioned in exactly the same way… 

Tech companies are competing so hard on the front of voice assistants because it’s again going to be a “winner take all” situation: one system will become the major (or only) interface between us and our houses, our cars, our shops, our world… And they all want to be the one to extract your real-life behavioral surplus in the process.

Rendition of the Self

Another area of rendition is the self: what we all put out there on social media (and is inferred by the platforms’ algorithms) that was never intended for rendition into the shadow text. They can render one’s personality, sexual orientation, intelligence and more, to feed their prediction products. In fact, as early as 2011, researchers showed that meta-data derived from what someone is sharing on Facebook is a better predictor of their personality than the information that they provided themselves in their profile… This type of research on facebook users’ and personality trait questionnaires were the basis of the Cambridge Analytica scandal that broke out much later. 

While researchers in this field mentioned that there could be positive outcomes of all this data derived from users’ activity and profiles, they were very weary of calling out any dangers associated with this level of invasion of one’s privacy (gender, race, political and religious views, and more intimate predictions). Rather, Zuboff focuses on a specific team of researchers (Kosinski et al) whose subsequent work focused on bringing down the cost and increasing the efficacy of those prediction algorithms. And they got incredible results (imagine being able to derive a user’s personality based on the saturation of their Instagram selfies…) while not having access to any non-public data, so imagine what the platforms themselves can do! 

The levels of micro-targeting that were seen in the Cambridge Analytica story could be turned to private business: user data can help a car dealership to better identify sales tactics that will convince a user to buy that more expensive car. We come back to better prediction of real-world outcomes derived from online behavioral surplus in the economical space. And that seems fine to most. Until we veer into political space, which is what happened to Cambridge Analytica. Zuboff’s argument is that the same level of unwanted scrutiny is happening routinely to 2+ billion people every day, with their personality being rendered from the depth in order to predict and modify their real-life behaviors. 

Machine Emotion

One step below “personality” towards the depth of rendition lies “emotions.” Using many biometric readers including microphones and cameras, companies aim to analyze one’s emotional state in order to even better predict behavior, by capturing micro-signals that are even too subtle for humans. And exploiting them with appropriate nudges at exactly the right time. This is nothing new to the advertising business though; what IS new is the scale of this general surveillance mechanism. 

The implications of “affective computing” were originally designed to be very positive, in a world where sufficient safeguards could be provided to users. Emotional state should only be used as part of a reinvestment cycle which would better users’ lives (for example by letting them know if they seem angry before they call their bosses!). Notably, researchers like Rosalind Picard saw many medical/health applications for this type of research. However, the use that platforms chose for this behavioral data was to analyze a user’s reaction to content provided on said platforms, in order to optimize that content to the user’s taste, but also (of course) to optimize the efficiency of advertising. 

The possible future depicted by Zuboff is perfectly abhorrent. “Emotion as a Service” would be where a company only needs to film a user at any given time, send the video to some central “AI” company that then generates a full emotional profile based on that video… That system could ultimately lead to “Happiness as a Service”, which also includes behavioral nudges that are generated from the emotional profile to attain a desired outcome. *shudders*

When They Come for My Truth

Zuboff argues that personality and emotions are a level of depth that is way too far for comfort. The extraction imperative taken to such depth is a threat to free will. Machines working against one’s will (or unbeknownst to one’s perception) to read their inner self, all for others’ economic ends, seems to be hellishly dystopian.


Chapter 8Intro and table of contentsChapter 10