My annotated highlights from The Age of Surveillance Capitalism 📚
This year I started reading more non-fiction books, something I had historically not enjoyed and thus did not do much at all. As this has changed, I’ve noticed I wasn’t retaining much of the knowledge I’d pick up. Additionally, I’d highlight things but never revisit the book. I was inspired by Jason Kottke’s trend of posting annotated highlights from books he has read and decided to revisit my highlights after finishing my book, annotating why I think I highlighted that or what it speaks to me now, and posting them to share.
The Spanish Data Protection Agency and later the European Court of Justice demonstrated the unbearable lightness of the inevitable, as both institutions declared what is at stake for a human future, beginning with the primacy of democratic institutions in shaping a healthy and just digital future.
A few things stuck out to me in this quote. First, I enjoy the use of “unbearable lightness” and I was reminded to pick up “The Unbearable Lightness of Being” again. Second, I find this section’s attack on inevitability to be extremely enlightening. I’ve suffered from the assumption of inevitability with our current technological trends and I realize that this is a mistake. Finally, I was surprised by how long some of our governmental institutions have been fighting against this. It’s been a relatively new thing for me, so it’s humbling to see it predating my learning on this.
Sandberg understood that through the artful manipulation of Facebook’s culture of intimacy and sharing, it would be possible to use behavioral surplus not only to satisfy demand but also to create demand.
This starts to touch on something I found both illuminating and hard to convey and explain. I think the most interesting topic discussed in The Age of Surveillance Capitalism isn’t the depth of information collected, the secrecy in which it’s done, or the lengths these firms will work to continue their surveillance, but rather the manipulation of behavior that bothers me the most.
This is one respect in which the surveillance capitalists are not unprecedented. Adam Winkler, a historian of corporate rights, reminds us, “Throughout American history, the nation’s most powerful corporations have persistently mobilized to use the Constitution to fight off unwanted government regulations."
It’s always good to keep in mind that everyone will point to the things written down that agree with them and try to use that to push their agendas. We may try to look at something like the Constitution has a purely logical document, but you cannot remove our own biases when we read it and interpret it.
It is important to say—and we will revisit this theme more than once—that regulatory interventions designed to constrain Google’s monopoly practices are likely to have little effect on the fundamental operations of this market form.
It’s good to not forget that surveillance capitalism, as described by this book, isn’t just something Google can do. Even if Google were to disappear tomorrow, other corporations can perform a similar act.
- We claim human experience as raw material free for the taking. On the basis of this claim, we can ignore considerations of individuals’ rights, interests, awareness, or comprehension.
- On the basis of our claim, we assert the right to take an individual’s experience for translation into behavioral data.
- Our right to take, based on our claim of free raw material, confers the right to own the behavioral data derived from human experience.
- Our rights to take and to own confer the right to know what the data disclose.
- Our rights to take, to own, and to know confer the right to decide how we use our knowledge.
- Our rights to take, to own, to know, and to decide confer our rights to the conditions that preserve our rights to take, to own, to know, and to decide.
I enjoyed this rather succinct construction of some rights that surveillance capital firms have created and assumed. It’s helpful to also know what, specifically, we need to work against to control them and this new form of capitalism.
They know a great deal about us, but our access to their knowledge is sparse: hidden in the shadow text and read-only by the new priests, their bosses, and their machines.
The creepy power over people that surveillance capital firms have is impressive, but this imbalance of knowledge was a surprising small detail I had overlooked
As things currently stand, it is the surveillance capitalist corporations that know. It is the market form that decides. It is the competitive struggle among surveillance capitalists that decides who decides.
I liked the framing the author gave of “Who knows, who decides, who decides who decides” and I found this a good use of it.
But in Varian’s scenario, what happens to the driver? What if there is a child in the car? Or a blizzard? Or a train to catch? Or a day-care center drop-off on the way to work? A mother on life support in the hospital still miles away? A son waiting to be picked up at school?
This section discusses how society is the alignment of many individuals. Contracts exist to help collective agreement, but it’s human to allow us to bend and break those agreements. Varian’s uncontract, while at its face seems extremely desirable, loses an important human aspect of social agreements. In this example, that loss can be downright deadly.
Imagine you have a hammer. That’s machine learning. It helped you climb a grueling mountain to reach the summit. That’s machine learning’s dominance of online data. On the mountaintop, you find a vast pile of nails, cheaper than anything previously imaginable. That’s the new smart sensor tech. An unbroken vista of virgin board stretches before you as far as you can see. That’s the whole dumb world. Then you learn that any time you plant a nail in a board with your machine learning hammer, you can extract value from that formerly dumb plank. That’s data monetization. What do you do? You start hammering like crazy and you never stop, unless somebody makes you stop. But there is nobody up here to make us stop. This is why the “internet of everything” is inevitable.
“When all you have a hammer, everything looks like a nail” is a common phrase I hear, but this was a fun twist on it. I also liked that this quote points out that it’s inevitable unless someone makes them stop.
“The bank is something else than men. It happens that every man in a bank hates what the bank does, and yet the bank does it. The bank is something more than men, I tell you. It’s the monster. Men made it, but they can’t control it.”
I just really enjoyed this Steinbeck quote and it serves as a great analogy.
The goal of everything we do is to change people’s actual behavior at scale. We want to figure out the construction of changing a person’s behavior, and then we want to change how lots of people are making their day-to-day decisions. When people use our app, we can capture their behaviors and identify good and bad [ones]. Then we develop “treatments” or “data pellets” that select good behaviors. We can test how actionable our cues are for them and how profitable certain behaviors are for us.
Part of the mental shift I had in reading this book was going from just seeing surveillance capitalism as creepy to seeing it as problematic due to its control of behavior.
In this process of experimentation, economies of action are discovered, honed, and ultimately institutionalized in software programs and their algorithms that function automatically, continuously, ubiquitously, and pervasively to achieve economies of action. Facebook’s surplus is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.
Furthering the previous quote and my feelings about it, I liked this brief dissection of Facebook’s work to modify behavior for their profit.
The Facebook document detailed the many ways in which the corporation uses its stores of behavioral surplus to pinpoint the exact moment at which a young person needs a “confidence boost” and is therefore most vulnerable to a specific configuration of advertising cues and nudges: “By monitoring posts, pictures, interactions, and Internet activity, Facebook can work out when young people feel ‘stressed,’ ‘defeated,’ ‘overwhelmed,’ ‘anxious,’ ‘nervous,’ ‘stupid,’ ‘silly,’ ‘useless,’ and a ‘failure.’”
Again a further descriptor of Facebook’s act of behavioral control that solidifies my concerns about them, and other surveillance capital firms.
Another factor was the 1971 publication of B. F. Skinner’s incendiary social meditation Beyond Freedom & Dignity.
While reading The Age of Surveillance Capitalism, I was fascinated by B. F. Skinner, his views on this, and his writing on it. I’m interested in reading both this book, Beyond Freedom & Dignity, and the other book discussed, Walden Two.
When the founding fathers established our constitutional system of government, they based it on their fundamental belief in the sanctity of the individual.… They understood that self-determination is the source of individuality, and individuality is the mainstay of freedom.… Recently, however, technology has begun to develop new methods of behavior control capable of altering not just an individual’s actions but his very personality and manner of thinking… the behavioral technology being developed in the United States today touches upon the most basic sources of individuality and the very core of personal freedom… the most serious threat… is the power this technology gives one man to impose his views and values on another.… Concepts of freedom, privacy and self-determination inherently conflict with programs designed to control not just physical freedom, but the source of free thought as well.… The question becomes even more acute when these programs are conducted, as they are today, in the absence of strict controls. As disturbing as behavior modification may be on a theoretical level, the unchecked growth of the practical technology of behavior control is cause for even greater concern.”
This quote brings home why behavioral control on a mass scale is terrifying. While some examples given in The Age of Surveillance Capitalism sound positive, like encouraging people to vote, the problem is the power dynamic introduced. Democracy, as flawed as it is, has a base assumption of people’s independent actions to determine the outcome of the group. While there’s always been some amount of group manipulation in politics, the new technologies at the focus of this book show that the manipulation at play is being increased by an order of magnitude, at least.
If industrial civilization flourished at the expense of nature and now threatens to cost us the Earth, an information civilization shaped by surveillance capitalism will thrive at the expense of human nature and threatens to cost us our humanity.
I found the various analogies in this book helpful, and this one stood out.
Forget the cliché that if it’s free, “You are the product.” You are not the product; you are the abandoned carcass. The “product” derives from the surplus that is ripped from your life.
This book regularly talks about and reframes the common “If it’s free, you’re the product” mantra, but this one was the most visceral quote I came across.
Big Other finally enables the universal technology of behavior that, as Skinner, Stuart MacKay, Mark Weiser, and Joe Paradiso each insisted, accomplishes its aims quietly and persistently, using methods that intentionally bypass our awareness, disappearing into the background of all things.
I think this is an interesting case of Hegel’s thesis, antithesis, synthesis. When Skinner came up with this sort of behavioral control he saw it as a great useful tool for a utopian society. Humanity saw it as too powerful and pushed back. However, over time it was synthesized into not a new form of idea, but taken over by capitalists who found a new way for profit.
I am quite curious to know what these researchers would think of the current applications of their initial ideas and how they’d feel about them.
In a 2015 murder case, police used data from a “smart” utility meter, an iPhone 6s Plus, and audio files captured by an Amazon Echo device to identify a suspect. In 2014 data from a Fitbit wristband were used in a personal injury case, and in 2017 police used data from a pacemaker to charge a man with arson and insurance fraud.
I primarily highlighted these so I could find the sources. For future reference, they are:
- Amazon Echo and the Hot Tub Murder
- Fitbit Data Now Being Used In The Courtroom
- When Fitbit Is the Expert Witness
- Cops use pacemaker data to charge homeowner with arson, insurance fraud
While I think they’re all pretty insightful and reveal that the devices that many of us trust can be used against us, I think the final one is the most eye-opening. While a Fitbits tend to be a tool to encourage better fitness habits and an Amazon Echo is generally a small convenience, a pacemaker is something vital to someone’s life. For something so integral to you, something that’s literally part of you, to spy on you like that, feels extremely worrying.
Oxford University China scholar Rogier Creemers, who translated some of the first documents on the social credit system, observes that “the trend towards social engineering and ‘nudging’ individuals towards ‘better’ behavior is also part of the Silicon Valley approach that holds that human problems can be solved once and for all through the disruptive power of technology.… In that sense, perhaps the most shocking element of the story is not the Chinese government’s agenda, but how similar it is to the path technology is taking elsewhere.”
Many Americans I speak to talk about the Chinese social credit system largely with disdain and are scared of such a system. I think this excerpt is quite surprising in seeing similar things being used here, but instead of attempting to use them to improve society (not to say it’s an ethical use of behavioral manipulation), it’s being used for corporate profit.
Indeed, surveillance capitalists such as Nadella, Page, and Zuckerberg conform to five of the six elements with which the great scholars of utopian thought, Frank and Fritzie Manuel, define the classic profile of the most ambitious modern utopianists:
- a tendency toward highly focused tunnel vision that simplifies the utopian challenge,
- an earlier and more trenchant grasp of a “new state of being” than other contemporaries,
- the obsessive pursuit and defense of an idée fixe,
- an unshakable belief in the inevitability of one’s ideas coming to fruition, and
- the drive for total reformation at the level of the species and the entire world system.
The power of writing down all these observations is the ability to make them seem so concise. I enjoyed this collection of traits that some of these new leaders of surveillance capitalism share.
No one has mapped the casino terrain more insightfully than MIT social anthropologist Natasha Dow Schüll in her fascinating examination of machine gambling in Las Vegas, Addiction by Design.
I mostly just highlighted this to add this book to my reading list. It seems like an interesting thing to study and probably some great insights.
For example, Evil by Design author Chris Nodder, a user-experience consultant, explains that evil design aims to exploit human weakness by creating interfaces that “make users emotionally involved in doing something that benefits the designer more than them.” He coaches his readers in psychic numbing, urging them to accept the fact that such practices have become the standard suggesting that consumers and designers find ways to “turn them to your advantage.”
This book also sounded interesting to read. I think the second half is the most painful to read, the part about psychic numbing. It almost feels like the third stage of genocide as defined by Gregory Stanton’s “The 8 Stages of Genocide”: Dehumanization. By reframing and numbing yourself to the humanity of the enemy, you can commit atrocities. While that’s on another scale than what is being discussed, the mental numbing reminded me of it.
They concluded that because of a range of psychological and contextual factors, “People are often unaware of the information they are sharing, unaware of how it can be used, and even in the rare situations when they have full knowledge of the consequences of sharing, uncertain about their own preferences.…” The researchers cautioned that people are “easily influenced in what and how much they disclose. Moreover, what they share can be used to influence their emotions, thoughts, and behaviors.…” The result is alteration in “the balance of power between those holding the data and those who are the subjects of that data.”
While the start of this book exposed me to the behavioral manipulation that surveillance capital firms employee to increase profit, the end circled back to the more basic manipulation of just getting us to share more data with them.
One consequence of the new density of social comparison triggers and their negative feedback loops is a psychological condition known as FOMO (“fear of missing out”). It is a form of social anxiety defined as “the uneasy and sometimes all-consuming feeling that… your peers are doing, in the know about, or in possession of more or something better than you.” It’s a young person’s affliction that is associated with negative mood and low levels of life satisfaction. Research has identified FOMO with compulsive Facebook use: FOMO sufferers obsessively checked their Facebook feeds—during meals, while driving, immediately upon waking or before sleeping, and so on. This compulsive behavior is intended to produce relief in the form of social reassurance, but it predictably breeds more anxiety and more searching.
I’ve known some friends who have some intense FOMO feelings and I found this tidbit of leaning to social media compulsively insightful. It’s also interesting from the poster’s perspective since I doubt many people share things on social media in an attempt to induce FOMO, but it’s something so many people experience.
I wait to hear my husband’s breathing in syncopation with the muffled sighs of our beloved dog on the floor beside us as she sprints through her ecstatic dreams. I sense beyond to the dense envelope of our bedroom walls and listen to their lullaby of seclusion.
I liked this description of the author’s home in this section. They continue to talk about how Big Other wants the seclusion to be removed for their gains while they provide small conveniences.
The real psychological truth is this: If you’ve got nothing to hide, you are nothing.
Zuboff’s retort to the common “what do you have to hide” manta is great. Everyone has things they don’t share with everyone. Hiding parts of one’s self is a normal and regular thing, and if you don’t do that are you even human?
Until then, 1.5 billion of its users, including those in Africa, Asia, Australia, and Latin America, were governed by terms of service issued by the company’s international headquarters in Ireland, meaning that these terms fell under the EU framework. It was in late April that Facebook quietly issued new terms of service, placing those 1.5 billion users under US privacy laws and thus eliminating their ability to file claims in Irish courts.
This sneaky move isn’t surprising on Facebook’s part. Especially with their recent regular slimy antics, but I hadn’t heard of it before.
Industrial capitalism commandeered nature only to saddle the coming generations with the burden of a burning planet. Will we add to this burden with surveillance capitalism’s invasion and conquest of human nature? Will we stand by as it subtly imposes the life of the hive while demanding the forfeit of sanctuary and the right to the future tense for the sake of its wealth and power?
We still haven’t completely tamed industrial capitalism, but the tools to do so are more well known now. I think it’s important to see we’re in a new stage of capitalism and determine what tools we need to reign it in.
I reject inevitability, and it is my hope that as a result of our journey together, you will too.
This book is dense and weaves a thick web discussing surveillance capitalism’s modus operandi, problems, and what the future will look like with it unchecked. It can be a bit much. However, this call to action is what I think is the most important. We must not accept the tale of inevitability we are fed.