1 <html><head><meta http-equiv=
"Content-Type" content=
"text/html; charset=UTF-8"><title>How to Destroy Surveillance Capitalism
</title><meta name=
"generator" content=
"DocBook XSL Stylesheets V1.79.1"><style type=
"text/css">
2 body { background-image: url('images/draft.png');
3 background-repeat: no-repeat;
4 background-position: top left;
5 /* The following properties make the watermark "fixed" on the page. */
6 /* I think that's just a bit too distracting for the reader... */
7 /* background-attachment: fixed; */
8 /* background-position: center center; */
9 }
</style></head><body bgcolor=
"white" text=
"black" link=
"#0000FF" vlink=
"#840084" alink=
"#0000FF"><div lang=
"en" class=
"article"><div class=
"titlepage"><div><div><h2 class=
"title"><a name=
"index"></a>How to Destroy Surveillance Capitalism
</h2></div><div><div class=
"authorgroup"><div class=
"author"><h3 class=
"author"><span class=
"firstname">Cory
</span> <span class=
"surname">Doctorow
</span></h3></div></div></div><div><p class=
"copyright">Copyright ©
2020 Cory Doctorow
</p></div><div><p class=
"copyright">Copyright ©
2020 Petter Reinholdtsen
</p></div><div><div class=
"legalnotice"><a name=
"idm18"></a><p>
10 How to Destroy Surveillance Capitalism by Cory Doctorow.
12 Published by Petter Reinholdtsen.
14 ISBN
978-
82-
93828-
05-
1 (hard cover)
16 ISBN
978-
82-
93828-
06-
8 (paperback)
18 ISBN
978-
82-
93828-
07-
5 (ePub)
20 <span class=
"inlinemediaobject"><img src=
"images/cc.png" align=
"middle" height=
"38" alt=
"Creative Commons, Some rights reserved"></span>
22 This book is licensed under a Creative Commons license. This
23 license permits any use of this work, so long as attribution is
24 given and no derivatived material is distributed. For more
25 information about the license visit
<a class=
"ulink" href=
"https://creativecommons.org/licenses/by-nd/4.0/" target=
"_top">https://creativecommons.org/licenses/by-nd/
4.0/
</a>.
26 </p></div></div></div><hr></div><div class=
"toc"><p><b>Table of Contents
</b></p><dl class=
"toc"><dt><span class=
"sect1"><a href=
"#the-net-of-a-thousand-lies">The net of a thousand lies
</a></span></dt><dt><span class=
"sect1"><a href=
"#digital-rights-activism-a-quarter-century-on">Digital rights activism, a quarter-century on
</a></span></dt><dt><span class=
"sect1"><a href=
"#tech-exceptionalism-then-and-now">Tech exceptionalism, then and now
</a></span></dt><dt><span class=
"sect1"><a href=
"#dont-believe-the-hype">Don’t believe the hype
</a></span></dt><dt><span class=
"sect1"><a href=
"#what-is-persuasion">What is persuasion?
</a></span></dt><dd><dl><dt><span class=
"sect2"><a href=
"#segmenting">1. Segmenting
</a></span></dt><dt><span class=
"sect2"><a href=
"#deception">2. Deception
</a></span></dt><dt><span class=
"sect2"><a href=
"#domination">3. Domination
</a></span></dt><dt><span class=
"sect2"><a href=
"#bypassing-our-rational-faculties">4. Bypassing our rational faculties
</a></span></dt></dl></dd><dt><span class=
"sect1"><a href=
"#if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak">If data is the new oil, then surveillance capitalism’s engine
27 has a leak
</a></span></dt><dt><span class=
"sect1"><a href=
"#what-is-facebook">What is Facebook?
</a></span></dt><dt><span class=
"sect1"><a href=
"#monopoly-and-the-right-to-the-future-tense">Monopoly and the right to the future tense
</a></span></dt><dt><span class=
"sect1"><a href=
"#search-order-and-the-right-to-the-future-tense">Search order and the right to the future tense
</a></span></dt><dt><span class=
"sect1"><a href=
"#monopolists-can-afford-sleeping-pills-for-watchdogs">Monopolists can afford sleeping pills for watchdogs
</a></span></dt><dt><span class=
"sect1"><a href=
"#privacy-and-monopoly">Privacy and monopoly
</a></span></dt><dt><span class=
"sect1"><a href=
"#ronald-reagan-pioneer-of-tech-monopolism">Ronald Reagan, pioneer of tech monopolism
</a></span></dt><dt><span class=
"sect1"><a href=
"#steering-with-the-windshield-wipers">Steering with the windshield wipers
</a></span></dt><dt><span class=
"sect1"><a href=
"#surveillance-still-matters">Surveillance still matters
</a></span></dt><dt><span class=
"sect1"><a href=
"#dignity-and-sanctuary">Dignity and sanctuary
</a></span></dt><dt><span class=
"sect1"><a href=
"#afflicting-the-afflicted">Afflicting the afflicted
</a></span></dt><dt><span class=
"sect1"><a href=
"#any-data-you-collect-and-retain-will-eventually-leak">Any data you collect and retain will eventually leak
</a></span></dt><dt><span class=
"sect1"><a href=
"#critical-tech-exceptionalism-is-still-tech-exceptionalism">Critical tech exceptionalism is still tech
28 exceptionalism
</a></span></dt><dt><span class=
"sect1"><a href=
"#how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story">How monopolies, not mind control, drive surveillance
29 capitalism: The Snapchat story
</a></span></dt><dt><span class=
"sect1"><a href=
"#a-monopoly-over-your-friends">A monopoly over your friends
</a></span></dt><dt><span class=
"sect1"><a href=
"#fake-news-is-an-epistemological-crisis">Fake news is an epistemological crisis
</a></span></dt><dt><span class=
"sect1"><a href=
"#tech-is-different">Tech is different
</a></span></dt><dt><span class=
"sect1"><a href=
"#ownership-of-facts">Ownership of facts
</a></span></dt><dt><span class=
"sect1"><a href=
"#persuasion-works-slowly">Persuasion works… slowly
</a></span></dt><dt><span class=
"sect1"><a href=
"#paying-wont-help">Paying won’t help
</a></span></dt><dt><span class=
"sect1"><a href=
"#an-ecology-moment-for-trustbusting">An
<span class=
"quote">“
<span class=
"quote">ecology
</span>”
</span> moment for trustbusting
</a></span></dt><dt><span class=
"sect1"><a href=
"#make-big-tech-small-again">Make Big Tech small again
</a></span></dt><dt><span class=
"sect1"><a href=
"#goto-10">20 GOTO
10</a></span></dt><dt><span class=
"sect1"><a href=
"#up-and-through">Up and through
</a></span></dt></dl></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"the-net-of-a-thousand-lies"></a>The net of a thousand lies
</h2></div></div></div><p>
30 The most surprising thing about the rebirth of flat Earthers in the
31 21st century is just how widespread the evidence against them is.
32 You can understand how, centuries ago, people who’d never gained a
33 high-enough vantage point from which to see the Earth’s curvature
34 might come to the commonsense belief that the flat-seeming Earth
37 But today, when elementary schools routinely dangle GoPro cameras
38 from balloons and loft them high enough to photograph the Earth’s
39 curve — to say nothing of the unexceptional sight of the curved
40 Earth from an airplane window — it takes a heroic effort to maintain
41 the belief that the world is flat.
43 Likewise for white nationalism and eugenics: In an age where you can
44 become a computational genomics datapoint by swabbing your cheek and
45 mailing it to a gene-sequencing company along with a modest sum of
46 money,
<span class=
"quote">“
<span class=
"quote">race science
</span>”
</span> has never been easier to refute.
48 We are living through a golden age of both readily available facts
49 and denial of those facts. Terrible ideas that have lingered on the
50 fringes for decades or even centuries have gone mainstream seemingly
53 When an obscure idea gains currency, there are only two things that
54 can explain its ascendance: Either the person expressing that idea
55 has gotten a lot better at stating their case, or the proposition
56 has become harder to deny in the face of mounting evidence. In other
57 words, if we want people to take climate change seriously, we can
58 get a bunch of Greta Thunbergs to make eloquent, passionate
59 arguments from podiums, winning our hearts and minds, or we can wait
60 for flood, fire, broiling sun, and pandemics to make the case for
61 us. In practice, we’ll probably have to do some of both: The more
62 we’re boiling and burning and drowning and wasting away, the easier
63 it will be for the Greta Thunbergs of the world to convince us.
65 The arguments for ridiculous beliefs in odious conspiracies like
66 anti-vaccination, climate denial, a flat Earth, and eugenics are no
67 better than they were a generation ago. Indeed, they’re worse
68 because they are being pitched to people who have at least a
69 background awareness of the refuting facts.
71 Anti-vax has been around since the first vaccines, but the early
72 anti-vaxxers were pitching people who were less equipped to
73 understand even the most basic ideas from microbiology, and
74 moreover, those people had not witnessed the extermination of
75 mass-murdering diseases like polio, smallpox, and measles. Today’s
76 anti-vaxxers are no more eloquent than their forebears, and they
77 have a much harder job.
79 So can these far-fetched conspiracy theorists really be succeeding
80 on the basis of superior arguments?
82 Some people think so. Today, there is a widespread belief that
83 machine learning and commercial surveillance can turn even the most
84 fumble-tongued conspiracy theorist into a svengali who can warp your
85 perceptions and win your belief by locating vulnerable people and
86 then pitching them with A.I.-refined arguments that bypass their
87 rational faculties and turn everyday people into flat Earthers,
88 anti-vaxxers, or even Nazis. When the RAND Corporation
89 <a class=
"ulink" href=
"https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf" target=
"_top">blames
90 Facebook for
<span class=
"quote">“
<span class=
"quote">radicalization
</span>”
</span></a> and when Facebook’s role in
91 spreading coronavirus misinformation is
92 <a class=
"ulink" href=
"https://secure.avaaz.org/campaign/en/facebook_threat_health/" target=
"_top">blamed
93 on its algorithm
</a>, the implicit message is that machine
94 learning and surveillance are causing the changes in our consensus
97 After all, in a world where sprawling and incoherent conspiracy
98 theories like Pizzagate and its successor, QAnon, have widespread
99 followings,
<span class=
"emphasis"><em>something
</em></span> must be afoot.
101 But what if there’s another explanation? What if it’s the material
102 circumstances, and not the arguments, that are making the difference
103 for these conspiracy pitchmen? What if the trauma of living through
104 <span class=
"emphasis"><em>real conspiracies
</em></span> all around us — conspiracies
105 among wealthy people, their lobbyists, and lawmakers to bury
106 inconvenient facts and evidence of wrongdoing (these conspiracies
107 are commonly known as
<span class=
"quote">“
<span class=
"quote">corruption
</span>”
</span>) — is making people vulnerable to
110 If it’s trauma and not contagion — material conditions and not
111 ideology — that is making the difference today and enabling a rise
112 of repulsive misinformation in the face of easily observed facts,
113 that doesn’t mean our computer networks are blameless. They’re still
114 doing the heavy work of locating vulnerable people and guiding them
115 through a series of ever-more-extreme ideas and communities.
117 Belief in conspiracy is a raging fire that has done real damage and
118 poses real danger to our planet and species, from epidemics
119 <a class=
"ulink" href=
"https://www.cdc.gov/measles/cases-outbreaks.html" target=
"_top">kicked
120 off by vaccine denial
</a> to genocides
121 <a class=
"ulink" href=
"https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html" target=
"_top">kicked
122 off by racist conspiracies
</a> to planetary meltdown caused by
123 denial-inspired climate inaction. Our world is on fire, and so we
124 have to put the fires out — to figure out how to help people see the
125 truth of the world through the conspiracies they’ve been confused
128 But firefighting is reactive. We need fire
129 <span class=
"emphasis"><em>prevention
</em></span>. We need to strike at the traumatic
130 material conditions that make people vulnerable to the contagion of
131 conspiracy. Here, too, tech has a role to play.
133 There’s no shortage of proposals to address this. From the EU’s
134 <a class=
"ulink" href=
"https://edri.org/tag/terreg/" target=
"_top">Terrorist Content
135 Regulation
</a>, which requires platforms to police and remove
136 <span class=
"quote">“
<span class=
"quote">extremist
</span>”
</span> content, to the U.S. proposals to
137 <a class=
"ulink" href=
"https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution" target=
"_top">force
138 tech companies to spy on their users
</a> and hold them liable
139 <a class=
"ulink" href=
"https://www.natlawreview.com/article/repeal-cda-section-230" target=
"_top">for
140 their users’ bad speech
</a>, there’s a lot of energy to force
141 tech companies to solve the problems they created.
143 There’s a critical piece missing from the debate, though. All these
144 solutions assume that tech companies are a fixture, that their
145 dominance over the internet is a permanent fact. Proposals to
146 replace Big Tech with a more diffused, pluralistic internet are
147 nowhere to be found. Worse: The
<span class=
"quote">“
<span class=
"quote">solutions
</span>”
</span> on the table today
148 <span class=
"emphasis"><em>require
</em></span> Big Tech to stay big because only the
149 very largest companies can afford to implement the systems these
152 Figuring out what we want our tech to look like is crucial if we’re
153 going to get out of this mess. Today, we’re at a crossroads where
154 we’re trying to figure out if we want to fix the Big Tech companies
155 that dominate our internet or if we want to fix the internet itself
156 by unshackling it from Big Tech’s stranglehold. We can’t do both, so
159 I want us to choose wisely. Taming Big Tech is integral to fixing
160 the internet, and for that, we need digital rights activism.
161 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"digital-rights-activism-a-quarter-century-on"></a>Digital rights activism, a quarter-century on
</h2></div></div></div><p>
162 Digital rights activism is more than
30 years old now. The
163 Electronic Frontier Foundation turned
30 this year; the Free
164 Software Foundation launched in
1985. For most of the history of the
165 movement, the most prominent criticism leveled against it was that
166 it was irrelevant: The real activist causes were real-world causes
167 (think of the skepticism when
168 <a class=
"ulink" href=
"https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015." target=
"_top">Finland
169 declared broadband a human right in
2010</a>), and real-world
170 activism was shoe-leather activism (think of Malcolm Gladwell’s
171 <a class=
"ulink" href=
"https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell" target=
"_top">contempt
172 for
<span class=
"quote">“
<span class=
"quote">clicktivism
</span>”
</span></a>). But as tech has grown more central to
173 our daily lives, these accusations of irrelevance have given way
174 first to accusations of insincerity (
<span class=
"quote">“
<span class=
"quote">You only care about tech
176 <a class=
"ulink" href=
"https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/" target=
"_top">shilling
177 for tech companies
</a></span>”
</span>) to accusations of negligence (
<span class=
"quote">“
<span class=
"quote">Why
178 didn’t you foresee that tech could be such a destructive force?
</span>”
</span>).
179 But digital rights activism is right where it’s always been: looking
180 out for the humans in a world where tech is inexorably taking over.
182 The latest version of this critique comes in the form of
183 <span class=
"quote">“
<span class=
"quote">surveillance capitalism,
</span>”
</span> a term coined by business professor
184 Shoshana Zuboff in her long and influential
2019 book,
<span class=
"emphasis"><em>The
185 Age of Surveillance Capitalism: The Fight for a Human Future at the
186 New Frontier of Power
</em></span>. Zuboff argues that
<span class=
"quote">“
<span class=
"quote">surveillance
187 capitalism
</span>”
</span> is a unique creature of the tech industry and that it is
188 unlike any other abusive commercial practice in history, one that is
189 <span class=
"quote">“
<span class=
"quote">constituted by unexpected and often illegible mechanisms of
190 extraction, commodification, and control that effectively exile
191 persons from their own behavior while producing new markets of
192 behavioral prediction and modification. Surveillance capitalism
193 challenges democratic norms and departs in key ways from the
194 centuries-long evolution of market capitalism.
</span>”
</span> It is a new and
195 deadly form of capitalism, a
<span class=
"quote">“
<span class=
"quote">rogue capitalism,
</span>”
</span> and our lack of
196 understanding of its unique capabilities and dangers represents an
197 existential, species-wide threat. She’s right that capitalism today
198 threatens our species, and she’s right that tech poses unique
199 challenges to our species and civilization, but she’s really wrong
200 about how tech is different and why it threatens our species.
202 What’s more, I think that her incorrect diagnosis will lead us down
203 a path that ends up making Big Tech stronger, not weaker. We need to
204 take down Big Tech, and to do that, we need to start by correctly
205 identifying the problem.
206 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"tech-exceptionalism-then-and-now"></a>Tech exceptionalism, then and now
</h2></div></div></div><p>
207 Early critics of the digital rights movement — perhaps best
208 represented by campaigning organizations like the Electronic
209 Frontier Foundation, the Free Software Foundation, Public Knowledge,
210 and others that focused on preserving and enhancing basic human
211 rights in the digital realm — damned activists for practicing
<span class=
"quote">“
<span class=
"quote">tech
212 exceptionalism.
</span>”
</span> Around the turn of the millennium, serious people
213 ridiculed any claim that tech policy mattered in the
<span class=
"quote">“
<span class=
"quote">real world.
</span>”
</span>
214 Claims that tech rules had implications for speech, association,
215 privacy, search and seizure, and fundamental rights and equities
216 were treated as ridiculous, an elevation of the concerns of sad
217 nerds arguing about
<span class=
"emphasis"><em>Star Trek
</em></span> on bulletin board
218 systems above the struggles of the Freedom Riders, Nelson Mandela,
219 or the Warsaw ghetto uprising.
221 In the decades since, accusations of
<span class=
"quote">“
<span class=
"quote">tech exceptionalism
</span>”
</span> have only
222 sharpened as tech’s role in everyday life has expanded: Now that
223 tech has infiltrated every corner of our life and our online lives
224 have been monopolized by a handful of giants, defenders of digital
225 freedoms are accused of carrying water for Big Tech, providing cover
226 for its self-interested negligence (or worse, nefarious plots).
228 From my perspective, the digital rights movement has remained
229 stationary while the rest of the world has moved. From the earliest
230 days, the movement’s concern was users and the toolsmiths who
231 provided the code they needed to realize their fundamental rights.
232 Digital rights activists only cared about companies to the extent
233 that companies were acting to uphold users’ rights (or, just as
234 often, when companies were acting so foolishly that they threatened
235 to bring down new rules that would also make it harder for good
236 actors to help users).
238 The
<span class=
"quote">“
<span class=
"quote">surveillance capitalism
</span>”
</span> critique recasts the digital rights
239 movement in a new light again: not as alarmists who overestimate the
240 importance of their shiny toys nor as shills for big tech but as
241 serene deck-chair rearrangers whose long-standing activism is a
242 liability because it makes them incapable of perceiving novel
243 threats as they continue to fight the last century’s tech battles.
245 But tech exceptionalism is a sin no matter who practices it.
246 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"dont-believe-the-hype"></a>Don’t believe the hype
</h2></div></div></div><p>
247 You’ve probably heard that
<span class=
"quote">“
<span class=
"quote">if you’re not paying for the product,
248 you’re the product.
</span>”
</span> As we’ll see below, that’s true, if incomplete.
249 But what is
<span class=
"emphasis"><em>absolutely
</em></span> true is that ad-driven
250 Big Tech’s customers are advertisers, and what companies like Google
251 and Facebook sell is their ability to convince
252 <span class=
"emphasis"><em>you
</em></span> to buy stuff. Big Tech’s product is
253 persuasion. The services — social media, search engines, maps,
254 messaging, and more — are delivery systems for persuasion.
256 The fear of surveillance capitalism starts from the (correct)
257 presumption that everything Big Tech says about itself is probably a
258 lie. But the surveillance capitalism critique makes an exception for
259 the claims Big Tech makes in its sales literature — the breathless
260 hype in the pitches to potential advertisers online and in ad-tech
261 seminars about the efficacy of its products: It assumes that Big
262 Tech is as good at influencing us as they claim they are when
263 they’re selling influencing products to credulous customers. That’s
264 a mistake because sales literature is not a reliable indicator of a
267 Surveillance capitalism assumes that because advertisers buy a lot
268 of what Big Tech is selling, Big Tech must be selling something
269 real. But Big Tech’s massive sales could just as easily be the
270 result of a popular delusion or something even more pernicious:
271 monopolistic control over our communications and commerce.
273 Being watched changes your behavior, and not for the better. It
274 creates risks for our social progress. Zuboff’s book features
275 beautifully wrought explanations of these phenomena. But Zuboff also
276 claims that surveillance literally robs us of our free will — that
277 when our personal data is mixed with machine learning, it creates a
278 system of persuasion so devastating that we are helpless before it.
279 That is, Facebook uses an algorithm to analyze the data it
280 nonconsensually extracts from your daily life and uses it to
281 customize your feed in ways that get you to buy stuff. It is a
282 mind-control ray out of a
1950s comic book, wielded by mad
283 scientists whose supercomputers guarantee them perpetual and total
285 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"what-is-persuasion"></a>What is persuasion?
</h2></div></div></div><p>
286 To understand why you shouldn’t worry about mind-control rays — but
287 why you
<span class=
"emphasis"><em>should
</em></span> worry about surveillance
288 <span class=
"emphasis"><em>and
</em></span> Big Tech — we must start by unpacking what
289 we mean by
<span class=
"quote">“
<span class=
"quote">persuasion.
</span>”
</span>
291 Google, Facebook, and other surveillance capitalists promise their
292 customers (the advertisers) that if they use machine-learning tools
293 trained on unimaginably large data sets of nonconsensually harvested
294 personal information, they will be able to uncover ways to bypass
295 the rational faculties of the public and direct their behavior,
296 creating a stream of purchases, votes, and other desired outcomes.
297 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
298 The impact of dominance far exceeds the impact of manipulation and
299 should be central to our analysis and any remedies we seek.
300 </p></blockquote></div><p>
301 But there’s little evidence that this is happening. Instead, the
302 predictions that surveillance capitalism delivers to its customers
303 are much less impressive. Rather than finding ways to bypass our
304 rational faculties, surveillance capitalists like Mark Zuckerberg
305 mostly do one or more of three things:
306 </p><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"segmenting"></a>1. Segmenting
</h3></div></div></div><p>
307 If you’re selling diapers, you have better luck if you pitch them
308 to people in maternity wards. Not everyone who enters or leaves a
309 maternity ward just had a baby, and not everyone who just had a
310 baby is in the market for diapers. But having a baby is a really
311 reliable correlate of being in the market for diapers, and being
312 in a maternity ward is highly correlated with having a baby. Hence
313 diaper ads around maternity wards (and even pitchmen for baby
314 products, who haunt maternity wards with baskets full of
317 Surveillance capitalism is segmenting times a billion. Diaper
318 vendors can go way beyond people in maternity wards (though they
319 can do that, too, with things like location-based mobile ads).
320 They can target you based on whether you’re reading articles about
321 child-rearing, diapers, or a host of other subjects, and data
322 mining can suggest unobvious keywords to advertise against. They
323 can target you based on the articles you’ve recently read. They
324 can target you based on what you’ve recently purchased. They can
325 target you based on whether you receive emails or private messages
326 about these subjects — or even if you speak aloud about them
327 (though Facebook and the like convincingly claim that’s not
330 This is seriously creepy.
332 But it’s not mind control.
334 It doesn’t deprive you of your free will. It doesn’t trick you.
336 Think of how surveillance capitalism works in politics.
337 Surveillance capitalist companies sell political operatives the
338 power to locate people who might be receptive to their pitch.
339 Candidates campaigning on finance industry corruption seek people
340 struggling with debt; candidates campaigning on xenophobia seek
341 out racists. Political operatives have always targeted their
342 message whether their intentions were honorable or not: Union
343 organizers set up pitches at factory gates, and white supremacists
344 hand out fliers at John Birch Society meetings.
346 But this is an inexact and thus wasteful practice. The union
347 organizer can’t know which worker to approach on the way out of
348 the factory gates and may waste their time on a covert John Birch
349 Society member; the white supremacist doesn’t know which of the
350 Birchers are so delusional that making it to a meeting is as much
351 as they can manage and which ones might be convinced to cross the
352 country to carry a tiki torch through the streets of
353 Charlottesville, Virginia.
355 Because targeting improves the yields on political pitches, it can
356 accelerate the pace of political upheaval by making it possible
357 for everyone who has secretly wished for the toppling of an
358 autocrat — or just an
11-term incumbent politician — to find
359 everyone else who feels the same way at very low cost. This has
360 been critical to the rapid crystallization of recent political
361 movements including Black Lives Matter and Occupy Wall Street as
362 well as less savory players like the far-right white nationalist
363 movements that marched in Charlottesville.
365 It’s important to differentiate this kind of political organizing
366 from influence campaigns; finding people who secretly agree with
367 you isn’t the same as convincing people to agree with you. The
368 rise of phenomena like nonbinary or otherwise nonconforming gender
369 identities is often characterized by reactionaries as the result
370 of online brainwashing campaigns that convince impressionable
371 people that they have been secretly queer all along.
373 But the personal accounts of those who have come out tell a
374 different story where people who long harbored a secret about
375 their gender were emboldened by others coming forward and where
376 people who knew that they were different but lacked a vocabulary
377 for discussing that difference learned the right words from these
378 low-cost means of finding people and learning about their ideas.
379 </p></div><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"deception"></a>2. Deception
</h3></div></div></div><p>
380 Lies and fraud are pernicious, and surveillance capitalism
381 supercharges them through targeting. If you want to sell a
382 fraudulent payday loan or subprime mortgage, surveillance
383 capitalism can help you find people who are both desperate and
384 unsophisticated and thus receptive to your pitch. This accounts
385 for the rise of many phenomena, like multilevel marketing schemes,
386 in which deceptive claims about potential earnings and the
387 efficacy of sales techniques are targeted at desperate people by
388 advertising against search queries that indicate, for example,
389 someone struggling with ill-advised loans.
391 Surveillance capitalism also abets fraud by making it easy to
392 locate other people who have been similarly deceived, forming a
393 community of people who reinforce one another’s false beliefs.
395 <a class=
"ulink" href=
"https://www.vulture.com/2020/01/the-dream-podcast-review.html" target=
"_top">the
396 forums
</a> where people who are being victimized by multilevel
397 marketing frauds gather to trade tips on how to improve their luck
398 in peddling the product.
400 Sometimes, online deception involves replacing someone’s correct
401 beliefs with incorrect ones, as it does in the anti-vaccination
402 movement, whose victims are often people who start out believing
403 in vaccines but are convinced by seemingly plausible evidence that
404 leads them into the false belief that vaccines are harmful.
406 But it’s much more common for fraud to succeed when it doesn’t
407 have to displace a true belief. When my daughter contracted head
408 lice at daycare, one of the daycare workers told me I could get
409 rid of them by treating her hair and scalp with olive oil. I
410 didn’t know anything about head lice, and I assumed that the
411 daycare worker did, so I tried it (it didn’t work, and it doesn’t
412 work). It’s easy to end up with false beliefs when you simply
413 don’t know any better and when those beliefs are conveyed by
414 someone who seems to know what they’re doing.
416 This is pernicious and difficult — and it’s also the kind of thing
417 the internet can help guard against by making true information
418 available, especially in a form that exposes the underlying
419 deliberations among parties with sharply divergent views, such as
420 Wikipedia. But it’s not brainwashing; it’s fraud. In the
421 <a class=
"ulink" href=
"https://datasociety.net/library/data-voids/" target=
"_top">majority
422 of cases
</a>, the victims of these fraud campaigns have an
423 informational void filled in the customary way, by consulting a
424 seemingly reliable source. If I look up the length of the Brooklyn
425 Bridge and learn that it is
5,
800 feet long, but in reality, it is
426 5,
989 feet long, the underlying deception is a problem, but it’s a
427 problem with a simple remedy. It’s a very different problem from
428 the anti-vax issue in which someone’s true belief is displaced by
429 a false one by means of sophisticated persuasion.
430 </p></div><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"domination"></a>3. Domination
</h3></div></div></div><p>
431 Surveillance capitalism is the result of monopoly. Monopoly is the
432 cause, and surveillance capitalism and its negative outcomes are
433 the effects of monopoly. I’ll get into this in depth later, but
434 for now, suffice it to say that the tech industry has grown up
435 with a radical theory of antitrust that has allowed companies to
436 grow by merging with their rivals, buying up their nascent
437 competitors, and expanding to control whole market verticals.
439 One example of how monopolism aids in persuasion is through
440 dominance: Google makes editorial decisions about its algorithms
441 that determine the sort order of the responses to our queries. If
442 a cabal of fraudsters have set out to trick the world into
443 thinking that the Brooklyn Bridge is
5,
800 feet long, and if
444 Google gives a high search rank to this group in response to
445 queries like
<span class=
"quote">“
<span class=
"quote">How long is the Brooklyn Bridge?
</span>”
</span> then the first
446 eight or
10 screens’ worth of Google results could be wrong. And
447 since most people don’t go beyond the first couple of results —
448 let alone the first
<span class=
"emphasis"><em>page
</em></span> of results —
449 Google’s choice means that many people will be deceived.
451 Google’s dominance over search — more than
86% of web searches are
452 performed through Google — means that the way it orders its search
453 results has an outsized effect on public beliefs. Ironically,
454 Google claims this is why it can’t afford to have any transparency
455 in its algorithm design: Google’s search dominance makes the
456 results of its sorting too important to risk telling the world how
457 it arrives at those results lest some bad actor discover a flaw in
458 the ranking system and exploit it to push its point of view to the
459 top of the search results. There’s an obvious remedy to a company
460 that is too big to audit: break it up into smaller pieces.
462 Zuboff calls surveillance capitalism a
<span class=
"quote">“
<span class=
"quote">rogue capitalism
</span>”
</span> whose
463 data-hoarding and machine-learning techniques rob us of our free
464 will. But influence campaigns that seek to displace existing,
465 correct beliefs with false ones have an effect that is small and
466 temporary while monopolistic dominance over informational systems
467 has massive, enduring effects. Controlling the results to the
468 world’s search queries means controlling access both to arguments
469 and their rebuttals and, thus, control over much of the world’s
470 beliefs. If our concern is how corporations are foreclosing on our
471 ability to make up our own minds and determine our own futures,
472 the impact of dominance far exceeds the impact of manipulation and
473 should be central to our analysis and any remedies we seek.
474 </p></div><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"bypassing-our-rational-faculties"></a>4. Bypassing our rational faculties
</h3></div></div></div><p>
475 <span class=
"emphasis"><em>This
</em></span> is the good stuff: using machine
476 learning,
<span class=
"quote">“
<span class=
"quote">dark patterns,
</span>”
</span> engagement hacking, and other
477 techniques to get us to do things that run counter to our better
478 judgment. This is mind control.
480 Some of these techniques have proven devastatingly effective (if
481 only in the short term). The use of countdown timers on a purchase
482 completion page can create a sense of urgency that causes you to
483 ignore the nagging internal voice suggesting that you should shop
484 around or sleep on your decision. The use of people from your
485 social graph in ads can provide
<span class=
"quote">“
<span class=
"quote">social proof
</span>”
</span> that a purchase is
486 worth making. Even the auction system pioneered by eBay is
487 calculated to play on our cognitive blind spots, letting us feel
488 like we
<span class=
"quote">“
<span class=
"quote">own
</span>”
</span> something because we bid on it, thus encouraging us
489 to bid again when we are outbid to ensure that
<span class=
"quote">“
<span class=
"quote">our
</span>”
</span> things stay
492 Games are extraordinarily good at this.
<span class=
"quote">“
<span class=
"quote">Free to play
</span>”
</span> games
493 manipulate us through many techniques, such as presenting players
494 with a series of smoothly escalating challenges that create a
495 sense of mastery and accomplishment but which sharply transition
496 into a set of challenges that are impossible to overcome without
497 paid upgrades. Add some social proof to the mix — a stream of
498 notifications about how well your friends are faring — and before
499 you know it, you’re buying virtual power-ups to get to the next
502 Companies have risen and fallen on these techniques, and the
503 <span class=
"quote">“
<span class=
"quote">fallen
</span>”
</span> part is worth paying attention to. In general, living
504 things adapt to stimulus: Something that is very compelling or
505 noteworthy when you first encounter it fades with repetition until
506 you stop noticing it altogether. Consider the refrigerator hum
507 that irritates you when it starts up but disappears into the
508 background so thoroughly that you only notice it when it stops
511 That’s why behavioral conditioning uses
<span class=
"quote">“
<span class=
"quote">intermittent
512 reinforcement schedules.
</span>”
</span> Instead of giving you a steady drip of
513 encouragement or setbacks, games and gamified services scatter
514 rewards on a randomized schedule — often enough to keep you
515 interested and random enough that you can never quite find the
516 pattern that would make it boring.
518 Intermittent reinforcement is a powerful behavioral tool, but it
519 also represents a collective action problem for surveillance
520 capitalism. The
<span class=
"quote">“
<span class=
"quote">engagement techniques
</span>”
</span> invented by the
521 behaviorists of surveillance capitalist companies are quickly
522 copied across the whole sector so that what starts as a
523 mysteriously compelling fillip in the design of a service—like
524 <span class=
"quote">“
<span class=
"quote">pull to refresh
</span>”
</span> or alerts when someone likes your posts or side
525 quests that your characters get invited to while in the midst of
526 main quests—quickly becomes dully ubiquitous. The
527 impossible-to-nail-down nonpattern of randomized drips from your
528 phone becomes a grey-noise wall of sound as every single app and
529 site starts to make use of whatever seems to be working at the
532 From the surveillance capitalist’s point of view, our adaptive
533 capacity is like a harmful bacterium that deprives it of its food
534 source — our attention — and novel techniques for snagging that
535 attention are like new antibiotics that can be used to breach our
536 defenses and destroy our self-determination. And there
537 <span class=
"emphasis"><em>are
</em></span> techniques like that. Who can forget the
538 Great Zynga Epidemic, when all of our friends were caught in
539 <span class=
"emphasis"><em>FarmVille
</em></span>’s endless, mindless dopamine loops?
540 But every new attention-commanding technique is jumped on by the
541 whole industry and used so indiscriminately that antibiotic
542 resistance sets in. Given enough repetition, almost all of us
543 develop immunity to even the most powerful techniques — by
2013,
544 two years after Zynga’s peak, its user base had halved.
546 Not everyone, of course. Some people never adapt to stimulus, just
547 as some people never stop hearing the hum of the refrigerator.
548 This is why most people who are exposed to slot machines play them
549 for a while and then move on while a small and tragic minority
550 liquidate their kids’ college funds, buy adult diapers, and
551 position themselves in front of a machine until they collapse.
553 But surveillance capitalism’s margins on behavioral modification
554 suck. Tripling the rate at which someone buys a widget sounds
556 <a class=
"ulink" href=
"https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957" target=
"_top">unless
557 the base rate is way less than
1%
</a> with an improved rate
558 of… still less than
1%. Even penny slot machines pull down pennies
559 for every spin while surveillance capitalism rakes in
560 infinitesimal penny fractions.
562 Slot machines’ high returns mean that they can be profitable just
563 by draining the fortunes of the small rump of people who are
564 pathologically vulnerable to them and unable to adapt to their
565 tricks. But surveillance capitalism can’t survive on the
566 fractional pennies it brings down from that vulnerable sliver —
567 that’s why, after the Great Zynga Epidemic had finally burned
568 itself out, the small number of still-addicted players left behind
569 couldn’t sustain it as a global phenomenon. And new powerful
570 attention weapons aren’t easy to find, as is evidenced by the long
571 years since the last time Zynga had a hit. Despite the hundreds of
572 millions of dollars that Zynga has to spend on developing new
573 tools to blast through our adaptation, it has never managed to
574 repeat the lucky accident that let it snag so much of our
575 attention for a brief moment in
2009. Powerhouses like Supercell
576 have fared a little better, but they are rare and throw away many
577 failures for every success.
579 The vulnerability of small segments of the population to dramatic,
580 efficient corporate manipulation is a real concern that’s worthy
581 of our attention and energy. But it’s not an existential threat to
583 </p></div></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak"></a>If data is the new oil, then surveillance capitalism’s engine
584 has a leak
</h2></div></div></div><p>
585 This adaptation problem offers an explanation for one of
586 surveillance capitalism’s most alarming traits: its relentless
587 hunger for data and its endless expansion of data-gathering
588 capabilities through the spread of sensors, online surveillance, and
589 acquisition of data streams from third parties.
591 Zuboff observes this phenomenon and concludes that data must be very
592 valuable if surveillance capitalism is so hungry for it. (In her
593 words:
<span class=
"quote">“
<span class=
"quote">Just as industrial capitalism was driven to the continuous
594 intensification of the means of production, so surveillance
595 capitalists and their market players are now locked into the
596 continuous intensification of the means of behavioral modification
597 and the gathering might of instrumentarian power.
</span>”
</span>) But what if the
598 voracious appetite is because data has such a short half-life —
599 because people become inured so quickly to new, data-driven
600 persuasion techniques — that the companies are locked in an arms
601 race with our limbic system? What if it’s all a Red Queen’s race
602 where they have to run ever faster — collect ever-more data — just
603 to stay in the same spot?
605 Of course, all of Big Tech’s persuasion techniques work in concert
606 with one another, and collecting data is useful beyond mere
609 If someone wants to recruit you to buy a refrigerator or join a
610 pogrom, they might use profiling and targeting to send messages to
611 people they judge to be good sales prospects. The messages
612 themselves may be deceptive, making claims about things you’re not
613 very knowledgeable about (food safety and energy efficiency or
614 eugenics and historical claims about racial superiority). They might
615 use search engine optimization and/or armies of fake reviewers and
616 commenters and/or paid placement to dominate the discourse so that
617 any search for further information takes you back to their messages.
618 And finally, they may refine the different pitches using machine
619 learning and other techniques to figure out what kind of pitch works
620 best on someone like you.
622 Each phase of this process benefits from surveillance: The more data
623 they have, the more precisely they can profile you and target you
624 with specific messages. Think of how you’d sell a fridge if you knew
625 that the warranty on your prospect’s fridge just expired and that
626 they were expecting a tax rebate in April.
628 Also, the more data they have, the better they can craft deceptive
629 messages — if I know that you’re into genealogy, I might not try to
630 feed you pseudoscience about genetic differences between
<span class=
"quote">“
<span class=
"quote">races,
</span>”
</span>
631 sticking instead to conspiratorial secret histories of
<span class=
"quote">“
<span class=
"quote">demographic
632 replacement
</span>”
</span> and the like.
634 Facebook also helps you locate people who have the same odious or
635 antisocial views as you. It makes it possible to find other people
636 who want to carry tiki torches through the streets of
637 Charlottesville in Confederate cosplay. It can help you find other
638 people who want to join your militia and go to the border to look
639 for undocumented migrants to terrorize. It can help you find people
640 who share your belief that vaccines are poison and that the Earth is
643 There is one way in which targeted advertising uniquely benefits
644 those advocating for socially unacceptable causes: It is invisible.
645 Racism is widely geographically dispersed, and there are few places
646 where racists — and only racists — gather. This is similar to the
647 problem of selling refrigerators in that potential refrigerator
648 purchasers are geographically dispersed and there are few places
649 where you can buy an ad that will be primarily seen by refrigerator
650 customers. But buying a refrigerator is socially acceptable while
651 being a Nazi is not, so you can buy a billboard or advertise in the
652 newspaper sports section for your refrigerator business, and the
653 only potential downside is that your ad will be seen by a lot of
654 people who don’t want refrigerators, resulting in a lot of wasted
657 But even if you wanted to advertise your Nazi movement on a
658 billboard or prime-time TV or the sports section, you would struggle
659 to find anyone willing to sell you the space for your ad partly
660 because they disagree with your views and partly because they fear
661 censure (boycott, reputational damage, etc.) from other people who
662 disagree with your views.
664 Targeted ads solve this problem: On the internet, every ad unit can
665 be different for every person, meaning that you can buy ads that are
666 only shown to people who appear to be Nazis and not to people who
667 hate Nazis. When there’s spillover — when someone who hates racism
668 is shown a racist recruiting ad — there is some fallout; the
669 platform or publication might get an angry public or private
670 denunciation. But the nature of the risk assumed by an online ad
671 buyer is different than the risks to a traditional publisher or
672 billboard owner who might want to run a Nazi ad.
674 Online ads are placed by algorithms that broker between a diverse
675 ecosystem of self-serve ad platforms that anyone can buy an ad
676 through, so the Nazi ad that slips onto your favorite online
677 publication isn’t seen as their moral failing but rather as a
678 failure in some distant, upstream ad supplier. When a publication
679 gets a complaint about an offensive ad that’s appearing in one of
680 its units, it can take some steps to block that ad, but the Nazi
681 might buy a slightly different ad from a different broker serving
682 the same unit. And in any event, internet users increasingly
683 understand that when they see an ad, it’s likely that the advertiser
684 did not choose that publication and that the publication has no idea
685 who its advertisers are.
687 These layers of indirection between advertisers and publishers serve
688 as moral buffers: Today’s moral consensus is largely that publishers
689 shouldn’t be held responsible for the ads that appear on their pages
690 because they’re not actively choosing to put those ads there.
691 Because of this, Nazis are able to overcome significant barriers to
692 organizing their movement.
694 Data has a complex relationship with domination. Being able to spy
695 on your customers can alert you to their preferences for your rivals
696 and allow you to head off your rivals at the pass.
698 More importantly, if you can dominate the information space while
699 also gathering data, then you make other deceptive tactics stronger
700 because it’s harder to break out of the web of deceit you’re
701 spinning. Domination — that is, ultimately becoming a monopoly — and
702 not the data itself is the supercharger that makes every tactic
703 worth pursuing because monopolistic domination deprives your target
706 If you’re a Nazi who wants to ensure that your prospects primarily
707 see deceptive, confirming information when they search for more, you
708 can improve your odds by seeding the search terms they use through
709 your initial communications. You don’t need to own the top
10
710 results for
<span class=
"quote">“
<span class=
"quote">voter suppression
</span>”
</span> if you can convince your marks to
711 confine their search terms to
<span class=
"quote">“
<span class=
"quote">voter fraud,
</span>”
</span> which throws up a very
712 different set of search results.
714 Surveillance capitalists are like stage mentalists who claim that
715 their extraordinary insights into human behavior let them guess the
716 word that you wrote down and folded up in your pocket but who really
717 use shills, hidden cameras, sleight of hand, and brute-force
718 memorization to amaze you.
720 Or perhaps they’re more like pick-up artists, the misogynistic cult
721 that promises to help awkward men have sex with women by teaching
722 them
<span class=
"quote">“
<span class=
"quote">neurolinguistic programming
</span>”
</span> phrases, body language
723 techniques, and psychological manipulation tactics like
<span class=
"quote">“
<span class=
"quote">negging
</span>”
</span> —
724 offering unsolicited negative feedback to women to lower their
725 self-esteem and prick their interest.
727 Some pick-up artists eventually manage to convince women to go home
728 with them, but it’s not because these men have figured out how to
729 bypass women’s critical faculties. Rather, pick-up artists’
730 <span class=
"quote">“
<span class=
"quote">success
</span>”
</span> stories are a mix of women who were incapable of giving
731 consent, women who were coerced, women who were intoxicated,
732 self-destructive women, and a few women who were sober and in
733 command of their faculties but who didn’t realize straightaway that
734 they were with terrible men but rectified the error as soon as they
737 Pick-up artists
<span class=
"emphasis"><em>believe
</em></span> they have figured out a
738 secret back door that bypasses women’s critical faculties, but they
739 haven’t. Many of the tactics they deploy, like negging, became the
740 butt of jokes (just like people joke about bad ad targeting), and
741 there’s a good chance that anyone they try these tactics on will
742 immediately recognize them and dismiss the men who use them as
745 Pick-up artists are proof that people can believe they have
746 developed a system of mind control
<span class=
"emphasis"><em>even when it doesn’t
747 work
</em></span>. Pick-up artists simply exploit the fact that
748 one-in-a-million chances can come through for you if you make a
749 million attempts, and then they assume that the other
999,
999 times,
750 they simply performed the technique incorrectly and commit
751 themselves to doing better next time. There’s only one group of
752 people who find pick-up artist lore reliably convincing: other
753 would-be pick-up artists whose anxiety and insecurity make them
754 vulnerable to scammers and delusional men who convince them that if
755 they pay for tutelage and follow instructions, then they will
756 someday succeed. Pick-up artists assume they fail to entice women
757 because they are bad at being pick-up artists, not because pick-up
758 artistry is bullshit. Pick-up artists are bad at selling themselves
759 to women, but they’re much better at selling themselves to men who
760 pay to learn the secrets of pick-up artistry.
762 Department store pioneer John Wanamaker is said to have lamented,
763 <span class=
"quote">“
<span class=
"quote">Half the money I spend on advertising is wasted; the trouble is I
764 don’t know which half.
</span>”
</span> The fact that Wanamaker thought that only
765 half of his advertising spending was wasted is a tribute to the
766 persuasiveness of advertising executives, who are
767 <span class=
"emphasis"><em>much
</em></span> better at convincing potential clients to
768 buy their services than they are at convincing the general public to
769 buy their clients’ wares.
770 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"what-is-facebook"></a>What is Facebook?
</h2></div></div></div><p>
771 Facebook is heralded as the origin of all of our modern plagues, and
772 it’s not hard to see why. Some tech companies want to lock their
773 users in but make their money by monopolizing access to the market
774 for apps for their devices and gouging them on prices rather than by
775 spying on them (like Apple). Some companies don’t care about locking
776 in users because they’ve figured out how to spy on them no matter
777 where they are and what they’re doing and can turn that surveillance
778 into money (Google). Facebook alone among the Western tech giants
779 has built a business based on locking in its users
780 <span class=
"emphasis"><em>and
</em></span> spying on them all the time.
782 Facebook’s surveillance regime is really without parallel in the
783 Western world. Though Facebook tries to prevent itself from being
784 visible on the public web, hiding most of what goes on there from
785 people unless they’re logged into Facebook, the company has
786 nevertheless booby-trapped the entire web with surveillance tools in
787 the form of Facebook
<span class=
"quote">“
<span class=
"quote">Like
</span>”
</span> buttons that web publishers include on
788 their sites to boost their Facebook profiles. Facebook also makes
789 various libraries and other useful code snippets available to web
790 publishers that act as surveillance tendrils on the sites where
791 they’re used, funneling information about visitors to the site —
792 newspapers, dating sites, message boards — to Facebook.
793 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
794 Big Tech is able to practice surveillance not just because it is
795 tech but because it is
<span class=
"emphasis"><em>big
</em></span>.
796 </p></blockquote></div><p>
797 Facebook offers similar tools to app developers, so the apps —
798 games, fart machines, business review services, apps for keeping
799 abreast of your kid’s schooling — you use will send information
800 about your activities to Facebook even if you don’t have a Facebook
801 account and even if you don’t download or use Facebook apps. On top
802 of all that, Facebook buys data from third-party brokers on shopping
803 habits, physical location, use of
<span class=
"quote">“
<span class=
"quote">loyalty
</span>”
</span> programs, financial
804 transactions, etc., and cross-references that with the dossiers it
805 develops on activity on Facebook and with apps and the public web.
807 Though it’s easy to integrate the web with Facebook — linking to
808 news stories and such — Facebook products are generally not
809 available to be integrated back into the web itself. You can embed a
810 tweet in a Facebook post, but if you embed a Facebook post in a
811 tweet, you just get a link back to Facebook and must log in before
812 you can see it. Facebook has used extreme technological and legal
813 countermeasures to prevent rivals from allowing their users to embed
814 Facebook snippets in competing services or to create alternative
815 interfaces to Facebook that merge your Facebook inbox with those of
816 other services that you use.
818 And Facebook is incredibly popular, with
2.3 billion claimed users
819 (though many believe this figure to be inflated). Facebook has been
820 used to organize genocidal pogroms, racist riots, anti-vaccination
821 movements, flat Earth cults, and the political lives of some of the
822 world’s ugliest, most brutal autocrats. There are some really
823 alarming things going on in the world, and Facebook is implicated in
824 many of them, so it’s easy to conclude that these bad things are the
825 result of Facebook’s mind-control system, which it rents out to
826 anyone with a few bucks to spend.
828 To understand what role Facebook plays in the formulation and
829 mobilization of antisocial movements, we need to understand the dual
832 Because it has a lot of users and a lot of data about those users,
833 Facebook is a very efficient tool for locating people with
834 hard-to-find traits, the kinds of traits that are widely diffused in
835 the population such that advertisers have historically struggled to
836 find a cost-effective way to reach them. Think back to
837 refrigerators: Most of us only replace our major appliances a few
838 times in our entire lives. If you’re a refrigerator manufacturer or
839 retailer, you have these brief windows in the life of a consumer
840 during which they are pondering a purchase, and you have to somehow
841 reach them. Anyone who’s ever registered a title change after buying
842 a house can attest that appliance manufacturers are incredibly
843 desperate to reach anyone who has even the slenderest chance of
844 being in the market for a new fridge.
846 Facebook makes finding people shopping for refrigerators a
847 <span class=
"emphasis"><em>lot
</em></span> easier. It can target ads to people who’ve
848 registered a new home purchase, to people who’ve searched for
849 refrigerator buying advice, to people who have complained about
850 their fridge dying, or any combination thereof. It can even target
851 people who’ve recently bought
<span class=
"emphasis"><em>other
</em></span> kitchen
852 appliances on the theory that someone who’s just replaced their
853 stove and dishwasher might be in a fridge-buying kind of mood. The
854 vast majority of people who are reached by these ads will not be in
855 the market for a new fridge, but — crucially — the percentage of
856 people who
<span class=
"emphasis"><em>are
</em></span> looking for fridges that these
857 ads reach is
<span class=
"emphasis"><em>much
</em></span> larger than it is than for
858 any group that might be subjected to traditional, offline targeted
859 refrigerator marketing.
861 Facebook also makes it a lot easier to find people who have the same
862 rare disease as you, which might have been impossible in earlier
863 eras — the closest fellow sufferer might otherwise be hundreds of
864 miles away. It makes it easier to find people who went to the same
865 high school as you even though decades have passed and your former
866 classmates have all been scattered to the four corners of the Earth.
868 Facebook also makes it much easier to find people who hold the same
869 rare political beliefs as you. If you’ve always harbored a secret
870 affinity for socialism but never dared utter this aloud lest you be
871 demonized by your neighbors, Facebook can help you discover other
872 people who feel the same way (and it might just demonstrate to you
873 that your affinity is more widespread than you ever suspected). It
874 can make it easier to find people who share your sexual identity.
875 And again, it can help you to understand that what you thought was a
876 shameful secret that affected only you was really a widely shared
877 trait, giving you both comfort and the courage to come out to the
880 All of this presents a dilemma for Facebook: Targeting makes the
881 company’s ads more effective than traditional ads, but it also lets
882 advertisers see just how effective their ads are. While advertisers
883 are pleased to learn that Facebook ads are more effective than ads
884 on systems with less sophisticated targeting, advertisers can also
885 see that in nearly every case, the people who see their ads ignore
886 them. Or, at best, the ads work on a subconscious level, creating
887 nebulous unmeasurables like
<span class=
"quote">“
<span class=
"quote">brand recognition.
</span>”
</span> This means that the
888 price per ad is very low in nearly every case.
890 To make things worse, many Facebook groups spark precious little
891 discussion. Your little-league soccer team, the people with the same
892 rare disease as you, and the people you share a political affinity
893 with may exchange the odd flurry of messages at critical junctures,
894 but on a daily basis, there’s not much to say to your old high
895 school chums or other hockey-card collectors.
897 With nothing but
<span class=
"quote">“
<span class=
"quote">organic
</span>”
</span> discussion, Facebook would not generate
898 enough traffic to sell enough ads to make the money it needs to
899 continually expand by buying up its competitors while returning
900 handsome sums to its investors.
902 So Facebook has to gin up traffic by sidetracking its own forums:
903 Every time Facebook’s algorithm injects controversial materials —
904 inflammatory political articles, conspiracy theories, outrage
905 stories — into a group, it can hijack that group’s nominal purpose
906 with its desultory discussions and supercharge those discussions by
907 turning them into bitter, unproductive arguments that drag on and
908 on. Facebook is optimized for engagement, not happiness, and it
909 turns out that automated systems are pretty good at figuring out
910 things that people will get angry about.
912 Facebook
<span class=
"emphasis"><em>can
</em></span> modify our behavior but only in a
913 couple of trivial ways. First, it can lock in all your friends and
914 family members so that you check and check and check with Facebook
915 to find out what they are up to; and second, it can make you angry
916 and anxious. It can force you to choose between being interrupted
917 constantly by updates — a process that breaks your concentration and
918 makes it hard to be introspective — and staying in touch with your
919 friends. This is a very limited form of mind control, and it can
920 only really make us miserable, angry, and anxious.
922 This is why Facebook’s targeting systems — both the ones it shows to
923 advertisers and the ones that let users find people who share their
924 interests — are so next-gen and smooth and easy to use as well as
925 why its message boards have a toolset that seems like it hasn’t
926 changed since the mid-
2000s. If Facebook delivered an equally
927 flexible, sophisticated message-reading system to its users, those
928 users could defend themselves against being nonconsensually
929 eyeball-fucked with Donald Trump headlines.
931 The more time you spend on Facebook, the more ads it gets to show
932 you. The solution to Facebook’s ads only working one in a thousand
933 times is for the company to try to increase how much time you spend
934 on Facebook by a factor of a thousand. Rather than thinking of
935 Facebook as a company that has figured out how to show you exactly
936 the right ad in exactly the right way to get you to do what its
937 advertisers want, think of it as a company that has figured out how
938 to make you slog through an endless torrent of arguments even though
939 they make you miserable, spending so much time on the site that it
940 eventually shows you at least one ad that you respond to.
941 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"monopoly-and-the-right-to-the-future-tense"></a>Monopoly and the right to the future tense
</h2></div></div></div><p>
942 Zuboff and her cohort are particularly alarmed at the extent to
943 which surveillance allows corporations to influence our decisions,
944 taking away something she poetically calls
<span class=
"quote">“
<span class=
"quote">the right to the future
945 tense
</span>”
</span> — that is, the right to decide for yourself what you will do
948 It’s true that advertising can tip the scales one way or another:
949 When you’re thinking of buying a fridge, a timely fridge ad might
950 end the search on the spot. But Zuboff puts enormous and undue
951 weight on the persuasive power of surveillance-based influence
952 techniques. Most of these don’t work very well, and the ones that do
953 won’t work for very long. The makers of these influence tools are
954 confident they will someday refine them into systems of total
955 control, but they are hardly unbiased observers, and the risks from
956 their dreams coming true are very speculative.
958 By contrast, Zuboff is rather sanguine about
40 years of lax
959 antitrust practice that has allowed a handful of companies to
960 dominate the internet, ushering in an information age with,
961 <a class=
"ulink" href=
"https://twitter.com/tveastman/status/1069674780826071040" target=
"_top">as
962 one person on Twitter noted
</a>, five giant websites each filled
963 with screenshots of the other four.
965 However, if we are to be alarmed that we might lose the right to
966 choose for ourselves what our future will hold, then monopoly’s
967 nonspeculative, concrete, here-and-now harms should be front and
968 center in our debate over tech policy.
970 Start with
<span class=
"quote">“
<span class=
"quote">digital rights management.
</span>”
</span> In
1998, Bill Clinton signed
971 the Digital Millennium Copyright Act (DMCA) into law. It’s a complex
972 piece of legislation with many controversial clauses but none more
973 so than Section
1201, the
<span class=
"quote">“
<span class=
"quote">anti-circumvention
</span>”
</span> rule.
975 This is a blanket ban on tampering with systems that restrict access
976 to copyrighted works. The ban is so thoroughgoing that it prohibits
977 removing a copyright lock even when no copyright infringement takes
978 place. This is by design: The activities that the DMCA’s Section
979 1201 sets out to ban are not copyright infringements; rather, they
980 are legal activities that frustrate manufacturers’ commercial plans.
982 For example, Section
1201’s first major application was on DVD
983 players as a means of enforcing the region coding built into those
984 devices. DVD-CCA, the body that standardized DVDs and DVD players,
985 divided the world into six regions and specified that DVD players
986 must check each disc to determine which regions it was authorized to
987 be played in. DVD players would have their own corresponding region
988 (a DVD player bought in the U.S. would be region
1 while one bought
989 in India would be region
5). If the player and the disc’s region
990 matched, the player would play the disc; otherwise, it would reject
993 However, watching a lawfully produced disc in a country other than
994 the one where you purchased it is not copyright infringement — it’s
995 the opposite. Copyright law imposes this duty on customers for a
996 movie: You must go into a store, find a licensed disc, and pay the
997 asking price. Do that — and
<span class=
"emphasis"><em>nothing else
</em></span> — and
998 you and copyright are square with one another.
1000 The fact that a movie studio wants to charge Indians less than
1001 Americans or release in Australia later than it releases in the U.K.
1002 has no bearing on copyright law. Once you lawfully acquire a DVD, it
1003 is no copyright infringement to watch it no matter where you happen
1006 So DVD and DVD player manufacturers would not be able to use
1007 accusations of abetting copyright infringement to punish
1008 manufacturers who made noncompliant players that would play discs
1009 from any region or repair shops that modified players to let you
1010 watch out-of-region discs or software programmers who created
1011 programs to let you do this.
1013 That’s where Section
1201 of the DMCA comes in: By banning tampering
1014 with an
<span class=
"quote">“
<span class=
"quote">access control,
</span>”
</span> the rule gave manufacturers and rights
1015 holders standing to sue competitors who released superior products
1016 with lawful features that the market demanded (in this case,
1017 region-free players).
1019 This is an odious scam against consumers, but as time went by,
1020 Section
1201 grew to encompass a rapidly expanding constellation of
1021 devices and services as canny manufacturers have realized certain
1023 </p><div class=
"itemizedlist"><ul class=
"itemizedlist compact" style=
"list-style-type: disc; "><li class=
"listitem"><p>
1024 Any device with software in it contains a
<span class=
"quote">“
<span class=
"quote">copyrighted work
</span>”
</span> —
1026 </p></li><li class=
"listitem"><p>
1027 A device can be designed so that reconfiguring the software
1028 requires bypassing an
<span class=
"quote">“
<span class=
"quote">access control for copyrighted works,
</span>”
</span>
1029 which is a potential felony under Section
1201.
1030 </p></li><li class=
"listitem"><p>
1031 Thus, companies can control their customers’ behavior after they
1032 take home their purchases by designing products so that all
1033 unpermitted uses require modifications that fall afoul of
1035 </p></li></ul></div><p>
1036 Section
1201 then becomes a means for manufacturers of all
1037 descriptions to force their customers to arrange their affairs to
1038 benefit the manufacturers’ shareholders instead of themselves.
1040 This manifests in many ways: from a new generation of inkjet
1041 printers that use countermeasures to prevent third-party ink that
1042 cannot be bypassed without legal risks to similar systems in
1043 tractors that prevent third-party technicians from swapping in the
1044 manufacturer’s own parts that are not recognized by the tractor’s
1045 control system until it is supplied with a manufacturer’s unlock
1048 Closer to home, Apple’s iPhones use these measures to prevent both
1049 third-party service and third-party software installation. This
1050 allows Apple to decide when an iPhone is beyond repair and must be
1051 shredded and landfilled as opposed to the iPhone’s purchaser. (Apple
1052 is notorious for its environmentally catastrophic policy of
1053 destroying old electronics rather than permitting them to be
1054 cannibalized for parts.) This is a very useful power to wield,
1055 especially in light of CEO Tim Cook’s January
2019 warning to
1056 investors that the company’s profits are endangered by customers
1057 choosing to hold onto their phones for longer rather than replacing
1060 Apple’s use of copyright locks also allows it to establish a
1061 monopoly over how its customers acquire software for their mobile
1062 devices. The App Store’s commercial terms guarantee Apple a share of
1063 all revenues generated by the apps sold there, meaning that Apple
1064 gets paid when you buy an app from its store and then continues to
1065 get paid every time you buy something using that app. This comes out
1066 of the bottom line of software developers, who must either charge
1067 more or accept lower profits for their products.
1069 Crucially, Apple’s use of copyright locks gives it the power to make
1070 editorial decisions about which apps you may and may not install on
1071 your own device. Apple has used this power to
1072 <a class=
"ulink" href=
"https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html" target=
"_top">reject
1073 dictionaries
</a> for containing obscene words; to
1074 <a class=
"ulink" href=
"https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again" target=
"_top">limit
1075 political speech
</a>, especially from apps that make sensitive
1076 political commentary such as an app that notifies you every time a
1077 U.S. drone kills someone somewhere in the world; and to
1078 <a class=
"ulink" href=
"https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says" target=
"_top">object
1079 to a game
</a> that commented on the Israel-Palestine conflict.
1081 Apple often justifies monopoly power over software installation in
1082 the name of security, arguing that its vetting of apps for its store
1083 means that it can guard its users against apps that contain
1084 surveillance code. But this cuts both ways. In China, the government
1085 <a class=
"ulink" href=
"https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc" target=
"_top">ordered
1086 Apple to prohibit the sale of privacy tools
</a> like VPNs with
1087 the exception of VPNs that had deliberately introduced flaws
1088 designed to let the Chinese state eavesdrop on users. Because Apple
1089 uses technological countermeasures — with legal backstops — to block
1090 customers from installing unauthorized apps, Chinese iPhone owners
1091 cannot readily (or legally) acquire VPNs that would protect them
1092 from Chinese state snooping.
1094 Zuboff calls surveillance capitalism a
<span class=
"quote">“
<span class=
"quote">rogue capitalism.
</span>”
</span>
1095 Theoreticians of capitalism claim that its virtue is that it
1096 <a class=
"ulink" href=
"https://en.wikipedia.org/wiki/Price_signal" target=
"_top">aggregates
1097 information in the form of consumers’ decisions
</a>, producing
1098 efficient markets. Surveillance capitalism’s supposed power to rob
1099 its victims of their free will through computationally supercharged
1100 influence campaigns means that our markets no longer aggregate
1101 customers’ decisions because we customers no longer decide — we are
1102 given orders by surveillance capitalism’s mind-control rays.
1104 If our concern is that markets cease to function when consumers can
1105 no longer make choices, then copyright locks should concern us at
1106 <span class=
"emphasis"><em>least
</em></span> as much as influence campaigns. An
1107 influence campaign might nudge you to buy a certain brand of phone;
1108 but the copyright locks on that phone absolutely determine where you
1109 get it serviced, which apps can run on it, and when you have to
1110 throw it away rather than fixing it.
1111 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"search-order-and-the-right-to-the-future-tense"></a>Search order and the right to the future tense
</h2></div></div></div><p>
1112 Markets are posed as a kind of magic: By discovering otherwise
1113 hidden information conveyed by the free choices of consumers, those
1114 consumers’ local knowledge is integrated into a self-correcting
1115 system that makes efficient allocations—more efficient than any
1116 computer could calculate. But monopolies are incompatible with that
1117 notion. When you only have one app store, the owner of the store —
1118 not the consumer — decides on the range of choices. As Boss Tweed
1119 once said,
<span class=
"quote">“
<span class=
"quote">I don’t care who does the electing, so long as I get to
1120 do the nominating.
</span>”
</span> A monopolized market is an election whose
1121 candidates are chosen by the monopolist.
1123 This ballot rigging is made more pernicious by the existence of
1124 monopolies over search order. Google’s search market share is about
1125 90%. When Google’s ranking algorithm puts a result for a popular
1126 search term in its top
10, that helps determine the behavior of
1127 millions of people. If Google’s answer to
<span class=
"quote">“
<span class=
"quote">Are vaccines dangerous?
</span>”
</span>
1128 is a page that rebuts anti-vax conspiracy theories, then a sizable
1129 portion of the public will learn that vaccines are safe. If, on the
1130 other hand, Google sends those people to a site affirming the
1131 anti-vax conspiracies, a sizable portion of those millions will come
1132 away convinced that vaccines are dangerous.
1134 Google’s algorithm is often tricked into serving disinformation as a
1135 prominent search result. But in these cases, Google isn’t persuading
1136 people to change their minds; it’s just presenting something untrue
1137 as fact when the user has no cause to doubt it.
1139 This is true whether the search is for
<span class=
"quote">“
<span class=
"quote">Are vaccines dangerous?
</span>”
</span> or
1140 <span class=
"quote">“
<span class=
"quote">best restaurants near me.
</span>”
</span> Most users will never look past the
1141 first page of search results, and when the overwhelming majority of
1142 people all use the same search engine, the ranking algorithm
1143 deployed by that search engine will determine myriad outcomes
1144 (whether to adopt a child, whether to have cancer surgery, where to
1145 eat dinner, where to move, where to apply for a job) to a degree
1146 that vastly outstrips any behavioral outcomes dictated by
1147 algorithmic persuasion techniques.
1149 Many of the questions we ask search engines have no empirically
1150 correct answers:
<span class=
"quote">“
<span class=
"quote">Where should I eat dinner?
</span>”
</span> is not an objective
1151 question. Even questions that do have correct answers (
<span class=
"quote">“
<span class=
"quote">Are vaccines
1152 dangerous?
</span>”
</span>) don’t have one empirically superior source for that
1153 answer. Many pages affirm the safety of vaccines, so which one goes
1154 first? Under conditions of competition, consumers can choose from
1155 many search engines and stick with the one whose algorithmic
1156 judgment suits them best, but under conditions of monopoly, we all
1157 get our answers from the same place.
1159 Google’s search dominance isn’t a matter of pure merit: The company
1160 has leveraged many tactics that would have been prohibited under
1161 classical, pre-Ronald-Reagan antitrust enforcement standards to
1162 attain its dominance. After all, this is a company that has
1163 developed two major products: a really good search engine and a
1164 pretty good Hotmail clone. Every other major success it’s had —
1165 Android, YouTube, Google Maps, etc. — has come through an
1166 acquisition of a nascent competitor. Many of the company’s key
1167 divisions, such as the advertising technology of DoubleClick,
1168 violate the historical antitrust principle of structural separation,
1169 which forbade firms from owning subsidiaries that competed with
1170 their customers. Railroads, for example, were barred from owning
1171 freight companies that competed with the shippers whose freight they
1174 If we’re worried about giant companies subverting markets by
1175 stripping consumers of their ability to make free choices, then
1176 vigorous antitrust enforcement seems like an excellent remedy. If
1177 we’d denied Google the right to effect its many mergers, we would
1178 also have probably denied it its total search dominance. Without
1179 that dominance, the pet theories, biases, errors (and good judgment,
1180 too) of Google search engineers and product managers would not have
1181 such an outsized effect on consumer choice.
1183 This goes for many other companies. Amazon, a classic surveillance
1184 capitalist, is obviously the dominant tool for searching Amazon —
1185 though many people find their way to Amazon through Google searches
1186 and Facebook posts — and obviously, Amazon controls Amazon search.
1187 That means that Amazon’s own self-serving editorial choices—like
1188 promoting its own house brands over rival goods from its sellers as
1189 well as its own pet theories, biases, and errors— determine much of
1190 what we buy on Amazon. And since Amazon is the dominant e-commerce
1191 retailer outside of China and since it attained that dominance by
1192 buying up both large rivals and nascent competitors in defiance of
1193 historical antitrust rules, we can blame the monopoly for stripping
1194 consumers of their right to the future tense and the ability to
1195 shape markets by making informed choices.
1197 Not every monopolist is a surveillance capitalist, but that doesn’t
1198 mean they’re not able to shape consumer choices in wide-ranging
1199 ways. Zuboff lauds Apple for its App Store and iTunes Store,
1200 insisting that adding price tags to the features on its platforms
1201 has been the secret to resisting surveillance and thus creating
1202 markets. But Apple is the only retailer allowed to sell on its
1203 platforms, and it’s the second-largest mobile device vendor in the
1204 world. The independent software vendors that sell through Apple’s
1205 marketplace accuse the company of the same surveillance sins as
1206 Amazon and other big retailers: spying on its customers to find
1207 lucrative new products to launch, effectively using independent
1208 software vendors as free-market researchers, then forcing them out
1209 of any markets they discover.
1211 Because of its use of copyright locks, Apple’s mobile customers are
1212 not legally allowed to switch to a rival retailer for its apps if
1213 they want to do so on an iPhone. Apple, obviously, is the only
1214 entity that gets to decide how it ranks the results of search
1215 queries in its stores. These decisions ensure that some apps are
1216 often installed (because they appear on page one) and others are
1217 never installed (because they appear on page one million). Apple’s
1218 search-ranking design decisions have a vastly more significant
1219 effect on consumer behaviors than influence campaigns delivered by
1220 surveillance capitalism’s ad-serving bots.
1221 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"monopolists-can-afford-sleeping-pills-for-watchdogs"></a>Monopolists can afford sleeping pills for watchdogs
</h2></div></div></div><p>
1222 Only the most extreme market ideologues think that markets can
1223 self-regulate without state oversight. Markets need watchdogs —
1224 regulators, lawmakers, and other elements of democratic control — to
1225 keep them honest. When these watchdogs sleep on the job, then
1226 markets cease to aggregate consumer choices because those choices
1227 are constrained by illegitimate and deceptive activities that
1228 companies are able to get away with because no one is holding them
1231 But this kind of regulatory capture doesn’t come cheap. In
1232 competitive sectors, where rivals are constantly eroding one
1233 another’s margins, individual firms lack the surplus capital to
1234 effectively lobby for laws and regulations that serve their ends.
1236 Many of the harms of surveillance capitalism are the result of weak
1237 or nonexistent regulation. Those regulatory vacuums spring from the
1238 power of monopolists to resist stronger regulation and to tailor
1239 what regulation exists to permit their existing businesses.
1241 Here’s an example: When firms over-collect and over-retain our data,
1242 they are at increased risk of suffering a breach — you can’t leak
1243 data you never collected, and once you delete all copies of that
1244 data, you can no longer leak it. For more than a decade, we’ve lived
1245 through an endless parade of ever-worsening data breaches, each one
1246 uniquely horrible in the scale of data breached and the sensitivity
1249 But still, firms continue to over-collect and over-retain our data
1252 <span class=
"strong"><strong>1. They are locked in the aforementioned
1253 limbic arms race with our capacity to shore up our attentional
1254 defense systems to resist their new persuasion
1255 techniques.
</strong></span> They’re also locked in an arms race with
1256 their competitors to find new ways to target people for sales
1257 pitches. As soon as they discover a soft spot in our attentional
1258 defenses (a counterintuitive, unobvious way to target potential
1259 refrigerator buyers), the public begins to wise up to the tactic,
1260 and their competitors leap on it, hastening the day in which all
1261 potential refrigerator buyers have been inured to the pitch.
1263 <span class=
"strong"><strong>2. They believe the surveillance capitalism
1264 story.
</strong></span> Data is cheap to aggregate and store, and both
1265 proponents and opponents of surveillance capitalism have assured
1266 managers and product designers that if you collect enough data, you
1267 will be able to perform sorcerous acts of mind control, thus
1268 supercharging your sales. Even if you never figure out how to profit
1269 from the data, someone else will eventually offer to buy it from you
1270 to give it a try. This is the hallmark of all economic bubbles:
1271 acquiring an asset on the assumption that someone else will buy it
1272 from you for more than you paid for it, often to sell to someone
1273 else at an even greater price.
1275 <span class=
"strong"><strong>3. The penalties for leaking data are
1276 negligible.
</strong></span> Most countries limit these penalties to
1277 actual damages, meaning that consumers who’ve had their data
1278 breached have to show actual monetary harms to get a reward. In
1279 2014, Home Depot disclosed that it had lost credit-card data for
53
1280 million of its customers, but it settled the matter by paying those
1281 customers about $
0.34 each — and a third of that $
0.34 wasn’t even
1282 paid in cash. It took the form of a credit to procure a largely
1283 ineffectual credit-monitoring service.
1285 But the harms from breaches are much more extensive than these
1286 actual-damages rules capture. Identity thieves and fraudsters are
1287 wily and endlessly inventive. All the vast breaches of our century
1288 are being continuously recombined, the data sets merged and mined
1289 for new ways to victimize the people whose data was present in them.
1290 Any reasonable, evidence-based theory of deterrence and compensation
1291 for breaches would not confine damages to actual damages but rather
1292 would allow users to claim these future harms.
1294 However, even the most ambitious privacy rules, such as the EU
1295 General Data Protection Regulation, fall far short of capturing the
1296 negative externalities of the platforms’ negligent over-collection
1297 and over-retention, and what penalties they do provide are not
1298 aggressively pursued by regulators.
1300 This tolerance of — or indifference to — data over-collection and
1301 over-retention can be ascribed in part to the sheer lobbying muscle
1302 of the platforms. They are so profitable that they can handily
1303 afford to divert gigantic sums to fight any real change — that is,
1304 change that would force them to internalize the costs of their
1305 surveillance activities.
1307 And then there’s state surveillance, which the surveillance
1308 capitalism story dismisses as a relic of another era when the big
1309 worry was being jailed for your dissident speech, not having your
1310 free will stripped away with machine learning.
1312 But state surveillance and private surveillance are intimately
1313 related. As we saw when Apple was conscripted by the Chinese
1314 government as a vital collaborator in state surveillance, the only
1315 really affordable and tractable way to conduct mass surveillance on
1316 the scale practiced by modern states — both
<span class=
"quote">“
<span class=
"quote">free
</span>”
</span> and autocratic
1317 states — is to suborn commercial services.
1319 Whether it’s Google being used as a location tracking tool by local
1320 law enforcement across the U.S. or the use of social media tracking
1321 by the Department of Homeland Security to build dossiers on
1322 participants in protests against Immigration and Customs
1323 Enforcement’s family separation practices, any hard limits on
1324 surveillance capitalism would hamstring the state’s own surveillance
1325 capability. Without Palantir, Amazon, Google, and other major tech
1326 contractors, U.S. cops would not be able to spy on Black people, ICE
1327 would not be able to manage the caging of children at the U.S.
1328 border, and state welfare systems would not be able to purge their
1329 rolls by dressing up cruelty as empiricism and claiming that poor
1330 and vulnerable people are ineligible for assistance. At least some
1331 of the states’ unwillingness to take meaningful action to curb
1332 surveillance should be attributed to this symbiotic relationship.
1333 There is no mass state surveillance without mass commercial
1336 Monopolism is key to the project of mass state surveillance. It’s
1337 true that smaller tech firms are apt to be less well-defended than
1338 Big Tech, whose security experts are drawn from the tops of their
1339 field and who are given enormous resources to secure and monitor
1340 their systems against intruders. But smaller firms also have less to
1341 protect: fewer users whose data is more fragmented across more
1342 systems and have to be suborned one at a time by state actors.
1344 A concentrated tech sector that works with authorities is a much
1345 more powerful ally in the project of mass state surveillance than a
1346 fragmented one composed of smaller actors. The U.S. tech sector is
1347 small enough that all of its top executives fit around a single
1348 boardroom table in Trump Tower in
2017, shortly after Trump’s
1349 inauguration. Most of its biggest players bid to win JEDI, the
1350 Pentagon’s $
10 billion Joint Enterprise Defense Infrastructure cloud
1351 contract. Like other highly concentrated industries, Big Tech
1352 rotates its key employees in and out of government service, sending
1353 them to serve in the Department of Defense and the White House, then
1354 hiring ex-Pentagon and ex-DOD top staffers and officers to work in
1355 their own government relations departments.
1357 They can even make a good case for doing this: After all, when there
1358 are only four or five big companies in an industry, everyone
1359 qualified to regulate those companies has served as an executive in
1360 at least a couple of them — because, likewise, when there are only
1361 five companies in an industry, everyone qualified for a senior role
1362 at any of them is by definition working at one of the other ones.
1363 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
1364 While surveillance doesn’t cause monopolies, monopolies certainly
1366 </p></blockquote></div><p>
1367 Industries that are competitive are fragmented — composed of
1368 companies that are at each other’s throats all the time and eroding
1369 one another’s margins in bids to steal their best customers. This
1370 leaves them with much more limited capital to use to lobby for
1371 favorable rules and a much harder job of getting everyone to agree
1372 to pool their resources to benefit the industry as a whole.
1374 Surveillance combined with machine learning is supposed to be an
1375 existential crisis, a species-defining moment at which our free will
1376 is just a few more advances in the field from being stripped away. I
1377 am skeptical of this claim, but I
<span class=
"emphasis"><em>do
</em></span> think that
1378 tech poses an existential threat to our society and possibly our
1381 But that threat grows out of monopoly.
1383 One of the consequences of tech’s regulatory capture is that it can
1384 shift liability for poor security decisions onto its customers and
1385 the wider society. It is absolutely normal in tech for companies to
1386 obfuscate the workings of their products, to make them deliberately
1387 hard to understand, and to threaten security researchers who seek to
1388 independently audit those products.
1390 IT is the only field in which this is practiced: No one builds a
1391 bridge or a hospital and keeps the composition of the steel or the
1392 equations used to calculate load stresses a secret. It is a frankly
1393 bizarre practice that leads, time and again, to grotesque security
1394 defects on farcical scales, with whole classes of devices being
1395 revealed as vulnerable long after they are deployed in the field and
1396 put into sensitive places.
1398 The monopoly power that keeps any meaningful consequences for
1399 breaches at bay means that tech companies continue to build terrible
1400 products that are insecure by design and that end up integrated into
1401 our lives, in possession of our data, and connected to our physical
1402 world. For years, Boeing has struggled with the aftermath of a
1403 series of bad technology decisions that made its
737 fleet a global
1404 pariah, a rare instance in which bad tech decisions have been
1405 seriously punished in the market.
1407 These bad security decisions are compounded yet again by the use of
1408 copyright locks to enforce business-model decisions against
1409 consumers. Recall that these locks have become the go-to means for
1410 shaping consumer behavior, making it technically impossible to use
1411 third-party ink, insulin, apps, or service depots in connection with
1412 your lawfully acquired property.
1414 Recall also that these copyright locks are backstopped by
1415 legislation (such as Section
1201 of the DMCA or Article
6 of the
1416 2001 EU Copyright Directive) that ban tampering with
1417 (
<span class=
"quote">“
<span class=
"quote">circumventing
</span>”
</span>) them, and these statutes have been used to
1418 threaten security researchers who make disclosures about
1419 vulnerabilities without permission from manufacturers.
1421 This amounts to a manufacturer’s veto over safety warnings and
1422 criticism. While this is far from the legislative intent of the DMCA
1423 and its sister statutes around the world, Congress has not
1424 intervened to clarify the statute nor will it because to do so would
1425 run counter to the interests of powerful, large firms whose lobbying
1426 muscle is unstoppable.
1428 Copyright locks are a double whammy: They create bad security
1429 decisions that can’t be freely investigated or discussed. If markets
1430 are supposed to be machines for aggregating information (and if
1431 surveillance capitalism’s notional mind-control rays are what make
1432 it a
<span class=
"quote">“
<span class=
"quote">rogue capitalism
</span>”
</span> because it denies consumers the power to
1433 make decisions), then a program of legally enforced ignorance of the
1434 risks of products makes monopolism even more of a
<span class=
"quote">“
<span class=
"quote">rogue capitalism
</span>”
</span>
1435 than surveillance capitalism’s influence campaigns.
1437 And unlike mind-control rays, enforced silence over security is an
1438 immediate, documented problem, and it
<span class=
"emphasis"><em>does
</em></span>
1439 constitute an existential threat to our civilization and possibly
1440 our species. The proliferation of insecure devices — especially
1441 devices that spy on us and especially when those devices also can
1442 manipulate the physical world by, say, steering your car or flipping
1443 a breaker at a power station — is a kind of technology debt.
1445 In software design,
<span class=
"quote">“
<span class=
"quote">technology debt
</span>”
</span> refers to old, baked-in
1446 decisions that turn out to be bad ones in hindsight. Perhaps a
1447 long-ago developer decided to incorporate a networking protocol made
1448 by a vendor that has since stopped supporting it. But everything in
1449 the product still relies on that superannuated protocol, and so,
1450 with each revision, the product team has to work around this
1451 obsolete core, adding compatibility layers, surrounding it with
1452 security checks that try to shore up its defenses, and so on. These
1453 Band-Aid measures compound the debt because every subsequent
1454 revision has to make allowances for
<span class=
"emphasis"><em>them
</em></span>, too,
1455 like interest mounting on a predatory subprime loan. And like a
1456 subprime loan, the interest mounts faster than you can hope to pay
1457 it off: The product team has to put so much energy into maintaining
1458 this complex, brittle system that they don’t have any time left over
1459 to refactor the product from the ground up and
<span class=
"quote">“
<span class=
"quote">pay off the debt
</span>”
</span>
1462 Typically, technology debt results in a technological bankruptcy:
1463 The product gets so brittle and unsustainable that it fails
1464 catastrophically. Think of the antiquated COBOL-based banking and
1465 accounting systems that fell over at the start of the pandemic
1466 emergency when confronted with surges of unemployment claims.
1467 Sometimes that ends the product; sometimes it takes the company down
1468 with it. Being caught in the default of a technology debt is scary
1469 and traumatic, just like losing your house due to bankruptcy is
1470 scary and traumatic.
1472 But the technology debt created by copyright locks isn’t individual
1473 debt; it’s systemic. Everyone in the world is exposed to this
1474 over-leverage, as was the case with the
2008 financial crisis. When
1475 that debt comes due — when we face a cascade of security breaches
1476 that threaten global shipping and logistics, the food supply,
1477 pharmaceutical production pipelines, emergency communications, and
1478 other critical systems that are accumulating technology debt in part
1479 due to the presence of deliberately insecure and deliberately
1480 unauditable copyright locks — it will indeed pose an existential
1482 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"privacy-and-monopoly"></a>Privacy and monopoly
</h2></div></div></div><p>
1483 Many tech companies are gripped by an orthodoxy that holds that if
1484 they just gather enough data on enough of our activities, everything
1485 else is possible — the mind control and endless profits. This is an
1486 unfalsifiable hypothesis: If data gives a tech company even a tiny
1487 improvement in behavior prediction and modification, the company
1488 declares that it has taken the first step toward global domination
1489 with no end in sight. If a company
<span class=
"emphasis"><em>fails
</em></span> to
1490 attain any improvements from gathering and analyzing data, it
1491 declares success to be just around the corner, attainable once more
1494 Surveillance tech is far from the first industry to embrace a
1495 nonsensical, self-serving belief that harms the rest of the world,
1496 and it is not the first industry to profit handsomely from such a
1497 delusion. Long before hedge-fund managers were claiming (falsely)
1498 that they could beat the S
&P
500, there were plenty of other
1499 <span class=
"quote">“
<span class=
"quote">respectable
</span>”
</span> industries that have been revealed as quacks in
1500 hindsight. From the makers of radium suppositories (a real thing!)
1501 to the cruel sociopaths who claimed they could
<span class=
"quote">“
<span class=
"quote">cure
</span>”
</span> gay people,
1502 history is littered with the formerly respectable titans of
1503 discredited industries.
1505 This is not to say that there’s nothing wrong with Big Tech and its
1506 ideological addiction to data. While surveillance’s benefits are
1507 mostly overstated, its harms are, if anything,
1508 <span class=
"emphasis"><em>understated
</em></span>.
1510 There’s real irony here. The belief in surveillance capitalism as a
1511 <span class=
"quote">“
<span class=
"quote">rogue capitalism
</span>”
</span> is driven by the belief that markets wouldn’t
1512 tolerate firms that are gripped by false beliefs. An oil company
1513 that has false beliefs about where the oil is will eventually go
1514 broke digging dry wells after all.
1516 But monopolists get to do terrible things for a long time before
1517 they pay the price. Think of how concentration in the finance sector
1518 allowed the subprime crisis to fester as bond-rating agencies,
1519 regulators, investors, and critics all fell under the sway of a
1520 false belief that complex mathematics could construct
<span class=
"quote">“
<span class=
"quote">fully hedged
</span>”
</span>
1521 debt instruments that could not possibly default. A small bank that
1522 engaged in this kind of malfeasance would simply go broke rather
1523 than outrunning the inevitable crisis, perhaps growing so big that
1524 it averted it altogether. But large banks were able to continue to
1525 attract investors, and when they finally
<span class=
"emphasis"><em>did
</em></span>
1526 come a-cropper, the world’s governments bailed them out. The worst
1527 offenders of the subprime crisis are bigger than they were in
2008,
1528 bringing home more profits and paying their execs even larger sums.
1530 Big Tech is able to practice surveillance not just because it is
1531 tech but because it is
<span class=
"emphasis"><em>big
</em></span>. The reason every
1532 web publisher embeds a Facebook
<span class=
"quote">“
<span class=
"quote">Like
</span>”
</span> button is that Facebook
1533 dominates the internet’s social media referrals — and every one of
1534 those
<span class=
"quote">“
<span class=
"quote">Like
</span>”
</span> buttons spies on everyone who lands on a page that
1535 contains them (see also: Google Analytics embeds, Twitter buttons,
1538 The reason the world’s governments have been slow to create
1539 meaningful penalties for privacy breaches is that Big Tech’s
1540 concentration produces huge profits that can be used to lobby
1541 against those penalties — and Big Tech’s concentration means that
1542 the companies involved are able to arrive at a unified negotiating
1543 position that supercharges the lobbying.
1545 The reason that the smartest engineers in the world want to work for
1546 Big Tech is that Big Tech commands the lion’s share of tech industry
1549 The reason people who are aghast at Facebook’s and Google’s and
1550 Amazon’s data-handling practices continue to use these services is
1551 that all their friends are on Facebook; Google dominates search; and
1552 Amazon has put all the local merchants out of business.
1554 Competitive markets would weaken the companies’ lobbying muscle by
1555 reducing their profits and pitting them against each other in
1556 regulatory forums. It would give customers other places to go to get
1557 their online services. It would make the companies small enough to
1558 regulate and pave the way to meaningful penalties for breaches. It
1559 would let engineers with ideas that challenged the surveillance
1560 orthodoxy raise capital to compete with the incumbents. It would
1561 give web publishers multiple ways to reach audiences and make the
1562 case against Facebook and Google and Twitter embeds.
1564 In other words, while surveillance doesn’t cause monopolies,
1565 monopolies certainly abet surveillance.
1566 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"ronald-reagan-pioneer-of-tech-monopolism"></a>Ronald Reagan, pioneer of tech monopolism
</h2></div></div></div><p>
1567 Technology exceptionalism is a sin, whether it’s practiced by
1568 technology’s blind proponents or by its critics. Both of these camps
1569 are prone to explaining away monopolistic concentration by citing
1570 some special characteristic of the tech industry, like network
1571 effects or first-mover advantage. The only real difference between
1572 these two groups is that the tech apologists say monopoly is
1573 inevitable so we should just let tech get away with its abuses while
1574 competition regulators in the U.S. and the EU say monopoly is
1575 inevitable so we should punish tech for its abuses but not try to
1576 break up the monopolies.
1578 To understand how tech became so monopolistic, it’s useful to look
1579 at the dawn of the consumer tech industry:
1979, the year the Apple
1580 II Plus launched and became the first successful home computer. That
1581 also happens to be the year that Ronald Reagan hit the campaign
1582 trail for the
1980 presidential race — a race he won, leading to a
1583 radical shift in the way that antitrust concerns are handled in
1584 America. Reagan’s cohort of politicians — including Margaret
1585 Thatcher in the U.K., Brian Mulroney in Canada, Helmut Kohl in
1586 Germany, and Augusto Pinochet in Chile — went on to enact similar
1587 reforms that eventually spread around the world.
1589 Antitrust’s story began nearly a century before all that with laws
1590 like the Sherman Act, which took aim at monopolists on the grounds
1591 that monopolies were bad in and of themselves — squeezing out
1592 competitors, creating
<span class=
"quote">“
<span class=
"quote">diseconomies of scale
</span>”
</span> (when a company is so
1593 big that its constituent parts go awry and it is seemingly helpless
1594 to address the problems), and capturing their regulators to such a
1595 degree that they can get away with a host of evils.
1597 Then came a fabulist named Robert Bork, a former solicitor general
1598 who Reagan appointed to the powerful U.S. Court of Appeals for the
1599 D.C. Circuit and who had created an alternate legislative history of
1600 the Sherman Act and its successors out of whole cloth. Bork insisted
1601 that these statutes were never targeted at monopolies (despite a
1602 wealth of evidence to the contrary, including the transcribed
1603 speeches of the acts’ authors) but, rather, that they were intended
1604 to prevent
<span class=
"quote">“
<span class=
"quote">consumer harm
</span>”
</span> — in the form of higher prices.
1606 Bork was a crank, but he was a crank with a theory that rich people
1607 really liked. Monopolies are a great way to make rich people richer
1608 by allowing them to receive
<span class=
"quote">“
<span class=
"quote">monopoly rents
</span>”
</span> (that is, bigger
1609 profits) and capture regulators, leading to a weaker, more favorable
1610 regulatory environment with fewer protections for customers,
1611 suppliers, the environment, and workers.
1613 Bork’s theories were especially palatable to the same power brokers
1614 who backed Reagan, and Reagan’s Department of Justice and other
1615 agencies began to incorporate Bork’s antitrust doctrine into their
1616 enforcement decisions (Reagan even put Bork up for a Supreme Court
1617 seat, but Bork flunked the Senate confirmation hearing so badly
1618 that,
40 years later, D.C. insiders use the term
<span class=
"quote">“
<span class=
"quote">borked
</span>”
</span> to refer
1619 to any catastrophically bad political performance).
1621 Little by little, Bork’s theories entered the mainstream, and their
1622 backers began to infiltrate the legal education field, even putting
1623 on junkets where members of the judiciary were treated to lavish
1624 meals, fun outdoor activities, and seminars where they were
1625 indoctrinated into the consumer harm theory of antitrust. The more
1626 Bork’s theories took hold, the more money the monopolists were
1627 making — and the more surplus capital they had at their disposal to
1628 lobby for even more Borkian antitrust influence campaigns.
1630 The history of Bork’s antitrust theories is a really good example of
1631 the kind of covertly engineered shifts in public opinion that Zuboff
1632 warns us against, where fringe ideas become mainstream orthodoxy.
1633 But Bork didn’t change the world overnight. He played a very long
1634 game, for over a generation, and he had a tailwind because the same
1635 forces that backed oligarchic antitrust theories also backed many
1636 other oligarchic shifts in public opinion. For example, the idea
1637 that taxation is theft, that wealth is a sign of virtue, and so on —
1638 all of these theories meshed to form a coherent ideology that
1639 elevated inequality to a virtue.
1641 Today, many fear that machine learning allows surveillance
1642 capitalism to sell
<span class=
"quote">“
<span class=
"quote">Bork-as-a-Service,
</span>”
</span> at internet speeds, so that
1643 you can contract a machine-learning company to engineer
1644 <span class=
"emphasis"><em>rapid
</em></span> shifts in public sentiment without
1645 needing the capital to sustain a multipronged, multigenerational
1646 project working at the local, state, national, and global levels in
1647 business, law, and philosophy. I do not believe that such a project
1648 is plausible, though I agree that this is basically what the
1649 platforms claim to be selling. They’re just lying about it. Big Tech
1650 lies all the time,
<span class=
"emphasis"><em>including
</em></span> in their sales
1653 The idea that tech forms
<span class=
"quote">“
<span class=
"quote">natural monopolies
</span>”
</span> (monopolies that are
1654 the inevitable result of the realities of an industry, such as the
1655 monopolies that accrue the first company to run long-haul phone
1656 lines or rail lines) is belied by tech’s own history: In the absence
1657 of anti-competitive tactics, Google was able to unseat AltaVista and
1658 Yahoo; Facebook was able to head off Myspace. There are some
1659 advantages to gathering mountains of data, but those mountains of
1660 data also have disadvantages: liability (from leaking), diminishing
1661 returns (from old data), and institutional inertia (big companies,
1662 like science, progress one funeral at a time).
1664 Indeed, the birth of the web saw a mass-extinction event for the
1665 existing giant, wildly profitable proprietary technologies that had
1666 capital, network effects, and walls and moats surrounding their
1667 businesses. The web showed that when a new industry is built around
1668 a protocol, rather than a product, the combined might of everyone
1669 who uses the protocol to reach their customers or users or
1670 communities outweighs even the most massive products. CompuServe,
1671 AOL, MSN, and a host of other proprietary walled gardens learned
1672 this lesson the hard way: Each believed it could stay separate from
1673 the web, offering
<span class=
"quote">“
<span class=
"quote">curation
</span>”
</span> and a guarantee of consistency and
1674 quality instead of the chaos of an open system. Each was wrong and
1675 ended up being absorbed into the public web.
1677 Yes, tech is heavily monopolized and is now closely associated with
1678 industry concentration, but this has more to do with a matter of
1679 timing than its intrinsically monopolistic tendencies. Tech was born
1680 at the moment that antitrust enforcement was being dismantled, and
1681 tech fell into exactly the same pathologies that antitrust was
1682 supposed to guard against. To a first approximation, it is
1683 reasonable to assume that tech’s monopolies are the result of a lack
1684 of anti-monopoly action and not the much-touted unique
1685 characteristics of tech, such as network effects, first-mover
1686 advantage, and so on.
1688 In support of this thesis, I offer the concentration that every
1689 <span class=
"emphasis"><em>other
</em></span> industry has undergone over the same
1690 period. From professional wrestling to consumer packaged goods to
1691 commercial property leasing to banking to sea freight to oil to
1692 record labels to newspaper ownership to theme parks,
1693 <span class=
"emphasis"><em>every
</em></span> industry has undergone a massive shift
1694 toward concentration. There’s no obvious network effects or
1695 first-mover advantage at play in these industries. However, in every
1696 case, these industries attained their concentrated status through
1697 tactics that were prohibited before Bork’s triumph: merging with
1698 major competitors, buying out innovative new market entrants,
1699 horizontal and vertical integration, and a suite of anti-competitive
1700 tactics that were once illegal but are not any longer.
1702 Again: When you change the laws intended to prevent monopolies and
1703 then monopolies form in exactly the way the law was supposed to
1704 prevent, it is reasonable to suppose that these facts are related.
1705 Tech’s concentration can be readily explained without recourse to
1706 radical theories of network effects — but only if you’re willing to
1707 indict unregulated markets as tending toward monopoly. Just as a
1708 lifelong smoker can give you a hundred reasons why their smoking
1709 didn’t cause their cancer (
<span class=
"quote">“
<span class=
"quote">It was the environmental toxins
</span>”
</span>), true
1710 believers in unregulated markets have a whole suite of unconvincing
1711 explanations for monopoly in tech that leave capitalism intact.
1712 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"steering-with-the-windshield-wipers"></a>Steering with the windshield wipers
</h2></div></div></div><p>
1713 It’s been
40 years since Bork’s project to rehabilitate monopolies
1714 achieved liftoff, and that is a generation and a half, which is
1715 plenty of time to take a common idea and make it seem outlandish and
1716 vice versa. Before the
1940s, affluent Americans dressed their baby
1717 boys in pink while baby girls wore blue (a
<span class=
"quote">“
<span class=
"quote">delicate and dainty
</span>”
</span>
1718 color). While gendered colors are obviously totally arbitrary, many
1719 still greet this news with amazement and find it hard to imagine a
1720 time when pink connoted masculinity.
1722 After
40 years of studiously ignoring antitrust analysis and
1723 enforcement, it’s not surprising that we’ve all but forgotten that
1724 antitrust exists, that in living memory, growth through mergers and
1725 acquisitions were largely prohibited under law, that
1726 market-cornering strategies like vertical integration could land a
1729 Antitrust is a market society’s steering wheel, the control of first
1730 resort to keep would-be masters of the universe in their lanes. But
1731 Bork and his cohort ripped out our steering wheel
40 years ago. The
1732 car is still barreling along, and so we’re yanking as hard as we can
1733 on all the
<span class=
"emphasis"><em>other
</em></span> controls in the car as well as
1734 desperately flapping the doors and rolling the windows up and down
1735 in the hopes that one of these other controls can be repurposed to
1736 let us choose where we’re heading before we careen off a cliff.
1738 It’s like a
1960s science-fiction plot come to life: People stuck in
1739 a
<span class=
"quote">“
<span class=
"quote">generation ship,
</span>”
</span> plying its way across the stars, a ship once
1740 piloted by their ancestors; and now, after a great cataclysm, the
1741 ship’s crew have forgotten that they’re in a ship at all and no
1742 longer remember where the control room is. Adrift, the ship is
1743 racing toward its extinction, and unless we can seize the controls
1744 and execute emergency course correction, we’re all headed for a
1745 fiery death in the heart of a sun.
1746 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"surveillance-still-matters"></a>Surveillance still matters
</h2></div></div></div><p>
1747 None of this is to minimize the problems with surveillance.
1748 Surveillance matters, and Big Tech’s use of surveillance
1749 <span class=
"emphasis"><em>is
</em></span> an existential risk to our species, but
1750 that’s not because surveillance and machine learning rob us of our
1753 Surveillance has become
<span class=
"emphasis"><em>much
</em></span> more efficient
1754 thanks to Big Tech. In
1989, the Stasi — the East German secret
1755 police — had the whole country under surveillance, a massive
1756 undertaking that recruited one out of every
60 people to serve as an
1757 informant or intelligence operative.
1759 Today, we know that the NSA is spying on a significant fraction of
1760 the entire world’s population, and its ratio of surveillance
1761 operatives to the surveilled is more like
1:
10,
000 (that’s probably
1762 on the low side since it assumes that every American with top-secret
1763 clearance is working for the NSA on this project — we don’t know how
1764 many of those cleared people are involved in NSA spying, but it’s
1765 definitely not all of them).
1767 How did the ratio of surveillable citizens expand from
1:
60 to
1768 1:
10,
000 in less than
30 years? It’s thanks to Big Tech. Our devices
1769 and services gather most of the data that the NSA mines for its
1770 surveillance project. We pay for these devices and the services they
1771 connect to, and then we painstakingly perform the data-entry tasks
1772 associated with logging facts about our lives, opinions, and
1773 preferences. This mass surveillance project has been largely useless
1774 for fighting terrorism: The NSA can
1775 <a class=
"ulink" href=
"https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html" target=
"_top">only
1776 point to a single minor success story
</a> in which it used its
1777 data collection program to foil an attempt by a U.S. resident to
1778 wire a few thousand dollars to an overseas terror group. It’s
1779 ineffective for much the same reason that commercial surveillance
1780 projects are largely ineffective at targeting advertising: The
1781 people who want to commit acts of terror, like people who want to
1782 buy a refrigerator, are extremely rare. If you’re trying to detect a
1783 phenomenon whose base rate is one in a million with an instrument
1784 whose accuracy is only
99%, then every true positive will come at
1785 the cost of
9,
999 false positives.
1787 Let me explain that again: If one in a million people is a
1788 terrorist, then there will only be about one terrorist in a random
1789 sample of one million people. If your test for detecting terrorists
1790 is
99% accurate, it will identify
10,
000 terrorists in your
1791 million-person sample (
1% of one million is
10,
000). For every true
1792 positive, you’ll get
9,
999 false positives.
1794 In reality, the accuracy of algorithmic terrorism detection falls
1795 far short of the
99% mark, as does refrigerator ad targeting. The
1796 difference is that being falsely accused of wanting to buy a fridge
1797 is a minor nuisance while being falsely accused of planning a terror
1798 attack can destroy your life and the lives of everyone you love.
1800 Mass state surveillance is only feasible because of surveillance
1801 capitalism and its extremely low-yield ad-targeting systems, which
1802 require a constant feed of personal data to remain barely viable.
1803 Surveillance capitalism’s primary failure mode is mistargeted ads
1804 while mass state surveillance’s primary failure mode is grotesque
1805 human rights abuses, tending toward totalitarianism.
1807 State surveillance is no mere parasite on Big Tech, sucking up its
1808 data and giving nothing in return. In truth, the two are symbiotes:
1809 Big Tech sucks up our data for spy agencies, and spy agencies ensure
1810 that governments don’t limit Big Tech’s activities so severely that
1811 it would no longer serve the spy agencies’ needs. There is no firm
1812 distinction between state surveillance and surveillance capitalism;
1813 they are dependent on one another.
1815 To see this at work today, look no further than Amazon’s home
1816 surveillance device, the Ring doorbell, and its associated app,
1817 Neighbors. Ring — a product that Amazon acquired and did not develop
1818 in house — makes a camera-enabled doorbell that streams footage from
1819 your front door to your mobile device. The Neighbors app allows you
1820 to form a neighborhood-wide surveillance grid with your fellow Ring
1821 owners through which you can share clips of
<span class=
"quote">“
<span class=
"quote">suspicious characters.
</span>”
</span>
1822 If you’re thinking that this sounds like a recipe for letting
1823 curtain-twitching racists supercharge their suspicions of people
1824 with brown skin who walk down their blocks,
1825 <a class=
"ulink" href=
"https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement" target=
"_top">you’re
1826 right
</a>. Ring has become a
<span class=
"emphasis"><em>de facto,
</em></span>
1827 off-the-books arm of the police without any of the pesky oversight
1830 In mid-
2019, a series of public records requests revealed that
1831 Amazon had struck confidential deals with more than
400 local law
1832 enforcement agencies through which the agencies would promote Ring
1833 and Neighbors and in exchange get access to footage from Ring
1834 cameras. In theory, cops would need to request this footage through
1835 Amazon (and internal documents reveal that Amazon devotes
1836 substantial resources to coaching cops on how to spin a convincing
1837 story when doing so), but in practice, when a Ring customer turns
1838 down a police request, Amazon only requires the agency to formally
1839 request the footage from the company, which it will then produce.
1841 Ring and law enforcement have found many ways to intertwine their
1842 activities. Ring strikes secret deals to acquire real-time access to
1843 911 dispatch and then streams alarming crime reports to Neighbors
1844 users, which serve as convincers for anyone who’s contemplating a
1845 surveillance doorbell but isn’t sure whether their neighborhood is
1846 dangerous enough to warrant it.
1848 The more the cops buzz-market the surveillance capitalist Ring, the
1849 more surveillance capability the state gets. Cops who rely on
1850 private entities for law-enforcement roles then brief against any
1851 controls on the deployment of that technology while the companies
1852 return the favor by lobbying against rules requiring public
1853 oversight of police surveillance technology. The more the cops rely
1854 on Ring and Neighbors, the harder it will be to pass laws to curb
1855 them. The fewer laws there are against them, the more the cops will
1857 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"dignity-and-sanctuary"></a>Dignity and sanctuary
</h2></div></div></div><p>
1858 But even if we could exercise democratic control over our states and
1859 force them to stop raiding surveillance capitalism’s reservoirs of
1860 behavioral data, surveillance capitalism would still harm us.
1862 This is an area where Zuboff shines. Her chapter on
<span class=
"quote">“
<span class=
"quote">sanctuary
</span>”
</span> —
1863 the feeling of being unobserved — is a beautiful hymn to
1864 introspection, calmness, mindfulness, and tranquility.
1866 When you are watched, something changes. Anyone who has ever raised
1867 a child knows this. You might look up from your book (or more
1868 realistically, from your phone) and catch your child in a moment of
1869 profound realization and growth, a moment where they are learning
1870 something that is right at the edge of their abilities, requiring
1871 their entire ferocious concentration. For a moment, you’re
1872 transfixed, watching that rare and beautiful moment of focus playing
1873 out before your eyes, and then your child looks up and sees you
1874 seeing them, and the moment collapses. To grow, you need to be and
1875 expose your authentic self, and in that moment, you are vulnerable
1876 like a hermit crab scuttling from one shell to the next. The tender,
1877 unprotected tissues you expose in that moment are too delicate to
1878 reveal in the presence of another, even someone you trust as
1879 implicitly as a child trusts their parent.
1881 In the digital age, our authentic selves are inextricably tied to
1882 our digital lives. Your search history is a running ledger of the
1883 questions you’ve pondered. Your location history is a record of the
1884 places you’ve sought out and the experiences you’ve had there. Your
1885 social graph reveals the different facets of your identity, the
1886 people you’ve connected with.
1888 To be observed in these activities is to lose the sanctuary of your
1891 There’s another way in which surveillance capitalism robs us of our
1892 capacity to be our authentic selves: by making us anxious.
1893 Surveillance capitalism isn’t really a mind-control ray, but you
1894 don’t need a mind-control ray to make someone anxious. After all,
1895 another word for anxiety is agitation, and to make someone
1896 experience agitation, you need merely to agitate them. To poke them
1897 and prod them and beep at them and buzz at them and bombard them on
1898 an intermittent schedule that is just random enough that our limbic
1899 systems never quite become inured to it.
1901 Our devices and services are
<span class=
"quote">“
<span class=
"quote">general purpose
</span>”
</span> in that they can
1902 connect anything or anyone to anything or anyone else and that they
1903 can run any program that can be written. This means that the
1904 distraction rectangles in our pockets hold our most precious moments
1905 with our most beloved people and their most urgent or time-sensitive
1906 communications (from
<span class=
"quote">“
<span class=
"quote">running late can you get the kid?
</span>”
</span> to
<span class=
"quote">“
<span class=
"quote">doctor
1907 gave me bad news and I need to talk to you RIGHT NOW
</span>”
</span>) as well as
1908 ads for refrigerators and recruiting messages from Nazis.
1910 All day and all night, our pockets buzz, shattering our
1911 concentration and tearing apart the fragile webs of connection we
1912 spin as we think through difficult ideas. If you locked someone in a
1913 cell and agitated them like this, we’d call it
<span class=
"quote">“
<span class=
"quote">sleep deprivation
1914 torture,
</span>”
</span> and it would be
1915 <a class=
"ulink" href=
"https://www.youtube.com/watch?v=1SKpRbvnx6g" target=
"_top">a war crime
1916 under the Geneva Conventions
</a>.
1917 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"afflicting-the-afflicted"></a>Afflicting the afflicted
</h2></div></div></div><p>
1918 The effects of surveillance on our ability to be our authentic
1919 selves are not equal for all people. Some of us are lucky enough to
1920 live in a time and place in which all the most important facts of
1921 our lives are widely and roundly socially acceptable and can be
1922 publicly displayed without the risk of social consequence.
1924 But for many of us, this is not true. Recall that in living memory,
1925 many of the ways of being that we think of as socially acceptable
1926 today were once cause for dire social sanction or even imprisonment.
1927 If you are
65 years old, you have lived through a time in which
1928 people living in
<span class=
"quote">“
<span class=
"quote">free societies
</span>”
</span> could be imprisoned or sanctioned
1929 for engaging in homosexual activity, for falling in love with a
1930 person whose skin was a different color than their own, or for
1933 Today, these activities aren’t just decriminalized in much of the
1934 world, they’re considered normal, and the fallen prohibitions are
1935 viewed as shameful, regrettable relics of the past.
1937 How did we get from prohibition to normalization? Through private,
1938 personal activity: People who were secretly gay or secret
1939 pot-smokers or who secretly loved someone with a different skin
1940 color were vulnerable to retaliation if they made their true selves
1941 known and were limited in how much they could advocate for their own
1942 right to exist in the world and be true to themselves. But because
1943 there was a private sphere, these people could form alliances with
1944 their friends and loved ones who did not share their disfavored
1945 traits by having private conversations in which they came out,
1946 disclosing their true selves to the people around them and bringing
1947 them to their cause one conversation at a time.
1949 The right to choose the time and manner of these conversations was
1950 key to their success. It’s one thing to come out to your dad while
1951 you’re on a fishing trip away from the world and another thing
1952 entirely to blurt it out over the Christmas dinner table while your
1953 racist Facebook uncle is there to make a scene.
1955 Without a private sphere, there’s a chance that none of these
1956 changes would have come to pass and that the people who benefited
1957 from these changes would have either faced social sanction for
1958 coming out to a hostile world or would have never been able to
1959 reveal their true selves to the people they love.
1961 The corollary is that, unless you think that our society has
1962 attained social perfection — that your grandchildren in
50 years
1963 will ask you to tell them the story of how, in
2020, every injustice
1964 had been righted and no further change had to be made — then you
1965 should expect that right now, at this minute, there are people you
1966 love, whose happiness is key to your own, who have a secret in their
1967 hearts that stops them from ever being their authentic selves with
1968 you. These people are sorrowing and will go to their graves with
1969 that secret sorrow in their hearts, and the source of that sorrow
1970 will be the falsity of their relationship to you.
1972 A private realm is necessary for human progress.
1973 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"any-data-you-collect-and-retain-will-eventually-leak"></a>Any data you collect and retain will eventually leak
</h2></div></div></div><p>
1974 The lack of a private life can rob vulnerable people of the chance
1975 to be their authentic selves and constrain our actions by depriving
1976 us of sanctuary, but there is another risk that is borne by
1977 everyone, not just people with a secret: crime.
1979 Personally identifying information is of very limited use for the
1980 purpose of controlling peoples’ minds, but identity theft — really a
1981 catchall term for a whole constellation of terrible criminal
1982 activities that can destroy your finances, compromise your personal
1983 integrity, ruin your reputation, or even expose you to physical
1984 danger — thrives on it.
1986 Attackers are not limited to using data from one breached source,
1987 either. Multiple services have suffered breaches that exposed names,
1988 addresses, phone numbers, passwords, sexual tastes, school grades,
1989 work performance, brushes with the criminal justice system, family
1990 details, genetic information, fingerprints and other biometrics,
1991 reading habits, search histories, literary tastes, pseudonymous
1992 identities, and other sensitive information. Attackers can merge
1993 data from these different breaches to build up extremely detailed
1994 dossiers on random subjects and then use different parts of the data
1995 for different criminal purposes.
1997 For example, attackers can use leaked username and password
1998 combinations to hijack whole fleets of commercial vehicles that
1999 <a class=
"ulink" href=
"https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps" target=
"_top">have
2000 been fitted with anti-theft GPS trackers and immobilizers
</a> or
2001 to hijack baby monitors in order to
2002 <a class=
"ulink" href=
"https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550" target=
"_top">terrorize
2003 toddlers with the audio tracks from pornography
</a>. Attackers
2004 use leaked data to trick phone companies into giving them your phone
2005 number, then they intercept SMS-based two-factor authentication
2006 codes in order to take over your email, bank account, and/or
2007 cryptocurrency wallets.
2009 Attackers are endlessly inventive in the pursuit of creative ways to
2010 weaponize leaked data. One common use of leaked data is to penetrate
2011 companies in order to access
<span class=
"emphasis"><em>more
</em></span> data.
2013 Like spies, online fraudsters are totally dependent on companies
2014 over-collecting and over-retaining our data. Spy agencies sometimes
2015 pay companies for access to their data or intimidate them into
2016 giving it up, but sometimes they work just like criminals do — by
2017 <a class=
"ulink" href=
"https://www.bbc.com/news/world-us-canada-24751821" target=
"_top">sneaking
2018 data out of companies’ databases
</a>.
2020 The over-collection of data has a host of terrible social
2021 consequences, from the erosion of our authentic selves to the
2022 undermining of social progress, from state surveillance to an
2023 epidemic of online crime. Commercial surveillance is also a boon to
2024 people running influence campaigns, but that’s the least of our
2026 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"critical-tech-exceptionalism-is-still-tech-exceptionalism"></a>Critical tech exceptionalism is still tech
2027 exceptionalism
</h2></div></div></div><p>
2028 Big Tech has long practiced technology exceptionalism: the idea that
2029 it should not be subject to the mundane laws and norms of
2030 <span class=
"quote">“
<span class=
"quote">meatspace.
</span>”
</span> Mottoes like Facebook’s
<span class=
"quote">“
<span class=
"quote">move fast and break things
</span>”
</span>
2031 attracted justifiable scorn of the companies’ self-serving rhetoric.
2033 Tech exceptionalism got us all into a lot of trouble, so it’s ironic
2034 and distressing to see Big Tech’s critics committing the same sin.
2036 Big Tech is not a
<span class=
"quote">“
<span class=
"quote">rogue capitalism
</span>”
</span> that cannot be cured through
2037 the traditional anti-monopoly remedies of trustbusting (forcing
2038 companies to divest of competitors they have acquired) and bans on
2039 mergers to monopoly and other anti-competitive tactics. Big Tech
2040 does not have the power to use machine learning to influence our
2041 behavior so thoroughly that markets lose the ability to punish bad
2042 actors and reward superior competitors. Big Tech has no rule-writing
2043 mind-control ray that necessitates ditching our old toolbox.
2045 The thing is, people have been claiming to have perfected
2046 mind-control rays for centuries, and every time, it turned out to be
2047 a con — though sometimes the con artists were also conning
2050 For generations, the advertising industry has been steadily
2051 improving its ability to sell advertising services to businesses
2052 while only making marginal gains in selling those businesses’
2053 products to prospective customers. John Wanamaker’s lament that
<span class=
"quote">“
<span class=
"quote">50%
2054 of my advertising budget is wasted, I just don’t know which
50%
</span>”
</span> is
2055 a testament to the triumph of
<span class=
"emphasis"><em>ad executives
</em></span>,
2056 who successfully convinced Wanamaker that only half of the money he
2057 spent went to waste.
2059 The tech industry has made enormous improvements in the science of
2060 convincing businesses that they’re good at advertising while their
2061 actual improvements to advertising — as opposed to targeting — have
2062 been pretty ho-hum. The vogue for machine learning — and the
2063 mystical invocation of
<span class=
"quote">“
<span class=
"quote">artificial intelligence
</span>”
</span> as a synonym for
2064 straightforward statistical inference techniques — has greatly
2065 boosted the efficacy of Big Tech’s sales pitch as marketers have
2066 exploited potential customers’ lack of technical sophistication to
2067 get away with breathtaking acts of overpromising and
2070 It’s tempting to think that if businesses are willing to pour
2071 billions into a venture that the venture must be a good one. Yet
2072 there are plenty of times when this rule of thumb has led us astray.
2073 For example, it’s virtually unheard of for managed investment funds
2074 to outperform simple index funds, and investors who put their money
2075 into the hands of expert money managers overwhelmingly fare worse
2076 than those who entrust their savings to index funds. But managed
2077 funds still account for the majority of the money invested in the
2078 markets, and they are patronized by some of the richest, most
2079 sophisticated investors in the world. Their vote of confidence in an
2080 underperforming sector is a parable about the role of luck in wealth
2081 accumulation, not a sign that managed funds are a good buy.
2083 The claims of Big Tech’s mind-control system are full of tells that
2084 the enterprise is a con. For example,
2085 <a class=
"ulink" href=
"https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full" target=
"_top">the
2086 reliance on the
<span class=
"quote">“
<span class=
"quote">Big Five
</span>”
</span> personality traits
</a> as a primary
2087 means of influencing people even though the
<span class=
"quote">“
<span class=
"quote">Big Five
</span>”
</span> theory is
2088 unsupported by any large-scale, peer-reviewed studies and is
2089 <a class=
"ulink" href=
"https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/" target=
"_top">mostly
2090 the realm of marketing hucksters and pop psych
</a>.
2092 Big Tech’s promotional materials also claim that their algorithms
2093 can accurately perform
<span class=
"quote">“
<span class=
"quote">sentiment analysis
</span>”
</span> or detect peoples’ moods
2094 based on their
<span class=
"quote">“
<span class=
"quote">microexpressions,
</span>”
</span> but
2095 <a class=
"ulink" href=
"https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it" target=
"_top">these
2096 are marketing claims, not scientific ones
</a>. These methods are
2097 largely untested by independent scientific experts, and where they
2098 have been tested, they’ve been found sorely wanting.
2099 Microexpressions are particularly suspect as the companies that
2100 specialize in training people to detect them
2101 <a class=
"ulink" href=
"https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/" target=
"_top">have
2102 been shown
</a> to underperform relative to random chance.
2104 Big Tech has been so good at marketing its own supposed superpowers
2105 that it’s easy to believe that they can market everything else with
2106 similar acumen, but it’s a mistake to believe the hype. Any
2107 statement a company makes about the quality of its products is
2108 clearly not impartial. The fact that we distrust all the things that
2109 Big Tech says about its data handling, compliance with privacy laws,
2110 etc., is only reasonable — but why on Earth would we treat Big
2111 Tech’s marketing literature as the gospel truth? Big Tech lies about
2112 just about
<span class=
"emphasis"><em>everything
</em></span>, including how well its
2113 machine-learning fueled persuasion systems work.
2115 That skepticism should infuse all of our evaluations of Big Tech and
2116 its supposed abilities, including our perusal of its patents. Zuboff
2117 vests these patents with enormous significance, pointing out that
2118 Google claimed extensive new persuasion capabilities in
2119 <a class=
"ulink" href=
"https://patents.google.com/patent/US20050131762A1/en" target=
"_top">its
2120 patent filings
</a>. These claims are doubly suspect: first,
2121 because they are so self-serving, and second, because the patent
2122 itself is so notoriously an invitation to exaggeration.
2124 Patent applications take the form of a series of claims and range
2125 from broad to narrow. A typical patent starts out by claiming that
2126 its authors have invented a method or system for doing every
2127 conceivable thing that anyone might do, ever, with any tool or
2128 device. Then it narrows that claim in successive stages until we get
2129 to the actual
<span class=
"quote">“
<span class=
"quote">invention
</span>”
</span> that is the true subject of the patent.
2130 The hope is that the patent examiner — who is almost certainly
2131 overworked and underinformed — will miss the fact that some or all
2132 of these claims are ridiculous, or at least suspect, and grant the
2133 patent’s broader claims. Patents for unpatentable things are still
2134 incredibly useful because they can be wielded against competitors
2135 who might license that patent or steer clear of its claims rather
2136 than endure the lengthy, expensive process of contesting it.
2138 What’s more, software patents are routinely granted even though the
2139 filer doesn’t have any evidence that they can do the thing claimed
2140 by the patent. That is, you can patent an
<span class=
"quote">“
<span class=
"quote">invention
</span>”
</span> that you
2141 haven’t actually made and that you don’t know how to make.
2143 With these considerations in hand, it becomes obvious that the fact
2144 that a Big Tech company has patented what it
2145 <span class=
"emphasis"><em>says
</em></span> is an effective mind-control ray is
2146 largely irrelevant to whether Big Tech can in fact control our
2149 Big Tech collects our data for many reasons, including the
2150 diminishing returns on existing stores of data. But many tech
2151 companies also collect data out of a mistaken tech exceptionalist
2152 belief in the network effects of data. Network effects occur when
2153 each new user in a system increases its value. The classic example
2154 is fax machines: A single fax machine is of no use, two fax machines
2155 are of limited use, but every new fax machine that’s put to use
2156 after the first doubles the number of possible fax-to-fax links.
2158 Data mined for predictive systems doesn’t necessarily produce these
2159 dividends. Think of Netflix: The predictive value of the data mined
2160 from a million English-speaking Netflix viewers is hardly improved
2161 by the addition of one more user’s viewing data. Most of the data
2162 Netflix acquires after that first minimum viable sample duplicates
2163 existing data and produces only minimal gains. Meanwhile, retraining
2164 models with new data gets progressively more expensive as the number
2165 of data points increases, and manual tasks like labeling and
2166 validating data do not get cheaper at scale.
2168 Businesses pursue fads to the detriment of their profits all the
2169 time, especially when the businesses and their investors are not
2170 motivated by the prospect of becoming profitable but rather by the
2171 prospect of being acquired by a Big Tech giant or by having an IPO.
2172 For these firms, ticking faddish boxes like
<span class=
"quote">“
<span class=
"quote">collects as much data
2173 as possible
</span>”
</span> might realize a bigger return on investment than
2174 <span class=
"quote">“
<span class=
"quote">collects a business-appropriate quantity of data.
</span>”
</span>
2176 This is another harm of tech exceptionalism: The belief that more
2177 data always produces more profits in the form of more insights that
2178 can be translated into better mind-control rays drives firms to
2179 over-collect and over-retain data beyond all rationality. And since
2180 the firms are behaving irrationally, a good number of them will go
2181 out of business and become ghost ships whose cargo holds are stuffed
2182 full of data that can harm people in myriad ways — but which no one
2183 is responsible for antey longer. Even if the companies don’t go
2184 under, the data they collect is maintained behind the minimum viable
2185 security — just enough security to keep the company viable while it
2186 waits to get bought out by a tech giant, an amount calculated to
2187 spend not one penny more than is necessary on protecting data.
2188 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story"></a>How monopolies, not mind control, drive surveillance
2189 capitalism: The Snapchat story
</h2></div></div></div><p>
2190 For the first decade of its existence, Facebook competed with the
2191 social media giants of the day (Myspace, Orkut, etc.) by presenting
2192 itself as the pro-privacy alternative. Indeed, Facebook justified
2193 its walled garden — which let users bring in data from the web but
2194 blocked web services like Google Search from indexing and caching
2195 Facebook pages — as a pro-privacy measure that protected users from
2196 the surveillance-happy winners of the social media wars like
2199 Despite frequent promises that it would never collect or analyze its
2200 users’ data, Facebook periodically created initiatives that did just
2201 that, like the creepy, ham-fisted Beacon tool, which spied on you as
2202 you moved around the web and then added your online activities to
2203 your public timeline, allowing your friends to monitor your browsing
2204 habits. Beacon sparked a user revolt. Every time, Facebook backed
2205 off from its surveillance initiative, but not all the way;
2206 inevitably, the new Facebook would be more surveilling than the old
2207 Facebook, though not quite as surveilling as the intermediate
2208 Facebook following the launch of the new product or service.
2210 The pace at which Facebook ramped up its surveillance efforts seems
2211 to have been set by Facebook’s competitive landscape. The more
2212 competitors Facebook had, the better it behaved. Every time a major
2213 competitor foundered, Facebook’s behavior
2214 <a class=
"ulink" href=
"https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362" target=
"_top">got
2217 All the while, Facebook was prodigiously acquiring companies,
2218 including a company called Onavo. Nominally, Onavo made a
2219 battery-monitoring mobile app. But the permissions that Onavo
2220 required were so expansive that the app was able to gather
2221 fine-grained telemetry on everything users did with their phones,
2222 including which apps they used and how they were using them.
2224 Through Onavo, Facebook discovered that it was losing market share
2225 to Snapchat, an app that — like Facebook a decade before — billed
2226 itself as the pro-privacy alternative to the status quo. Through
2227 Onavo, Facebook was able to mine data from the devices of Snapchat
2228 users, including both current and former Snapchat users. This
2229 spurred Facebook to acquire Instagram — some features of which
2230 competed with Snapchat — and then allowed Facebook to fine-tune
2231 Instagram’s features and sales pitch to erode Snapchat’s gains and
2232 ensure that Facebook would not have to face the kinds of competitive
2233 pressures it had earlier inflicted on Myspace and Orkut.
2235 The story of how Facebook crushed Snapchat reveals the relationship
2236 between monopoly and surveillance capitalism. Facebook combined
2237 surveillance with lax antitrust enforcement to spot the competitive
2238 threat of Snapchat on its horizon and then take decisive action
2239 against it. Facebook’s surveillance capitalism let it avert
2240 competitive pressure with anti-competitive tactics. Facebook users
2241 still want privacy — Facebook hasn’t used surveillance to brainwash
2242 them out of it — but they can’t get it because Facebook’s
2243 surveillance lets it destroy any hope of a rival service emerging
2244 that competes on privacy features.
2245 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"a-monopoly-over-your-friends"></a>A monopoly over your friends
</h2></div></div></div><p>
2246 A decentralization movement has tried to erode the dominance of
2247 Facebook and other Big Tech companies by fielding
<span class=
"quote">“
<span class=
"quote">indieweb
</span>”
</span>
2248 alternatives — Mastodon as a Twitter alternative, Diaspora as a
2249 Facebook alternative, etc. — but these efforts have failed to attain
2250 any kind of liftoff.
2252 Fundamentally, each of these services is hamstrung by the same
2253 problem: Every potential user for a Facebook or Twitter alternative
2254 has to convince all their friends to follow them to a decentralized
2255 web alternative in order to continue to realize the benefit of
2256 social media. For many of us, the only reason to have a Facebook
2257 account is that our friends have Facebook accounts, and the reason
2258 they have Facebook accounts is that
<span class=
"emphasis"><em>we
</em></span> have
2261 All of this has conspired to make Facebook — and other dominant
2262 platforms — into
<span class=
"quote">“
<span class=
"quote">kill zones
</span>”
</span> that investors will not fund new
2265 And yet, all of today’s tech giants came into existence despite the
2266 entrenched advantage of the companies that came before them. To
2267 understand how that happened, you have to understand both
2268 interoperability and adversarial interoperability.
2269 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
2270 The hard problem of our species is coordination.
2271 </p></blockquote></div><p>
2272 <span class=
"quote">“
<span class=
"quote">Interoperability
</span>”
</span> is the ability of two technologies to work with
2273 one another: Anyone can make an LP that will play on any record
2274 player, anyone can make a filter you can install in your stove’s
2275 extractor fan, anyone can make gasoline for your car, anyone can
2276 make a USB phone charger that fits in your car’s cigarette lighter
2277 receptacle, anyone can make a light bulb that works in your light
2278 socket, anyone can make bread that will toast in your toaster.
2280 Interoperability is often a source of innovation and consumer
2281 benefit: Apple made the first commercially successful PC, but
2282 millions of independent software vendors made interoperable programs
2283 that ran on the Apple II Plus. The simple analog antenna inputs on
2284 the back of TVs first allowed cable operators to connect directly to
2285 TVs, then they allowed game console companies and then personal
2286 computer companies to use standard televisions as displays. Standard
2287 RJ-
11 telephone jacks allowed for the production of phones from a
2288 variety of vendors in a variety of forms, from the free
2289 football-shaped phone that came with a
<span class=
"emphasis"><em>Sports
2290 Illustrated
</em></span> subscription to business phones with
2291 speakers, hold functions, and so on and then answering machines and
2292 finally modems, paving the way for the internet revolution.
2294 <span class=
"quote">“
<span class=
"quote">Interoperability
</span>”
</span> is often used interchangeably with
2295 <span class=
"quote">“
<span class=
"quote">standardization,
</span>”
</span> which is the process when manufacturers and other
2296 stakeholders hammer out a set of agreed-upon rules for implementing
2297 a technology, such as the electrical plug on your wall, the CAN bus
2298 used by your car’s computer systems, or the HTML instructions that
2299 your browser interprets.
2301 But interoperability doesn’t require standardization — indeed,
2302 standardization often proceeds from the chaos of ad hoc
2303 interoperability measures. The inventor of the cigarette-lighter USB
2304 charger didn’t need to get permission from car manufacturers or even
2305 the manufacturers of the dashboard lighter subcomponent. The
2306 automakers didn’t take any countermeasures to prevent the use of
2307 these aftermarket accessories by their customers, but they also
2308 didn’t do anything to make life easier for the chargers’
2309 manufacturers. This is a kind of
<span class=
"quote">“
<span class=
"quote">neutral interoperability.
</span>”
</span>
2311 Beyond neutral interoperability, there is
<span class=
"quote">“
<span class=
"quote">adversarial
2312 interoperability.
</span>”
</span> That’s when a manufacturer makes a product that
2313 interoperates with another manufacturer’s product
<span class=
"emphasis"><em>despite
2314 the second manufacturer’s objections
</em></span> and
<span class=
"emphasis"><em>even
2315 if that means bypassing a security system designed to prevent
2316 interoperability
</em></span>.
2318 Probably the most familiar form of adversarial interoperability is
2319 third-party printer ink. Printer manufacturers claim that they sell
2320 printers below cost and that the only way they can recoup the losses
2321 they incur is by charging high markups on ink. To prevent the owners
2322 of printers from buying ink elsewhere, the printer companies deploy
2323 a suite of anti-customer security systems that detect and reject
2324 both refilled and third-party cartridges.
2326 Owners of printers take the position that HP and Epson and Brother
2327 are not charities and that customers for their wares have no
2328 obligation to help them survive, and so if the companies choose to
2329 sell their products at a loss, that’s their foolish choice and their
2330 consequences to live with. Likewise, competitors who make ink or
2331 refill kits observe that they don’t owe printer companies anything,
2332 and their erosion of printer companies’ margins are the printer
2333 companies’ problems, not their competitors’. After all, the printer
2334 companies shed no tears when they drive a refiller out of business,
2335 so why should the refillers concern themselves with the economic
2336 fortunes of the printer companies?
2338 Adversarial interoperability has played an outsized role in the
2339 history of the tech industry: from the founding of the
<span class=
"quote">“
<span class=
"quote">alt.*
</span>”
</span>
2340 Usenet hierarchy (which was started against the wishes of Usenet’s
2341 maintainers and which grew to be bigger than all of Usenet combined)
2342 to the browser wars (when Netscape and Microsoft devoted massive
2343 engineering efforts to making their browsers incompatible with the
2344 other’s special commands and peccadilloes) to Facebook (whose
2345 success was built in part by helping its new users stay in touch
2346 with friends they’d left behind on Myspace because Facebook supplied
2347 them with a tool that scraped waiting messages from Myspace and
2348 imported them into Facebook, effectively creating an Facebook-based
2351 Today, incumbency is seen as an unassailable advantage. Facebook is
2352 where all of your friends are, so no one can start a Facebook
2353 competitor. But adversarial compatibility reverses the competitive
2354 advantage: If you were allowed to compete with Facebook by providing
2355 a tool that imported all your users’ waiting Facebook messages into
2356 an environment that competed on lines that Facebook couldn’t cross,
2357 like eliminating surveillance and ads, then Facebook would be at a
2358 huge disadvantage. It would have assembled all possible ex-Facebook
2359 users into a single, easy-to-find service; it would have educated
2360 them on how a Facebook-like service worked and what its potential
2361 benefits were; and it would have provided an easy means for
2362 disgruntled Facebook users to tell their friends where they might
2363 expect better treatment.
2365 Adversarial interoperability was once the norm and a key contributor
2366 to the dynamic, vibrant tech scene, but now it is stuck behind a
2367 thicket of laws and regulations that add legal risks to the
2368 tried-and-true tactics of adversarial interoperability. New rules
2369 and new interpretations of existing rules mean that a would-be
2370 adversarial interoperator needs to steer clear of claims under
2371 copyright, terms of service, trade secrecy, tortious interference,
2374 In the absence of a competitive market, lawmakers have resorted to
2375 assigning expensive, state-like duties to Big Tech firms, such as
2376 automatically filtering user contributions for copyright
2377 infringement or terrorist and extremist content or detecting and
2378 preventing harassment in real time or controlling access to sexual
2381 These measures put a floor under how small we can make Big Tech
2382 because only the very largest companies can afford the humans and
2383 automated filters needed to perform these duties.
2385 But that’s not the only way in which making platforms responsible
2386 for policing their users undermines competition. A platform that is
2387 expected to police its users’ conduct must prevent many vital
2388 adversarial interoperability techniques lest these subvert its
2389 policing measures. For example, if someone using a Twitter
2390 replacement like Mastodon is able to push messages into Twitter and
2391 read messages out of Twitter, they could avoid being caught by
2392 automated systems that detect and prevent harassment (such as
2393 systems that use the timing of messages or IP-based rules to make
2394 guesses about whether someone is a harasser).
2396 To the extent that we are willing to let Big Tech police itself —
2397 rather than making Big Tech small enough that users can leave bad
2398 platforms for better ones and small enough that a regulation that
2399 simply puts a platform out of business will not destroy billions of
2400 users’ access to their communities and data — we build the case that
2401 Big Tech should be able to block its competitors and make it easier
2402 for Big Tech to demand legal enforcement tools to ban and punish
2403 attempts at adversarial interoperability.
2405 Ultimately, we can try to fix Big Tech by making it responsible for
2406 bad acts by its users, or we can try to fix the internet by cutting
2407 Big Tech down to size. But we can’t do both. To replace today’s
2408 giant products with pluralistic protocols, we need to clear the
2409 legal thicket that prevents adversarial interoperability so that
2410 tomorrow’s nimble, personal, small-scale products can federate
2411 themselves with giants like Facebook, allowing the users who’ve left
2412 to continue to communicate with users who haven’t left yet, reaching
2413 tendrils over Facebook’s garden wall that Facebook’s trapped users
2414 can use to scale the walls and escape to the global, open web.
2415 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"fake-news-is-an-epistemological-crisis"></a>Fake news is an epistemological crisis
</h2></div></div></div><p>
2416 Tech is not the only industry that has undergone massive
2417 concentration since the Reagan era. Virtually every major industry —
2418 from oil to newspapers to meatpacking to sea freight to eyewear to
2419 online pornography — has become a clubby oligarchy that just a few
2422 At the same time, every industry has become something of a tech
2423 industry as general-purpose computers and general-purpose networks
2424 and the promise of efficiencies through data-driven analysis infuse
2425 every device, process, and firm with tech.
2427 This phenomenon of industrial concentration is part of a wider story
2428 about wealth concentration overall as a smaller and smaller number
2429 of people own more and more of our world. This concentration of both
2430 wealth and industries means that our political outcomes are
2431 increasingly beholden to the parochial interests of the people and
2432 companies with all the money.
2434 That means that whenever a regulator asks a question with an
2435 obvious, empirical answer (
<span class=
"quote">“
<span class=
"quote">Are humans causing climate change?
</span>”
</span> or
2436 <span class=
"quote">“
<span class=
"quote">Should we let companies conduct commercial mass surveillance?
</span>”
</span> or
2437 <span class=
"quote">“
<span class=
"quote">Does society benefit from allowing network neutrality
2438 violations?
</span>”
</span>), the answer that comes out is only correct if that
2439 correctness meets with the approval of rich people and the
2440 industries that made them so wealthy.
2442 Rich people have always played an outsized role in politics and more
2443 so since the Supreme Court’s
<span class=
"emphasis"><em>Citizens United
</em></span>
2444 decision eliminated key controls over political spending. Widening
2445 inequality and wealth concentration means that the very richest
2446 people are now a lot richer and can afford to spend a lot more money
2447 on political projects than ever before. Think of the Koch brothers
2448 or George Soros or Bill Gates.
2450 But the policy distortions of rich individuals pale in comparison to
2451 the policy distortions that concentrated industries are capable of.
2452 The companies in highly concentrated industries are much more
2453 profitable than companies in competitive industries — no competition
2454 means not having to reduce prices or improve quality to win
2455 customers — leaving them with bigger capital surpluses to spend on
2458 Concentrated industries also find it easier to collaborate on policy
2459 objectives than competitive ones. When all the top execs from your
2460 industry can fit around a single boardroom table, they often do. And
2461 <span class=
"emphasis"><em>when
</em></span> they do, they can forge a consensus
2462 position on regulation.
2464 Rising through the ranks in a concentrated industry generally means
2465 working at two or three of the big companies. When there are only
2466 relatively few companies in a given industry, each company has a
2467 more ossified executive rank, leaving ambitious execs with fewer
2468 paths to higher positions unless they are recruited to a rival. This
2469 means that the top execs in concentrated industries are likely to
2470 have been colleagues at some point and socialize in the same circles
2471 — connected through social ties or, say, serving as trustees for
2472 each others’ estates. These tight social bonds foster a collegial,
2473 rather than competitive, attitude.
2475 Highly concentrated industries also present a regulatory conundrum.
2476 When an industry is dominated by just four or five companies, the
2477 only people who are likely to truly understand the industry’s
2478 practices are its veteran executives. This means that top regulators
2479 are often former execs of the companies they are supposed to be
2480 regulating. These turns in government are often tacitly understood
2481 to be leaves of absence from industry, with former employers
2482 welcoming their erstwhile watchdogs back into their executive ranks
2483 once their terms have expired.
2485 All this is to say that the tight social bonds, small number of
2486 firms, and regulatory capture of concentrated industries give the
2487 companies that comprise them the power to dictate many, if not all,
2488 of the regulations that bind them.
2490 This is increasingly obvious. Whether it’s payday lenders
2491 <a class=
"ulink" href=
"https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/" target=
"_top">winning
2492 the right to practice predatory lending
</a> or Apple
2493 <a class=
"ulink" href=
"https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation" target=
"_top">winning
2494 the right to decide who can fix your phone
</a> or Google and
2495 Facebook winning the right to breach your private data without
2496 suffering meaningful consequences or victories for pipeline
2497 companies or impunity for opioid manufacturers or massive tax
2498 subsidies for incredibly profitable dominant businesses, it’s
2499 increasingly apparent that many of our official, evidence-based
2500 truth-seeking processes are, in fact, auctions for sale to the
2503 It’s really impossible to overstate what a terrifying prospect this
2504 is. We live in an incredibly high-tech society, and none of us could
2505 acquire the expertise to evaluate every technological proposition
2506 that stands between us and our untimely, horrible deaths. You might
2507 devote your life to acquiring the media literacy to distinguish good
2508 scientific journals from corrupt pay-for-play lookalikes and the
2509 statistical literacy to evaluate the quality of the analysis in the
2510 journals as well as the microbiology and epidemiology knowledge to
2511 determine whether you can trust claims about the safety of vaccines
2512 — but that would still leave you unqualified to judge whether the
2513 wiring in your home will give you a lethal shock
2514 <span class=
"emphasis"><em>and
</em></span> whether your car’s brakes’ software will
2515 cause them to fail unpredictably
<span class=
"emphasis"><em>and
</em></span> whether
2516 the hygiene standards at your butcher are sufficient to keep you
2517 from dying after you finish your dinner.
2519 In a world as complex as this one, we have to defer to authorities,
2520 and we keep them honest by making those authorities accountable to
2521 us and binding them with rules to prevent conflicts of interest. We
2522 can’t possibly acquire the expertise to adjudicate conflicting
2523 claims about the best way to make the world safe and prosperous, but
2524 we
<span class=
"emphasis"><em>can
</em></span> determine whether the adjudication
2525 process itself is trustworthy.
2527 Right now, it’s obviously not.
2529 The past
40 years of rising inequality and industry concentration,
2530 together with increasingly weak accountability and transparency for
2531 expert agencies, has created an increasingly urgent sense of
2532 impending doom, the sense that there are vast conspiracies afoot
2533 that operate with tacit official approval despite the likelihood
2534 they are working to better themselves by ruining the rest of us.
2536 For example, it’s been decades since Exxon’s own scientists
2537 concluded that its products would render the Earth uninhabitable by
2538 humans. And yet those decades were lost to us, in large part because
2539 Exxon lobbied governments and sowed doubt about the dangers of its
2540 products and did so with the cooperation of many public officials.
2541 When the survival of you and everyone you love is threatened by
2542 conspiracies, it’s not unreasonable to start questioning the things
2543 you think you know in an attempt to determine whether they, too, are
2544 the outcome of another conspiracy.
2546 The collapse of the credibility of our systems for divining and
2547 upholding truths has left us in a state of epistemological chaos.
2548 Once, most of us might have assumed that the system was working and
2549 that our regulations reflected our best understanding of the
2550 empirical truths of the world as they were best understood — now we
2551 have to find our own experts to help us sort the true from the
2554 If you’re like me, you probably believe that vaccines are safe, but
2555 you (like me) probably also can’t explain the microbiology or
2556 statistics. Few of us have the math skills to review the literature
2557 on vaccine safety and describe why their statistical reasoning is
2558 sound. Likewise, few of us can review the stats in the (now
2559 discredited) literature on opioid safety and explain how those stats
2560 were manipulated. Both vaccines and opioids were embraced by medical
2561 authorities, after all, and one is safe while the other could ruin
2562 your life. You’re left with a kind of inchoate constellation of
2563 rules of thumb about which experts you trust to fact-check
2564 controversial claims and then to explain how all those respectable
2565 doctors with their peer-reviewed research on opioid safety
2566 <span class=
"emphasis"><em>were
</em></span> an aberration and then how you know that
2567 the doctors writing about vaccine safety are
2568 <span class=
"emphasis"><em>not
</em></span> an aberration.
2570 I’m
100% certain that vaccinating is safe and effective, but I’m
2571 also at something of a loss to explain exactly,
2572 <span class=
"emphasis"><em>precisely,
</em></span> why I believe this, given all the
2573 corruption I know about and the many times the stamp of certainty
2574 has turned out to be a parochial lie told to further enrich the
2577 Fake news — conspiracy theories, racist ideologies, scientific
2578 denialism — has always been with us. What’s changed today is not the
2579 mix of ideas in the public discourse but the popularity of the worst
2580 ideas in that mix. Conspiracy and denial have skyrocketed in
2581 lockstep with the growth of Big Inequality, which has also tracked
2582 the rise of Big Tech and Big Pharma and Big Wrestling and Big Car
2583 and Big Movie Theater and Big Everything Else.
2585 No one can say for certain why this has happened, but the two
2586 dominant camps are idealism (the belief that the people who argue
2587 for these conspiracies have gotten better at explaining them, maybe
2588 with the help of machine-learning tools) or materialism (the ideas
2589 have become more attractive because of material conditions in the
2592 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2593 theorists all my life, and I have not experienced any qualitative
2594 leap in the quality of those arguments.
2596 The major difference is in the world, not the arguments. In a time
2597 where actual conspiracies are commonplace, conspiracy theories
2598 acquire a ring of plausibility.
2600 We have always had disagreements about what’s true, but today, we
2601 have a disagreement over how we know whether something is true. This
2602 is an epistemological crisis, not a crisis over belief. It’s a
2603 crisis over the credibility of our truth-seeking exercises, from
2604 scientific journals (in an era where the biggest journal publishers
2605 have been caught producing pay-to-play journals for junk science) to
2606 regulations (in an era where regulators are routinely cycling in and
2607 out of business) to education (in an era where universities are
2608 dependent on corporate donations to keep their lights on).
2610 Targeting — surveillance capitalism — makes it easier to find people
2611 who are undergoing this epistemological crisis, but it doesn’t
2612 create the crisis. For that, you need to look to corruption.
2614 And, conveniently enough, it’s corruption that allows surveillance
2615 capitalism to grow by dismantling monopoly protections, by
2616 permitting reckless collection and retention of personal data, by
2617 allowing ads to be targeted in secret, and by foreclosing on the
2618 possibility of going somewhere else where you might continue to
2619 enjoy your friends without subjecting yourself to commercial
2621 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"tech-is-different"></a>Tech is different
</h2></div></div></div><p>
2622 I reject both iterations of technological exceptionalism. I reject
2623 the idea that tech is uniquely terrible and led by people who are
2624 greedier or worse than the leaders of other industries, and I reject
2625 the idea that tech is so good — or so intrinsically prone to
2626 concentration — that it can’t be blamed for its present-day
2627 monopolistic status.
2629 I think tech is just another industry, albeit one that grew up in
2630 the absence of real monopoly constraints. It may have been first,
2631 but it isn’t the worst nor will it be the last.
2633 But there’s one way in which I
<span class=
"emphasis"><em>am
</em></span> a tech
2634 exceptionalist. I believe that online tools are the key to
2635 overcoming problems that are much more urgent than tech
2636 monopolization: climate change, inequality, misogyny, and
2637 discrimination on the basis of race, gender identity, and other
2638 factors. The internet is how we will recruit people to fight those
2639 fights, and how we will coordinate their labor. Tech is not a
2640 substitute for democratic accountability, the rule of law, fairness,
2641 or stability — but it’s a means to achieve these things.
2643 The hard problem of our species is coordination. Everything from
2644 climate change to social change to running a business to making a
2645 family work can be viewed as a collective action problem.
2647 The internet makes it easier than at any time before to find people
2648 who want to work on a project with you — hence the success of free
2649 and open-source software, crowdfunding, and racist terror groups —
2650 and easier than ever to coordinate the work you do.
2652 The internet and the computers we connect to it also possess an
2653 exceptional quality: general-purposeness. The internet is designed
2654 to allow any two parties to communicate any data, using any
2655 protocol, without permission from anyone else. The only production
2656 design we have for computers is the general-purpose,
<span class=
"quote">“
<span class=
"quote">Turing
2657 complete
</span>”
</span> computer that can run every program we can express in
2660 This means that every time someone with a special communications
2661 need invests in infrastructure and techniques to make the internet
2662 faster, cheaper, and more robust, this benefit redounds to everyone
2663 else who is using the internet to communicate. And this also means
2664 that every time someone with a special computing need invests to
2665 make computers faster, cheaper, and more robust, every other
2666 computing application is a potential beneficiary of this work.
2668 For these reasons, every type of communication is gradually absorbed
2669 into the internet, and every type of device — from airplanes to
2670 pacemakers — eventually becomes a computer in a fancy case.
2672 While these considerations don’t preclude regulating networks and
2673 computers, they do call for gravitas and caution when doing so
2674 because changes to regulatory frameworks could ripple out to have
2675 unintended consequences in many, many other domains.
2677 The upshot of this is that our best hope of solving the big
2678 coordination problems — climate change, inequality, etc. — is with
2679 free, fair, and open tech. Our best hope of keeping tech free, fair,
2680 and open is to exercise caution in how we regulate tech and to
2681 attend closely to the ways in which interventions to solve one
2682 problem might create problems in other domains.
2683 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"ownership-of-facts"></a>Ownership of facts
</h2></div></div></div><p>
2684 Big Tech has a funny relationship with information. When you’re
2685 generating information — anything from the location data streaming
2686 off your mobile device to the private messages you send to friends
2687 on a social network — it claims the rights to make unlimited use of
2690 But when you have the audacity to turn the tables — to use a tool
2691 that blocks ads or slurps your waiting updates out of a social
2692 network and puts them in another app that lets you set your own
2693 priorities and suggestions or crawls their system to allow you to
2694 start a rival business — they claim that you’re stealing from them.
2696 The thing is, information is a very bad fit for any kind of private
2697 property regime. Property rights are useful for establishing markets
2698 that can lead to the effective development of fallow assets. These
2699 markets depend on clear titles to ensure that the things being
2700 bought and sold in them can, in fact, be bought and sold.
2702 Information rarely has such a clear title. Take phone numbers:
2703 There’s clearly something going wrong when Facebook slurps up
2704 millions of users’ address books and uses the phone numbers it finds
2705 in them to plot out social graphs and fill in missing information
2708 But the phone numbers Facebook nonconsensually acquires in this
2709 transaction are not the
<span class=
"quote">“
<span class=
"quote">property
</span>”
</span> of the users they’re taken from
2710 nor do they belong to the people whose phones ring when you dial
2711 those numbers. The numbers are mere integers,
10 digits in the U.S.
2712 and Canada, and they appear in millions of places, including
2713 somewhere deep in pi as well as numerous other contexts. Giving
2714 people ownership titles to integers is an obviously terrible idea.
2716 Likewise for the facts that Facebook and other commercial
2717 surveillance operators acquire about us, like that we are the
2718 children of our parents or the parents to our children or that we
2719 had a conversation with someone else or went to a public place.
2720 These data points can’t be property in the sense that your house or
2721 your shirt is your property because the title to them is
2722 intrinsically muddy: Does your mom own the fact that she is your
2723 mother? Do you? Do both of you? What about your dad — does he own
2724 this fact too, or does he have to license the fact from you (or your
2725 mom or both of you) in order to use this fact? What about the
2726 hundreds or thousands of other people who know these facts?
2728 If you go to a Black Lives Matter demonstration, do the other
2729 demonstrators need your permission to post their photos from the
2730 event? The online fights over
2731 <a class=
"ulink" href=
"https://www.wired.com/story/how-to-take-photos-at-protests/" target=
"_top">when
2732 and how to post photos from demonstrations
</a> reveal a nuanced,
2733 complex issue that cannot be easily hand-waved away by giving one
2734 party a property right that everyone else in the mix has to respect.
2736 The fact that information isn’t a good fit with property and markets
2737 doesn’t mean that it’s not valuable. Babies aren’t property, but
2738 they’re inarguably valuable. In fact, we have a whole set of rules
2739 just for babies as well as a subset of those rules that apply to
2740 humans more generally. Someone who argues that babies won’t be truly
2741 valuable until they can be bought and sold like loaves of bread
2742 would be instantly and rightfully condemned as a monster.
2744 It’s tempting to reach for the property hammer when Big Tech treats
2745 your information like a nail — not least because Big Tech are such
2746 prolific abusers of property hammers when it comes to
2747 <span class=
"emphasis"><em>their
</em></span> information. But this is a mistake. If we
2748 allow markets to dictate the use of our information, then we’ll find
2749 that we’re sellers in a buyers’ market where the Big Tech monopolies
2750 set a price for our data that is so low as to be insignificant or,
2751 more likely, set at a nonnegotiable price of zero in a click-through
2752 agreement that you don’t have the opportunity to modify.
2754 Meanwhile, establishing property rights over information will create
2755 insurmountable barriers to independent data processing. Imagine that
2756 we require a license to be negotiated when a translated document is
2757 compared with its original, something Google has done and continues
2758 to do billions of times to train its automated language translation
2759 tools. Google can afford this, but independent third parties cannot.
2760 Google can staff a clearances department to negotiate one-time
2761 payments to the likes of the EU (one of the major repositories of
2762 translated documents) while independent watchdogs wanting to verify
2763 that the translations are well-prepared, or to root out bias in
2764 translations, will find themselves needing a staffed-up legal
2765 department and millions for licenses before they can even get
2768 The same goes for things like search indexes of the web or photos of
2769 peoples’ houses, which have become contentious thanks to Google’s
2770 Street View project. Whatever problems may exist with Google’s
2771 photographing of street scenes, resolving them by letting people
2772 decide who can take pictures of the facades of their homes from a
2773 public street will surely create even worse ones. Think of how
2774 street photography is important for newsgathering — including
2775 informal newsgathering, like photographing abuses of authority — and
2776 how being able to document housing and street life are important for
2777 contesting eminent domain, advocating for social aid, reporting
2778 planning and zoning violations, documenting discriminatory and
2779 unequal living conditions, and more.
2781 The ownership of facts is antithetical to many kinds of human
2782 progress. It’s hard to imagine a rule that limits Big Tech’s
2783 exploitation of our collective labors without inadvertently banning
2784 people from gathering data on online harassment or compiling indexes
2785 of changes in language or simply investigating how the platforms are
2786 shaping our discourse — all of which require scraping data that
2787 other people have created and subjecting it to scrutiny and
2789 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"persuasion-works-slowly"></a>Persuasion works… slowly
</h2></div></div></div><p>
2790 The platforms may oversell their ability to persuade people, but
2791 obviously, persuasion works sometimes. Whether it’s the private
2792 realm that LGBTQ people used to recruit allies and normalize sexual
2793 diversity or the decadeslong project to convince people that markets
2794 are the only efficient way to solve complicated resource allocation
2795 problems, it’s clear that our societal attitudes
2796 <span class=
"emphasis"><em>can
</em></span> change.
2798 The project of shifting societal attitudes is a game of inches and
2799 years. For centuries, svengalis have purported to be able to
2800 accelerate this process, but even the most brutal forms of
2801 propaganda have struggled to make permanent changes. Joseph Goebbels
2802 was able to subject Germans to daily, mandatory, hourslong radio
2803 broadcasts, to round up and torture and murder dissidents, and to
2804 seize full control over their children’s education while banning any
2805 literature, broadcasts, or films that did not comport with his
2808 Yet, after
12 years of terror, once the war ended, Nazi ideology was
2809 largely discredited in both East and West Germany, and a program of
2810 national truth and reconciliation was put in its place. Racism and
2811 authoritarianism were never fully abolished in Germany, but neither
2812 were the majority of Germans irrevocably convinced of Nazism — and
2813 the rise of racist authoritarianism in Germany today tells us that
2814 the liberal attitudes that replaced Nazism were no more permanent
2817 Racism and authoritarianism have also always been with us. Anyone
2818 who’s reviewed the kind of messages and arguments that racists put
2819 forward today would be hard-pressed to say that they have gotten
2820 better at presenting their ideas. The same pseudoscience, appeals to
2821 fear, and circular logic that racists presented in the
1980s, when
2822 the cause of white supremacy was on the wane, are to be found in the
2823 communications of leading white nationalists today.
2825 If racists haven’t gotten more convincing in the past decade, then
2826 how is it that more people were convinced to be openly racist at
2827 that time? I believe that the answer lies in the material world, not
2828 the world of ideas. The ideas haven’t gotten more convincing, but
2829 people have become more afraid. Afraid that the state can’t be
2830 trusted to act as an honest broker in life-or-death decisions, from
2831 those regarding the management of the economy to the regulation of
2832 painkillers to the rules for handling private information. Afraid
2833 that the world has become a game of musical chairs in which the
2834 chairs are being taken away at a never-before-seen rate. Afraid that
2835 justice for others will come at their expense. Monopolism isn’t the
2836 cause of these fears, but the inequality and material desperation
2837 and policy malpractice that monopolism contributes to is a
2838 significant contributor to these conditions. Inequality creates the
2839 conditions for both conspiracies and violent racist ideologies, and
2840 then surveillance capitalism lets opportunists target the fearful
2841 and the conspiracy-minded.
2842 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"paying-wont-help"></a>Paying won’t help
</h2></div></div></div><p>
2843 As the old saw goes,
<span class=
"quote">“
<span class=
"quote">If you’re not paying for the product, you’re
2844 the product.
</span>”
</span>
2846 It’s a commonplace belief today that the advent of free,
2847 ad-supported media was the original sin of surveillance capitalism.
2848 The reasoning is that the companies that charged for access couldn’t
2849 <span class=
"quote">“
<span class=
"quote">compete with free
</span>”
</span> and so they were driven out of business. Their
2850 ad-supported competitors, meanwhile, declared open season on their
2851 users’ data in a bid to improve their ad targeting and make more
2852 money and then resorted to the most sensationalist tactics to
2853 generate clicks on those ads. If only we’d pay for media again, we’d
2854 have a better, more responsible, more sober discourse that would be
2855 better for democracy.
2857 But the degradation of news products long precedes the advent of
2858 ad-supported online news. Long before newspapers were online, lax
2859 antitrust enforcement had opened the door for unprecedented waves of
2860 consolidation and roll-ups in newsrooms. Rival newspapers were
2861 merged, reporters and ad sales staff were laid off, physical plants
2862 were sold and leased back, leaving the companies loaded up with debt
2863 through leveraged buyouts and subsequent profit-taking by the new
2864 owners. In other words, it wasn’t merely shifts in the classified
2865 advertising market, which was long held to be the primary driver in
2866 the decline of the traditional newsroom, that made news companies
2867 unable to adapt to the internet — it was monopolism.
2869 Then, as news companies
<span class=
"emphasis"><em>did
</em></span> come online, the ad
2870 revenues they commanded dropped even as the number of internet users
2871 (and thus potential online readers) increased. That shift was a
2872 function of consolidation in the ad sales market, with Google and
2873 Facebook emerging as duopolists who made more money every year from
2874 advertising while paying less and less of it to the publishers whose
2875 work the ads appeared alongside. Monopolism created a buyer’s market
2876 for ad inventory with Facebook and Google acting as gatekeepers.
2878 Paid services continue to exist alongside free ones, and often it is
2879 these paid services — anxious to prevent people from bypassing their
2880 paywalls or sharing paid media with freeloaders — that exert the
2881 most control over their customers. Apple’s iTunes and App Stores are
2882 paid services, but to maximize their profitability, Apple has to
2883 lock its platforms so that third parties can’t make compatible
2884 software without permission. These locks allow the company to
2885 exercise both editorial control (enabling it to exclude
2886 <a class=
"ulink" href=
"https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression" target=
"_top">controversial
2887 political material
</a>) and technological control, including
2888 control over who can repair the devices it makes. If we’re worried
2889 that ad-supported products deprive people of their right to
2890 self-determination by using persuasion techniques to nudge their
2891 purchase decisions a few degrees in one direction or the other, then
2892 the near-total control a single company holds over the decision of
2893 who gets to sell you software, parts, and service for your iPhone
2894 should have us very worried indeed.
2896 We shouldn’t just be concerned about payment and control: The idea
2897 that paying will improve discourse is also dangerously wrong. The
2898 poor success rate of targeted advertising means that the platforms
2899 have to incentivize you to
<span class=
"quote">“
<span class=
"quote">engage
</span>”
</span> with posts at extremely high
2900 levels to generate enough pageviews to safeguard their profits. As
2901 discussed earlier, to increase engagement, platforms like Facebook
2902 use machine learning to guess which messages will be most
2903 inflammatory and make a point of shoving those into your eyeballs at
2904 every turn so that you will hate-click and argue with people.
2906 Perhaps paying would fix this, the reasoning goes. If platforms
2907 could be economically viable even if you stopped clicking on them
2908 once your intellectual and social curiosity had been slaked, then
2909 they would have no reason to algorithmically enrage you to get more
2910 clicks out of you, right?
2912 There may be something to that argument, but it still ignores the
2913 wider economic and political context of the platforms and the world
2914 that allowed them to grow so dominant.
2916 Platforms are world-spanning and all-encompassing because they are
2917 monopolies, and they are monopolies because we have gutted our most
2918 important and reliable anti-monopoly rules. Antitrust was neutered
2919 as a key part of the project to make the wealthy wealthier, and that
2920 project has worked. The vast majority of people on Earth have a
2921 negative net worth, and even the dwindling middle class is in a
2922 precarious state, undersaved for retirement, underinsured for
2923 medical disasters, and undersecured against climate and technology
2926 In this wildly unequal world, paying doesn’t improve the discourse;
2927 it simply prices discourse out of the range of the majority of
2928 people. Paying for the product is dandy, if you can afford it.
2930 If you think today’s filter bubbles are a problem for our discourse,
2931 imagine what they’d be like if rich people inhabited free-flowing
2932 Athenian marketplaces of ideas where you have to pay for admission
2933 while everyone else lives in online spaces that are subsidized by
2934 wealthy benefactors who relish the chance to establish
2935 conversational spaces where the
<span class=
"quote">“
<span class=
"quote">house rules
</span>”
</span> forbid questioning the
2936 status quo. That is, imagine if the rich seceded from Facebook, and
2937 then, instead of running ads that made money for shareholders,
2938 Facebook became a billionaire’s vanity project that also happened to
2939 ensure that nobody talked about whether it was fair that only
2940 billionaires could afford to hang out in the rarified corners of the
2943 Behind the idea of paying for access is a belief that free markets
2944 will address Big Tech’s dysfunction. After all, to the extent that
2945 people have a view of surveillance at all, it is generally an
2946 unfavorable one, and the longer and more thoroughly one is
2947 surveilled, the less one tends to like it. Same goes for lock-in: If
2948 HP’s ink or Apple’s App Store were really obviously fantastic, they
2949 wouldn’t need technical measures to prevent users from choosing a
2950 rival’s product. The only reason these technical countermeasures
2951 exist is that the companies don’t believe their customers would
2952 <span class=
"emphasis"><em>voluntarily
</em></span> submit to their terms, and they
2953 want to deprive them of the choice to take their business elsewhere.
2955 Advocates for markets laud their ability to aggregate the diffused
2956 knowledge of buyers and sellers across a whole society through
2957 demand signals, price signals, and so on. The argument for
2958 surveillance capitalism being a
<span class=
"quote">“
<span class=
"quote">rogue capitalism
</span>”
</span> is that
2959 machine-learning-driven persuasion techniques distort
2960 decision-making by consumers, leading to incorrect signals —
2961 consumers don’t buy what they prefer, they buy what they’re tricked
2962 into preferring. It follows that the monopolistic practices of
2963 lock-in, which do far more to constrain consumers’ free choices, are
2964 even more of a
<span class=
"quote">“
<span class=
"quote">rogue capitalism.
</span>”
</span>
2966 The profitability of any business is constrained by the possibility
2967 that its customers will take their business elsewhere. Both
2968 surveillance and lock-in are anti-features that no customer wants.
2969 But monopolies can capture their regulators, crush their
2970 competitors, insert themselves into their customers’ lives, and
2971 corral people into
<span class=
"quote">“
<span class=
"quote">choosing
</span>”
</span> their services regardless of whether
2972 they want them — it’s fine to be terrible when there is no
2975 Ultimately, surveillance and lock-in are both simply business
2976 strategies that monopolists can choose. Surveillance companies like
2977 Google are perfectly capable of deploying lock-in technologies —
2978 just look at the onerous Android licensing terms that require
2979 device-makers to bundle in Google’s suite of applications. And
2980 lock-in companies like Apple are perfectly capable of subjecting
2981 their users to surveillance if it means keeping the Chinese
2982 government happy and preserving ongoing access to Chinese markets.
2983 Monopolies may be made up of good, ethical people, but as
2984 institutions, they are not your friend — they will do whatever they
2985 can get away with to maximize their profits, and the more
2986 monopolistic they are, the more they
<span class=
"emphasis"><em>can
</em></span> get
2988 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"an-ecology-moment-for-trustbusting"></a>An
<span class=
"quote">“
<span class=
"quote">ecology
</span>”
</span> moment for trustbusting
</h2></div></div></div><p>
2989 If we’re going to break Big Tech’s death grip on our digital lives,
2990 we’re going to have to fight monopolies. That may sound pretty
2991 mundane and old-fashioned, something out of the New Deal era, while
2992 ending the use of automated behavioral modification feels like the
2993 plotline of a really cool cyberpunk novel.
2995 Meanwhile, breaking up monopolies is something we seem to have
2996 forgotten how to do. There is a bipartisan, trans-Atlantic consensus
2997 that breaking up companies is a fool’s errand at best — liable to
2998 mire your federal prosecutors in decades of litigation — and
2999 counterproductive at worst, eroding the
<span class=
"quote">“
<span class=
"quote">consumer benefits
</span>”
</span> of large
3000 companies with massive efficiencies of scale.
3002 But trustbusters once strode the nation, brandishing law books,
3003 terrorizing robber barons, and shattering the illusion of
3004 monopolies’ all-powerful grip on our society. The trustbusting era
3005 could not begin until we found the political will — until the people
3006 convinced politicians they’d have their backs when they went up
3007 against the richest, most powerful men in the world.
3009 Could we find that political will again?
3011 Copyright scholar James Boyle has described how the term
<span class=
"quote">“
<span class=
"quote">ecology
</span>”
</span>
3012 marked a turning point in environmental activism. Prior to the
3013 adoption of this term, people who wanted to preserve whale
3014 populations didn’t necessarily see themselves as fighting the same
3015 battle as people who wanted to protect the ozone layer or fight
3016 freshwater pollution or beat back smog or acid rain.
3018 But the term
<span class=
"quote">“
<span class=
"quote">ecology
</span>”
</span> welded these disparate causes together into a
3019 single movement, and the members of this movement found solidarity
3020 with one another. The people who cared about smog signed petitions
3021 circulated by the people who wanted to end whaling, and the
3022 anti-whalers marched alongside the people demanding action on acid
3023 rain. This uniting behind a common cause completely changed the
3024 dynamics of environmentalism, setting the stage for today’s climate
3025 activism and the sense that preserving the habitability of the
3026 planet Earth is a shared duty among all people.
3028 I believe we are on the verge of a new
<span class=
"quote">“
<span class=
"quote">ecology
</span>”
</span> moment dedicated to
3029 combating monopolies. After all, tech isn’t the only concentrated
3030 industry nor is it even the
<span class=
"emphasis"><em>most
</em></span> concentrated
3033 You can find partisans for trustbusting in every sector of the
3034 economy. Everywhere you look, you can find people who’ve been
3035 wronged by monopolists who’ve trashed their finances, their health,
3036 their privacy, their educations, and the lives of people they love.
3037 Those people have the same cause as the people who want to break up
3038 Big Tech and the same enemies. When most of the world’s wealth is in
3039 the hands of a very few, it follows that nearly every large company
3040 will have overlapping shareholders.
3042 That’s the good news: With a little bit of work and a little bit of
3043 coalition building, we have more than enough political will to break
3044 up Big Tech and every other concentrated industry besides. First we
3045 take Facebook, then we take AT
&T/WarnerMedia.
3047 But here’s the bad news: Much of what we’re doing to tame Big Tech
3048 <span class=
"emphasis"><em>instead
</em></span> of breaking up the big companies also
3049 forecloses on the possibility of breaking them up later.
3051 Big Tech’s concentration currently means that their inaction on
3052 harassment, for example, leaves users with an impossible choice:
3053 absent themselves from public discourse by, say, quitting Twitter or
3054 endure vile, constant abuse. Big Tech’s over-collection and
3055 over-retention of data results in horrific identity theft. And their
3056 inaction on extremist recruitment means that white supremacists who
3057 livestream their shooting rampages can reach an audience of
3058 billions. The combination of tech concentration and media
3059 concentration means that artists’ incomes are falling even as the
3060 revenue generated by their creations are increasing.
3062 Yet governments confronting all of these problems all inevitably
3063 converge on the same solution: deputize the Big Tech giants to
3064 police their users and render them liable for their users’ bad
3065 actions. The drive to force Big Tech to use automated filters to
3066 block everything from copyright infringement to sex-trafficking to
3067 violent extremism means that tech companies will have to allocate
3068 hundreds of millions to run these compliance systems.
3070 These rules — the EU’s new Directive on Copyright, Australia’s new
3071 terror regulation, America’s FOSTA/SESTA sex-trafficking law and
3072 more — are not just death warrants for small, upstart competitors
3073 that might challenge Big Tech’s dominance but who lack the deep
3074 pockets of established incumbents to pay for all these automated
3075 systems. Worse still, these rules put a floor under how small we can
3076 hope to make Big Tech.
3078 That’s because any move to break up Big Tech and cut it down to size
3079 will have to cope with the hard limit of not making these companies
3080 so small that they can no longer afford to perform these duties —
3081 and it’s
<span class=
"emphasis"><em>expensive
</em></span> to invest in those automated
3082 filters and outsource content moderation. It’s already going to be
3083 hard to unwind these deeply concentrated, chimeric behemoths that
3084 have been welded together in the pursuit of monopoly profits. Doing
3085 so while simultaneously finding some way to fill the regulatory void
3086 that will be left behind if these self-policing rulers were forced
3087 to suddenly abdicate will be much, much harder.
3089 Allowing the platforms to grow to their present size has given them
3090 a dominance that is nearly insurmountable — deputizing them with
3091 public duties to redress the pathologies created by their size makes
3092 it virtually impossible to reduce that size. Lather, rinse, repeat:
3093 If the platforms don’t get smaller, they will get larger, and as
3094 they get larger, they will create more problems, which will give
3095 rise to more public duties for the companies, which will make them
3098 We can work to fix the internet by breaking up Big Tech and
3099 depriving them of monopoly profits, or we can work to fix Big Tech
3100 by making them spend their monopoly profits on governance. But we
3101 can’t do both. We have to choose between a vibrant, open internet or
3102 a dominated, monopolized internet commanded by Big Tech giants that
3103 we struggle with constantly to get them to behave themselves.
3104 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"make-big-tech-small-again"></a>Make Big Tech small again
</h2></div></div></div><p>
3105 Trustbusting is hard. Breaking big companies into smaller ones is
3106 expensive and time-consuming. So time-consuming that by the time
3107 you’re done, the world has often moved on and rendered years of
3108 litigation irrelevant. From
1969 to
1982, the U.S. government
3109 pursued an antitrust case against IBM over its dominance of
3110 mainframe computing — but the case collapsed in
1982 because
3111 mainframes were being speedily replaced by PCs.
3112 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
3113 A future U.S. president could simply direct their attorney general
3114 to enforce the law as it was written.
3115 </p></blockquote></div><p>
3116 It’s far easier to prevent concentration than to fix it, and
3117 reinstating the traditional contours of U.S. antitrust enforcement
3118 will, at the very least, prevent further concentration. That means
3119 bans on mergers between large companies, on big companies acquiring
3120 nascent competitors, and on platform companies competing directly
3121 with the companies that rely on the platforms.
3123 These powers are all in the plain language of U.S. antitrust laws,
3124 so in theory, a future U.S. president could simply direct their
3125 attorney general to enforce the law as it was written. But after
3126 decades of judicial
<span class=
"quote">“
<span class=
"quote">education
</span>”
</span> in the benefits of monopolies, after
3127 multiple administrations that have packed the federal courts with
3128 lifetime-appointed monopoly cheerleaders, it’s not clear that mere
3129 administrative action would do the trick.
3131 If the courts frustrate the Justice Department and the president,
3132 the next stop would be Congress, which could eliminate any doubt
3133 about how antitrust law should be enforced in the U.S. by passing
3134 new laws that boil down to saying,
<span class=
"quote">“
<span class=
"quote">Knock it off. We all know what
3135 the Sherman Act says. Robert Bork was a deranged fantasist. For
3136 avoidance of doubt,
<span class=
"emphasis"><em>fuck that guy
</em></span>.
</span>”
</span> In other
3137 words, the problem with monopolies is
3138 <span class=
"emphasis"><em>monopolism
</em></span> — the concentration of power into
3139 too few hands, which erodes our right to self-determination. If
3140 there is a monopoly, the law wants it gone, period. Sure, get rid of
3141 monopolies that create
<span class=
"quote">“
<span class=
"quote">consumer harm
</span>”
</span> in the form of higher prices,
3142 but also,
<span class=
"emphasis"><em>get rid of other monopolies, too.
</em></span>
3144 But this only prevents things from getting worse. To help them get
3145 better, we will have to build coalitions with other activists in the
3146 anti-monopoly ecology movement — a pluralism movement or a
3147 self-determination movement — and target existing monopolies in
3148 every industry for breakup and structural separation rules that
3149 prevent, for example, the giant eyewear monopolist Luxottica from
3150 dominating both the sale and the manufacture of spectacles.
3152 In an important sense, it doesn’t matter which industry the breakups
3153 begin in. Once they start, shareholders in
3154 <span class=
"emphasis"><em>every
</em></span> industry will start to eye their
3155 investments in monopolists skeptically. As trustbusters ride into
3156 town and start making lives miserable for monopolists, the debate
3157 around every corporate boardroom’s table will shift. People within
3158 corporations who’ve always felt uneasy about monopolism will gain a
3159 powerful new argument to fend off their evil rivals in the corporate
3160 hierarchy:
<span class=
"quote">“
<span class=
"quote">If we do it my way, we make less money; if we do it your
3161 way, a judge will fine us billions and expose us to ridicule and
3162 public disapprobation. So even though I get that it would be really
3163 cool to do that merger, lock out that competitor, or buy that little
3164 company and kill it before it can threaten it, we really shouldn’t —
3165 not if we don’t want to get tied to the DOJ’s bumper and get dragged
3166 up and down Trustbuster Road for the next
10 years.
</span>”
</span>
3167 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"goto-10"></a>20 GOTO
10</h2></div></div></div><p>
3168 Fixing Big Tech will require a lot of iteration. As cyber lawyer
3169 Lawrence Lessig wrote in his
1999 book,
<span class=
"emphasis"><em>Code and Other
3170 Laws of Cyberspace
</em></span>, our lives are regulated by four
3171 forces: law (what’s legal), code (what’s technologically possible),
3172 norms (what’s socially acceptable), and markets (what’s profitable).
3174 If you could wave a wand and get Congress to pass a law that
3175 re-fanged the Sherman Act tomorrow, you could use the impending
3176 breakups to convince venture capitalists to fund competitors to
3177 Facebook, Google, Twitter, and Apple that would be waiting in the
3178 wings after they were cut down to size.
3180 But getting Congress to act will require a massive normative shift,
3181 a mass movement of people who care about monopolies — and pulling
3184 Getting people to care about monopolies will take technological
3185 interventions that help them to see what a world free from Big Tech
3186 might look like. Imagine if someone could make a beloved (but
3187 unauthorized) third-party Facebook or Twitter client that dampens
3188 the anxiety-producing algorithmic drumbeat and still lets you talk
3189 to your friends without being spied upon — something that made
3190 social media more sociable and less toxic. Now imagine that it gets
3191 shut down in a brutal legal battle. It’s always easier to convince
3192 people that something must be done to save a thing they love than it
3193 is to excite them about something that doesn’t even exist yet.
3195 Neither tech nor law nor code nor markets are sufficient to reform
3196 Big Tech. But a profitable competitor to Big Tech could bankroll a
3197 legislative push; legal reform can embolden a toolsmith to make a
3198 better tool; the tool can create customers for a potential business
3199 who value the benefits of the internet but want them delivered
3200 without Big Tech; and that business can get funded and divert some
3201 of its profits to legal reform.
20 GOTO
10 (or lather, rinse,
3202 repeat). Do it again, but this time, get farther! After all, this
3203 time you’re starting with weaker Big Tech adversaries, a
3204 constituency that understands things can be better, Big Tech rivals
3205 who’ll help ensure their own future by bankrolling reform, and code
3206 that other programmers can build on to weaken Big Tech even further.
3208 The surveillance capitalism hypothesis — that Big Tech’s products
3209 really work as well as they say they do and that’s why everything is
3210 so screwed up — is way too easy on surveillance and even easier on
3211 capitalism. Companies spy because they believe their own BS, and
3212 companies spy because governments let them, and companies spy
3213 because any advantage from spying is so short-lived and minor that
3214 they have to do more and more of it just to stay in place.
3216 As to why things are so screwed up? Capitalism. Specifically, the
3217 monopolism that creates inequality and the inequality that creates
3218 monopolism. It’s a form of capitalism that rewards sociopaths who
3219 destroy the real economy to inflate the bottom line, and they get
3220 away with it for the same reason companies get away with spying:
3221 because our governments are in thrall to both the ideology that says
3222 monopolies are actually just fine and in thrall to the ideology that
3223 says that in a monopolistic world, you’d better not piss off the
3226 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked
3227 rule begets surveillance. Surveillance isn’t bad because it lets
3228 people manipulate us. It’s bad because it crushes our ability to be
3229 our authentic selves — and because it lets the rich and powerful
3230 figure out who might be thinking of building guillotines and what
3231 dirt they can use to discredit those embryonic guillotine-builders
3232 before they can even get to the lumberyard.
3233 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"up-and-through"></a>Up and through
</h2></div></div></div><p>
3234 With all the problems of Big Tech, it’s tempting to imagine solving
3235 the problem by returning to a world without tech at all. Resist that
3238 The only way out of our Big Tech problem is up and through. If our
3239 future is not reliant upon high tech, it will be because
3240 civilization has fallen. Big Tech wired together a planetary,
3241 species-wide nervous system that, with the proper reforms and course
3242 corrections, is capable of seeing us through the existential
3243 challenge of our species and planet. Now it’s up to us to seize the
3244 means of computation, putting that electronic nervous system under
3245 democratic, accountable control.
3247 I am, secretly, despite what I have said earlier, a tech
3248 exceptionalist. Not in the sense of thinking that tech should be
3249 given a free pass to monopolize because it has
<span class=
"quote">“
<span class=
"quote">economies of scale
</span>”
</span>
3250 or some other nebulous feature. I’m a tech exceptionalist because I
3251 believe that getting tech right matters and that getting it wrong
3252 will be an unmitigated catastrophe — and doing it right can give us
3253 the power to work together to save our civilization, our species,
3255 </p></div></div></body></html>