1 <html><head><meta http-equiv=
"Content-Type" content=
"text/html; charset=ISO-8859-1"><title>How to Destroy Surveillance Capitalism
</title><meta name=
"generator" content=
"DocBook XSL Stylesheets V1.79.1"><style type=
"text/css">
2 body { background-image: url('images/draft.png');
3 background-repeat: no-repeat;
4 background-position: top left;
5 /* The following properties make the watermark "fixed" on the page. */
6 /* I think that's just a bit too distracting for the reader... */
7 /* background-attachment: fixed; */
8 /* background-position: center center; */
9 }
</style></head><body bgcolor=
"white" text=
"black" link=
"#0000FF" vlink=
"#840084" alink=
"#0000FF"><div class=
"article"><div class=
"titlepage"><div><div><h2 class=
"title"><a name=
"idm1"></a>How to Destroy Surveillance Capitalism
</h2></div><div><div class=
"authorgroup"><div class=
"author"><h3 class=
"author"><span class=
"firstname">Cory
</span> <span class=
"surname">Doctorow
</span></h3></div></div></div></div><hr></div><div class=
"toc"><p><b>Table of Contents
</b></p><dl class=
"toc"><dt><span class=
"sect1"><a href=
"#the-net-of-a-thousand-lies">The net of a thousand lies
</a></span></dt><dt><span class=
"sect1"><a href=
"#digital-rights-activism-a-quarter-century-on">Digital rights activism, a quarter-century on
</a></span></dt><dt><span class=
"sect1"><a href=
"#tech-exceptionalism-then-and-now">Tech exceptionalism, then and now
</a></span></dt><dt><span class=
"sect1"><a href=
"#dont-believe-the-hype">Don
’t believe the hype
</a></span></dt><dt><span class=
"sect1"><a href=
"#what-is-persuasion">What is persuasion?
</a></span></dt><dd><dl><dt><span class=
"sect2"><a href=
"#segmenting">1. Segmenting
</a></span></dt><dt><span class=
"sect2"><a href=
"#deception">2. Deception
</a></span></dt><dt><span class=
"sect2"><a href=
"#domination">3. Domination
</a></span></dt><dt><span class=
"sect2"><a href=
"#bypassing-our-rational-faculties">4. Bypassing our rational faculties
</a></span></dt></dl></dd><dt><span class=
"sect1"><a href=
"#if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak">If data is the new oil, then surveillance capitalism
’s engine
10 has a leak
</a></span></dt><dt><span class=
"sect1"><a href=
"#what-is-facebook">What is Facebook?
</a></span></dt><dt><span class=
"sect1"><a href=
"#monopoly-and-the-right-to-the-future-tense">Monopoly and the right to the future tense
</a></span></dt><dt><span class=
"sect1"><a href=
"#search-order-and-the-right-to-the-future-tense">Search order and the right to the future tense
</a></span></dt><dt><span class=
"sect1"><a href=
"#monopolists-can-afford-sleeping-pills-for-watchdogs">Monopolists can afford sleeping pills for watchdogs
</a></span></dt><dt><span class=
"sect1"><a href=
"#privacy-and-monopoly">Privacy and monopoly
</a></span></dt><dt><span class=
"sect1"><a href=
"#ronald-reagan-pioneer-of-tech-monopolism">Ronald Reagan, pioneer of tech monopolism
</a></span></dt><dt><span class=
"sect1"><a href=
"#steering-with-the-windshield-wipers">Steering with the windshield wipers
</a></span></dt><dt><span class=
"sect1"><a href=
"#surveillance-still-matters">Surveillance still matters
</a></span></dt><dt><span class=
"sect1"><a href=
"#dignity-and-sanctuary">Dignity and sanctuary
</a></span></dt><dt><span class=
"sect1"><a href=
"#afflicting-the-afflicted">Afflicting the afflicted
</a></span></dt><dt><span class=
"sect1"><a href=
"#any-data-you-collect-and-retain-will-eventually-leak">Any data you collect and retain will eventually leak
</a></span></dt><dt><span class=
"sect1"><a href=
"#critical-tech-exceptionalism-is-still-tech-exceptionalism">Critical tech exceptionalism is still tech
11 exceptionalism
</a></span></dt><dt><span class=
"sect1"><a href=
"#how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story">How monopolies, not mind control, drive surveillance
12 capitalism: The Snapchat story
</a></span></dt><dt><span class=
"sect1"><a href=
"#a-monopoly-over-your-friends">A monopoly over your friends
</a></span></dt><dt><span class=
"sect1"><a href=
"#fake-news-is-an-epistemological-crisis">Fake news is an epistemological crisis
</a></span></dt><dt><span class=
"sect1"><a href=
"#tech-is-different">Tech is different
</a></span></dt><dt><span class=
"sect1"><a href=
"#ownership-of-facts">Ownership of facts
</a></span></dt><dt><span class=
"sect1"><a href=
"#persuasion-works-slowly">Persuasion works
… slowly
</a></span></dt><dt><span class=
"sect1"><a href=
"#paying-wont-help">Paying won
’t help
</a></span></dt><dt><span class=
"sect1"><a href=
"#an-ecology-moment-for-trustbusting">An
<span class=
"quote">“<span class=
"quote">ecology
</span>”</span> moment for trustbusting
</a></span></dt><dt><span class=
"sect1"><a href=
"#make-big-tech-small-again">Make Big Tech small again
</a></span></dt><dt><span class=
"sect1"><a href=
"#goto-10">20 GOTO
10</a></span></dt><dt><span class=
"sect1"><a href=
"#up-and-through">Up and through
</a></span></dt></dl></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"the-net-of-a-thousand-lies"></a>The net of a thousand lies
</h2></div></div></div><p>
13 The most surprising thing about the rebirth of flat Earthers in the
14 21st century is just how widespread the evidence against them is.
15 You can understand how, centuries ago, people who
’d never gained a
16 high-enough vantage point from which to see the Earth
’s curvature
17 might come to the commonsense belief that the flat-seeming Earth
20 But today, when elementary schools routinely dangle GoPro cameras
21 from balloons and loft them high enough to photograph the Earth
’s
22 curve
— to say nothing of the unexceptional sight of the curved
23 Earth from an airplane window
— it takes a heroic effort to maintain
24 the belief that the world is flat.
26 Likewise for white nationalism and eugenics: In an age where you can
27 become a computational genomics datapoint by swabbing your cheek and
28 mailing it to a gene-sequencing company along with a modest sum of
29 money,
<span class=
"quote">“<span class=
"quote">race science
</span>”</span> has never been easier to refute.
31 We are living through a golden age of both readily available facts
32 and denial of those facts. Terrible ideas that have lingered on the
33 fringes for decades or even centuries have gone mainstream seemingly
36 When an obscure idea gains currency, there are only two things that
37 can explain its ascendance: Either the person expressing that idea
38 has gotten a lot better at stating their case, or the proposition
39 has become harder to deny in the face of mounting evidence. In other
40 words, if we want people to take climate change seriously, we can
41 get a bunch of Greta Thunbergs to make eloquent, passionate
42 arguments from podiums, winning our hearts and minds, or we can wait
43 for flood, fire, broiling sun, and pandemics to make the case for
44 us. In practice, we
’ll probably have to do some of both: The more
45 we
’re boiling and burning and drowning and wasting away, the easier
46 it will be for the Greta Thunbergs of the world to convince us.
48 The arguments for ridiculous beliefs in odious conspiracies like
49 anti-vaccination, climate denial, a flat Earth, and eugenics are no
50 better than they were a generation ago. Indeed, they
’re worse
51 because they are being pitched to people who have at least a
52 background awareness of the refuting facts.
54 Anti-vax has been around since the first vaccines, but the early
55 anti-vaxxers were pitching people who were less equipped to
56 understand even the most basic ideas from microbiology, and
57 moreover, those people had not witnessed the extermination of
58 mass-murdering diseases like polio, smallpox, and measles. Today
’s
59 anti-vaxxers are no more eloquent than their forebears, and they
60 have a much harder job.
62 So can these far-fetched conspiracy theorists really be succeeding
63 on the basis of superior arguments?
65 Some people think so. Today, there is a widespread belief that
66 machine learning and commercial surveillance can turn even the most
67 fumble-tongued conspiracy theorist into a svengali who can warp your
68 perceptions and win your belief by locating vulnerable people and
69 then pitching them with A.I.-refined arguments that bypass their
70 rational faculties and turn everyday people into flat Earthers,
71 anti-vaxxers, or even Nazis. When the RAND Corporation
72 <a class=
"ulink" href=
"https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf" target=
"_top">blames
73 Facebook for
<span class=
"quote">“<span class=
"quote">radicalization
</span>”</span></a> and when Facebook
’s role in
74 spreading coronavirus misinformation is
75 <a class=
"ulink" href=
"https://secure.avaaz.org/campaign/en/facebook_threat_health/" target=
"_top">blamed
76 on its algorithm
</a>, the implicit message is that machine
77 learning and surveillance are causing the changes in our consensus
78 about what
’s true.
80 After all, in a world where sprawling and incoherent conspiracy
81 theories like Pizzagate and its successor, QAnon, have widespread
82 followings,
<span class=
"emphasis"><em>something
</em></span> must be afoot.
84 But what if there
’s another explanation? What if it
’s the material
85 circumstances, and not the arguments, that are making the difference
86 for these conspiracy pitchmen? What if the trauma of living through
87 <span class=
"emphasis"><em>real conspiracies
</em></span> all around us
— conspiracies
88 among wealthy people, their lobbyists, and lawmakers to bury
89 inconvenient facts and evidence of wrongdoing (these conspiracies
90 are commonly known as
<span class=
"quote">“<span class=
"quote">corruption
</span>”</span>)
— is making people vulnerable to
93 If it
’s trauma and not contagion
— material conditions and not
94 ideology
— that is making the difference today and enabling a rise
95 of repulsive misinformation in the face of easily observed facts,
96 that doesn
’t mean our computer networks are blameless. They
’re still
97 doing the heavy work of locating vulnerable people and guiding them
98 through a series of ever-more-extreme ideas and communities.
100 Belief in conspiracy is a raging fire that has done real damage and
101 poses real danger to our planet and species, from epidemics
102 <a class=
"ulink" href=
"https://www.cdc.gov/measles/cases-outbreaks.html" target=
"_top">kicked
103 off by vaccine denial
</a> to genocides
104 <a class=
"ulink" href=
"https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html" target=
"_top">kicked
105 off by racist conspiracies
</a> to planetary meltdown caused by
106 denial-inspired climate inaction. Our world is on fire, and so we
107 have to put the fires out
— to figure out how to help people see the
108 truth of the world through the conspiracies they
’ve been confused
111 But firefighting is reactive. We need fire
112 <span class=
"emphasis"><em>prevention
</em></span>. We need to strike at the traumatic
113 material conditions that make people vulnerable to the contagion of
114 conspiracy. Here, too, tech has a role to play.
116 There
’s no shortage of proposals to address this. From the EU
’s
117 <a class=
"ulink" href=
"https://edri.org/tag/terreg/" target=
"_top">Terrorist Content
118 Regulation
</a>, which requires platforms to police and remove
119 <span class=
"quote">“<span class=
"quote">extremist
</span>”</span> content, to the U.S. proposals to
120 <a class=
"ulink" href=
"https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution" target=
"_top">force
121 tech companies to spy on their users
</a> and hold them liable
122 <a class=
"ulink" href=
"https://www.natlawreview.com/article/repeal-cda-section-230" target=
"_top">for
123 their users
’ bad speech
</a>, there
’s a lot of energy to force
124 tech companies to solve the problems they created.
126 There
’s a critical piece missing from the debate, though. All these
127 solutions assume that tech companies are a fixture, that their
128 dominance over the internet is a permanent fact. Proposals to
129 replace Big Tech with a more diffused, pluralistic internet are
130 nowhere to be found. Worse: The
<span class=
"quote">“<span class=
"quote">solutions
</span>”</span> on the table today
131 <span class=
"emphasis"><em>require
</em></span> Big Tech to stay big because only the
132 very largest companies can afford to implement the systems these
135 Figuring out what we want our tech to look like is crucial if we
’re
136 going to get out of this mess. Today, we
’re at a crossroads where
137 we
’re trying to figure out if we want to fix the Big Tech companies
138 that dominate our internet or if we want to fix the internet itself
139 by unshackling it from Big Tech
’s stranglehold. We can
’t do both, so
142 I want us to choose wisely. Taming Big Tech is integral to fixing
143 the internet, and for that, we need digital rights activism.
144 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"digital-rights-activism-a-quarter-century-on"></a>Digital rights activism, a quarter-century on
</h2></div></div></div><p>
145 Digital rights activism is more than
30 years old now. The
146 Electronic Frontier Foundation turned
30 this year; the Free
147 Software Foundation launched in
1985. For most of the history of the
148 movement, the most prominent criticism leveled against it was that
149 it was irrelevant: The real activist causes were real-world causes
150 (think of the skepticism when
151 <a class=
"ulink" href=
"https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015." target=
"_top">Finland
152 declared broadband a human right in
2010</a>), and real-world
153 activism was shoe-leather activism (think of Malcolm Gladwell
’s
154 <a class=
"ulink" href=
"https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell" target=
"_top">contempt
155 for
<span class=
"quote">“<span class=
"quote">clicktivism
</span>”</span></a>). But as tech has grown more central to
156 our daily lives, these accusations of irrelevance have given way
157 first to accusations of insincerity (
<span class=
"quote">“<span class=
"quote">You only care about tech
159 <a class=
"ulink" href=
"https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/" target=
"_top">shilling
160 for tech companies
</a></span>”</span>) to accusations of negligence (
<span class=
"quote">“<span class=
"quote">Why
161 didn
’t you foresee that tech could be such a destructive force?
</span>”</span>).
162 But digital rights activism is right where it
’s always been: looking
163 out for the humans in a world where tech is inexorably taking over.
165 The latest version of this critique comes in the form of
166 <span class=
"quote">“<span class=
"quote">surveillance capitalism,
</span>”</span> a term coined by business professor
167 Shoshana Zuboff in her long and influential
2019 book,
<span class=
"emphasis"><em>The
168 Age of Surveillance Capitalism: The Fight for a Human Future at the
169 New Frontier of Power
</em></span>. Zuboff argues that
<span class=
"quote">“<span class=
"quote">surveillance
170 capitalism
</span>”</span> is a unique creature of the tech industry and that it is
171 unlike any other abusive commercial practice in history, one that is
172 <span class=
"quote">“<span class=
"quote">constituted by unexpected and often illegible mechanisms of
173 extraction, commodification, and control that effectively exile
174 persons from their own behavior while producing new markets of
175 behavioral prediction and modification. Surveillance capitalism
176 challenges democratic norms and departs in key ways from the
177 centuries-long evolution of market capitalism.
</span>”</span> It is a new and
178 deadly form of capitalism, a
<span class=
"quote">“<span class=
"quote">rogue capitalism,
</span>”</span> and our lack of
179 understanding of its unique capabilities and dangers represents an
180 existential, species-wide threat. She
’s right that capitalism today
181 threatens our species, and she
’s right that tech poses unique
182 challenges to our species and civilization, but she
’s really wrong
183 about how tech is different and why it threatens our species.
185 What
’s more, I think that her incorrect diagnosis will lead us down
186 a path that ends up making Big Tech stronger, not weaker. We need to
187 take down Big Tech, and to do that, we need to start by correctly
188 identifying the problem.
189 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"tech-exceptionalism-then-and-now"></a>Tech exceptionalism, then and now
</h2></div></div></div><p>
190 Early critics of the digital rights movement
— perhaps best
191 represented by campaigning organizations like the Electronic
192 Frontier Foundation, the Free Software Foundation, Public Knowledge,
193 and others that focused on preserving and enhancing basic human
194 rights in the digital realm
— damned activists for practicing
<span class=
"quote">“<span class=
"quote">tech
195 exceptionalism.
</span>”</span> Around the turn of the millennium, serious people
196 ridiculed any claim that tech policy mattered in the
<span class=
"quote">“<span class=
"quote">real world.
</span>”</span>
197 Claims that tech rules had implications for speech, association,
198 privacy, search and seizure, and fundamental rights and equities
199 were treated as ridiculous, an elevation of the concerns of sad
200 nerds arguing about
<span class=
"emphasis"><em>Star Trek
</em></span> on bulletin board
201 systems above the struggles of the Freedom Riders, Nelson Mandela,
202 or the Warsaw ghetto uprising.
204 In the decades since, accusations of
<span class=
"quote">“<span class=
"quote">tech exceptionalism
</span>”</span> have only
205 sharpened as tech
’s role in everyday life has expanded: Now that
206 tech has infiltrated every corner of our life and our online lives
207 have been monopolized by a handful of giants, defenders of digital
208 freedoms are accused of carrying water for Big Tech, providing cover
209 for its self-interested negligence (or worse, nefarious plots).
211 From my perspective, the digital rights movement has remained
212 stationary while the rest of the world has moved. From the earliest
213 days, the movement
’s concern was users and the toolsmiths who
214 provided the code they needed to realize their fundamental rights.
215 Digital rights activists only cared about companies to the extent
216 that companies were acting to uphold users
’ rights (or, just as
217 often, when companies were acting so foolishly that they threatened
218 to bring down new rules that would also make it harder for good
219 actors to help users).
221 The
<span class=
"quote">“<span class=
"quote">surveillance capitalism
</span>”</span> critique recasts the digital rights
222 movement in a new light again: not as alarmists who overestimate the
223 importance of their shiny toys nor as shills for big tech but as
224 serene deck-chair rearrangers whose long-standing activism is a
225 liability because it makes them incapable of perceiving novel
226 threats as they continue to fight the last century
’s tech battles.
228 But tech exceptionalism is a sin no matter who practices it.
229 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"dont-believe-the-hype"></a>Don
’t believe the hype
</h2></div></div></div><p>
230 You
’ve probably heard that
<span class=
"quote">“<span class=
"quote">if you
’re not paying for the product,
231 you
’re the product.
</span>”</span> As we
’ll see below, that
’s true, if incomplete.
232 But what is
<span class=
"emphasis"><em>absolutely
</em></span> true is that ad-driven
233 Big Tech
’s customers are advertisers, and what companies like Google
234 and Facebook sell is their ability to convince
235 <span class=
"emphasis"><em>you
</em></span> to buy stuff. Big Tech
’s product is
236 persuasion. The services
— social media, search engines, maps,
237 messaging, and more
— are delivery systems for persuasion.
239 The fear of surveillance capitalism starts from the (correct)
240 presumption that everything Big Tech says about itself is probably a
241 lie. But the surveillance capitalism critique makes an exception for
242 the claims Big Tech makes in its sales literature
— the breathless
243 hype in the pitches to potential advertisers online and in ad-tech
244 seminars about the efficacy of its products: It assumes that Big
245 Tech is as good at influencing us as they claim they are when
246 they
’re selling influencing products to credulous customers. That
’s
247 a mistake because sales literature is not a reliable indicator of a
248 product
’s efficacy.
250 Surveillance capitalism assumes that because advertisers buy a lot
251 of what Big Tech is selling, Big Tech must be selling something
252 real. But Big Tech
’s massive sales could just as easily be the
253 result of a popular delusion or something even more pernicious:
254 monopolistic control over our communications and commerce.
256 Being watched changes your behavior, and not for the better. It
257 creates risks for our social progress. Zuboff
’s book features
258 beautifully wrought explanations of these phenomena. But Zuboff also
259 claims that surveillance literally robs us of our free will
— that
260 when our personal data is mixed with machine learning, it creates a
261 system of persuasion so devastating that we are helpless before it.
262 That is, Facebook uses an algorithm to analyze the data it
263 nonconsensually extracts from your daily life and uses it to
264 customize your feed in ways that get you to buy stuff. It is a
265 mind-control ray out of a
1950s comic book, wielded by mad
266 scientists whose supercomputers guarantee them perpetual and total
268 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"what-is-persuasion"></a>What is persuasion?
</h2></div></div></div><p>
269 To understand why you shouldn
’t worry about mind-control rays
— but
270 why you
<span class=
"emphasis"><em>should
</em></span> worry about surveillance
271 <span class=
"emphasis"><em>and
</em></span> Big Tech
— we must start by unpacking what
272 we mean by
<span class=
"quote">“<span class=
"quote">persuasion.
</span>”</span>
274 Google, Facebook, and other surveillance capitalists promise their
275 customers (the advertisers) that if they use machine-learning tools
276 trained on unimaginably large data sets of nonconsensually harvested
277 personal information, they will be able to uncover ways to bypass
278 the rational faculties of the public and direct their behavior,
279 creating a stream of purchases, votes, and other desired outcomes.
280 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
281 The impact of dominance far exceeds the impact of manipulation and
282 should be central to our analysis and any remedies we seek.
283 </p></blockquote></div><p>
284 But there
’s little evidence that this is happening. Instead, the
285 predictions that surveillance capitalism delivers to its customers
286 are much less impressive. Rather than finding ways to bypass our
287 rational faculties, surveillance capitalists like Mark Zuckerberg
288 mostly do one or more of three things:
289 </p><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"segmenting"></a>1. Segmenting
</h3></div></div></div><p>
290 If you
’re selling diapers, you have better luck if you pitch them
291 to people in maternity wards. Not everyone who enters or leaves a
292 maternity ward just had a baby, and not everyone who just had a
293 baby is in the market for diapers. But having a baby is a really
294 reliable correlate of being in the market for diapers, and being
295 in a maternity ward is highly correlated with having a baby. Hence
296 diaper ads around maternity wards (and even pitchmen for baby
297 products, who haunt maternity wards with baskets full of
300 Surveillance capitalism is segmenting times a billion. Diaper
301 vendors can go way beyond people in maternity wards (though they
302 can do that, too, with things like location-based mobile ads).
303 They can target you based on whether you
’re reading articles about
304 child-rearing, diapers, or a host of other subjects, and data
305 mining can suggest unobvious keywords to advertise against. They
306 can target you based on the articles you
’ve recently read. They
307 can target you based on what you
’ve recently purchased. They can
308 target you based on whether you receive emails or private messages
309 about these subjects
— or even if you speak aloud about them
310 (though Facebook and the like convincingly claim that
’s not
311 happening
— yet).
313 This is seriously creepy.
315 But it
’s not mind control.
317 It doesn
’t deprive you of your free will. It doesn
’t trick you.
319 Think of how surveillance capitalism works in politics.
320 Surveillance capitalist companies sell political operatives the
321 power to locate people who might be receptive to their pitch.
322 Candidates campaigning on finance industry corruption seek people
323 struggling with debt; candidates campaigning on xenophobia seek
324 out racists. Political operatives have always targeted their
325 message whether their intentions were honorable or not: Union
326 organizers set up pitches at factory gates, and white supremacists
327 hand out fliers at John Birch Society meetings.
329 But this is an inexact and thus wasteful practice. The union
330 organizer can
’t know which worker to approach on the way out of
331 the factory gates and may waste their time on a covert John Birch
332 Society member; the white supremacist doesn
’t know which of the
333 Birchers are so delusional that making it to a meeting is as much
334 as they can manage and which ones might be convinced to cross the
335 country to carry a tiki torch through the streets of
336 Charlottesville, Virginia.
338 Because targeting improves the yields on political pitches, it can
339 accelerate the pace of political upheaval by making it possible
340 for everyone who has secretly wished for the toppling of an
341 autocrat
— or just an
11-term incumbent politician
— to find
342 everyone else who feels the same way at very low cost. This has
343 been critical to the rapid crystallization of recent political
344 movements including Black Lives Matter and Occupy Wall Street as
345 well as less savory players like the far-right white nationalist
346 movements that marched in Charlottesville.
348 It
’s important to differentiate this kind of political organizing
349 from influence campaigns; finding people who secretly agree with
350 you isn
’t the same as convincing people to agree with you. The
351 rise of phenomena like nonbinary or otherwise nonconforming gender
352 identities is often characterized by reactionaries as the result
353 of online brainwashing campaigns that convince impressionable
354 people that they have been secretly queer all along.
356 But the personal accounts of those who have come out tell a
357 different story where people who long harbored a secret about
358 their gender were emboldened by others coming forward and where
359 people who knew that they were different but lacked a vocabulary
360 for discussing that difference learned the right words from these
361 low-cost means of finding people and learning about their ideas.
362 </p></div><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"deception"></a>2. Deception
</h3></div></div></div><p>
363 Lies and fraud are pernicious, and surveillance capitalism
364 supercharges them through targeting. If you want to sell a
365 fraudulent payday loan or subprime mortgage, surveillance
366 capitalism can help you find people who are both desperate and
367 unsophisticated and thus receptive to your pitch. This accounts
368 for the rise of many phenomena, like multilevel marketing schemes,
369 in which deceptive claims about potential earnings and the
370 efficacy of sales techniques are targeted at desperate people by
371 advertising against search queries that indicate, for example,
372 someone struggling with ill-advised loans.
374 Surveillance capitalism also abets fraud by making it easy to
375 locate other people who have been similarly deceived, forming a
376 community of people who reinforce one another
’s false beliefs.
378 <a class=
"ulink" href=
"https://www.vulture.com/2020/01/the-dream-podcast-review.html" target=
"_top">the
379 forums
</a> where people who are being victimized by multilevel
380 marketing frauds gather to trade tips on how to improve their luck
381 in peddling the product.
383 Sometimes, online deception involves replacing someone
’s correct
384 beliefs with incorrect ones, as it does in the anti-vaccination
385 movement, whose victims are often people who start out believing
386 in vaccines but are convinced by seemingly plausible evidence that
387 leads them into the false belief that vaccines are harmful.
389 But it
’s much more common for fraud to succeed when it doesn
’t
390 have to displace a true belief. When my daughter contracted head
391 lice at daycare, one of the daycare workers told me I could get
392 rid of them by treating her hair and scalp with olive oil. I
393 didn
’t know anything about head lice, and I assumed that the
394 daycare worker did, so I tried it (it didn
’t work, and it doesn
’t
395 work). It
’s easy to end up with false beliefs when you simply
396 don
’t know any better and when those beliefs are conveyed by
397 someone who seems to know what they
’re doing.
399 This is pernicious and difficult
— and it
’s also the kind of thing
400 the internet can help guard against by making true information
401 available, especially in a form that exposes the underlying
402 deliberations among parties with sharply divergent views, such as
403 Wikipedia. But it
’s not brainwashing; it
’s fraud. In the
404 <a class=
"ulink" href=
"https://datasociety.net/library/data-voids/" target=
"_top">majority
405 of cases
</a>, the victims of these fraud campaigns have an
406 informational void filled in the customary way, by consulting a
407 seemingly reliable source. If I look up the length of the Brooklyn
408 Bridge and learn that it is
5,
800 feet long, but in reality, it is
409 5,
989 feet long, the underlying deception is a problem, but it
’s a
410 problem with a simple remedy. It
’s a very different problem from
411 the anti-vax issue in which someone
’s true belief is displaced by
412 a false one by means of sophisticated persuasion.
413 </p></div><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"domination"></a>3. Domination
</h3></div></div></div><p>
414 Surveillance capitalism is the result of monopoly. Monopoly is the
415 cause, and surveillance capitalism and its negative outcomes are
416 the effects of monopoly. I
’ll get into this in depth later, but
417 for now, suffice it to say that the tech industry has grown up
418 with a radical theory of antitrust that has allowed companies to
419 grow by merging with their rivals, buying up their nascent
420 competitors, and expanding to control whole market verticals.
422 One example of how monopolism aids in persuasion is through
423 dominance: Google makes editorial decisions about its algorithms
424 that determine the sort order of the responses to our queries. If
425 a cabal of fraudsters have set out to trick the world into
426 thinking that the Brooklyn Bridge is
5,
800 feet long, and if
427 Google gives a high search rank to this group in response to
428 queries like
<span class=
"quote">“<span class=
"quote">How long is the Brooklyn Bridge?
</span>”</span> then the first
429 eight or
10 screens
’ worth of Google results could be wrong. And
430 since most people don
’t go beyond the first couple of results
—
431 let alone the first
<span class=
"emphasis"><em>page
</em></span> of results
—
432 Google
’s choice means that many people will be deceived.
434 Google
’s dominance over search
— more than
86% of web searches are
435 performed through Google
— means that the way it orders its search
436 results has an outsized effect on public beliefs. Ironically,
437 Google claims this is why it can
’t afford to have any transparency
438 in its algorithm design: Google
’s search dominance makes the
439 results of its sorting too important to risk telling the world how
440 it arrives at those results lest some bad actor discover a flaw in
441 the ranking system and exploit it to push its point of view to the
442 top of the search results. There
’s an obvious remedy to a company
443 that is too big to audit: break it up into smaller pieces.
445 Zuboff calls surveillance capitalism a
<span class=
"quote">“<span class=
"quote">rogue capitalism
</span>”</span> whose
446 data-hoarding and machine-learning techniques rob us of our free
447 will. But influence campaigns that seek to displace existing,
448 correct beliefs with false ones have an effect that is small and
449 temporary while monopolistic dominance over informational systems
450 has massive, enduring effects. Controlling the results to the
451 world
’s search queries means controlling access both to arguments
452 and their rebuttals and, thus, control over much of the world
’s
453 beliefs. If our concern is how corporations are foreclosing on our
454 ability to make up our own minds and determine our own futures,
455 the impact of dominance far exceeds the impact of manipulation and
456 should be central to our analysis and any remedies we seek.
457 </p></div><div class=
"sect2"><div class=
"titlepage"><div><div><h3 class=
"title"><a name=
"bypassing-our-rational-faculties"></a>4. Bypassing our rational faculties
</h3></div></div></div><p>
458 <span class=
"emphasis"><em>This
</em></span> is the good stuff: using machine
459 learning,
<span class=
"quote">“<span class=
"quote">dark patterns,
</span>”</span> engagement hacking, and other
460 techniques to get us to do things that run counter to our better
461 judgment. This is mind control.
463 Some of these techniques have proven devastatingly effective (if
464 only in the short term). The use of countdown timers on a purchase
465 completion page can create a sense of urgency that causes you to
466 ignore the nagging internal voice suggesting that you should shop
467 around or sleep on your decision. The use of people from your
468 social graph in ads can provide
<span class=
"quote">“<span class=
"quote">social proof
</span>”</span> that a purchase is
469 worth making. Even the auction system pioneered by eBay is
470 calculated to play on our cognitive blind spots, letting us feel
471 like we
<span class=
"quote">“<span class=
"quote">own
</span>”</span> something because we bid on it, thus encouraging us
472 to bid again when we are outbid to ensure that
<span class=
"quote">“<span class=
"quote">our
</span>”</span> things stay
475 Games are extraordinarily good at this.
<span class=
"quote">“<span class=
"quote">Free to play
</span>”</span> games
476 manipulate us through many techniques, such as presenting players
477 with a series of smoothly escalating challenges that create a
478 sense of mastery and accomplishment but which sharply transition
479 into a set of challenges that are impossible to overcome without
480 paid upgrades. Add some social proof to the mix
— a stream of
481 notifications about how well your friends are faring
— and before
482 you know it, you
’re buying virtual power-ups to get to the next
485 Companies have risen and fallen on these techniques, and the
486 <span class=
"quote">“<span class=
"quote">fallen
</span>”</span> part is worth paying attention to. In general, living
487 things adapt to stimulus: Something that is very compelling or
488 noteworthy when you first encounter it fades with repetition until
489 you stop noticing it altogether. Consider the refrigerator hum
490 that irritates you when it starts up but disappears into the
491 background so thoroughly that you only notice it when it stops
494 That
’s why behavioral conditioning uses
<span class=
"quote">“<span class=
"quote">intermittent
495 reinforcement schedules.
</span>”</span> Instead of giving you a steady drip of
496 encouragement or setbacks, games and gamified services scatter
497 rewards on a randomized schedule
— often enough to keep you
498 interested and random enough that you can never quite find the
499 pattern that would make it boring.
501 Intermittent reinforcement is a powerful behavioral tool, but it
502 also represents a collective action problem for surveillance
503 capitalism. The
<span class=
"quote">“<span class=
"quote">engagement techniques
</span>”</span> invented by the
504 behaviorists of surveillance capitalist companies are quickly
505 copied across the whole sector so that what starts as a
506 mysteriously compelling fillip in the design of a service
—like
507 <span class=
"quote">“<span class=
"quote">pull to refresh
</span>”</span> or alerts when someone likes your posts or side
508 quests that your characters get invited to while in the midst of
509 main quests
—quickly becomes dully ubiquitous. The
510 impossible-to-nail-down nonpattern of randomized drips from your
511 phone becomes a grey-noise wall of sound as every single app and
512 site starts to make use of whatever seems to be working at the
515 From the surveillance capitalist
’s point of view, our adaptive
516 capacity is like a harmful bacterium that deprives it of its food
517 source
— our attention
— and novel techniques for snagging that
518 attention are like new antibiotics that can be used to breach our
519 defenses and destroy our self-determination. And there
520 <span class=
"emphasis"><em>are
</em></span> techniques like that. Who can forget the
521 Great Zynga Epidemic, when all of our friends were caught in
522 <span class=
"emphasis"><em>FarmVille
</em></span>’s endless, mindless dopamine loops?
523 But every new attention-commanding technique is jumped on by the
524 whole industry and used so indiscriminately that antibiotic
525 resistance sets in. Given enough repetition, almost all of us
526 develop immunity to even the most powerful techniques
— by
2013,
527 two years after Zynga
’s peak, its user base had halved.
529 Not everyone, of course. Some people never adapt to stimulus, just
530 as some people never stop hearing the hum of the refrigerator.
531 This is why most people who are exposed to slot machines play them
532 for a while and then move on while a small and tragic minority
533 liquidate their kids
’ college funds, buy adult diapers, and
534 position themselves in front of a machine until they collapse.
536 But surveillance capitalism
’s margins on behavioral modification
537 suck. Tripling the rate at which someone buys a widget sounds
539 <a class=
"ulink" href=
"https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957" target=
"_top">unless
540 the base rate is way less than
1%
</a> with an improved rate
541 of
… still less than
1%. Even penny slot machines pull down pennies
542 for every spin while surveillance capitalism rakes in
543 infinitesimal penny fractions.
545 Slot machines
’ high returns mean that they can be profitable just
546 by draining the fortunes of the small rump of people who are
547 pathologically vulnerable to them and unable to adapt to their
548 tricks. But surveillance capitalism can
’t survive on the
549 fractional pennies it brings down from that vulnerable sliver
—
550 that
’s why, after the Great Zynga Epidemic had finally burned
551 itself out, the small number of still-addicted players left behind
552 couldn
’t sustain it as a global phenomenon. And new powerful
553 attention weapons aren
’t easy to find, as is evidenced by the long
554 years since the last time Zynga had a hit. Despite the hundreds of
555 millions of dollars that Zynga has to spend on developing new
556 tools to blast through our adaptation, it has never managed to
557 repeat the lucky accident that let it snag so much of our
558 attention for a brief moment in
2009. Powerhouses like Supercell
559 have fared a little better, but they are rare and throw away many
560 failures for every success.
562 The vulnerability of small segments of the population to dramatic,
563 efficient corporate manipulation is a real concern that
’s worthy
564 of our attention and energy. But it
’s not an existential threat to
566 </p></div></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak"></a>If data is the new oil, then surveillance capitalism
’s engine
567 has a leak
</h2></div></div></div><p>
568 This adaptation problem offers an explanation for one of
569 surveillance capitalism
’s most alarming traits: its relentless
570 hunger for data and its endless expansion of data-gathering
571 capabilities through the spread of sensors, online surveillance, and
572 acquisition of data streams from third parties.
574 Zuboff observes this phenomenon and concludes that data must be very
575 valuable if surveillance capitalism is so hungry for it. (In her
576 words:
<span class=
"quote">“<span class=
"quote">Just as industrial capitalism was driven to the continuous
577 intensification of the means of production, so surveillance
578 capitalists and their market players are now locked into the
579 continuous intensification of the means of behavioral modification
580 and the gathering might of instrumentarian power.
</span>”</span>) But what if the
581 voracious appetite is because data has such a short half-life
—
582 because people become inured so quickly to new, data-driven
583 persuasion techniques
— that the companies are locked in an arms
584 race with our limbic system? What if it
’s all a Red Queen
’s race
585 where they have to run ever faster
— collect ever-more data
— just
586 to stay in the same spot?
588 Of course, all of Big Tech
’s persuasion techniques work in concert
589 with one another, and collecting data is useful beyond mere
592 If someone wants to recruit you to buy a refrigerator or join a
593 pogrom, they might use profiling and targeting to send messages to
594 people they judge to be good sales prospects. The messages
595 themselves may be deceptive, making claims about things you
’re not
596 very knowledgeable about (food safety and energy efficiency or
597 eugenics and historical claims about racial superiority). They might
598 use search engine optimization and/or armies of fake reviewers and
599 commenters and/or paid placement to dominate the discourse so that
600 any search for further information takes you back to their messages.
601 And finally, they may refine the different pitches using machine
602 learning and other techniques to figure out what kind of pitch works
603 best on someone like you.
605 Each phase of this process benefits from surveillance: The more data
606 they have, the more precisely they can profile you and target you
607 with specific messages. Think of how you
’d sell a fridge if you knew
608 that the warranty on your prospect
’s fridge just expired and that
609 they were expecting a tax rebate in April.
611 Also, the more data they have, the better they can craft deceptive
612 messages
— if I know that you
’re into genealogy, I might not try to
613 feed you pseudoscience about genetic differences between
<span class=
"quote">“<span class=
"quote">races,
</span>”</span>
614 sticking instead to conspiratorial secret histories of
<span class=
"quote">“<span class=
"quote">demographic
615 replacement
</span>”</span> and the like.
617 Facebook also helps you locate people who have the same odious or
618 antisocial views as you. It makes it possible to find other people
619 who want to carry tiki torches through the streets of
620 Charlottesville in Confederate cosplay. It can help you find other
621 people who want to join your militia and go to the border to look
622 for undocumented migrants to terrorize. It can help you find people
623 who share your belief that vaccines are poison and that the Earth is
626 There is one way in which targeted advertising uniquely benefits
627 those advocating for socially unacceptable causes: It is invisible.
628 Racism is widely geographically dispersed, and there are few places
629 where racists
— and only racists
— gather. This is similar to the
630 problem of selling refrigerators in that potential refrigerator
631 purchasers are geographically dispersed and there are few places
632 where you can buy an ad that will be primarily seen by refrigerator
633 customers. But buying a refrigerator is socially acceptable while
634 being a Nazi is not, so you can buy a billboard or advertise in the
635 newspaper sports section for your refrigerator business, and the
636 only potential downside is that your ad will be seen by a lot of
637 people who don
’t want refrigerators, resulting in a lot of wasted
640 But even if you wanted to advertise your Nazi movement on a
641 billboard or prime-time TV or the sports section, you would struggle
642 to find anyone willing to sell you the space for your ad partly
643 because they disagree with your views and partly because they fear
644 censure (boycott, reputational damage, etc.) from other people who
645 disagree with your views.
647 Targeted ads solve this problem: On the internet, every ad unit can
648 be different for every person, meaning that you can buy ads that are
649 only shown to people who appear to be Nazis and not to people who
650 hate Nazis. When there
’s spillover
— when someone who hates racism
651 is shown a racist recruiting ad
— there is some fallout; the
652 platform or publication might get an angry public or private
653 denunciation. But the nature of the risk assumed by an online ad
654 buyer is different than the risks to a traditional publisher or
655 billboard owner who might want to run a Nazi ad.
657 Online ads are placed by algorithms that broker between a diverse
658 ecosystem of self-serve ad platforms that anyone can buy an ad
659 through, so the Nazi ad that slips onto your favorite online
660 publication isn
’t seen as their moral failing but rather as a
661 failure in some distant, upstream ad supplier. When a publication
662 gets a complaint about an offensive ad that
’s appearing in one of
663 its units, it can take some steps to block that ad, but the Nazi
664 might buy a slightly different ad from a different broker serving
665 the same unit. And in any event, internet users increasingly
666 understand that when they see an ad, it
’s likely that the advertiser
667 did not choose that publication and that the publication has no idea
668 who its advertisers are.
670 These layers of indirection between advertisers and publishers serve
671 as moral buffers: Today
’s moral consensus is largely that publishers
672 shouldn
’t be held responsible for the ads that appear on their pages
673 because they
’re not actively choosing to put those ads there.
674 Because of this, Nazis are able to overcome significant barriers to
675 organizing their movement.
677 Data has a complex relationship with domination. Being able to spy
678 on your customers can alert you to their preferences for your rivals
679 and allow you to head off your rivals at the pass.
681 More importantly, if you can dominate the information space while
682 also gathering data, then you make other deceptive tactics stronger
683 because it
’s harder to break out of the web of deceit you
’re
684 spinning. Domination
— that is, ultimately becoming a monopoly
— and
685 not the data itself is the supercharger that makes every tactic
686 worth pursuing because monopolistic domination deprives your target
689 If you
’re a Nazi who wants to ensure that your prospects primarily
690 see deceptive, confirming information when they search for more, you
691 can improve your odds by seeding the search terms they use through
692 your initial communications. You don
’t need to own the top
10
693 results for
<span class=
"quote">“<span class=
"quote">voter suppression
</span>”</span> if you can convince your marks to
694 confine their search terms to
<span class=
"quote">“<span class=
"quote">voter fraud,
</span>”</span> which throws up a very
695 different set of search results.
697 Surveillance capitalists are like stage mentalists who claim that
698 their extraordinary insights into human behavior let them guess the
699 word that you wrote down and folded up in your pocket but who really
700 use shills, hidden cameras, sleight of hand, and brute-force
701 memorization to amaze you.
703 Or perhaps they
’re more like pick-up artists, the misogynistic cult
704 that promises to help awkward men have sex with women by teaching
705 them
<span class=
"quote">“<span class=
"quote">neurolinguistic programming
</span>”</span> phrases, body language
706 techniques, and psychological manipulation tactics like
<span class=
"quote">“<span class=
"quote">negging
</span>”</span> —
707 offering unsolicited negative feedback to women to lower their
708 self-esteem and prick their interest.
710 Some pick-up artists eventually manage to convince women to go home
711 with them, but it
’s not because these men have figured out how to
712 bypass women
’s critical faculties. Rather, pick-up artists
’
713 <span class=
"quote">“<span class=
"quote">success
</span>”</span> stories are a mix of women who were incapable of giving
714 consent, women who were coerced, women who were intoxicated,
715 self-destructive women, and a few women who were sober and in
716 command of their faculties but who didn
’t realize straightaway that
717 they were with terrible men but rectified the error as soon as they
720 Pick-up artists
<span class=
"emphasis"><em>believe
</em></span> they have figured out a
721 secret back door that bypasses women
’s critical faculties, but they
722 haven
’t. Many of the tactics they deploy, like negging, became the
723 butt of jokes (just like people joke about bad ad targeting), and
724 there
’s a good chance that anyone they try these tactics on will
725 immediately recognize them and dismiss the men who use them as
728 Pick-up artists are proof that people can believe they have
729 developed a system of mind control
<span class=
"emphasis"><em>even when it doesn
’t
730 work
</em></span>. Pick-up artists simply exploit the fact that
731 one-in-a-million chances can come through for you if you make a
732 million attempts, and then they assume that the other
999,
999 times,
733 they simply performed the technique incorrectly and commit
734 themselves to doing better next time. There
’s only one group of
735 people who find pick-up artist lore reliably convincing: other
736 would-be pick-up artists whose anxiety and insecurity make them
737 vulnerable to scammers and delusional men who convince them that if
738 they pay for tutelage and follow instructions, then they will
739 someday succeed. Pick-up artists assume they fail to entice women
740 because they are bad at being pick-up artists, not because pick-up
741 artistry is bullshit. Pick-up artists are bad at selling themselves
742 to women, but they
’re much better at selling themselves to men who
743 pay to learn the secrets of pick-up artistry.
745 Department store pioneer John Wanamaker is said to have lamented,
746 <span class=
"quote">“<span class=
"quote">Half the money I spend on advertising is wasted; the trouble is I
747 don
’t know which half.
</span>”</span> The fact that Wanamaker thought that only
748 half of his advertising spending was wasted is a tribute to the
749 persuasiveness of advertising executives, who are
750 <span class=
"emphasis"><em>much
</em></span> better at convincing potential clients to
751 buy their services than they are at convincing the general public to
752 buy their clients
’ wares.
753 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"what-is-facebook"></a>What is Facebook?
</h2></div></div></div><p>
754 Facebook is heralded as the origin of all of our modern plagues, and
755 it
’s not hard to see why. Some tech companies want to lock their
756 users in but make their money by monopolizing access to the market
757 for apps for their devices and gouging them on prices rather than by
758 spying on them (like Apple). Some companies don
’t care about locking
759 in users because they
’ve figured out how to spy on them no matter
760 where they are and what they
’re doing and can turn that surveillance
761 into money (Google). Facebook alone among the Western tech giants
762 has built a business based on locking in its users
763 <span class=
"emphasis"><em>and
</em></span> spying on them all the time.
765 Facebook
’s surveillance regime is really without parallel in the
766 Western world. Though Facebook tries to prevent itself from being
767 visible on the public web, hiding most of what goes on there from
768 people unless they
’re logged into Facebook, the company has
769 nevertheless booby-trapped the entire web with surveillance tools in
770 the form of Facebook
<span class=
"quote">“<span class=
"quote">Like
</span>”</span> buttons that web publishers include on
771 their sites to boost their Facebook profiles. Facebook also makes
772 various libraries and other useful code snippets available to web
773 publishers that act as surveillance tendrils on the sites where
774 they
’re used, funneling information about visitors to the site
—
775 newspapers, dating sites, message boards
— to Facebook.
776 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
777 Big Tech is able to practice surveillance not just because it is
778 tech but because it is
<span class=
"emphasis"><em>big
</em></span>.
779 </p></blockquote></div><p>
780 Facebook offers similar tools to app developers, so the apps
—
781 games, fart machines, business review services, apps for keeping
782 abreast of your kid
’s schooling
— you use will send information
783 about your activities to Facebook even if you don
’t have a Facebook
784 account and even if you don
’t download or use Facebook apps. On top
785 of all that, Facebook buys data from third-party brokers on shopping
786 habits, physical location, use of
<span class=
"quote">“<span class=
"quote">loyalty
</span>”</span> programs, financial
787 transactions, etc., and cross-references that with the dossiers it
788 develops on activity on Facebook and with apps and the public web.
790 Though it
’s easy to integrate the web with Facebook
— linking to
791 news stories and such
— Facebook products are generally not
792 available to be integrated back into the web itself. You can embed a
793 tweet in a Facebook post, but if you embed a Facebook post in a
794 tweet, you just get a link back to Facebook and must log in before
795 you can see it. Facebook has used extreme technological and legal
796 countermeasures to prevent rivals from allowing their users to embed
797 Facebook snippets in competing services or to create alternative
798 interfaces to Facebook that merge your Facebook inbox with those of
799 other services that you use.
801 And Facebook is incredibly popular, with
2.3 billion claimed users
802 (though many believe this figure to be inflated). Facebook has been
803 used to organize genocidal pogroms, racist riots, anti-vaccination
804 movements, flat Earth cults, and the political lives of some of the
805 world
’s ugliest, most brutal autocrats. There are some really
806 alarming things going on in the world, and Facebook is implicated in
807 many of them, so it
’s easy to conclude that these bad things are the
808 result of Facebook
’s mind-control system, which it rents out to
809 anyone with a few bucks to spend.
811 To understand what role Facebook plays in the formulation and
812 mobilization of antisocial movements, we need to understand the dual
815 Because it has a lot of users and a lot of data about those users,
816 Facebook is a very efficient tool for locating people with
817 hard-to-find traits, the kinds of traits that are widely diffused in
818 the population such that advertisers have historically struggled to
819 find a cost-effective way to reach them. Think back to
820 refrigerators: Most of us only replace our major appliances a few
821 times in our entire lives. If you
’re a refrigerator manufacturer or
822 retailer, you have these brief windows in the life of a consumer
823 during which they are pondering a purchase, and you have to somehow
824 reach them. Anyone who
’s ever registered a title change after buying
825 a house can attest that appliance manufacturers are incredibly
826 desperate to reach anyone who has even the slenderest chance of
827 being in the market for a new fridge.
829 Facebook makes finding people shopping for refrigerators a
830 <span class=
"emphasis"><em>lot
</em></span> easier. It can target ads to people who
’ve
831 registered a new home purchase, to people who
’ve searched for
832 refrigerator buying advice, to people who have complained about
833 their fridge dying, or any combination thereof. It can even target
834 people who
’ve recently bought
<span class=
"emphasis"><em>other
</em></span> kitchen
835 appliances on the theory that someone who
’s just replaced their
836 stove and dishwasher might be in a fridge-buying kind of mood. The
837 vast majority of people who are reached by these ads will not be in
838 the market for a new fridge, but
— crucially
— the percentage of
839 people who
<span class=
"emphasis"><em>are
</em></span> looking for fridges that these
840 ads reach is
<span class=
"emphasis"><em>much
</em></span> larger than it is than for
841 any group that might be subjected to traditional, offline targeted
842 refrigerator marketing.
844 Facebook also makes it a lot easier to find people who have the same
845 rare disease as you, which might have been impossible in earlier
846 eras
— the closest fellow sufferer might otherwise be hundreds of
847 miles away. It makes it easier to find people who went to the same
848 high school as you even though decades have passed and your former
849 classmates have all been scattered to the four corners of the Earth.
851 Facebook also makes it much easier to find people who hold the same
852 rare political beliefs as you. If you
’ve always harbored a secret
853 affinity for socialism but never dared utter this aloud lest you be
854 demonized by your neighbors, Facebook can help you discover other
855 people who feel the same way (and it might just demonstrate to you
856 that your affinity is more widespread than you ever suspected). It
857 can make it easier to find people who share your sexual identity.
858 And again, it can help you to understand that what you thought was a
859 shameful secret that affected only you was really a widely shared
860 trait, giving you both comfort and the courage to come out to the
863 All of this presents a dilemma for Facebook: Targeting makes the
864 company
’s ads more effective than traditional ads, but it also lets
865 advertisers see just how effective their ads are. While advertisers
866 are pleased to learn that Facebook ads are more effective than ads
867 on systems with less sophisticated targeting, advertisers can also
868 see that in nearly every case, the people who see their ads ignore
869 them. Or, at best, the ads work on a subconscious level, creating
870 nebulous unmeasurables like
<span class=
"quote">“<span class=
"quote">brand recognition.
</span>”</span> This means that the
871 price per ad is very low in nearly every case.
873 To make things worse, many Facebook groups spark precious little
874 discussion. Your little-league soccer team, the people with the same
875 rare disease as you, and the people you share a political affinity
876 with may exchange the odd flurry of messages at critical junctures,
877 but on a daily basis, there
’s not much to say to your old high
878 school chums or other hockey-card collectors.
880 With nothing but
<span class=
"quote">“<span class=
"quote">organic
</span>”</span> discussion, Facebook would not generate
881 enough traffic to sell enough ads to make the money it needs to
882 continually expand by buying up its competitors while returning
883 handsome sums to its investors.
885 So Facebook has to gin up traffic by sidetracking its own forums:
886 Every time Facebook
’s algorithm injects controversial materials
—
887 inflammatory political articles, conspiracy theories, outrage
888 stories
— into a group, it can hijack that group
’s nominal purpose
889 with its desultory discussions and supercharge those discussions by
890 turning them into bitter, unproductive arguments that drag on and
891 on. Facebook is optimized for engagement, not happiness, and it
892 turns out that automated systems are pretty good at figuring out
893 things that people will get angry about.
895 Facebook
<span class=
"emphasis"><em>can
</em></span> modify our behavior but only in a
896 couple of trivial ways. First, it can lock in all your friends and
897 family members so that you check and check and check with Facebook
898 to find out what they are up to; and second, it can make you angry
899 and anxious. It can force you to choose between being interrupted
900 constantly by updates
— a process that breaks your concentration and
901 makes it hard to be introspective
— and staying in touch with your
902 friends. This is a very limited form of mind control, and it can
903 only really make us miserable, angry, and anxious.
905 This is why Facebook
’s targeting systems
— both the ones it shows to
906 advertisers and the ones that let users find people who share their
907 interests
— are so next-gen and smooth and easy to use as well as
908 why its message boards have a toolset that seems like it hasn
’t
909 changed since the mid-
2000s. If Facebook delivered an equally
910 flexible, sophisticated message-reading system to its users, those
911 users could defend themselves against being nonconsensually
912 eyeball-fucked with Donald Trump headlines.
914 The more time you spend on Facebook, the more ads it gets to show
915 you. The solution to Facebook
’s ads only working one in a thousand
916 times is for the company to try to increase how much time you spend
917 on Facebook by a factor of a thousand. Rather than thinking of
918 Facebook as a company that has figured out how to show you exactly
919 the right ad in exactly the right way to get you to do what its
920 advertisers want, think of it as a company that has figured out how
921 to make you slog through an endless torrent of arguments even though
922 they make you miserable, spending so much time on the site that it
923 eventually shows you at least one ad that you respond to.
924 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"monopoly-and-the-right-to-the-future-tense"></a>Monopoly and the right to the future tense
</h2></div></div></div><p>
925 Zuboff and her cohort are particularly alarmed at the extent to
926 which surveillance allows corporations to influence our decisions,
927 taking away something she poetically calls
<span class=
"quote">“<span class=
"quote">the right to the future
928 tense
</span>”</span> — that is, the right to decide for yourself what you will do
931 It
’s true that advertising can tip the scales one way or another:
932 When you
’re thinking of buying a fridge, a timely fridge ad might
933 end the search on the spot. But Zuboff puts enormous and undue
934 weight on the persuasive power of surveillance-based influence
935 techniques. Most of these don
’t work very well, and the ones that do
936 won
’t work for very long. The makers of these influence tools are
937 confident they will someday refine them into systems of total
938 control, but they are hardly unbiased observers, and the risks from
939 their dreams coming true are very speculative.
941 By contrast, Zuboff is rather sanguine about
40 years of lax
942 antitrust practice that has allowed a handful of companies to
943 dominate the internet, ushering in an information age with,
944 <a class=
"ulink" href=
"https://twitter.com/tveastman/status/1069674780826071040" target=
"_top">as
945 one person on Twitter noted
</a>, five giant websites each filled
946 with screenshots of the other four.
948 However, if we are to be alarmed that we might lose the right to
949 choose for ourselves what our future will hold, then monopoly
’s
950 nonspeculative, concrete, here-and-now harms should be front and
951 center in our debate over tech policy.
953 Start with
<span class=
"quote">“<span class=
"quote">digital rights management.
</span>”</span> In
1998, Bill Clinton signed
954 the Digital Millennium Copyright Act (DMCA) into law. It
’s a complex
955 piece of legislation with many controversial clauses but none more
956 so than Section
1201, the
<span class=
"quote">“<span class=
"quote">anti-circumvention
</span>”</span> rule.
958 This is a blanket ban on tampering with systems that restrict access
959 to copyrighted works. The ban is so thoroughgoing that it prohibits
960 removing a copyright lock even when no copyright infringement takes
961 place. This is by design: The activities that the DMCA
’s Section
962 1201 sets out to ban are not copyright infringements; rather, they
963 are legal activities that frustrate manufacturers
’ commercial plans.
965 For example, Section
1201’s first major application was on DVD
966 players as a means of enforcing the region coding built into those
967 devices. DVD-CCA, the body that standardized DVDs and DVD players,
968 divided the world into six regions and specified that DVD players
969 must check each disc to determine which regions it was authorized to
970 be played in. DVD players would have their own corresponding region
971 (a DVD player bought in the U.S. would be region
1 while one bought
972 in India would be region
5). If the player and the disc
’s region
973 matched, the player would play the disc; otherwise, it would reject
976 However, watching a lawfully produced disc in a country other than
977 the one where you purchased it is not copyright infringement
— it
’s
978 the opposite. Copyright law imposes this duty on customers for a
979 movie: You must go into a store, find a licensed disc, and pay the
980 asking price. Do that
— and
<span class=
"emphasis"><em>nothing else
</em></span> — and
981 you and copyright are square with one another.
983 The fact that a movie studio wants to charge Indians less than
984 Americans or release in Australia later than it releases in the U.K.
985 has no bearing on copyright law. Once you lawfully acquire a DVD, it
986 is no copyright infringement to watch it no matter where you happen
989 So DVD and DVD player manufacturers would not be able to use
990 accusations of abetting copyright infringement to punish
991 manufacturers who made noncompliant players that would play discs
992 from any region or repair shops that modified players to let you
993 watch out-of-region discs or software programmers who created
994 programs to let you do this.
996 That
’s where Section
1201 of the DMCA comes in: By banning tampering
997 with an
<span class=
"quote">“<span class=
"quote">access control,
</span>”</span> the rule gave manufacturers and rights
998 holders standing to sue competitors who released superior products
999 with lawful features that the market demanded (in this case,
1000 region-free players).
1002 This is an odious scam against consumers, but as time went by,
1003 Section
1201 grew to encompass a rapidly expanding constellation of
1004 devices and services as canny manufacturers have realized certain
1006 </p><div class=
"itemizedlist"><ul class=
"itemizedlist compact" style=
"list-style-type: disc; "><li class=
"listitem"><p>
1007 Any device with software in it contains a
<span class=
"quote">“<span class=
"quote">copyrighted work
</span>”</span> —
1009 </p></li><li class=
"listitem"><p>
1010 A device can be designed so that reconfiguring the software
1011 requires bypassing an
<span class=
"quote">“<span class=
"quote">access control for copyrighted works,
</span>”</span>
1012 which is a potential felony under Section
1201.
1013 </p></li><li class=
"listitem"><p>
1014 Thus, companies can control their customers
’ behavior after they
1015 take home their purchases by designing products so that all
1016 unpermitted uses require modifications that fall afoul of
1018 </p></li></ul></div><p>
1019 Section
1201 then becomes a means for manufacturers of all
1020 descriptions to force their customers to arrange their affairs to
1021 benefit the manufacturers
’ shareholders instead of themselves.
1023 This manifests in many ways: from a new generation of inkjet
1024 printers that use countermeasures to prevent third-party ink that
1025 cannot be bypassed without legal risks to similar systems in
1026 tractors that prevent third-party technicians from swapping in the
1027 manufacturer
’s own parts that are not recognized by the tractor
’s
1028 control system until it is supplied with a manufacturer
’s unlock
1031 Closer to home, Apple
’s iPhones use these measures to prevent both
1032 third-party service and third-party software installation. This
1033 allows Apple to decide when an iPhone is beyond repair and must be
1034 shredded and landfilled as opposed to the iPhone
’s purchaser. (Apple
1035 is notorious for its environmentally catastrophic policy of
1036 destroying old electronics rather than permitting them to be
1037 cannibalized for parts.) This is a very useful power to wield,
1038 especially in light of CEO Tim Cook
’s January
2019 warning to
1039 investors that the company
’s profits are endangered by customers
1040 choosing to hold onto their phones for longer rather than replacing
1043 Apple
’s use of copyright locks also allows it to establish a
1044 monopoly over how its customers acquire software for their mobile
1045 devices. The App Store
’s commercial terms guarantee Apple a share of
1046 all revenues generated by the apps sold there, meaning that Apple
1047 gets paid when you buy an app from its store and then continues to
1048 get paid every time you buy something using that app. This comes out
1049 of the bottom line of software developers, who must either charge
1050 more or accept lower profits for their products.
1052 Crucially, Apple
’s use of copyright locks gives it the power to make
1053 editorial decisions about which apps you may and may not install on
1054 your own device. Apple has used this power to
1055 <a class=
"ulink" href=
"https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html" target=
"_top">reject
1056 dictionaries
</a> for containing obscene words; to
1057 <a class=
"ulink" href=
"https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again" target=
"_top">limit
1058 political speech
</a>, especially from apps that make sensitive
1059 political commentary such as an app that notifies you every time a
1060 U.S. drone kills someone somewhere in the world; and to
1061 <a class=
"ulink" href=
"https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says" target=
"_top">object
1062 to a game
</a> that commented on the Israel-Palestine conflict.
1064 Apple often justifies monopoly power over software installation in
1065 the name of security, arguing that its vetting of apps for its store
1066 means that it can guard its users against apps that contain
1067 surveillance code. But this cuts both ways. In China, the government
1068 <a class=
"ulink" href=
"https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc" target=
"_top">ordered
1069 Apple to prohibit the sale of privacy tools
</a> like VPNs with
1070 the exception of VPNs that had deliberately introduced flaws
1071 designed to let the Chinese state eavesdrop on users. Because Apple
1072 uses technological countermeasures
— with legal backstops
— to block
1073 customers from installing unauthorized apps, Chinese iPhone owners
1074 cannot readily (or legally) acquire VPNs that would protect them
1075 from Chinese state snooping.
1077 Zuboff calls surveillance capitalism a
<span class=
"quote">“<span class=
"quote">rogue capitalism.
</span>”</span>
1078 Theoreticians of capitalism claim that its virtue is that it
1079 <a class=
"ulink" href=
"https://en.wikipedia.org/wiki/Price_signal" target=
"_top">aggregates
1080 information in the form of consumers
’ decisions
</a>, producing
1081 efficient markets. Surveillance capitalism
’s supposed power to rob
1082 its victims of their free will through computationally supercharged
1083 influence campaigns means that our markets no longer aggregate
1084 customers
’ decisions because we customers no longer decide
— we are
1085 given orders by surveillance capitalism
’s mind-control rays.
1087 If our concern is that markets cease to function when consumers can
1088 no longer make choices, then copyright locks should concern us at
1089 <span class=
"emphasis"><em>least
</em></span> as much as influence campaigns. An
1090 influence campaign might nudge you to buy a certain brand of phone;
1091 but the copyright locks on that phone absolutely determine where you
1092 get it serviced, which apps can run on it, and when you have to
1093 throw it away rather than fixing it.
1094 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"search-order-and-the-right-to-the-future-tense"></a>Search order and the right to the future tense
</h2></div></div></div><p>
1095 Markets are posed as a kind of magic: By discovering otherwise
1096 hidden information conveyed by the free choices of consumers, those
1097 consumers
’ local knowledge is integrated into a self-correcting
1098 system that makes efficient allocations
—more efficient than any
1099 computer could calculate. But monopolies are incompatible with that
1100 notion. When you only have one app store, the owner of the store
—
1101 not the consumer
— decides on the range of choices. As Boss Tweed
1102 once said,
<span class=
"quote">“<span class=
"quote">I don
’t care who does the electing, so long as I get to
1103 do the nominating.
</span>”</span> A monopolized market is an election whose
1104 candidates are chosen by the monopolist.
1106 This ballot rigging is made more pernicious by the existence of
1107 monopolies over search order. Google
’s search market share is about
1108 90%. When Google
’s ranking algorithm puts a result for a popular
1109 search term in its top
10, that helps determine the behavior of
1110 millions of people. If Google
’s answer to
<span class=
"quote">“<span class=
"quote">Are vaccines dangerous?
</span>”</span>
1111 is a page that rebuts anti-vax conspiracy theories, then a sizable
1112 portion of the public will learn that vaccines are safe. If, on the
1113 other hand, Google sends those people to a site affirming the
1114 anti-vax conspiracies, a sizable portion of those millions will come
1115 away convinced that vaccines are dangerous.
1117 Google
’s algorithm is often tricked into serving disinformation as a
1118 prominent search result. But in these cases, Google isn
’t persuading
1119 people to change their minds; it
’s just presenting something untrue
1120 as fact when the user has no cause to doubt it.
1122 This is true whether the search is for
<span class=
"quote">“<span class=
"quote">Are vaccines dangerous?
</span>”</span> or
1123 <span class=
"quote">“<span class=
"quote">best restaurants near me.
</span>”</span> Most users will never look past the
1124 first page of search results, and when the overwhelming majority of
1125 people all use the same search engine, the ranking algorithm
1126 deployed by that search engine will determine myriad outcomes
1127 (whether to adopt a child, whether to have cancer surgery, where to
1128 eat dinner, where to move, where to apply for a job) to a degree
1129 that vastly outstrips any behavioral outcomes dictated by
1130 algorithmic persuasion techniques.
1132 Many of the questions we ask search engines have no empirically
1133 correct answers:
<span class=
"quote">“<span class=
"quote">Where should I eat dinner?
</span>”</span> is not an objective
1134 question. Even questions that do have correct answers (
<span class=
"quote">“<span class=
"quote">Are vaccines
1135 dangerous?
</span>”</span>) don
’t have one empirically superior source for that
1136 answer. Many pages affirm the safety of vaccines, so which one goes
1137 first? Under conditions of competition, consumers can choose from
1138 many search engines and stick with the one whose algorithmic
1139 judgment suits them best, but under conditions of monopoly, we all
1140 get our answers from the same place.
1142 Google
’s search dominance isn
’t a matter of pure merit: The company
1143 has leveraged many tactics that would have been prohibited under
1144 classical, pre-Ronald-Reagan antitrust enforcement standards to
1145 attain its dominance. After all, this is a company that has
1146 developed two major products: a really good search engine and a
1147 pretty good Hotmail clone. Every other major success it
’s had
—
1148 Android, YouTube, Google Maps, etc.
— has come through an
1149 acquisition of a nascent competitor. Many of the company
’s key
1150 divisions, such as the advertising technology of DoubleClick,
1151 violate the historical antitrust principle of structural separation,
1152 which forbade firms from owning subsidiaries that competed with
1153 their customers. Railroads, for example, were barred from owning
1154 freight companies that competed with the shippers whose freight they
1157 If we
’re worried about giant companies subverting markets by
1158 stripping consumers of their ability to make free choices, then
1159 vigorous antitrust enforcement seems like an excellent remedy. If
1160 we
’d denied Google the right to effect its many mergers, we would
1161 also have probably denied it its total search dominance. Without
1162 that dominance, the pet theories, biases, errors (and good judgment,
1163 too) of Google search engineers and product managers would not have
1164 such an outsized effect on consumer choice.
1166 This goes for many other companies. Amazon, a classic surveillance
1167 capitalist, is obviously the dominant tool for searching Amazon
—
1168 though many people find their way to Amazon through Google searches
1169 and Facebook posts
— and obviously, Amazon controls Amazon search.
1170 That means that Amazon
’s own self-serving editorial choices
—like
1171 promoting its own house brands over rival goods from its sellers as
1172 well as its own pet theories, biases, and errors
— determine much of
1173 what we buy on Amazon. And since Amazon is the dominant e-commerce
1174 retailer outside of China and since it attained that dominance by
1175 buying up both large rivals and nascent competitors in defiance of
1176 historical antitrust rules, we can blame the monopoly for stripping
1177 consumers of their right to the future tense and the ability to
1178 shape markets by making informed choices.
1180 Not every monopolist is a surveillance capitalist, but that doesn
’t
1181 mean they
’re not able to shape consumer choices in wide-ranging
1182 ways. Zuboff lauds Apple for its App Store and iTunes Store,
1183 insisting that adding price tags to the features on its platforms
1184 has been the secret to resisting surveillance and thus creating
1185 markets. But Apple is the only retailer allowed to sell on its
1186 platforms, and it
’s the second-largest mobile device vendor in the
1187 world. The independent software vendors that sell through Apple
’s
1188 marketplace accuse the company of the same surveillance sins as
1189 Amazon and other big retailers: spying on its customers to find
1190 lucrative new products to launch, effectively using independent
1191 software vendors as free-market researchers, then forcing them out
1192 of any markets they discover.
1194 Because of its use of copyright locks, Apple
’s mobile customers are
1195 not legally allowed to switch to a rival retailer for its apps if
1196 they want to do so on an iPhone. Apple, obviously, is the only
1197 entity that gets to decide how it ranks the results of search
1198 queries in its stores. These decisions ensure that some apps are
1199 often installed (because they appear on page one) and others are
1200 never installed (because they appear on page one million). Apple
’s
1201 search-ranking design decisions have a vastly more significant
1202 effect on consumer behaviors than influence campaigns delivered by
1203 surveillance capitalism
’s ad-serving bots.
1204 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"monopolists-can-afford-sleeping-pills-for-watchdogs"></a>Monopolists can afford sleeping pills for watchdogs
</h2></div></div></div><p>
1205 Only the most extreme market ideologues think that markets can
1206 self-regulate without state oversight. Markets need watchdogs
—
1207 regulators, lawmakers, and other elements of democratic control
— to
1208 keep them honest. When these watchdogs sleep on the job, then
1209 markets cease to aggregate consumer choices because those choices
1210 are constrained by illegitimate and deceptive activities that
1211 companies are able to get away with because no one is holding them
1214 But this kind of regulatory capture doesn
’t come cheap. In
1215 competitive sectors, where rivals are constantly eroding one
1216 another
’s margins, individual firms lack the surplus capital to
1217 effectively lobby for laws and regulations that serve their ends.
1219 Many of the harms of surveillance capitalism are the result of weak
1220 or nonexistent regulation. Those regulatory vacuums spring from the
1221 power of monopolists to resist stronger regulation and to tailor
1222 what regulation exists to permit their existing businesses.
1224 Here
’s an example: When firms over-collect and over-retain our data,
1225 they are at increased risk of suffering a breach
— you can
’t leak
1226 data you never collected, and once you delete all copies of that
1227 data, you can no longer leak it. For more than a decade, we
’ve lived
1228 through an endless parade of ever-worsening data breaches, each one
1229 uniquely horrible in the scale of data breached and the sensitivity
1232 But still, firms continue to over-collect and over-retain our data
1235 <span class=
"strong"><strong>1. They are locked in the aforementioned
1236 limbic arms race with our capacity to shore up our attentional
1237 defense systems to resist their new persuasion
1238 techniques.
</strong></span> They
’re also locked in an arms race with
1239 their competitors to find new ways to target people for sales
1240 pitches. As soon as they discover a soft spot in our attentional
1241 defenses (a counterintuitive, unobvious way to target potential
1242 refrigerator buyers), the public begins to wise up to the tactic,
1243 and their competitors leap on it, hastening the day in which all
1244 potential refrigerator buyers have been inured to the pitch.
1246 <span class=
"strong"><strong>2. They believe the surveillance capitalism
1247 story.
</strong></span> Data is cheap to aggregate and store, and both
1248 proponents and opponents of surveillance capitalism have assured
1249 managers and product designers that if you collect enough data, you
1250 will be able to perform sorcerous acts of mind control, thus
1251 supercharging your sales. Even if you never figure out how to profit
1252 from the data, someone else will eventually offer to buy it from you
1253 to give it a try. This is the hallmark of all economic bubbles:
1254 acquiring an asset on the assumption that someone else will buy it
1255 from you for more than you paid for it, often to sell to someone
1256 else at an even greater price.
1258 <span class=
"strong"><strong>3. The penalties for leaking data are
1259 negligible.
</strong></span> Most countries limit these penalties to
1260 actual damages, meaning that consumers who
’ve had their data
1261 breached have to show actual monetary harms to get a reward. In
1262 2014, Home Depot disclosed that it had lost credit-card data for
53
1263 million of its customers, but it settled the matter by paying those
1264 customers about $
0.34 each
— and a third of that $
0.34 wasn
’t even
1265 paid in cash. It took the form of a credit to procure a largely
1266 ineffectual credit-monitoring service.
1268 But the harms from breaches are much more extensive than these
1269 actual-damages rules capture. Identity thieves and fraudsters are
1270 wily and endlessly inventive. All the vast breaches of our century
1271 are being continuously recombined, the data sets merged and mined
1272 for new ways to victimize the people whose data was present in them.
1273 Any reasonable, evidence-based theory of deterrence and compensation
1274 for breaches would not confine damages to actual damages but rather
1275 would allow users to claim these future harms.
1277 However, even the most ambitious privacy rules, such as the EU
1278 General Data Protection Regulation, fall far short of capturing the
1279 negative externalities of the platforms
’ negligent over-collection
1280 and over-retention, and what penalties they do provide are not
1281 aggressively pursued by regulators.
1283 This tolerance of
— or indifference to
— data over-collection and
1284 over-retention can be ascribed in part to the sheer lobbying muscle
1285 of the platforms. They are so profitable that they can handily
1286 afford to divert gigantic sums to fight any real change
— that is,
1287 change that would force them to internalize the costs of their
1288 surveillance activities.
1290 And then there
’s state surveillance, which the surveillance
1291 capitalism story dismisses as a relic of another era when the big
1292 worry was being jailed for your dissident speech, not having your
1293 free will stripped away with machine learning.
1295 But state surveillance and private surveillance are intimately
1296 related. As we saw when Apple was conscripted by the Chinese
1297 government as a vital collaborator in state surveillance, the only
1298 really affordable and tractable way to conduct mass surveillance on
1299 the scale practiced by modern states
— both
<span class=
"quote">“<span class=
"quote">free
</span>”</span> and autocratic
1300 states
— is to suborn commercial services.
1302 Whether it
’s Google being used as a location tracking tool by local
1303 law enforcement across the U.S. or the use of social media tracking
1304 by the Department of Homeland Security to build dossiers on
1305 participants in protests against Immigration and Customs
1306 Enforcement
’s family separation practices, any hard limits on
1307 surveillance capitalism would hamstring the state
’s own surveillance
1308 capability. Without Palantir, Amazon, Google, and other major tech
1309 contractors, U.S. cops would not be able to spy on Black people, ICE
1310 would not be able to manage the caging of children at the U.S.
1311 border, and state welfare systems would not be able to purge their
1312 rolls by dressing up cruelty as empiricism and claiming that poor
1313 and vulnerable people are ineligible for assistance. At least some
1314 of the states
’ unwillingness to take meaningful action to curb
1315 surveillance should be attributed to this symbiotic relationship.
1316 There is no mass state surveillance without mass commercial
1319 Monopolism is key to the project of mass state surveillance. It
’s
1320 true that smaller tech firms are apt to be less well-defended than
1321 Big Tech, whose security experts are drawn from the tops of their
1322 field and who are given enormous resources to secure and monitor
1323 their systems against intruders. But smaller firms also have less to
1324 protect: fewer users whose data is more fragmented across more
1325 systems and have to be suborned one at a time by state actors.
1327 A concentrated tech sector that works with authorities is a much
1328 more powerful ally in the project of mass state surveillance than a
1329 fragmented one composed of smaller actors. The U.S. tech sector is
1330 small enough that all of its top executives fit around a single
1331 boardroom table in Trump Tower in
2017, shortly after Trump
’s
1332 inauguration. Most of its biggest players bid to win JEDI, the
1333 Pentagon
’s $
10 billion Joint Enterprise Defense Infrastructure cloud
1334 contract. Like other highly concentrated industries, Big Tech
1335 rotates its key employees in and out of government service, sending
1336 them to serve in the Department of Defense and the White House, then
1337 hiring ex-Pentagon and ex-DOD top staffers and officers to work in
1338 their own government relations departments.
1340 They can even make a good case for doing this: After all, when there
1341 are only four or five big companies in an industry, everyone
1342 qualified to regulate those companies has served as an executive in
1343 at least a couple of them
— because, likewise, when there are only
1344 five companies in an industry, everyone qualified for a senior role
1345 at any of them is by definition working at one of the other ones.
1346 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
1347 While surveillance doesn
’t cause monopolies, monopolies certainly
1349 </p></blockquote></div><p>
1350 Industries that are competitive are fragmented
— composed of
1351 companies that are at each other
’s throats all the time and eroding
1352 one another
’s margins in bids to steal their best customers. This
1353 leaves them with much more limited capital to use to lobby for
1354 favorable rules and a much harder job of getting everyone to agree
1355 to pool their resources to benefit the industry as a whole.
1357 Surveillance combined with machine learning is supposed to be an
1358 existential crisis, a species-defining moment at which our free will
1359 is just a few more advances in the field from being stripped away. I
1360 am skeptical of this claim, but I
<span class=
"emphasis"><em>do
</em></span> think that
1361 tech poses an existential threat to our society and possibly our
1364 But that threat grows out of monopoly.
1366 One of the consequences of tech
’s regulatory capture is that it can
1367 shift liability for poor security decisions onto its customers and
1368 the wider society. It is absolutely normal in tech for companies to
1369 obfuscate the workings of their products, to make them deliberately
1370 hard to understand, and to threaten security researchers who seek to
1371 independently audit those products.
1373 IT is the only field in which this is practiced: No one builds a
1374 bridge or a hospital and keeps the composition of the steel or the
1375 equations used to calculate load stresses a secret. It is a frankly
1376 bizarre practice that leads, time and again, to grotesque security
1377 defects on farcical scales, with whole classes of devices being
1378 revealed as vulnerable long after they are deployed in the field and
1379 put into sensitive places.
1381 The monopoly power that keeps any meaningful consequences for
1382 breaches at bay means that tech companies continue to build terrible
1383 products that are insecure by design and that end up integrated into
1384 our lives, in possession of our data, and connected to our physical
1385 world. For years, Boeing has struggled with the aftermath of a
1386 series of bad technology decisions that made its
737 fleet a global
1387 pariah, a rare instance in which bad tech decisions have been
1388 seriously punished in the market.
1390 These bad security decisions are compounded yet again by the use of
1391 copyright locks to enforce business-model decisions against
1392 consumers. Recall that these locks have become the go-to means for
1393 shaping consumer behavior, making it technically impossible to use
1394 third-party ink, insulin, apps, or service depots in connection with
1395 your lawfully acquired property.
1397 Recall also that these copyright locks are backstopped by
1398 legislation (such as Section
1201 of the DMCA or Article
6 of the
1399 2001 EU Copyright Directive) that ban tampering with
1400 (
<span class=
"quote">“<span class=
"quote">circumventing
</span>”</span>) them, and these statutes have been used to
1401 threaten security researchers who make disclosures about
1402 vulnerabilities without permission from manufacturers.
1404 This amounts to a manufacturer
’s veto over safety warnings and
1405 criticism. While this is far from the legislative intent of the DMCA
1406 and its sister statutes around the world, Congress has not
1407 intervened to clarify the statute nor will it because to do so would
1408 run counter to the interests of powerful, large firms whose lobbying
1409 muscle is unstoppable.
1411 Copyright locks are a double whammy: They create bad security
1412 decisions that can
’t be freely investigated or discussed. If markets
1413 are supposed to be machines for aggregating information (and if
1414 surveillance capitalism
’s notional mind-control rays are what make
1415 it a
<span class=
"quote">“<span class=
"quote">rogue capitalism
</span>”</span> because it denies consumers the power to
1416 make decisions), then a program of legally enforced ignorance of the
1417 risks of products makes monopolism even more of a
<span class=
"quote">“<span class=
"quote">rogue capitalism
</span>”</span>
1418 than surveillance capitalism
’s influence campaigns.
1420 And unlike mind-control rays, enforced silence over security is an
1421 immediate, documented problem, and it
<span class=
"emphasis"><em>does
</em></span>
1422 constitute an existential threat to our civilization and possibly
1423 our species. The proliferation of insecure devices
— especially
1424 devices that spy on us and especially when those devices also can
1425 manipulate the physical world by, say, steering your car or flipping
1426 a breaker at a power station
— is a kind of technology debt.
1428 In software design,
<span class=
"quote">“<span class=
"quote">technology debt
</span>”</span> refers to old, baked-in
1429 decisions that turn out to be bad ones in hindsight. Perhaps a
1430 long-ago developer decided to incorporate a networking protocol made
1431 by a vendor that has since stopped supporting it. But everything in
1432 the product still relies on that superannuated protocol, and so,
1433 with each revision, the product team has to work around this
1434 obsolete core, adding compatibility layers, surrounding it with
1435 security checks that try to shore up its defenses, and so on. These
1436 Band-Aid measures compound the debt because every subsequent
1437 revision has to make allowances for
<span class=
"emphasis"><em>them
</em></span>, too,
1438 like interest mounting on a predatory subprime loan. And like a
1439 subprime loan, the interest mounts faster than you can hope to pay
1440 it off: The product team has to put so much energy into maintaining
1441 this complex, brittle system that they don
’t have any time left over
1442 to refactor the product from the ground up and
<span class=
"quote">“<span class=
"quote">pay off the debt
</span>”</span>
1445 Typically, technology debt results in a technological bankruptcy:
1446 The product gets so brittle and unsustainable that it fails
1447 catastrophically. Think of the antiquated COBOL-based banking and
1448 accounting systems that fell over at the start of the pandemic
1449 emergency when confronted with surges of unemployment claims.
1450 Sometimes that ends the product; sometimes it takes the company down
1451 with it. Being caught in the default of a technology debt is scary
1452 and traumatic, just like losing your house due to bankruptcy is
1453 scary and traumatic.
1455 But the technology debt created by copyright locks isn
’t individual
1456 debt; it
’s systemic. Everyone in the world is exposed to this
1457 over-leverage, as was the case with the
2008 financial crisis. When
1458 that debt comes due
— when we face a cascade of security breaches
1459 that threaten global shipping and logistics, the food supply,
1460 pharmaceutical production pipelines, emergency communications, and
1461 other critical systems that are accumulating technology debt in part
1462 due to the presence of deliberately insecure and deliberately
1463 unauditable copyright locks
— it will indeed pose an existential
1465 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"privacy-and-monopoly"></a>Privacy and monopoly
</h2></div></div></div><p>
1466 Many tech companies are gripped by an orthodoxy that holds that if
1467 they just gather enough data on enough of our activities, everything
1468 else is possible
— the mind control and endless profits. This is an
1469 unfalsifiable hypothesis: If data gives a tech company even a tiny
1470 improvement in behavior prediction and modification, the company
1471 declares that it has taken the first step toward global domination
1472 with no end in sight. If a company
<span class=
"emphasis"><em>fails
</em></span> to
1473 attain any improvements from gathering and analyzing data, it
1474 declares success to be just around the corner, attainable once more
1477 Surveillance tech is far from the first industry to embrace a
1478 nonsensical, self-serving belief that harms the rest of the world,
1479 and it is not the first industry to profit handsomely from such a
1480 delusion. Long before hedge-fund managers were claiming (falsely)
1481 that they could beat the S
&P
500, there were plenty of other
1482 <span class=
"quote">“<span class=
"quote">respectable
</span>”</span> industries that have been revealed as quacks in
1483 hindsight. From the makers of radium suppositories (a real thing!)
1484 to the cruel sociopaths who claimed they could
<span class=
"quote">“<span class=
"quote">cure
</span>”</span> gay people,
1485 history is littered with the formerly respectable titans of
1486 discredited industries.
1488 This is not to say that there
’s nothing wrong with Big Tech and its
1489 ideological addiction to data. While surveillance
’s benefits are
1490 mostly overstated, its harms are, if anything,
1491 <span class=
"emphasis"><em>understated
</em></span>.
1493 There
’s real irony here. The belief in surveillance capitalism as a
1494 <span class=
"quote">“<span class=
"quote">rogue capitalism
</span>”</span> is driven by the belief that markets wouldn
’t
1495 tolerate firms that are gripped by false beliefs. An oil company
1496 that has false beliefs about where the oil is will eventually go
1497 broke digging dry wells after all.
1499 But monopolists get to do terrible things for a long time before
1500 they pay the price. Think of how concentration in the finance sector
1501 allowed the subprime crisis to fester as bond-rating agencies,
1502 regulators, investors, and critics all fell under the sway of a
1503 false belief that complex mathematics could construct
<span class=
"quote">“<span class=
"quote">fully hedged
</span>”</span>
1504 debt instruments that could not possibly default. A small bank that
1505 engaged in this kind of malfeasance would simply go broke rather
1506 than outrunning the inevitable crisis, perhaps growing so big that
1507 it averted it altogether. But large banks were able to continue to
1508 attract investors, and when they finally
<span class=
"emphasis"><em>did
</em></span>
1509 come a-cropper, the world
’s governments bailed them out. The worst
1510 offenders of the subprime crisis are bigger than they were in
2008,
1511 bringing home more profits and paying their execs even larger sums.
1513 Big Tech is able to practice surveillance not just because it is
1514 tech but because it is
<span class=
"emphasis"><em>big
</em></span>. The reason every
1515 web publisher embeds a Facebook
<span class=
"quote">“<span class=
"quote">Like
</span>”</span> button is that Facebook
1516 dominates the internet
’s social media referrals
— and every one of
1517 those
<span class=
"quote">“<span class=
"quote">Like
</span>”</span> buttons spies on everyone who lands on a page that
1518 contains them (see also: Google Analytics embeds, Twitter buttons,
1521 The reason the world
’s governments have been slow to create
1522 meaningful penalties for privacy breaches is that Big Tech
’s
1523 concentration produces huge profits that can be used to lobby
1524 against those penalties
— and Big Tech
’s concentration means that
1525 the companies involved are able to arrive at a unified negotiating
1526 position that supercharges the lobbying.
1528 The reason that the smartest engineers in the world want to work for
1529 Big Tech is that Big Tech commands the lion
’s share of tech industry
1532 The reason people who are aghast at Facebook
’s and Google
’s and
1533 Amazon
’s data-handling practices continue to use these services is
1534 that all their friends are on Facebook; Google dominates search; and
1535 Amazon has put all the local merchants out of business.
1537 Competitive markets would weaken the companies
’ lobbying muscle by
1538 reducing their profits and pitting them against each other in
1539 regulatory forums. It would give customers other places to go to get
1540 their online services. It would make the companies small enough to
1541 regulate and pave the way to meaningful penalties for breaches. It
1542 would let engineers with ideas that challenged the surveillance
1543 orthodoxy raise capital to compete with the incumbents. It would
1544 give web publishers multiple ways to reach audiences and make the
1545 case against Facebook and Google and Twitter embeds.
1547 In other words, while surveillance doesn
’t cause monopolies,
1548 monopolies certainly abet surveillance.
1549 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"ronald-reagan-pioneer-of-tech-monopolism"></a>Ronald Reagan, pioneer of tech monopolism
</h2></div></div></div><p>
1550 Technology exceptionalism is a sin, whether it
’s practiced by
1551 technology
’s blind proponents or by its critics. Both of these camps
1552 are prone to explaining away monopolistic concentration by citing
1553 some special characteristic of the tech industry, like network
1554 effects or first-mover advantage. The only real difference between
1555 these two groups is that the tech apologists say monopoly is
1556 inevitable so we should just let tech get away with its abuses while
1557 competition regulators in the U.S. and the EU say monopoly is
1558 inevitable so we should punish tech for its abuses but not try to
1559 break up the monopolies.
1561 To understand how tech became so monopolistic, it
’s useful to look
1562 at the dawn of the consumer tech industry:
1979, the year the Apple
1563 II Plus launched and became the first successful home computer. That
1564 also happens to be the year that Ronald Reagan hit the campaign
1565 trail for the
1980 presidential race
— a race he won, leading to a
1566 radical shift in the way that antitrust concerns are handled in
1567 America. Reagan
’s cohort of politicians
— including Margaret
1568 Thatcher in the U.K., Brian Mulroney in Canada, Helmut Kohl in
1569 Germany, and Augusto Pinochet in Chile
— went on to enact similar
1570 reforms that eventually spread around the world.
1572 Antitrust
’s story began nearly a century before all that with laws
1573 like the Sherman Act, which took aim at monopolists on the grounds
1574 that monopolies were bad in and of themselves
— squeezing out
1575 competitors, creating
<span class=
"quote">“<span class=
"quote">diseconomies of scale
</span>”</span> (when a company is so
1576 big that its constituent parts go awry and it is seemingly helpless
1577 to address the problems), and capturing their regulators to such a
1578 degree that they can get away with a host of evils.
1580 Then came a fabulist named Robert Bork, a former solicitor general
1581 who Reagan appointed to the powerful U.S. Court of Appeals for the
1582 D.C. Circuit and who had created an alternate legislative history of
1583 the Sherman Act and its successors out of whole cloth. Bork insisted
1584 that these statutes were never targeted at monopolies (despite a
1585 wealth of evidence to the contrary, including the transcribed
1586 speeches of the acts
’ authors) but, rather, that they were intended
1587 to prevent
<span class=
"quote">“<span class=
"quote">consumer harm
</span>”</span> — in the form of higher prices.
1589 Bork was a crank, but he was a crank with a theory that rich people
1590 really liked. Monopolies are a great way to make rich people richer
1591 by allowing them to receive
<span class=
"quote">“<span class=
"quote">monopoly rents
</span>”</span> (that is, bigger
1592 profits) and capture regulators, leading to a weaker, more favorable
1593 regulatory environment with fewer protections for customers,
1594 suppliers, the environment, and workers.
1596 Bork
’s theories were especially palatable to the same power brokers
1597 who backed Reagan, and Reagan
’s Department of Justice and other
1598 agencies began to incorporate Bork
’s antitrust doctrine into their
1599 enforcement decisions (Reagan even put Bork up for a Supreme Court
1600 seat, but Bork flunked the Senate confirmation hearing so badly
1601 that,
40 years later, D.C. insiders use the term
<span class=
"quote">“<span class=
"quote">borked
</span>”</span> to refer
1602 to any catastrophically bad political performance).
1604 Little by little, Bork
’s theories entered the mainstream, and their
1605 backers began to infiltrate the legal education field, even putting
1606 on junkets where members of the judiciary were treated to lavish
1607 meals, fun outdoor activities, and seminars where they were
1608 indoctrinated into the consumer harm theory of antitrust. The more
1609 Bork
’s theories took hold, the more money the monopolists were
1610 making
— and the more surplus capital they had at their disposal to
1611 lobby for even more Borkian antitrust influence campaigns.
1613 The history of Bork
’s antitrust theories is a really good example of
1614 the kind of covertly engineered shifts in public opinion that Zuboff
1615 warns us against, where fringe ideas become mainstream orthodoxy.
1616 But Bork didn
’t change the world overnight. He played a very long
1617 game, for over a generation, and he had a tailwind because the same
1618 forces that backed oligarchic antitrust theories also backed many
1619 other oligarchic shifts in public opinion. For example, the idea
1620 that taxation is theft, that wealth is a sign of virtue, and so on
—
1621 all of these theories meshed to form a coherent ideology that
1622 elevated inequality to a virtue.
1624 Today, many fear that machine learning allows surveillance
1625 capitalism to sell
<span class=
"quote">“<span class=
"quote">Bork-as-a-Service,
</span>”</span> at internet speeds, so that
1626 you can contract a machine-learning company to engineer
1627 <span class=
"emphasis"><em>rapid
</em></span> shifts in public sentiment without
1628 needing the capital to sustain a multipronged, multigenerational
1629 project working at the local, state, national, and global levels in
1630 business, law, and philosophy. I do not believe that such a project
1631 is plausible, though I agree that this is basically what the
1632 platforms claim to be selling. They
’re just lying about it. Big Tech
1633 lies all the time,
<span class=
"emphasis"><em>including
</em></span> in their sales
1636 The idea that tech forms
<span class=
"quote">“<span class=
"quote">natural monopolies
</span>”</span> (monopolies that are
1637 the inevitable result of the realities of an industry, such as the
1638 monopolies that accrue the first company to run long-haul phone
1639 lines or rail lines) is belied by tech
’s own history: In the absence
1640 of anti-competitive tactics, Google was able to unseat AltaVista and
1641 Yahoo; Facebook was able to head off Myspace. There are some
1642 advantages to gathering mountains of data, but those mountains of
1643 data also have disadvantages: liability (from leaking), diminishing
1644 returns (from old data), and institutional inertia (big companies,
1645 like science, progress one funeral at a time).
1647 Indeed, the birth of the web saw a mass-extinction event for the
1648 existing giant, wildly profitable proprietary technologies that had
1649 capital, network effects, and walls and moats surrounding their
1650 businesses. The web showed that when a new industry is built around
1651 a protocol, rather than a product, the combined might of everyone
1652 who uses the protocol to reach their customers or users or
1653 communities outweighs even the most massive products. CompuServe,
1654 AOL, MSN, and a host of other proprietary walled gardens learned
1655 this lesson the hard way: Each believed it could stay separate from
1656 the web, offering
<span class=
"quote">“<span class=
"quote">curation
</span>”</span> and a guarantee of consistency and
1657 quality instead of the chaos of an open system. Each was wrong and
1658 ended up being absorbed into the public web.
1660 Yes, tech is heavily monopolized and is now closely associated with
1661 industry concentration, but this has more to do with a matter of
1662 timing than its intrinsically monopolistic tendencies. Tech was born
1663 at the moment that antitrust enforcement was being dismantled, and
1664 tech fell into exactly the same pathologies that antitrust was
1665 supposed to guard against. To a first approximation, it is
1666 reasonable to assume that tech
’s monopolies are the result of a lack
1667 of anti-monopoly action and not the much-touted unique
1668 characteristics of tech, such as network effects, first-mover
1669 advantage, and so on.
1671 In support of this thesis, I offer the concentration that every
1672 <span class=
"emphasis"><em>other
</em></span> industry has undergone over the same
1673 period. From professional wrestling to consumer packaged goods to
1674 commercial property leasing to banking to sea freight to oil to
1675 record labels to newspaper ownership to theme parks,
1676 <span class=
"emphasis"><em>every
</em></span> industry has undergone a massive shift
1677 toward concentration. There
’s no obvious network effects or
1678 first-mover advantage at play in these industries. However, in every
1679 case, these industries attained their concentrated status through
1680 tactics that were prohibited before Bork
’s triumph: merging with
1681 major competitors, buying out innovative new market entrants,
1682 horizontal and vertical integration, and a suite of anti-competitive
1683 tactics that were once illegal but are not any longer.
1685 Again: When you change the laws intended to prevent monopolies and
1686 then monopolies form in exactly the way the law was supposed to
1687 prevent, it is reasonable to suppose that these facts are related.
1688 Tech
’s concentration can be readily explained without recourse to
1689 radical theories of network effects
— but only if you
’re willing to
1690 indict unregulated markets as tending toward monopoly. Just as a
1691 lifelong smoker can give you a hundred reasons why their smoking
1692 didn
’t cause their cancer (
<span class=
"quote">“<span class=
"quote">It was the environmental toxins
</span>”</span>), true
1693 believers in unregulated markets have a whole suite of unconvincing
1694 explanations for monopoly in tech that leave capitalism intact.
1695 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"steering-with-the-windshield-wipers"></a>Steering with the windshield wipers
</h2></div></div></div><p>
1696 It
’s been
40 years since Bork
’s project to rehabilitate monopolies
1697 achieved liftoff, and that is a generation and a half, which is
1698 plenty of time to take a common idea and make it seem outlandish and
1699 vice versa. Before the
1940s, affluent Americans dressed their baby
1700 boys in pink while baby girls wore blue (a
<span class=
"quote">“<span class=
"quote">delicate and dainty
</span>”</span>
1701 color). While gendered colors are obviously totally arbitrary, many
1702 still greet this news with amazement and find it hard to imagine a
1703 time when pink connoted masculinity.
1705 After
40 years of studiously ignoring antitrust analysis and
1706 enforcement, it
’s not surprising that we
’ve all but forgotten that
1707 antitrust exists, that in living memory, growth through mergers and
1708 acquisitions were largely prohibited under law, that
1709 market-cornering strategies like vertical integration could land a
1712 Antitrust is a market society
’s steering wheel, the control of first
1713 resort to keep would-be masters of the universe in their lanes. But
1714 Bork and his cohort ripped out our steering wheel
40 years ago. The
1715 car is still barreling along, and so we
’re yanking as hard as we can
1716 on all the
<span class=
"emphasis"><em>other
</em></span> controls in the car as well as
1717 desperately flapping the doors and rolling the windows up and down
1718 in the hopes that one of these other controls can be repurposed to
1719 let us choose where we
’re heading before we careen off a cliff.
1721 It
’s like a
1960s science-fiction plot come to life: People stuck in
1722 a
<span class=
"quote">“<span class=
"quote">generation ship,
</span>”</span> plying its way across the stars, a ship once
1723 piloted by their ancestors; and now, after a great cataclysm, the
1724 ship
’s crew have forgotten that they
’re in a ship at all and no
1725 longer remember where the control room is. Adrift, the ship is
1726 racing toward its extinction, and unless we can seize the controls
1727 and execute emergency course correction, we
’re all headed for a
1728 fiery death in the heart of a sun.
1729 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"surveillance-still-matters"></a>Surveillance still matters
</h2></div></div></div><p>
1730 None of this is to minimize the problems with surveillance.
1731 Surveillance matters, and Big Tech
’s use of surveillance
1732 <span class=
"emphasis"><em>is
</em></span> an existential risk to our species, but
1733 that
’s not because surveillance and machine learning rob us of our
1736 Surveillance has become
<span class=
"emphasis"><em>much
</em></span> more efficient
1737 thanks to Big Tech. In
1989, the Stasi
— the East German secret
1738 police
— had the whole country under surveillance, a massive
1739 undertaking that recruited one out of every
60 people to serve as an
1740 informant or intelligence operative.
1742 Today, we know that the NSA is spying on a significant fraction of
1743 the entire world
’s population, and its ratio of surveillance
1744 operatives to the surveilled is more like
1:
10,
000 (that
’s probably
1745 on the low side since it assumes that every American with top-secret
1746 clearance is working for the NSA on this project
— we don
’t know how
1747 many of those cleared people are involved in NSA spying, but it
’s
1748 definitely not all of them).
1750 How did the ratio of surveillable citizens expand from
1:
60 to
1751 1:
10,
000 in less than
30 years? It
’s thanks to Big Tech. Our devices
1752 and services gather most of the data that the NSA mines for its
1753 surveillance project. We pay for these devices and the services they
1754 connect to, and then we painstakingly perform the data-entry tasks
1755 associated with logging facts about our lives, opinions, and
1756 preferences. This mass surveillance project has been largely useless
1757 for fighting terrorism: The NSA can
1758 <a class=
"ulink" href=
"https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html" target=
"_top">only
1759 point to a single minor success story
</a> in which it used its
1760 data collection program to foil an attempt by a U.S. resident to
1761 wire a few thousand dollars to an overseas terror group. It
’s
1762 ineffective for much the same reason that commercial surveillance
1763 projects are largely ineffective at targeting advertising: The
1764 people who want to commit acts of terror, like people who want to
1765 buy a refrigerator, are extremely rare. If you
’re trying to detect a
1766 phenomenon whose base rate is one in a million with an instrument
1767 whose accuracy is only
99%, then every true positive will come at
1768 the cost of
9,
999 false positives.
1770 Let me explain that again: If one in a million people is a
1771 terrorist, then there will only be about one terrorist in a random
1772 sample of one million people. If your test for detecting terrorists
1773 is
99% accurate, it will identify
10,
000 terrorists in your
1774 million-person sample (
1% of one million is
10,
000). For every true
1775 positive, you
’ll get
9,
999 false positives.
1777 In reality, the accuracy of algorithmic terrorism detection falls
1778 far short of the
99% mark, as does refrigerator ad targeting. The
1779 difference is that being falsely accused of wanting to buy a fridge
1780 is a minor nuisance while being falsely accused of planning a terror
1781 attack can destroy your life and the lives of everyone you love.
1783 Mass state surveillance is only feasible because of surveillance
1784 capitalism and its extremely low-yield ad-targeting systems, which
1785 require a constant feed of personal data to remain barely viable.
1786 Surveillance capitalism
’s primary failure mode is mistargeted ads
1787 while mass state surveillance
’s primary failure mode is grotesque
1788 human rights abuses, tending toward totalitarianism.
1790 State surveillance is no mere parasite on Big Tech, sucking up its
1791 data and giving nothing in return. In truth, the two are symbiotes:
1792 Big Tech sucks up our data for spy agencies, and spy agencies ensure
1793 that governments don
’t limit Big Tech
’s activities so severely that
1794 it would no longer serve the spy agencies
’ needs. There is no firm
1795 distinction between state surveillance and surveillance capitalism;
1796 they are dependent on one another.
1798 To see this at work today, look no further than Amazon
’s home
1799 surveillance device, the Ring doorbell, and its associated app,
1800 Neighbors. Ring
— a product that Amazon acquired and did not develop
1801 in house
— makes a camera-enabled doorbell that streams footage from
1802 your front door to your mobile device. The Neighbors app allows you
1803 to form a neighborhood-wide surveillance grid with your fellow Ring
1804 owners through which you can share clips of
<span class=
"quote">“<span class=
"quote">suspicious characters.
</span>”</span>
1805 If you
’re thinking that this sounds like a recipe for letting
1806 curtain-twitching racists supercharge their suspicions of people
1807 with brown skin who walk down their blocks,
1808 <a class=
"ulink" href=
"https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement" target=
"_top">you
’re
1809 right
</a>. Ring has become a
<span class=
"emphasis"><em>de facto,
</em></span>
1810 off-the-books arm of the police without any of the pesky oversight
1813 In mid-
2019, a series of public records requests revealed that
1814 Amazon had struck confidential deals with more than
400 local law
1815 enforcement agencies through which the agencies would promote Ring
1816 and Neighbors and in exchange get access to footage from Ring
1817 cameras. In theory, cops would need to request this footage through
1818 Amazon (and internal documents reveal that Amazon devotes
1819 substantial resources to coaching cops on how to spin a convincing
1820 story when doing so), but in practice, when a Ring customer turns
1821 down a police request, Amazon only requires the agency to formally
1822 request the footage from the company, which it will then produce.
1824 Ring and law enforcement have found many ways to intertwine their
1825 activities. Ring strikes secret deals to acquire real-time access to
1826 911 dispatch and then streams alarming crime reports to Neighbors
1827 users, which serve as convincers for anyone who
’s contemplating a
1828 surveillance doorbell but isn
’t sure whether their neighborhood is
1829 dangerous enough to warrant it.
1831 The more the cops buzz-market the surveillance capitalist Ring, the
1832 more surveillance capability the state gets. Cops who rely on
1833 private entities for law-enforcement roles then brief against any
1834 controls on the deployment of that technology while the companies
1835 return the favor by lobbying against rules requiring public
1836 oversight of police surveillance technology. The more the cops rely
1837 on Ring and Neighbors, the harder it will be to pass laws to curb
1838 them. The fewer laws there are against them, the more the cops will
1840 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"dignity-and-sanctuary"></a>Dignity and sanctuary
</h2></div></div></div><p>
1841 But even if we could exercise democratic control over our states and
1842 force them to stop raiding surveillance capitalism
’s reservoirs of
1843 behavioral data, surveillance capitalism would still harm us.
1845 This is an area where Zuboff shines. Her chapter on
<span class=
"quote">“<span class=
"quote">sanctuary
</span>”</span> —
1846 the feeling of being unobserved
— is a beautiful hymn to
1847 introspection, calmness, mindfulness, and tranquility.
1849 When you are watched, something changes. Anyone who has ever raised
1850 a child knows this. You might look up from your book (or more
1851 realistically, from your phone) and catch your child in a moment of
1852 profound realization and growth, a moment where they are learning
1853 something that is right at the edge of their abilities, requiring
1854 their entire ferocious concentration. For a moment, you
’re
1855 transfixed, watching that rare and beautiful moment of focus playing
1856 out before your eyes, and then your child looks up and sees you
1857 seeing them, and the moment collapses. To grow, you need to be and
1858 expose your authentic self, and in that moment, you are vulnerable
1859 like a hermit crab scuttling from one shell to the next. The tender,
1860 unprotected tissues you expose in that moment are too delicate to
1861 reveal in the presence of another, even someone you trust as
1862 implicitly as a child trusts their parent.
1864 In the digital age, our authentic selves are inextricably tied to
1865 our digital lives. Your search history is a running ledger of the
1866 questions you
’ve pondered. Your location history is a record of the
1867 places you
’ve sought out and the experiences you
’ve had there. Your
1868 social graph reveals the different facets of your identity, the
1869 people you
’ve connected with.
1871 To be observed in these activities is to lose the sanctuary of your
1874 There
’s another way in which surveillance capitalism robs us of our
1875 capacity to be our authentic selves: by making us anxious.
1876 Surveillance capitalism isn
’t really a mind-control ray, but you
1877 don
’t need a mind-control ray to make someone anxious. After all,
1878 another word for anxiety is agitation, and to make someone
1879 experience agitation, you need merely to agitate them. To poke them
1880 and prod them and beep at them and buzz at them and bombard them on
1881 an intermittent schedule that is just random enough that our limbic
1882 systems never quite become inured to it.
1884 Our devices and services are
<span class=
"quote">“<span class=
"quote">general purpose
</span>”</span> in that they can
1885 connect anything or anyone to anything or anyone else and that they
1886 can run any program that can be written. This means that the
1887 distraction rectangles in our pockets hold our most precious moments
1888 with our most beloved people and their most urgent or time-sensitive
1889 communications (from
<span class=
"quote">“<span class=
"quote">running late can you get the kid?
</span>”</span> to
<span class=
"quote">“<span class=
"quote">doctor
1890 gave me bad news and I need to talk to you RIGHT NOW
</span>”</span>) as well as
1891 ads for refrigerators and recruiting messages from Nazis.
1893 All day and all night, our pockets buzz, shattering our
1894 concentration and tearing apart the fragile webs of connection we
1895 spin as we think through difficult ideas. If you locked someone in a
1896 cell and agitated them like this, we
’d call it
<span class=
"quote">“<span class=
"quote">sleep deprivation
1897 torture,
</span>”</span> and it would be
1898 <a class=
"ulink" href=
"https://www.youtube.com/watch?v=1SKpRbvnx6g" target=
"_top">a war crime
1899 under the Geneva Conventions
</a>.
1900 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"afflicting-the-afflicted"></a>Afflicting the afflicted
</h2></div></div></div><p>
1901 The effects of surveillance on our ability to be our authentic
1902 selves are not equal for all people. Some of us are lucky enough to
1903 live in a time and place in which all the most important facts of
1904 our lives are widely and roundly socially acceptable and can be
1905 publicly displayed without the risk of social consequence.
1907 But for many of us, this is not true. Recall that in living memory,
1908 many of the ways of being that we think of as socially acceptable
1909 today were once cause for dire social sanction or even imprisonment.
1910 If you are
65 years old, you have lived through a time in which
1911 people living in
<span class=
"quote">“<span class=
"quote">free societies
</span>”</span> could be imprisoned or sanctioned
1912 for engaging in homosexual activity, for falling in love with a
1913 person whose skin was a different color than their own, or for
1916 Today, these activities aren
’t just decriminalized in much of the
1917 world, they
’re considered normal, and the fallen prohibitions are
1918 viewed as shameful, regrettable relics of the past.
1920 How did we get from prohibition to normalization? Through private,
1921 personal activity: People who were secretly gay or secret
1922 pot-smokers or who secretly loved someone with a different skin
1923 color were vulnerable to retaliation if they made their true selves
1924 known and were limited in how much they could advocate for their own
1925 right to exist in the world and be true to themselves. But because
1926 there was a private sphere, these people could form alliances with
1927 their friends and loved ones who did not share their disfavored
1928 traits by having private conversations in which they came out,
1929 disclosing their true selves to the people around them and bringing
1930 them to their cause one conversation at a time.
1932 The right to choose the time and manner of these conversations was
1933 key to their success. It
’s one thing to come out to your dad while
1934 you
’re on a fishing trip away from the world and another thing
1935 entirely to blurt it out over the Christmas dinner table while your
1936 racist Facebook uncle is there to make a scene.
1938 Without a private sphere, there
’s a chance that none of these
1939 changes would have come to pass and that the people who benefited
1940 from these changes would have either faced social sanction for
1941 coming out to a hostile world or would have never been able to
1942 reveal their true selves to the people they love.
1944 The corollary is that, unless you think that our society has
1945 attained social perfection
— that your grandchildren in
50 years
1946 will ask you to tell them the story of how, in
2020, every injustice
1947 had been righted and no further change had to be made
— then you
1948 should expect that right now, at this minute, there are people you
1949 love, whose happiness is key to your own, who have a secret in their
1950 hearts that stops them from ever being their authentic selves with
1951 you. These people are sorrowing and will go to their graves with
1952 that secret sorrow in their hearts, and the source of that sorrow
1953 will be the falsity of their relationship to you.
1955 A private realm is necessary for human progress.
1956 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"any-data-you-collect-and-retain-will-eventually-leak"></a>Any data you collect and retain will eventually leak
</h2></div></div></div><p>
1957 The lack of a private life can rob vulnerable people of the chance
1958 to be their authentic selves and constrain our actions by depriving
1959 us of sanctuary, but there is another risk that is borne by
1960 everyone, not just people with a secret: crime.
1962 Personally identifying information is of very limited use for the
1963 purpose of controlling peoples
’ minds, but identity theft
— really a
1964 catchall term for a whole constellation of terrible criminal
1965 activities that can destroy your finances, compromise your personal
1966 integrity, ruin your reputation, or even expose you to physical
1967 danger
— thrives on it.
1969 Attackers are not limited to using data from one breached source,
1970 either. Multiple services have suffered breaches that exposed names,
1971 addresses, phone numbers, passwords, sexual tastes, school grades,
1972 work performance, brushes with the criminal justice system, family
1973 details, genetic information, fingerprints and other biometrics,
1974 reading habits, search histories, literary tastes, pseudonymous
1975 identities, and other sensitive information. Attackers can merge
1976 data from these different breaches to build up extremely detailed
1977 dossiers on random subjects and then use different parts of the data
1978 for different criminal purposes.
1980 For example, attackers can use leaked username and password
1981 combinations to hijack whole fleets of commercial vehicles that
1982 <a class=
"ulink" href=
"https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps" target=
"_top">have
1983 been fitted with anti-theft GPS trackers and immobilizers
</a> or
1984 to hijack baby monitors in order to
1985 <a class=
"ulink" href=
"https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550" target=
"_top">terrorize
1986 toddlers with the audio tracks from pornography
</a>. Attackers
1987 use leaked data to trick phone companies into giving them your phone
1988 number, then they intercept SMS-based two-factor authentication
1989 codes in order to take over your email, bank account, and/or
1990 cryptocurrency wallets.
1992 Attackers are endlessly inventive in the pursuit of creative ways to
1993 weaponize leaked data. One common use of leaked data is to penetrate
1994 companies in order to access
<span class=
"emphasis"><em>more
</em></span> data.
1996 Like spies, online fraudsters are totally dependent on companies
1997 over-collecting and over-retaining our data. Spy agencies sometimes
1998 pay companies for access to their data or intimidate them into
1999 giving it up, but sometimes they work just like criminals do
— by
2000 <a class=
"ulink" href=
"https://www.bbc.com/news/world-us-canada-24751821" target=
"_top">sneaking
2001 data out of companies
’ databases
</a>.
2003 The over-collection of data has a host of terrible social
2004 consequences, from the erosion of our authentic selves to the
2005 undermining of social progress, from state surveillance to an
2006 epidemic of online crime. Commercial surveillance is also a boon to
2007 people running influence campaigns, but that
’s the least of our
2009 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"critical-tech-exceptionalism-is-still-tech-exceptionalism"></a>Critical tech exceptionalism is still tech
2010 exceptionalism
</h2></div></div></div><p>
2011 Big Tech has long practiced technology exceptionalism: the idea that
2012 it should not be subject to the mundane laws and norms of
2013 <span class=
"quote">“<span class=
"quote">meatspace.
</span>”</span> Mottoes like Facebook
’s
<span class=
"quote">“<span class=
"quote">move fast and break things
</span>”</span>
2014 attracted justifiable scorn of the companies
’ self-serving rhetoric.
2016 Tech exceptionalism got us all into a lot of trouble, so it
’s ironic
2017 and distressing to see Big Tech
’s critics committing the same sin.
2019 Big Tech is not a
<span class=
"quote">“<span class=
"quote">rogue capitalism
</span>”</span> that cannot be cured through
2020 the traditional anti-monopoly remedies of trustbusting (forcing
2021 companies to divest of competitors they have acquired) and bans on
2022 mergers to monopoly and other anti-competitive tactics. Big Tech
2023 does not have the power to use machine learning to influence our
2024 behavior so thoroughly that markets lose the ability to punish bad
2025 actors and reward superior competitors. Big Tech has no rule-writing
2026 mind-control ray that necessitates ditching our old toolbox.
2028 The thing is, people have been claiming to have perfected
2029 mind-control rays for centuries, and every time, it turned out to be
2030 a con
— though sometimes the con artists were also conning
2033 For generations, the advertising industry has been steadily
2034 improving its ability to sell advertising services to businesses
2035 while only making marginal gains in selling those businesses
’
2036 products to prospective customers. John Wanamaker
’s lament that
<span class=
"quote">“<span class=
"quote">50%
2037 of my advertising budget is wasted, I just don
’t know which
50%
</span>”</span> is
2038 a testament to the triumph of
<span class=
"emphasis"><em>ad executives
</em></span>,
2039 who successfully convinced Wanamaker that only half of the money he
2040 spent went to waste.
2042 The tech industry has made enormous improvements in the science of
2043 convincing businesses that they
’re good at advertising while their
2044 actual improvements to advertising
— as opposed to targeting
— have
2045 been pretty ho-hum. The vogue for machine learning
— and the
2046 mystical invocation of
<span class=
"quote">“<span class=
"quote">artificial intelligence
</span>”</span> as a synonym for
2047 straightforward statistical inference techniques
— has greatly
2048 boosted the efficacy of Big Tech
’s sales pitch as marketers have
2049 exploited potential customers
’ lack of technical sophistication to
2050 get away with breathtaking acts of overpromising and
2053 It
’s tempting to think that if businesses are willing to pour
2054 billions into a venture that the venture must be a good one. Yet
2055 there are plenty of times when this rule of thumb has led us astray.
2056 For example, it
’s virtually unheard of for managed investment funds
2057 to outperform simple index funds, and investors who put their money
2058 into the hands of expert money managers overwhelmingly fare worse
2059 than those who entrust their savings to index funds. But managed
2060 funds still account for the majority of the money invested in the
2061 markets, and they are patronized by some of the richest, most
2062 sophisticated investors in the world. Their vote of confidence in an
2063 underperforming sector is a parable about the role of luck in wealth
2064 accumulation, not a sign that managed funds are a good buy.
2066 The claims of Big Tech
’s mind-control system are full of tells that
2067 the enterprise is a con. For example,
2068 <a class=
"ulink" href=
"https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full" target=
"_top">the
2069 reliance on the
<span class=
"quote">“<span class=
"quote">Big Five
</span>”</span> personality traits
</a> as a primary
2070 means of influencing people even though the
<span class=
"quote">“<span class=
"quote">Big Five
</span>”</span> theory is
2071 unsupported by any large-scale, peer-reviewed studies and is
2072 <a class=
"ulink" href=
"https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/" target=
"_top">mostly
2073 the realm of marketing hucksters and pop psych
</a>.
2075 Big Tech
’s promotional materials also claim that their algorithms
2076 can accurately perform
<span class=
"quote">“<span class=
"quote">sentiment analysis
</span>”</span> or detect peoples
’ moods
2077 based on their
<span class=
"quote">“<span class=
"quote">microexpressions,
</span>”</span> but
2078 <a class=
"ulink" href=
"https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it" target=
"_top">these
2079 are marketing claims, not scientific ones
</a>. These methods are
2080 largely untested by independent scientific experts, and where they
2081 have been tested, they
’ve been found sorely wanting.
2082 Microexpressions are particularly suspect as the companies that
2083 specialize in training people to detect them
2084 <a class=
"ulink" href=
"https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/" target=
"_top">have
2085 been shown
</a> to underperform relative to random chance.
2087 Big Tech has been so good at marketing its own supposed superpowers
2088 that it
’s easy to believe that they can market everything else with
2089 similar acumen, but it
’s a mistake to believe the hype. Any
2090 statement a company makes about the quality of its products is
2091 clearly not impartial. The fact that we distrust all the things that
2092 Big Tech says about its data handling, compliance with privacy laws,
2093 etc., is only reasonable
— but why on Earth would we treat Big
2094 Tech
’s marketing literature as the gospel truth? Big Tech lies about
2095 just about
<span class=
"emphasis"><em>everything
</em></span>, including how well its
2096 machine-learning fueled persuasion systems work.
2098 That skepticism should infuse all of our evaluations of Big Tech and
2099 its supposed abilities, including our perusal of its patents. Zuboff
2100 vests these patents with enormous significance, pointing out that
2101 Google claimed extensive new persuasion capabilities in
2102 <a class=
"ulink" href=
"https://patents.google.com/patent/US20050131762A1/en" target=
"_top">its
2103 patent filings
</a>. These claims are doubly suspect: first,
2104 because they are so self-serving, and second, because the patent
2105 itself is so notoriously an invitation to exaggeration.
2107 Patent applications take the form of a series of claims and range
2108 from broad to narrow. A typical patent starts out by claiming that
2109 its authors have invented a method or system for doing every
2110 conceivable thing that anyone might do, ever, with any tool or
2111 device. Then it narrows that claim in successive stages until we get
2112 to the actual
<span class=
"quote">“<span class=
"quote">invention
</span>”</span> that is the true subject of the patent.
2113 The hope is that the patent examiner
— who is almost certainly
2114 overworked and underinformed
— will miss the fact that some or all
2115 of these claims are ridiculous, or at least suspect, and grant the
2116 patent
’s broader claims. Patents for unpatentable things are still
2117 incredibly useful because they can be wielded against competitors
2118 who might license that patent or steer clear of its claims rather
2119 than endure the lengthy, expensive process of contesting it.
2121 What
’s more, software patents are routinely granted even though the
2122 filer doesn
’t have any evidence that they can do the thing claimed
2123 by the patent. That is, you can patent an
<span class=
"quote">“<span class=
"quote">invention
</span>”</span> that you
2124 haven
’t actually made and that you don
’t know how to make.
2126 With these considerations in hand, it becomes obvious that the fact
2127 that a Big Tech company has patented what it
2128 <span class=
"emphasis"><em>says
</em></span> is an effective mind-control ray is
2129 largely irrelevant to whether Big Tech can in fact control our
2132 Big Tech collects our data for many reasons, including the
2133 diminishing returns on existing stores of data. But many tech
2134 companies also collect data out of a mistaken tech exceptionalist
2135 belief in the network effects of data. Network effects occur when
2136 each new user in a system increases its value. The classic example
2137 is fax machines: A single fax machine is of no use, two fax machines
2138 are of limited use, but every new fax machine that
’s put to use
2139 after the first doubles the number of possible fax-to-fax links.
2141 Data mined for predictive systems doesn
’t necessarily produce these
2142 dividends. Think of Netflix: The predictive value of the data mined
2143 from a million English-speaking Netflix viewers is hardly improved
2144 by the addition of one more user
’s viewing data. Most of the data
2145 Netflix acquires after that first minimum viable sample duplicates
2146 existing data and produces only minimal gains. Meanwhile, retraining
2147 models with new data gets progressively more expensive as the number
2148 of data points increases, and manual tasks like labeling and
2149 validating data do not get cheaper at scale.
2151 Businesses pursue fads to the detriment of their profits all the
2152 time, especially when the businesses and their investors are not
2153 motivated by the prospect of becoming profitable but rather by the
2154 prospect of being acquired by a Big Tech giant or by having an IPO.
2155 For these firms, ticking faddish boxes like
<span class=
"quote">“<span class=
"quote">collects as much data
2156 as possible
</span>”</span> might realize a bigger return on investment than
2157 <span class=
"quote">“<span class=
"quote">collects a business-appropriate quantity of data.
</span>”</span>
2159 This is another harm of tech exceptionalism: The belief that more
2160 data always produces more profits in the form of more insights that
2161 can be translated into better mind-control rays drives firms to
2162 over-collect and over-retain data beyond all rationality. And since
2163 the firms are behaving irrationally, a good number of them will go
2164 out of business and become ghost ships whose cargo holds are stuffed
2165 full of data that can harm people in myriad ways
— but which no one
2166 is responsible for antey longer. Even if the companies don
’t go
2167 under, the data they collect is maintained behind the minimum viable
2168 security
— just enough security to keep the company viable while it
2169 waits to get bought out by a tech giant, an amount calculated to
2170 spend not one penny more than is necessary on protecting data.
2171 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story"></a>How monopolies, not mind control, drive surveillance
2172 capitalism: The Snapchat story
</h2></div></div></div><p>
2173 For the first decade of its existence, Facebook competed with the
2174 social media giants of the day (Myspace, Orkut, etc.) by presenting
2175 itself as the pro-privacy alternative. Indeed, Facebook justified
2176 its walled garden
— which let users bring in data from the web but
2177 blocked web services like Google Search from indexing and caching
2178 Facebook pages
— as a pro-privacy measure that protected users from
2179 the surveillance-happy winners of the social media wars like
2182 Despite frequent promises that it would never collect or analyze its
2183 users
’ data, Facebook periodically created initiatives that did just
2184 that, like the creepy, ham-fisted Beacon tool, which spied on you as
2185 you moved around the web and then added your online activities to
2186 your public timeline, allowing your friends to monitor your browsing
2187 habits. Beacon sparked a user revolt. Every time, Facebook backed
2188 off from its surveillance initiative, but not all the way;
2189 inevitably, the new Facebook would be more surveilling than the old
2190 Facebook, though not quite as surveilling as the intermediate
2191 Facebook following the launch of the new product or service.
2193 The pace at which Facebook ramped up its surveillance efforts seems
2194 to have been set by Facebook
’s competitive landscape. The more
2195 competitors Facebook had, the better it behaved. Every time a major
2196 competitor foundered, Facebook
’s behavior
2197 <a class=
"ulink" href=
"https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362" target=
"_top">got
2200 All the while, Facebook was prodigiously acquiring companies,
2201 including a company called Onavo. Nominally, Onavo made a
2202 battery-monitoring mobile app. But the permissions that Onavo
2203 required were so expansive that the app was able to gather
2204 fine-grained telemetry on everything users did with their phones,
2205 including which apps they used and how they were using them.
2207 Through Onavo, Facebook discovered that it was losing market share
2208 to Snapchat, an app that
— like Facebook a decade before
— billed
2209 itself as the pro-privacy alternative to the status quo. Through
2210 Onavo, Facebook was able to mine data from the devices of Snapchat
2211 users, including both current and former Snapchat users. This
2212 spurred Facebook to acquire Instagram
— some features of which
2213 competed with Snapchat
— and then allowed Facebook to fine-tune
2214 Instagram
’s features and sales pitch to erode Snapchat
’s gains and
2215 ensure that Facebook would not have to face the kinds of competitive
2216 pressures it had earlier inflicted on Myspace and Orkut.
2218 The story of how Facebook crushed Snapchat reveals the relationship
2219 between monopoly and surveillance capitalism. Facebook combined
2220 surveillance with lax antitrust enforcement to spot the competitive
2221 threat of Snapchat on its horizon and then take decisive action
2222 against it. Facebook
’s surveillance capitalism let it avert
2223 competitive pressure with anti-competitive tactics. Facebook users
2224 still want privacy
— Facebook hasn
’t used surveillance to brainwash
2225 them out of it
— but they can
’t get it because Facebook
’s
2226 surveillance lets it destroy any hope of a rival service emerging
2227 that competes on privacy features.
2228 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"a-monopoly-over-your-friends"></a>A monopoly over your friends
</h2></div></div></div><p>
2229 A decentralization movement has tried to erode the dominance of
2230 Facebook and other Big Tech companies by fielding
<span class=
"quote">“<span class=
"quote">indieweb
</span>”</span>
2231 alternatives
— Mastodon as a Twitter alternative, Diaspora as a
2232 Facebook alternative, etc.
— but these efforts have failed to attain
2233 any kind of liftoff.
2235 Fundamentally, each of these services is hamstrung by the same
2236 problem: Every potential user for a Facebook or Twitter alternative
2237 has to convince all their friends to follow them to a decentralized
2238 web alternative in order to continue to realize the benefit of
2239 social media. For many of us, the only reason to have a Facebook
2240 account is that our friends have Facebook accounts, and the reason
2241 they have Facebook accounts is that
<span class=
"emphasis"><em>we
</em></span> have
2244 All of this has conspired to make Facebook
— and other dominant
2245 platforms
— into
<span class=
"quote">“<span class=
"quote">kill zones
</span>”</span> that investors will not fund new
2248 And yet, all of today
’s tech giants came into existence despite the
2249 entrenched advantage of the companies that came before them. To
2250 understand how that happened, you have to understand both
2251 interoperability and adversarial interoperability.
2252 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
2253 The hard problem of our species is coordination.
2254 </p></blockquote></div><p>
2255 <span class=
"quote">“<span class=
"quote">Interoperability
</span>”</span> is the ability of two technologies to work with
2256 one another: Anyone can make an LP that will play on any record
2257 player, anyone can make a filter you can install in your stove
’s
2258 extractor fan, anyone can make gasoline for your car, anyone can
2259 make a USB phone charger that fits in your car
’s cigarette lighter
2260 receptacle, anyone can make a light bulb that works in your light
2261 socket, anyone can make bread that will toast in your toaster.
2263 Interoperability is often a source of innovation and consumer
2264 benefit: Apple made the first commercially successful PC, but
2265 millions of independent software vendors made interoperable programs
2266 that ran on the Apple II Plus. The simple analog antenna inputs on
2267 the back of TVs first allowed cable operators to connect directly to
2268 TVs, then they allowed game console companies and then personal
2269 computer companies to use standard televisions as displays. Standard
2270 RJ-
11 telephone jacks allowed for the production of phones from a
2271 variety of vendors in a variety of forms, from the free
2272 football-shaped phone that came with a
<span class=
"emphasis"><em>Sports
2273 Illustrated
</em></span> subscription to business phones with
2274 speakers, hold functions, and so on and then answering machines and
2275 finally modems, paving the way for the internet revolution.
2277 <span class=
"quote">“<span class=
"quote">Interoperability
</span>”</span> is often used interchangeably with
2278 <span class=
"quote">“<span class=
"quote">standardization,
</span>”</span> which is the process when manufacturers and other
2279 stakeholders hammer out a set of agreed-upon rules for implementing
2280 a technology, such as the electrical plug on your wall, the CAN bus
2281 used by your car
’s computer systems, or the HTML instructions that
2282 your browser interprets.
2284 But interoperability doesn
’t require standardization
— indeed,
2285 standardization often proceeds from the chaos of ad hoc
2286 interoperability measures. The inventor of the cigarette-lighter USB
2287 charger didn
’t need to get permission from car manufacturers or even
2288 the manufacturers of the dashboard lighter subcomponent. The
2289 automakers didn
’t take any countermeasures to prevent the use of
2290 these aftermarket accessories by their customers, but they also
2291 didn
’t do anything to make life easier for the chargers
’
2292 manufacturers. This is a kind of
<span class=
"quote">“<span class=
"quote">neutral interoperability.
</span>”</span>
2294 Beyond neutral interoperability, there is
<span class=
"quote">“<span class=
"quote">adversarial
2295 interoperability.
</span>”</span> That
’s when a manufacturer makes a product that
2296 interoperates with another manufacturer
’s product
<span class=
"emphasis"><em>despite
2297 the second manufacturer
’s objections
</em></span> and
<span class=
"emphasis"><em>even
2298 if that means bypassing a security system designed to prevent
2299 interoperability
</em></span>.
2301 Probably the most familiar form of adversarial interoperability is
2302 third-party printer ink. Printer manufacturers claim that they sell
2303 printers below cost and that the only way they can recoup the losses
2304 they incur is by charging high markups on ink. To prevent the owners
2305 of printers from buying ink elsewhere, the printer companies deploy
2306 a suite of anti-customer security systems that detect and reject
2307 both refilled and third-party cartridges.
2309 Owners of printers take the position that HP and Epson and Brother
2310 are not charities and that customers for their wares have no
2311 obligation to help them survive, and so if the companies choose to
2312 sell their products at a loss, that
’s their foolish choice and their
2313 consequences to live with. Likewise, competitors who make ink or
2314 refill kits observe that they don
’t owe printer companies anything,
2315 and their erosion of printer companies
’ margins are the printer
2316 companies
’ problems, not their competitors
’. After all, the printer
2317 companies shed no tears when they drive a refiller out of business,
2318 so why should the refillers concern themselves with the economic
2319 fortunes of the printer companies?
2321 Adversarial interoperability has played an outsized role in the
2322 history of the tech industry: from the founding of the
<span class=
"quote">“<span class=
"quote">alt.*
</span>”</span>
2323 Usenet hierarchy (which was started against the wishes of Usenet
’s
2324 maintainers and which grew to be bigger than all of Usenet combined)
2325 to the browser wars (when Netscape and Microsoft devoted massive
2326 engineering efforts to making their browsers incompatible with the
2327 other
’s special commands and peccadilloes) to Facebook (whose
2328 success was built in part by helping its new users stay in touch
2329 with friends they
’d left behind on Myspace because Facebook supplied
2330 them with a tool that scraped waiting messages from Myspace and
2331 imported them into Facebook, effectively creating an Facebook-based
2334 Today, incumbency is seen as an unassailable advantage. Facebook is
2335 where all of your friends are, so no one can start a Facebook
2336 competitor. But adversarial compatibility reverses the competitive
2337 advantage: If you were allowed to compete with Facebook by providing
2338 a tool that imported all your users
’ waiting Facebook messages into
2339 an environment that competed on lines that Facebook couldn
’t cross,
2340 like eliminating surveillance and ads, then Facebook would be at a
2341 huge disadvantage. It would have assembled all possible ex-Facebook
2342 users into a single, easy-to-find service; it would have educated
2343 them on how a Facebook-like service worked and what its potential
2344 benefits were; and it would have provided an easy means for
2345 disgruntled Facebook users to tell their friends where they might
2346 expect better treatment.
2348 Adversarial interoperability was once the norm and a key contributor
2349 to the dynamic, vibrant tech scene, but now it is stuck behind a
2350 thicket of laws and regulations that add legal risks to the
2351 tried-and-true tactics of adversarial interoperability. New rules
2352 and new interpretations of existing rules mean that a would-be
2353 adversarial interoperator needs to steer clear of claims under
2354 copyright, terms of service, trade secrecy, tortious interference,
2357 In the absence of a competitive market, lawmakers have resorted to
2358 assigning expensive, state-like duties to Big Tech firms, such as
2359 automatically filtering user contributions for copyright
2360 infringement or terrorist and extremist content or detecting and
2361 preventing harassment in real time or controlling access to sexual
2364 These measures put a floor under how small we can make Big Tech
2365 because only the very largest companies can afford the humans and
2366 automated filters needed to perform these duties.
2368 But that
’s not the only way in which making platforms responsible
2369 for policing their users undermines competition. A platform that is
2370 expected to police its users
’ conduct must prevent many vital
2371 adversarial interoperability techniques lest these subvert its
2372 policing measures. For example, if someone using a Twitter
2373 replacement like Mastodon is able to push messages into Twitter and
2374 read messages out of Twitter, they could avoid being caught by
2375 automated systems that detect and prevent harassment (such as
2376 systems that use the timing of messages or IP-based rules to make
2377 guesses about whether someone is a harasser).
2379 To the extent that we are willing to let Big Tech police itself
—
2380 rather than making Big Tech small enough that users can leave bad
2381 platforms for better ones and small enough that a regulation that
2382 simply puts a platform out of business will not destroy billions of
2383 users
’ access to their communities and data
— we build the case that
2384 Big Tech should be able to block its competitors and make it easier
2385 for Big Tech to demand legal enforcement tools to ban and punish
2386 attempts at adversarial interoperability.
2388 Ultimately, we can try to fix Big Tech by making it responsible for
2389 bad acts by its users, or we can try to fix the internet by cutting
2390 Big Tech down to size. But we can
’t do both. To replace today
’s
2391 giant products with pluralistic protocols, we need to clear the
2392 legal thicket that prevents adversarial interoperability so that
2393 tomorrow
’s nimble, personal, small-scale products can federate
2394 themselves with giants like Facebook, allowing the users who
’ve left
2395 to continue to communicate with users who haven
’t left yet, reaching
2396 tendrils over Facebook
’s garden wall that Facebook
’s trapped users
2397 can use to scale the walls and escape to the global, open web.
2398 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"fake-news-is-an-epistemological-crisis"></a>Fake news is an epistemological crisis
</h2></div></div></div><p>
2399 Tech is not the only industry that has undergone massive
2400 concentration since the Reagan era. Virtually every major industry
—
2401 from oil to newspapers to meatpacking to sea freight to eyewear to
2402 online pornography
— has become a clubby oligarchy that just a few
2405 At the same time, every industry has become something of a tech
2406 industry as general-purpose computers and general-purpose networks
2407 and the promise of efficiencies through data-driven analysis infuse
2408 every device, process, and firm with tech.
2410 This phenomenon of industrial concentration is part of a wider story
2411 about wealth concentration overall as a smaller and smaller number
2412 of people own more and more of our world. This concentration of both
2413 wealth and industries means that our political outcomes are
2414 increasingly beholden to the parochial interests of the people and
2415 companies with all the money.
2417 That means that whenever a regulator asks a question with an
2418 obvious, empirical answer (
<span class=
"quote">“<span class=
"quote">Are humans causing climate change?
</span>”</span> or
2419 <span class=
"quote">“<span class=
"quote">Should we let companies conduct commercial mass surveillance?
</span>”</span> or
2420 <span class=
"quote">“<span class=
"quote">Does society benefit from allowing network neutrality
2421 violations?
</span>”</span>), the answer that comes out is only correct if that
2422 correctness meets with the approval of rich people and the
2423 industries that made them so wealthy.
2425 Rich people have always played an outsized role in politics and more
2426 so since the Supreme Court
’s
<span class=
"emphasis"><em>Citizens United
</em></span>
2427 decision eliminated key controls over political spending. Widening
2428 inequality and wealth concentration means that the very richest
2429 people are now a lot richer and can afford to spend a lot more money
2430 on political projects than ever before. Think of the Koch brothers
2431 or George Soros or Bill Gates.
2433 But the policy distortions of rich individuals pale in comparison to
2434 the policy distortions that concentrated industries are capable of.
2435 The companies in highly concentrated industries are much more
2436 profitable than companies in competitive industries
— no competition
2437 means not having to reduce prices or improve quality to win
2438 customers
— leaving them with bigger capital surpluses to spend on
2441 Concentrated industries also find it easier to collaborate on policy
2442 objectives than competitive ones. When all the top execs from your
2443 industry can fit around a single boardroom table, they often do. And
2444 <span class=
"emphasis"><em>when
</em></span> they do, they can forge a consensus
2445 position on regulation.
2447 Rising through the ranks in a concentrated industry generally means
2448 working at two or three of the big companies. When there are only
2449 relatively few companies in a given industry, each company has a
2450 more ossified executive rank, leaving ambitious execs with fewer
2451 paths to higher positions unless they are recruited to a rival. This
2452 means that the top execs in concentrated industries are likely to
2453 have been colleagues at some point and socialize in the same circles
2454 — connected through social ties or, say, serving as trustees for
2455 each others
’ estates. These tight social bonds foster a collegial,
2456 rather than competitive, attitude.
2458 Highly concentrated industries also present a regulatory conundrum.
2459 When an industry is dominated by just four or five companies, the
2460 only people who are likely to truly understand the industry
’s
2461 practices are its veteran executives. This means that top regulators
2462 are often former execs of the companies they are supposed to be
2463 regulating. These turns in government are often tacitly understood
2464 to be leaves of absence from industry, with former employers
2465 welcoming their erstwhile watchdogs back into their executive ranks
2466 once their terms have expired.
2468 All this is to say that the tight social bonds, small number of
2469 firms, and regulatory capture of concentrated industries give the
2470 companies that comprise them the power to dictate many, if not all,
2471 of the regulations that bind them.
2473 This is increasingly obvious. Whether it
’s payday lenders
2474 <a class=
"ulink" href=
"https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/" target=
"_top">winning
2475 the right to practice predatory lending
</a> or Apple
2476 <a class=
"ulink" href=
"https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation" target=
"_top">winning
2477 the right to decide who can fix your phone
</a> or Google and
2478 Facebook winning the right to breach your private data without
2479 suffering meaningful consequences or victories for pipeline
2480 companies or impunity for opioid manufacturers or massive tax
2481 subsidies for incredibly profitable dominant businesses, it
’s
2482 increasingly apparent that many of our official, evidence-based
2483 truth-seeking processes are, in fact, auctions for sale to the
2486 It
’s really impossible to overstate what a terrifying prospect this
2487 is. We live in an incredibly high-tech society, and none of us could
2488 acquire the expertise to evaluate every technological proposition
2489 that stands between us and our untimely, horrible deaths. You might
2490 devote your life to acquiring the media literacy to distinguish good
2491 scientific journals from corrupt pay-for-play lookalikes and the
2492 statistical literacy to evaluate the quality of the analysis in the
2493 journals as well as the microbiology and epidemiology knowledge to
2494 determine whether you can trust claims about the safety of vaccines
2495 — but that would still leave you unqualified to judge whether the
2496 wiring in your home will give you a lethal shock
2497 <span class=
"emphasis"><em>and
</em></span> whether your car
’s brakes
’ software will
2498 cause them to fail unpredictably
<span class=
"emphasis"><em>and
</em></span> whether
2499 the hygiene standards at your butcher are sufficient to keep you
2500 from dying after you finish your dinner.
2502 In a world as complex as this one, we have to defer to authorities,
2503 and we keep them honest by making those authorities accountable to
2504 us and binding them with rules to prevent conflicts of interest. We
2505 can
’t possibly acquire the expertise to adjudicate conflicting
2506 claims about the best way to make the world safe and prosperous, but
2507 we
<span class=
"emphasis"><em>can
</em></span> determine whether the adjudication
2508 process itself is trustworthy.
2510 Right now, it
’s obviously not.
2512 The past
40 years of rising inequality and industry concentration,
2513 together with increasingly weak accountability and transparency for
2514 expert agencies, has created an increasingly urgent sense of
2515 impending doom, the sense that there are vast conspiracies afoot
2516 that operate with tacit official approval despite the likelihood
2517 they are working to better themselves by ruining the rest of us.
2519 For example, it
’s been decades since Exxon
’s own scientists
2520 concluded that its products would render the Earth uninhabitable by
2521 humans. And yet those decades were lost to us, in large part because
2522 Exxon lobbied governments and sowed doubt about the dangers of its
2523 products and did so with the cooperation of many public officials.
2524 When the survival of you and everyone you love is threatened by
2525 conspiracies, it
’s not unreasonable to start questioning the things
2526 you think you know in an attempt to determine whether they, too, are
2527 the outcome of another conspiracy.
2529 The collapse of the credibility of our systems for divining and
2530 upholding truths has left us in a state of epistemological chaos.
2531 Once, most of us might have assumed that the system was working and
2532 that our regulations reflected our best understanding of the
2533 empirical truths of the world as they were best understood
— now we
2534 have to find our own experts to help us sort the true from the
2537 If you
’re like me, you probably believe that vaccines are safe, but
2538 you (like me) probably also can
’t explain the microbiology or
2539 statistics. Few of us have the math skills to review the literature
2540 on vaccine safety and describe why their statistical reasoning is
2541 sound. Likewise, few of us can review the stats in the (now
2542 discredited) literature on opioid safety and explain how those stats
2543 were manipulated. Both vaccines and opioids were embraced by medical
2544 authorities, after all, and one is safe while the other could ruin
2545 your life. You
’re left with a kind of inchoate constellation of
2546 rules of thumb about which experts you trust to fact-check
2547 controversial claims and then to explain how all those respectable
2548 doctors with their peer-reviewed research on opioid safety
2549 <span class=
"emphasis"><em>were
</em></span> an aberration and then how you know that
2550 the doctors writing about vaccine safety are
2551 <span class=
"emphasis"><em>not
</em></span> an aberration.
2553 I
’m
100% certain that vaccinating is safe and effective, but I
’m
2554 also at something of a loss to explain exactly,
2555 <span class=
"emphasis"><em>precisely,
</em></span> why I believe this, given all the
2556 corruption I know about and the many times the stamp of certainty
2557 has turned out to be a parochial lie told to further enrich the
2560 Fake news
— conspiracy theories, racist ideologies, scientific
2561 denialism
— has always been with us. What
’s changed today is not the
2562 mix of ideas in the public discourse but the popularity of the worst
2563 ideas in that mix. Conspiracy and denial have skyrocketed in
2564 lockstep with the growth of Big Inequality, which has also tracked
2565 the rise of Big Tech and Big Pharma and Big Wrestling and Big Car
2566 and Big Movie Theater and Big Everything Else.
2568 No one can say for certain why this has happened, but the two
2569 dominant camps are idealism (the belief that the people who argue
2570 for these conspiracies have gotten better at explaining them, maybe
2571 with the help of machine-learning tools) or materialism (the ideas
2572 have become more attractive because of material conditions in the
2575 I
’m a materialist. I
’ve been exposed to the arguments of conspiracy
2576 theorists all my life, and I have not experienced any qualitative
2577 leap in the quality of those arguments.
2579 The major difference is in the world, not the arguments. In a time
2580 where actual conspiracies are commonplace, conspiracy theories
2581 acquire a ring of plausibility.
2583 We have always had disagreements about what
’s true, but today, we
2584 have a disagreement over how we know whether something is true. This
2585 is an epistemological crisis, not a crisis over belief. It
’s a
2586 crisis over the credibility of our truth-seeking exercises, from
2587 scientific journals (in an era where the biggest journal publishers
2588 have been caught producing pay-to-play journals for junk science) to
2589 regulations (in an era where regulators are routinely cycling in and
2590 out of business) to education (in an era where universities are
2591 dependent on corporate donations to keep their lights on).
2593 Targeting
— surveillance capitalism
— makes it easier to find people
2594 who are undergoing this epistemological crisis, but it doesn
’t
2595 create the crisis. For that, you need to look to corruption.
2597 And, conveniently enough, it
’s corruption that allows surveillance
2598 capitalism to grow by dismantling monopoly protections, by
2599 permitting reckless collection and retention of personal data, by
2600 allowing ads to be targeted in secret, and by foreclosing on the
2601 possibility of going somewhere else where you might continue to
2602 enjoy your friends without subjecting yourself to commercial
2604 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"tech-is-different"></a>Tech is different
</h2></div></div></div><p>
2605 I reject both iterations of technological exceptionalism. I reject
2606 the idea that tech is uniquely terrible and led by people who are
2607 greedier or worse than the leaders of other industries, and I reject
2608 the idea that tech is so good
— or so intrinsically prone to
2609 concentration
— that it can
’t be blamed for its present-day
2610 monopolistic status.
2612 I think tech is just another industry, albeit one that grew up in
2613 the absence of real monopoly constraints. It may have been first,
2614 but it isn
’t the worst nor will it be the last.
2616 But there
’s one way in which I
<span class=
"emphasis"><em>am
</em></span> a tech
2617 exceptionalist. I believe that online tools are the key to
2618 overcoming problems that are much more urgent than tech
2619 monopolization: climate change, inequality, misogyny, and
2620 discrimination on the basis of race, gender identity, and other
2621 factors. The internet is how we will recruit people to fight those
2622 fights, and how we will coordinate their labor. Tech is not a
2623 substitute for democratic accountability, the rule of law, fairness,
2624 or stability
— but it
’s a means to achieve these things.
2626 The hard problem of our species is coordination. Everything from
2627 climate change to social change to running a business to making a
2628 family work can be viewed as a collective action problem.
2630 The internet makes it easier than at any time before to find people
2631 who want to work on a project with you
— hence the success of free
2632 and open-source software, crowdfunding, and racist terror groups
—
2633 and easier than ever to coordinate the work you do.
2635 The internet and the computers we connect to it also possess an
2636 exceptional quality: general-purposeness. The internet is designed
2637 to allow any two parties to communicate any data, using any
2638 protocol, without permission from anyone else. The only production
2639 design we have for computers is the general-purpose,
<span class=
"quote">“<span class=
"quote">Turing
2640 complete
</span>”</span> computer that can run every program we can express in
2643 This means that every time someone with a special communications
2644 need invests in infrastructure and techniques to make the internet
2645 faster, cheaper, and more robust, this benefit redounds to everyone
2646 else who is using the internet to communicate. And this also means
2647 that every time someone with a special computing need invests to
2648 make computers faster, cheaper, and more robust, every other
2649 computing application is a potential beneficiary of this work.
2651 For these reasons, every type of communication is gradually absorbed
2652 into the internet, and every type of device
— from airplanes to
2653 pacemakers
— eventually becomes a computer in a fancy case.
2655 While these considerations don
’t preclude regulating networks and
2656 computers, they do call for gravitas and caution when doing so
2657 because changes to regulatory frameworks could ripple out to have
2658 unintended consequences in many, many other domains.
2660 The upshot of this is that our best hope of solving the big
2661 coordination problems
— climate change, inequality, etc.
— is with
2662 free, fair, and open tech. Our best hope of keeping tech free, fair,
2663 and open is to exercise caution in how we regulate tech and to
2664 attend closely to the ways in which interventions to solve one
2665 problem might create problems in other domains.
2666 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"ownership-of-facts"></a>Ownership of facts
</h2></div></div></div><p>
2667 Big Tech has a funny relationship with information. When you
’re
2668 generating information
— anything from the location data streaming
2669 off your mobile device to the private messages you send to friends
2670 on a social network
— it claims the rights to make unlimited use of
2673 But when you have the audacity to turn the tables
— to use a tool
2674 that blocks ads or slurps your waiting updates out of a social
2675 network and puts them in another app that lets you set your own
2676 priorities and suggestions or crawls their system to allow you to
2677 start a rival business
— they claim that you
’re stealing from them.
2679 The thing is, information is a very bad fit for any kind of private
2680 property regime. Property rights are useful for establishing markets
2681 that can lead to the effective development of fallow assets. These
2682 markets depend on clear titles to ensure that the things being
2683 bought and sold in them can, in fact, be bought and sold.
2685 Information rarely has such a clear title. Take phone numbers:
2686 There
’s clearly something going wrong when Facebook slurps up
2687 millions of users
’ address books and uses the phone numbers it finds
2688 in them to plot out social graphs and fill in missing information
2691 But the phone numbers Facebook nonconsensually acquires in this
2692 transaction are not the
<span class=
"quote">“<span class=
"quote">property
</span>”</span> of the users they
’re taken from
2693 nor do they belong to the people whose phones ring when you dial
2694 those numbers. The numbers are mere integers,
10 digits in the U.S.
2695 and Canada, and they appear in millions of places, including
2696 somewhere deep in pi as well as numerous other contexts. Giving
2697 people ownership titles to integers is an obviously terrible idea.
2699 Likewise for the facts that Facebook and other commercial
2700 surveillance operators acquire about us, like that we are the
2701 children of our parents or the parents to our children or that we
2702 had a conversation with someone else or went to a public place.
2703 These data points can
’t be property in the sense that your house or
2704 your shirt is your property because the title to them is
2705 intrinsically muddy: Does your mom own the fact that she is your
2706 mother? Do you? Do both of you? What about your dad
— does he own
2707 this fact too, or does he have to license the fact from you (or your
2708 mom or both of you) in order to use this fact? What about the
2709 hundreds or thousands of other people who know these facts?
2711 If you go to a Black Lives Matter demonstration, do the other
2712 demonstrators need your permission to post their photos from the
2713 event? The online fights over
2714 <a class=
"ulink" href=
"https://www.wired.com/story/how-to-take-photos-at-protests/" target=
"_top">when
2715 and how to post photos from demonstrations
</a> reveal a nuanced,
2716 complex issue that cannot be easily hand-waved away by giving one
2717 party a property right that everyone else in the mix has to respect.
2719 The fact that information isn
’t a good fit with property and markets
2720 doesn
’t mean that it
’s not valuable. Babies aren
’t property, but
2721 they
’re inarguably valuable. In fact, we have a whole set of rules
2722 just for babies as well as a subset of those rules that apply to
2723 humans more generally. Someone who argues that babies won
’t be truly
2724 valuable until they can be bought and sold like loaves of bread
2725 would be instantly and rightfully condemned as a monster.
2727 It
’s tempting to reach for the property hammer when Big Tech treats
2728 your information like a nail
— not least because Big Tech are such
2729 prolific abusers of property hammers when it comes to
2730 <span class=
"emphasis"><em>their
</em></span> information. But this is a mistake. If we
2731 allow markets to dictate the use of our information, then we
’ll find
2732 that we
’re sellers in a buyers
’ market where the Big Tech monopolies
2733 set a price for our data that is so low as to be insignificant or,
2734 more likely, set at a nonnegotiable price of zero in a click-through
2735 agreement that you don
’t have the opportunity to modify.
2737 Meanwhile, establishing property rights over information will create
2738 insurmountable barriers to independent data processing. Imagine that
2739 we require a license to be negotiated when a translated document is
2740 compared with its original, something Google has done and continues
2741 to do billions of times to train its automated language translation
2742 tools. Google can afford this, but independent third parties cannot.
2743 Google can staff a clearances department to negotiate one-time
2744 payments to the likes of the EU (one of the major repositories of
2745 translated documents) while independent watchdogs wanting to verify
2746 that the translations are well-prepared, or to root out bias in
2747 translations, will find themselves needing a staffed-up legal
2748 department and millions for licenses before they can even get
2751 The same goes for things like search indexes of the web or photos of
2752 peoples
’ houses, which have become contentious thanks to Google
’s
2753 Street View project. Whatever problems may exist with Google
’s
2754 photographing of street scenes, resolving them by letting people
2755 decide who can take pictures of the facades of their homes from a
2756 public street will surely create even worse ones. Think of how
2757 street photography is important for newsgathering
— including
2758 informal newsgathering, like photographing abuses of authority
— and
2759 how being able to document housing and street life are important for
2760 contesting eminent domain, advocating for social aid, reporting
2761 planning and zoning violations, documenting discriminatory and
2762 unequal living conditions, and more.
2764 The ownership of facts is antithetical to many kinds of human
2765 progress. It
’s hard to imagine a rule that limits Big Tech
’s
2766 exploitation of our collective labors without inadvertently banning
2767 people from gathering data on online harassment or compiling indexes
2768 of changes in language or simply investigating how the platforms are
2769 shaping our discourse
— all of which require scraping data that
2770 other people have created and subjecting it to scrutiny and
2772 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"persuasion-works-slowly"></a>Persuasion works
… slowly
</h2></div></div></div><p>
2773 The platforms may oversell their ability to persuade people, but
2774 obviously, persuasion works sometimes. Whether it
’s the private
2775 realm that LGBTQ people used to recruit allies and normalize sexual
2776 diversity or the decadeslong project to convince people that markets
2777 are the only efficient way to solve complicated resource allocation
2778 problems, it
’s clear that our societal attitudes
2779 <span class=
"emphasis"><em>can
</em></span> change.
2781 The project of shifting societal attitudes is a game of inches and
2782 years. For centuries, svengalis have purported to be able to
2783 accelerate this process, but even the most brutal forms of
2784 propaganda have struggled to make permanent changes. Joseph Goebbels
2785 was able to subject Germans to daily, mandatory, hourslong radio
2786 broadcasts, to round up and torture and murder dissidents, and to
2787 seize full control over their children
’s education while banning any
2788 literature, broadcasts, or films that did not comport with his
2791 Yet, after
12 years of terror, once the war ended, Nazi ideology was
2792 largely discredited in both East and West Germany, and a program of
2793 national truth and reconciliation was put in its place. Racism and
2794 authoritarianism were never fully abolished in Germany, but neither
2795 were the majority of Germans irrevocably convinced of Nazism
— and
2796 the rise of racist authoritarianism in Germany today tells us that
2797 the liberal attitudes that replaced Nazism were no more permanent
2800 Racism and authoritarianism have also always been with us. Anyone
2801 who
’s reviewed the kind of messages and arguments that racists put
2802 forward today would be hard-pressed to say that they have gotten
2803 better at presenting their ideas. The same pseudoscience, appeals to
2804 fear, and circular logic that racists presented in the
1980s, when
2805 the cause of white supremacy was on the wane, are to be found in the
2806 communications of leading white nationalists today.
2808 If racists haven
’t gotten more convincing in the past decade, then
2809 how is it that more people were convinced to be openly racist at
2810 that time? I believe that the answer lies in the material world, not
2811 the world of ideas. The ideas haven
’t gotten more convincing, but
2812 people have become more afraid. Afraid that the state can
’t be
2813 trusted to act as an honest broker in life-or-death decisions, from
2814 those regarding the management of the economy to the regulation of
2815 painkillers to the rules for handling private information. Afraid
2816 that the world has become a game of musical chairs in which the
2817 chairs are being taken away at a never-before-seen rate. Afraid that
2818 justice for others will come at their expense. Monopolism isn
’t the
2819 cause of these fears, but the inequality and material desperation
2820 and policy malpractice that monopolism contributes to is a
2821 significant contributor to these conditions. Inequality creates the
2822 conditions for both conspiracies and violent racist ideologies, and
2823 then surveillance capitalism lets opportunists target the fearful
2824 and the conspiracy-minded.
2825 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"paying-wont-help"></a>Paying won
’t help
</h2></div></div></div><p>
2826 As the old saw goes,
<span class=
"quote">“<span class=
"quote">If you
’re not paying for the product, you
’re
2827 the product.
</span>”</span>
2829 It
’s a commonplace belief today that the advent of free,
2830 ad-supported media was the original sin of surveillance capitalism.
2831 The reasoning is that the companies that charged for access couldn
’t
2832 <span class=
"quote">“<span class=
"quote">compete with free
</span>”</span> and so they were driven out of business. Their
2833 ad-supported competitors, meanwhile, declared open season on their
2834 users
’ data in a bid to improve their ad targeting and make more
2835 money and then resorted to the most sensationalist tactics to
2836 generate clicks on those ads. If only we
’d pay for media again, we
’d
2837 have a better, more responsible, more sober discourse that would be
2838 better for democracy.
2840 But the degradation of news products long precedes the advent of
2841 ad-supported online news. Long before newspapers were online, lax
2842 antitrust enforcement had opened the door for unprecedented waves of
2843 consolidation and roll-ups in newsrooms. Rival newspapers were
2844 merged, reporters and ad sales staff were laid off, physical plants
2845 were sold and leased back, leaving the companies loaded up with debt
2846 through leveraged buyouts and subsequent profit-taking by the new
2847 owners. In other words, it wasn
’t merely shifts in the classified
2848 advertising market, which was long held to be the primary driver in
2849 the decline of the traditional newsroom, that made news companies
2850 unable to adapt to the internet
— it was monopolism.
2852 Then, as news companies
<span class=
"emphasis"><em>did
</em></span> come online, the ad
2853 revenues they commanded dropped even as the number of internet users
2854 (and thus potential online readers) increased. That shift was a
2855 function of consolidation in the ad sales market, with Google and
2856 Facebook emerging as duopolists who made more money every year from
2857 advertising while paying less and less of it to the publishers whose
2858 work the ads appeared alongside. Monopolism created a buyer
’s market
2859 for ad inventory with Facebook and Google acting as gatekeepers.
2861 Paid services continue to exist alongside free ones, and often it is
2862 these paid services
— anxious to prevent people from bypassing their
2863 paywalls or sharing paid media with freeloaders
— that exert the
2864 most control over their customers. Apple
’s iTunes and App Stores are
2865 paid services, but to maximize their profitability, Apple has to
2866 lock its platforms so that third parties can
’t make compatible
2867 software without permission. These locks allow the company to
2868 exercise both editorial control (enabling it to exclude
2869 <a class=
"ulink" href=
"https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression" target=
"_top">controversial
2870 political material
</a>) and technological control, including
2871 control over who can repair the devices it makes. If we
’re worried
2872 that ad-supported products deprive people of their right to
2873 self-determination by using persuasion techniques to nudge their
2874 purchase decisions a few degrees in one direction or the other, then
2875 the near-total control a single company holds over the decision of
2876 who gets to sell you software, parts, and service for your iPhone
2877 should have us very worried indeed.
2879 We shouldn
’t just be concerned about payment and control: The idea
2880 that paying will improve discourse is also dangerously wrong. The
2881 poor success rate of targeted advertising means that the platforms
2882 have to incentivize you to
<span class=
"quote">“<span class=
"quote">engage
</span>”</span> with posts at extremely high
2883 levels to generate enough pageviews to safeguard their profits. As
2884 discussed earlier, to increase engagement, platforms like Facebook
2885 use machine learning to guess which messages will be most
2886 inflammatory and make a point of shoving those into your eyeballs at
2887 every turn so that you will hate-click and argue with people.
2889 Perhaps paying would fix this, the reasoning goes. If platforms
2890 could be economically viable even if you stopped clicking on them
2891 once your intellectual and social curiosity had been slaked, then
2892 they would have no reason to algorithmically enrage you to get more
2893 clicks out of you, right?
2895 There may be something to that argument, but it still ignores the
2896 wider economic and political context of the platforms and the world
2897 that allowed them to grow so dominant.
2899 Platforms are world-spanning and all-encompassing because they are
2900 monopolies, and they are monopolies because we have gutted our most
2901 important and reliable anti-monopoly rules. Antitrust was neutered
2902 as a key part of the project to make the wealthy wealthier, and that
2903 project has worked. The vast majority of people on Earth have a
2904 negative net worth, and even the dwindling middle class is in a
2905 precarious state, undersaved for retirement, underinsured for
2906 medical disasters, and undersecured against climate and technology
2909 In this wildly unequal world, paying doesn
’t improve the discourse;
2910 it simply prices discourse out of the range of the majority of
2911 people. Paying for the product is dandy, if you can afford it.
2913 If you think today
’s filter bubbles are a problem for our discourse,
2914 imagine what they
’d be like if rich people inhabited free-flowing
2915 Athenian marketplaces of ideas where you have to pay for admission
2916 while everyone else lives in online spaces that are subsidized by
2917 wealthy benefactors who relish the chance to establish
2918 conversational spaces where the
<span class=
"quote">“<span class=
"quote">house rules
</span>”</span> forbid questioning the
2919 status quo. That is, imagine if the rich seceded from Facebook, and
2920 then, instead of running ads that made money for shareholders,
2921 Facebook became a billionaire
’s vanity project that also happened to
2922 ensure that nobody talked about whether it was fair that only
2923 billionaires could afford to hang out in the rarified corners of the
2926 Behind the idea of paying for access is a belief that free markets
2927 will address Big Tech
’s dysfunction. After all, to the extent that
2928 people have a view of surveillance at all, it is generally an
2929 unfavorable one, and the longer and more thoroughly one is
2930 surveilled, the less one tends to like it. Same goes for lock-in: If
2931 HP
’s ink or Apple
’s App Store were really obviously fantastic, they
2932 wouldn
’t need technical measures to prevent users from choosing a
2933 rival
’s product. The only reason these technical countermeasures
2934 exist is that the companies don
’t believe their customers would
2935 <span class=
"emphasis"><em>voluntarily
</em></span> submit to their terms, and they
2936 want to deprive them of the choice to take their business elsewhere.
2938 Advocates for markets laud their ability to aggregate the diffused
2939 knowledge of buyers and sellers across a whole society through
2940 demand signals, price signals, and so on. The argument for
2941 surveillance capitalism being a
<span class=
"quote">“<span class=
"quote">rogue capitalism
</span>”</span> is that
2942 machine-learning-driven persuasion techniques distort
2943 decision-making by consumers, leading to incorrect signals
—
2944 consumers don
’t buy what they prefer, they buy what they
’re tricked
2945 into preferring. It follows that the monopolistic practices of
2946 lock-in, which do far more to constrain consumers
’ free choices, are
2947 even more of a
<span class=
"quote">“<span class=
"quote">rogue capitalism.
</span>”</span>
2949 The profitability of any business is constrained by the possibility
2950 that its customers will take their business elsewhere. Both
2951 surveillance and lock-in are anti-features that no customer wants.
2952 But monopolies can capture their regulators, crush their
2953 competitors, insert themselves into their customers
’ lives, and
2954 corral people into
<span class=
"quote">“<span class=
"quote">choosing
</span>”</span> their services regardless of whether
2955 they want them
— it
’s fine to be terrible when there is no
2958 Ultimately, surveillance and lock-in are both simply business
2959 strategies that monopolists can choose. Surveillance companies like
2960 Google are perfectly capable of deploying lock-in technologies
—
2961 just look at the onerous Android licensing terms that require
2962 device-makers to bundle in Google
’s suite of applications. And
2963 lock-in companies like Apple are perfectly capable of subjecting
2964 their users to surveillance if it means keeping the Chinese
2965 government happy and preserving ongoing access to Chinese markets.
2966 Monopolies may be made up of good, ethical people, but as
2967 institutions, they are not your friend
— they will do whatever they
2968 can get away with to maximize their profits, and the more
2969 monopolistic they are, the more they
<span class=
"emphasis"><em>can
</em></span> get
2971 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"an-ecology-moment-for-trustbusting"></a>An
<span class=
"quote">“<span class=
"quote">ecology
</span>”</span> moment for trustbusting
</h2></div></div></div><p>
2972 If we
’re going to break Big Tech
’s death grip on our digital lives,
2973 we
’re going to have to fight monopolies. That may sound pretty
2974 mundane and old-fashioned, something out of the New Deal era, while
2975 ending the use of automated behavioral modification feels like the
2976 plotline of a really cool cyberpunk novel.
2978 Meanwhile, breaking up monopolies is something we seem to have
2979 forgotten how to do. There is a bipartisan, trans-Atlantic consensus
2980 that breaking up companies is a fool
’s errand at best
— liable to
2981 mire your federal prosecutors in decades of litigation
— and
2982 counterproductive at worst, eroding the
<span class=
"quote">“<span class=
"quote">consumer benefits
</span>”</span> of large
2983 companies with massive efficiencies of scale.
2985 But trustbusters once strode the nation, brandishing law books,
2986 terrorizing robber barons, and shattering the illusion of
2987 monopolies
’ all-powerful grip on our society. The trustbusting era
2988 could not begin until we found the political will
— until the people
2989 convinced politicians they
’d have their backs when they went up
2990 against the richest, most powerful men in the world.
2992 Could we find that political will again?
2994 Copyright scholar James Boyle has described how the term
<span class=
"quote">“<span class=
"quote">ecology
</span>”</span>
2995 marked a turning point in environmental activism. Prior to the
2996 adoption of this term, people who wanted to preserve whale
2997 populations didn
’t necessarily see themselves as fighting the same
2998 battle as people who wanted to protect the ozone layer or fight
2999 freshwater pollution or beat back smog or acid rain.
3001 But the term
<span class=
"quote">“<span class=
"quote">ecology
</span>”</span> welded these disparate causes together into a
3002 single movement, and the members of this movement found solidarity
3003 with one another. The people who cared about smog signed petitions
3004 circulated by the people who wanted to end whaling, and the
3005 anti-whalers marched alongside the people demanding action on acid
3006 rain. This uniting behind a common cause completely changed the
3007 dynamics of environmentalism, setting the stage for today
’s climate
3008 activism and the sense that preserving the habitability of the
3009 planet Earth is a shared duty among all people.
3011 I believe we are on the verge of a new
<span class=
"quote">“<span class=
"quote">ecology
</span>”</span> moment dedicated to
3012 combating monopolies. After all, tech isn
’t the only concentrated
3013 industry nor is it even the
<span class=
"emphasis"><em>most
</em></span> concentrated
3016 You can find partisans for trustbusting in every sector of the
3017 economy. Everywhere you look, you can find people who
’ve been
3018 wronged by monopolists who
’ve trashed their finances, their health,
3019 their privacy, their educations, and the lives of people they love.
3020 Those people have the same cause as the people who want to break up
3021 Big Tech and the same enemies. When most of the world
’s wealth is in
3022 the hands of a very few, it follows that nearly every large company
3023 will have overlapping shareholders.
3025 That
’s the good news: With a little bit of work and a little bit of
3026 coalition building, we have more than enough political will to break
3027 up Big Tech and every other concentrated industry besides. First we
3028 take Facebook, then we take AT
&T/WarnerMedia.
3030 But here
’s the bad news: Much of what we
’re doing to tame Big Tech
3031 <span class=
"emphasis"><em>instead
</em></span> of breaking up the big companies also
3032 forecloses on the possibility of breaking them up later.
3034 Big Tech
’s concentration currently means that their inaction on
3035 harassment, for example, leaves users with an impossible choice:
3036 absent themselves from public discourse by, say, quitting Twitter or
3037 endure vile, constant abuse. Big Tech
’s over-collection and
3038 over-retention of data results in horrific identity theft. And their
3039 inaction on extremist recruitment means that white supremacists who
3040 livestream their shooting rampages can reach an audience of
3041 billions. The combination of tech concentration and media
3042 concentration means that artists
’ incomes are falling even as the
3043 revenue generated by their creations are increasing.
3045 Yet governments confronting all of these problems all inevitably
3046 converge on the same solution: deputize the Big Tech giants to
3047 police their users and render them liable for their users
’ bad
3048 actions. The drive to force Big Tech to use automated filters to
3049 block everything from copyright infringement to sex-trafficking to
3050 violent extremism means that tech companies will have to allocate
3051 hundreds of millions to run these compliance systems.
3053 These rules
— the EU
’s new Directive on Copyright, Australia
’s new
3054 terror regulation, America
’s FOSTA/SESTA sex-trafficking law and
3055 more
— are not just death warrants for small, upstart competitors
3056 that might challenge Big Tech
’s dominance but who lack the deep
3057 pockets of established incumbents to pay for all these automated
3058 systems. Worse still, these rules put a floor under how small we can
3059 hope to make Big Tech.
3061 That
’s because any move to break up Big Tech and cut it down to size
3062 will have to cope with the hard limit of not making these companies
3063 so small that they can no longer afford to perform these duties
—
3064 and it
’s
<span class=
"emphasis"><em>expensive
</em></span> to invest in those automated
3065 filters and outsource content moderation. It
’s already going to be
3066 hard to unwind these deeply concentrated, chimeric behemoths that
3067 have been welded together in the pursuit of monopoly profits. Doing
3068 so while simultaneously finding some way to fill the regulatory void
3069 that will be left behind if these self-policing rulers were forced
3070 to suddenly abdicate will be much, much harder.
3072 Allowing the platforms to grow to their present size has given them
3073 a dominance that is nearly insurmountable
— deputizing them with
3074 public duties to redress the pathologies created by their size makes
3075 it virtually impossible to reduce that size. Lather, rinse, repeat:
3076 If the platforms don
’t get smaller, they will get larger, and as
3077 they get larger, they will create more problems, which will give
3078 rise to more public duties for the companies, which will make them
3081 We can work to fix the internet by breaking up Big Tech and
3082 depriving them of monopoly profits, or we can work to fix Big Tech
3083 by making them spend their monopoly profits on governance. But we
3084 can
’t do both. We have to choose between a vibrant, open internet or
3085 a dominated, monopolized internet commanded by Big Tech giants that
3086 we struggle with constantly to get them to behave themselves.
3087 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"make-big-tech-small-again"></a>Make Big Tech small again
</h2></div></div></div><p>
3088 Trustbusting is hard. Breaking big companies into smaller ones is
3089 expensive and time-consuming. So time-consuming that by the time
3090 you
’re done, the world has often moved on and rendered years of
3091 litigation irrelevant. From
1969 to
1982, the U.S. government
3092 pursued an antitrust case against IBM over its dominance of
3093 mainframe computing
— but the case collapsed in
1982 because
3094 mainframes were being speedily replaced by PCs.
3095 </p><div class=
"blockquote"><blockquote class=
"blockquote"><p>
3096 A future U.S. president could simply direct their attorney general
3097 to enforce the law as it was written.
3098 </p></blockquote></div><p>
3099 It
’s far easier to prevent concentration than to fix it, and
3100 reinstating the traditional contours of U.S. antitrust enforcement
3101 will, at the very least, prevent further concentration. That means
3102 bans on mergers between large companies, on big companies acquiring
3103 nascent competitors, and on platform companies competing directly
3104 with the companies that rely on the platforms.
3106 These powers are all in the plain language of U.S. antitrust laws,
3107 so in theory, a future U.S. president could simply direct their
3108 attorney general to enforce the law as it was written. But after
3109 decades of judicial
<span class=
"quote">“<span class=
"quote">education
</span>”</span> in the benefits of monopolies, after
3110 multiple administrations that have packed the federal courts with
3111 lifetime-appointed monopoly cheerleaders, it
’s not clear that mere
3112 administrative action would do the trick.
3114 If the courts frustrate the Justice Department and the president,
3115 the next stop would be Congress, which could eliminate any doubt
3116 about how antitrust law should be enforced in the U.S. by passing
3117 new laws that boil down to saying,
<span class=
"quote">“<span class=
"quote">Knock it off. We all know what
3118 the Sherman Act says. Robert Bork was a deranged fantasist. For
3119 avoidance of doubt,
<span class=
"emphasis"><em>fuck that guy
</em></span>.
</span>”</span> In other
3120 words, the problem with monopolies is
3121 <span class=
"emphasis"><em>monopolism
</em></span> — the concentration of power into
3122 too few hands, which erodes our right to self-determination. If
3123 there is a monopoly, the law wants it gone, period. Sure, get rid of
3124 monopolies that create
<span class=
"quote">“<span class=
"quote">consumer harm
</span>”</span> in the form of higher prices,
3125 but also,
<span class=
"emphasis"><em>get rid of other monopolies, too.
</em></span>
3127 But this only prevents things from getting worse. To help them get
3128 better, we will have to build coalitions with other activists in the
3129 anti-monopoly ecology movement
— a pluralism movement or a
3130 self-determination movement
— and target existing monopolies in
3131 every industry for breakup and structural separation rules that
3132 prevent, for example, the giant eyewear monopolist Luxottica from
3133 dominating both the sale and the manufacture of spectacles.
3135 In an important sense, it doesn
’t matter which industry the breakups
3136 begin in. Once they start, shareholders in
3137 <span class=
"emphasis"><em>every
</em></span> industry will start to eye their
3138 investments in monopolists skeptically. As trustbusters ride into
3139 town and start making lives miserable for monopolists, the debate
3140 around every corporate boardroom
’s table will shift. People within
3141 corporations who
’ve always felt uneasy about monopolism will gain a
3142 powerful new argument to fend off their evil rivals in the corporate
3143 hierarchy:
<span class=
"quote">“<span class=
"quote">If we do it my way, we make less money; if we do it your
3144 way, a judge will fine us billions and expose us to ridicule and
3145 public disapprobation. So even though I get that it would be really
3146 cool to do that merger, lock out that competitor, or buy that little
3147 company and kill it before it can threaten it, we really shouldn
’t
—
3148 not if we don
’t want to get tied to the DOJ
’s bumper and get dragged
3149 up and down Trustbuster Road for the next
10 years.
</span>”</span>
3150 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"goto-10"></a>20 GOTO
10</h2></div></div></div><p>
3151 Fixing Big Tech will require a lot of iteration. As cyber lawyer
3152 Lawrence Lessig wrote in his
1999 book,
<span class=
"emphasis"><em>Code and Other
3153 Laws of Cyberspace
</em></span>, our lives are regulated by four
3154 forces: law (what
’s legal), code (what
’s technologically possible),
3155 norms (what
’s socially acceptable), and markets (what
’s profitable).
3157 If you could wave a wand and get Congress to pass a law that
3158 re-fanged the Sherman Act tomorrow, you could use the impending
3159 breakups to convince venture capitalists to fund competitors to
3160 Facebook, Google, Twitter, and Apple that would be waiting in the
3161 wings after they were cut down to size.
3163 But getting Congress to act will require a massive normative shift,
3164 a mass movement of people who care about monopolies
— and pulling
3167 Getting people to care about monopolies will take technological
3168 interventions that help them to see what a world free from Big Tech
3169 might look like. Imagine if someone could make a beloved (but
3170 unauthorized) third-party Facebook or Twitter client that dampens
3171 the anxiety-producing algorithmic drumbeat and still lets you talk
3172 to your friends without being spied upon
— something that made
3173 social media more sociable and less toxic. Now imagine that it gets
3174 shut down in a brutal legal battle. It
’s always easier to convince
3175 people that something must be done to save a thing they love than it
3176 is to excite them about something that doesn
’t even exist yet.
3178 Neither tech nor law nor code nor markets are sufficient to reform
3179 Big Tech. But a profitable competitor to Big Tech could bankroll a
3180 legislative push; legal reform can embolden a toolsmith to make a
3181 better tool; the tool can create customers for a potential business
3182 who value the benefits of the internet but want them delivered
3183 without Big Tech; and that business can get funded and divert some
3184 of its profits to legal reform.
20 GOTO
10 (or lather, rinse,
3185 repeat). Do it again, but this time, get farther! After all, this
3186 time you
’re starting with weaker Big Tech adversaries, a
3187 constituency that understands things can be better, Big Tech rivals
3188 who
’ll help ensure their own future by bankrolling reform, and code
3189 that other programmers can build on to weaken Big Tech even further.
3191 The surveillance capitalism hypothesis
— that Big Tech
’s products
3192 really work as well as they say they do and that
’s why everything is
3193 so screwed up
— is way too easy on surveillance and even easier on
3194 capitalism. Companies spy because they believe their own BS, and
3195 companies spy because governments let them, and companies spy
3196 because any advantage from spying is so short-lived and minor that
3197 they have to do more and more of it just to stay in place.
3199 As to why things are so screwed up? Capitalism. Specifically, the
3200 monopolism that creates inequality and the inequality that creates
3201 monopolism. It
’s a form of capitalism that rewards sociopaths who
3202 destroy the real economy to inflate the bottom line, and they get
3203 away with it for the same reason companies get away with spying:
3204 because our governments are in thrall to both the ideology that says
3205 monopolies are actually just fine and in thrall to the ideology that
3206 says that in a monopolistic world, you
’d better not piss off the
3209 Surveillance doesn
’t make capitalism rogue. Capitalism
’s unchecked
3210 rule begets surveillance. Surveillance isn
’t bad because it lets
3211 people manipulate us. It
’s bad because it crushes our ability to be
3212 our authentic selves
— and because it lets the rich and powerful
3213 figure out who might be thinking of building guillotines and what
3214 dirt they can use to discredit those embryonic guillotine-builders
3215 before they can even get to the lumberyard.
3216 </p></div><div class=
"sect1"><div class=
"titlepage"><div><div><h2 class=
"title" style=
"clear: both"><a name=
"up-and-through"></a>Up and through
</h2></div></div></div><p>
3217 With all the problems of Big Tech, it
’s tempting to imagine solving
3218 the problem by returning to a world without tech at all. Resist that
3221 The only way out of our Big Tech problem is up and through. If our
3222 future is not reliant upon high tech, it will be because
3223 civilization has fallen. Big Tech wired together a planetary,
3224 species-wide nervous system that, with the proper reforms and course
3225 corrections, is capable of seeing us through the existential
3226 challenge of our species and planet. Now it
’s up to us to seize the
3227 means of computation, putting that electronic nervous system under
3228 democratic, accountable control.
3230 I am, secretly, despite what I have said earlier, a tech
3231 exceptionalist. Not in the sense of thinking that tech should be
3232 given a free pass to monopolize because it has
<span class=
"quote">“<span class=
"quote">economies of scale
</span>”</span>
3233 or some other nebulous feature. I
’m a tech exceptionalist because I
3234 believe that getting tech right matters and that getting it wrong
3235 will be an unmitigated catastrophe
— and doing it right can give us
3236 the power to work together to save our civilization, our species,
3238 </p></div></div></body></html>