]> pere.pagekite.me Git - text-destroy-surveillance.git/blob - public/how-to-destroy-surveillance-capitalism.html
Regenerated.
[text-destroy-surveillance.git] / public / how-to-destroy-surveillance-capitalism.html
1 <html><head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"><title>How to Destroy Surveillance Capitalism</title><meta name="generator" content="DocBook XSL Stylesheets V1.79.1"><style type="text/css">
2 body { background-image: url('images/draft.png');
3 background-repeat: no-repeat;
4 background-position: top left;
5 /* The following properties make the watermark "fixed" on the page. */
6 /* I think that's just a bit too distracting for the reader... */
7 /* background-attachment: fixed; */
8 /* background-position: center center; */
9 }</style></head><body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF"><div lang="en" class="article"><div class="titlepage"><div><div><h2 class="title"><a name="index"></a>How to Destroy Surveillance Capitalism</h2></div><div><div class="authorgroup"><div class="author"><h3 class="author"><span class="firstname">Cory</span> <span class="surname">Doctorow</span></h3></div></div></div><div><p class="copyright">Copyright © 2020 Cory Doctorow</p></div><div><p class="copyright">Copyright © 2020 Petter Reinholdtsen</p></div><div><div class="legalnotice"><a name="idm18"></a><p>
10 How to Destroy Surveillance Capitalism by Cory Doctorow.
11 </p><p>
12 Published by Petter Reinholdtsen.
13 </p><p>
14 ISBN 978-82-93828-05-1 (hard cover)
15 </p><p>
16 ISBN 978-82-93828-06-8 (paperback)
17 </p><p>
18 ISBN 978-82-93828-07-5 (ePub)
19 </p><p>
20 This book is available for purchase from
21 <a class="ulink" href="https://www.lulu.com/" target="_top">https://www.lulu.com/</a>.
22 </p><p>
23 <span class="inlinemediaobject"><img src="images/cc.png" align="middle" height="38" alt="Creative Commons, Some rights reserved"></span>
24 </p><p>
25 This book is licensed under a Creative Commons license. This
26 license permits any use of this work, so long as attribution is
27 given and no derivatived material is distributed. For more
28 information about the license visit <a class="ulink" href="https://creativecommons.org/licenses/by-nd/4.0/" target="_top">https://creativecommons.org/licenses/by-nd/4.0/</a>.
29 </p></div></div></div><hr></div><div class="toc"><p><b>Table of Contents</b></p><dl class="toc"><dt><span class="sect1"><a href="#the-net-of-a-thousand-lies">The net of a thousand lies</a></span></dt><dt><span class="sect1"><a href="#digital-rights-activism-a-quarter-century-on">Digital rights activism, a quarter-century on</a></span></dt><dt><span class="sect1"><a href="#tech-exceptionalism-then-and-now">Tech exceptionalism, then and now</a></span></dt><dt><span class="sect1"><a href="#dont-believe-the-hype">Don’t believe the hype</a></span></dt><dt><span class="sect1"><a href="#what-is-persuasion">What is persuasion?</a></span></dt><dd><dl><dt><span class="sect2"><a href="#segmenting">1. Segmenting</a></span></dt><dt><span class="sect2"><a href="#deception">2. Deception</a></span></dt><dt><span class="sect2"><a href="#domination">3. Domination</a></span></dt><dt><span class="sect2"><a href="#bypassing-our-rational-faculties">4. Bypassing our rational faculties</a></span></dt></dl></dd><dt><span class="sect1"><a href="#if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak">If data is the new oil, then surveillance capitalism’s engine
30 has a leak</a></span></dt><dt><span class="sect1"><a href="#what-is-facebook">What is Facebook?</a></span></dt><dt><span class="sect1"><a href="#monopoly-and-the-right-to-the-future-tense">Monopoly and the right to the future tense</a></span></dt><dt><span class="sect1"><a href="#search-order-and-the-right-to-the-future-tense">Search order and the right to the future tense</a></span></dt><dt><span class="sect1"><a href="#monopolists-can-afford-sleeping-pills-for-watchdogs">Monopolists can afford sleeping pills for watchdogs</a></span></dt><dt><span class="sect1"><a href="#privacy-and-monopoly">Privacy and monopoly</a></span></dt><dt><span class="sect1"><a href="#ronald-reagan-pioneer-of-tech-monopolism">Ronald Reagan, pioneer of tech monopolism</a></span></dt><dt><span class="sect1"><a href="#steering-with-the-windshield-wipers">Steering with the windshield wipers</a></span></dt><dt><span class="sect1"><a href="#surveillance-still-matters">Surveillance still matters</a></span></dt><dt><span class="sect1"><a href="#dignity-and-sanctuary">Dignity and sanctuary</a></span></dt><dt><span class="sect1"><a href="#afflicting-the-afflicted">Afflicting the afflicted</a></span></dt><dt><span class="sect1"><a href="#any-data-you-collect-and-retain-will-eventually-leak">Any data you collect and retain will eventually leak</a></span></dt><dt><span class="sect1"><a href="#critical-tech-exceptionalism-is-still-tech-exceptionalism">Critical tech exceptionalism is still tech
31 exceptionalism</a></span></dt><dt><span class="sect1"><a href="#how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story">How monopolies, not mind control, drive surveillance
32 capitalism: The Snapchat story</a></span></dt><dt><span class="sect1"><a href="#a-monopoly-over-your-friends">A monopoly over your friends</a></span></dt><dt><span class="sect1"><a href="#fake-news-is-an-epistemological-crisis">Fake news is an epistemological crisis</a></span></dt><dt><span class="sect1"><a href="#tech-is-different">Tech is different</a></span></dt><dt><span class="sect1"><a href="#ownership-of-facts">Ownership of facts</a></span></dt><dt><span class="sect1"><a href="#persuasion-works-slowly">Persuasion works… slowly</a></span></dt><dt><span class="sect1"><a href="#paying-wont-help">Paying won’t help</a></span></dt><dt><span class="sect1"><a href="#an-ecology-moment-for-trustbusting">An <span class="quote"><span class="quote">ecology</span></span> moment for trustbusting</a></span></dt><dt><span class="sect1"><a href="#make-big-tech-small-again">Make Big Tech small again</a></span></dt><dt><span class="sect1"><a href="#goto-10">20 GOTO 10</a></span></dt><dt><span class="sect1"><a href="#up-and-through">Up and through</a></span></dt></dl></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="the-net-of-a-thousand-lies"></a>The net of a thousand lies</h2></div></div></div><p>
33 The most surprising thing about the rebirth of flat Earthers in the
34 21st century is just how widespread the evidence against them is.
35 You can understand how, centuries ago, people who’d never gained a
36 high-enough vantage point from which to see the Earth’s curvature
37 might come to the commonsense belief that the flat-seeming Earth
38 was, indeed, flat.
39 </p><p>
40 But today, when elementary schools routinely dangle GoPro cameras
41 from balloons and loft them high enough to photograph the Earth’s
42 curve — to say nothing of the unexceptional sight of the curved
43 Earth from an airplane window — it takes a heroic effort to maintain
44 the belief that the world is flat.
45 </p><p>
46 Likewise for white nationalism and eugenics: In an age where you can
47 become a computational genomics datapoint by swabbing your cheek and
48 mailing it to a gene-sequencing company along with a modest sum of
49 money, <span class="quote"><span class="quote">race science</span></span> has never been easier to refute.
50 </p><p>
51 We are living through a golden age of both readily available facts
52 and denial of those facts. Terrible ideas that have lingered on the
53 fringes for decades or even centuries have gone mainstream seemingly
54 overnight.
55 </p><p>
56 When an obscure idea gains currency, there are only two things that
57 can explain its ascendance: Either the person expressing that idea
58 has gotten a lot better at stating their case, or the proposition
59 has become harder to deny in the face of mounting evidence. In other
60 words, if we want people to take climate change seriously, we can
61 get a bunch of Greta Thunbergs to make eloquent, passionate
62 arguments from podiums, winning our hearts and minds, or we can wait
63 for flood, fire, broiling sun, and pandemics to make the case for
64 us. In practice, we’ll probably have to do some of both: The more
65 we’re boiling and burning and drowning and wasting away, the easier
66 it will be for the Greta Thunbergs of the world to convince us.
67 </p><p>
68 The arguments for ridiculous beliefs in odious conspiracies like
69 anti-vaccination, climate denial, a flat Earth, and eugenics are no
70 better than they were a generation ago. Indeed, they’re worse
71 because they are being pitched to people who have at least a
72 background awareness of the refuting facts.
73 </p><p>
74 Anti-vax has been around since the first vaccines, but the early
75 anti-vaxxers were pitching people who were less equipped to
76 understand even the most basic ideas from microbiology, and
77 moreover, those people had not witnessed the extermination of
78 mass-murdering diseases like polio, smallpox, and measles. Today’s
79 anti-vaxxers are no more eloquent than their forebears, and they
80 have a much harder job.
81 </p><p>
82 So can these far-fetched conspiracy theorists really be succeeding
83 on the basis of superior arguments?
84 </p><p>
85 Some people think so. Today, there is a widespread belief that
86 machine learning and commercial surveillance can turn even the most
87 fumble-tongued conspiracy theorist into a svengali who can warp your
88 perceptions and win your belief by locating vulnerable people and
89 then pitching them with A.I.-refined arguments that bypass their
90 rational faculties and turn everyday people into flat Earthers,
91 anti-vaxxers, or even Nazis. When the RAND Corporation
92 <a class="ulink" href="https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf" target="_top">blames
93 Facebook for <span class="quote"><span class="quote">radicalization</span></span></a> and when Facebook’s role in
94 spreading coronavirus misinformation is
95 <a class="ulink" href="https://secure.avaaz.org/campaign/en/facebook_threat_health/" target="_top">blamed
96 on its algorithm</a>, the implicit message is that machine
97 learning and surveillance are causing the changes in our consensus
98 about what’s true.
99 </p><p>
100 After all, in a world where sprawling and incoherent conspiracy
101 theories like Pizzagate and its successor, QAnon, have widespread
102 followings, <span class="emphasis"><em>something</em></span> must be afoot.
103 </p><p>
104 But what if there’s another explanation? What if it’s the material
105 circumstances, and not the arguments, that are making the difference
106 for these conspiracy pitchmen? What if the trauma of living through
107 <span class="emphasis"><em>real conspiracies</em></span> all around us — conspiracies
108 among wealthy people, their lobbyists, and lawmakers to bury
109 inconvenient facts and evidence of wrongdoing (these conspiracies
110 are commonly known as <span class="quote"><span class="quote">corruption</span></span>) — is making people vulnerable to
111 conspiracy theories?
112 </p><p>
113 If it’s trauma and not contagion — material conditions and not
114 ideology — that is making the difference today and enabling a rise
115 of repulsive misinformation in the face of easily observed facts,
116 that doesn’t mean our computer networks are blameless. They’re still
117 doing the heavy work of locating vulnerable people and guiding them
118 through a series of ever-more-extreme ideas and communities.
119 </p><p>
120 Belief in conspiracy is a raging fire that has done real damage and
121 poses real danger to our planet and species, from epidemics
122 <a class="ulink" href="https://www.cdc.gov/measles/cases-outbreaks.html" target="_top">kicked
123 off by vaccine denial</a> to genocides
124 <a class="ulink" href="https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html" target="_top">kicked
125 off by racist conspiracies</a> to planetary meltdown caused by
126 denial-inspired climate inaction. Our world is on fire, and so we
127 have to put the fires out — to figure out how to help people see the
128 truth of the world through the conspiracies they’ve been confused
129 by.
130 </p><p>
131 But firefighting is reactive. We need fire
132 <span class="emphasis"><em>prevention</em></span>. We need to strike at the traumatic
133 material conditions that make people vulnerable to the contagion of
134 conspiracy. Here, too, tech has a role to play.
135 </p><p>
136 There’s no shortage of proposals to address this. From the EU’s
137 <a class="ulink" href="https://edri.org/tag/terreg/" target="_top">Terrorist Content
138 Regulation</a>, which requires platforms to police and remove
139 <span class="quote"><span class="quote">extremist</span></span> content, to the U.S. proposals to
140 <a class="ulink" href="https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution" target="_top">force
141 tech companies to spy on their users</a> and hold them liable
142 <a class="ulink" href="https://www.natlawreview.com/article/repeal-cda-section-230" target="_top">for
143 their users’ bad speech</a>, there’s a lot of energy to force
144 tech companies to solve the problems they created.
145 </p><p>
146 There’s a critical piece missing from the debate, though. All these
147 solutions assume that tech companies are a fixture, that their
148 dominance over the internet is a permanent fact. Proposals to
149 replace Big Tech with a more diffused, pluralistic internet are
150 nowhere to be found. Worse: The <span class="quote"><span class="quote">solutions</span></span> on the table today
151 <span class="emphasis"><em>require</em></span> Big Tech to stay big because only the
152 very largest companies can afford to implement the systems these
153 laws demand.
154 </p><p>
155 Figuring out what we want our tech to look like is crucial if we’re
156 going to get out of this mess. Today, we’re at a crossroads where
157 we’re trying to figure out if we want to fix the Big Tech companies
158 that dominate our internet or if we want to fix the internet itself
159 by unshackling it from Big Tech’s stranglehold. We can’t do both, so
160 we have to choose.
161 </p><p>
162 I want us to choose wisely. Taming Big Tech is integral to fixing
163 the internet, and for that, we need digital rights activism.
164 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="digital-rights-activism-a-quarter-century-on"></a>Digital rights activism, a quarter-century on</h2></div></div></div><p>
165 Digital rights activism is more than 30 years old now. The
166 Electronic Frontier Foundation turned 30 this year; the Free
167 Software Foundation launched in 1985. For most of the history of the
168 movement, the most prominent criticism leveled against it was that
169 it was irrelevant: The real activist causes were real-world causes
170 (think of the skepticism when
171 <a class="ulink" href="https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&amp;text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015." target="_top">Finland
172 declared broadband a human right in 2010</a>), and real-world
173 activism was shoe-leather activism (think of Malcolm Gladwell’s
174 <a class="ulink" href="https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell" target="_top">contempt
175 for <span class="quote"><span class="quote">clicktivism</span></span></a>). But as tech has grown more central to
176 our daily lives, these accusations of irrelevance have given way
177 first to accusations of insincerity (<span class="quote"><span class="quote">You only care about tech
178 because you’re
179 <a class="ulink" href="https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/" target="_top">shilling
180 for tech companies</a></span></span>) to accusations of negligence (<span class="quote"><span class="quote">Why
181 didn’t you foresee that tech could be such a destructive force?</span></span>).
182 But digital rights activism is right where it’s always been: looking
183 out for the humans in a world where tech is inexorably taking over.
184 </p><p>
185 The latest version of this critique comes in the form of
186 <span class="quote"><span class="quote">surveillance capitalism,</span></span> a term coined by business professor
187 Shoshana Zuboff in her long and influential 2019 book, <span class="emphasis"><em>The
188 Age of Surveillance Capitalism: The Fight for a Human Future at the
189 New Frontier of Power</em></span>. Zuboff argues that <span class="quote"><span class="quote">surveillance
190 capitalism</span></span> is a unique creature of the tech industry and that it is
191 unlike any other abusive commercial practice in history, one that is
192 <span class="quote"><span class="quote">constituted by unexpected and often illegible mechanisms of
193 extraction, commodification, and control that effectively exile
194 persons from their own behavior while producing new markets of
195 behavioral prediction and modification. Surveillance capitalism
196 challenges democratic norms and departs in key ways from the
197 centuries-long evolution of market capitalism.</span></span> It is a new and
198 deadly form of capitalism, a <span class="quote"><span class="quote">rogue capitalism,</span></span> and our lack of
199 understanding of its unique capabilities and dangers represents an
200 existential, species-wide threat. She’s right that capitalism today
201 threatens our species, and she’s right that tech poses unique
202 challenges to our species and civilization, but she’s really wrong
203 about how tech is different and why it threatens our species.
204 </p><p>
205 What’s more, I think that her incorrect diagnosis will lead us down
206 a path that ends up making Big Tech stronger, not weaker. We need to
207 take down Big Tech, and to do that, we need to start by correctly
208 identifying the problem.
209 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="tech-exceptionalism-then-and-now"></a>Tech exceptionalism, then and now</h2></div></div></div><p>
210 Early critics of the digital rights movement — perhaps best
211 represented by campaigning organizations like the Electronic
212 Frontier Foundation, the Free Software Foundation, Public Knowledge,
213 and others that focused on preserving and enhancing basic human
214 rights in the digital realm — damned activists for practicing <span class="quote"><span class="quote">tech
215 exceptionalism.</span></span> Around the turn of the millennium, serious people
216 ridiculed any claim that tech policy mattered in the <span class="quote"><span class="quote">real world.</span></span>
217 Claims that tech rules had implications for speech, association,
218 privacy, search and seizure, and fundamental rights and equities
219 were treated as ridiculous, an elevation of the concerns of sad
220 nerds arguing about <span class="emphasis"><em>Star Trek</em></span> on bulletin board
221 systems above the struggles of the Freedom Riders, Nelson Mandela,
222 or the Warsaw ghetto uprising.
223 </p><p>
224 In the decades since, accusations of <span class="quote"><span class="quote">tech exceptionalism</span></span> have only
225 sharpened as tech’s role in everyday life has expanded: Now that
226 tech has infiltrated every corner of our life and our online lives
227 have been monopolized by a handful of giants, defenders of digital
228 freedoms are accused of carrying water for Big Tech, providing cover
229 for its self-interested negligence (or worse, nefarious plots).
230 </p><p>
231 From my perspective, the digital rights movement has remained
232 stationary while the rest of the world has moved. From the earliest
233 days, the movement’s concern was users and the toolsmiths who
234 provided the code they needed to realize their fundamental rights.
235 Digital rights activists only cared about companies to the extent
236 that companies were acting to uphold users’ rights (or, just as
237 often, when companies were acting so foolishly that they threatened
238 to bring down new rules that would also make it harder for good
239 actors to help users).
240 </p><p>
241 The <span class="quote"><span class="quote">surveillance capitalism</span></span> critique recasts the digital rights
242 movement in a new light again: not as alarmists who overestimate the
243 importance of their shiny toys nor as shills for big tech but as
244 serene deck-chair rearrangers whose long-standing activism is a
245 liability because it makes them incapable of perceiving novel
246 threats as they continue to fight the last century’s tech battles.
247 </p><p>
248 But tech exceptionalism is a sin no matter who practices it.
249 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="dont-believe-the-hype"></a>Don’t believe the hype</h2></div></div></div><p>
250 You’ve probably heard that <span class="quote"><span class="quote">if you’re not paying for the product,
251 you’re the product.</span></span> As we’ll see below, that’s true, if incomplete.
252 But what is <span class="emphasis"><em>absolutely</em></span> true is that ad-driven
253 Big Tech’s customers are advertisers, and what companies like Google
254 and Facebook sell is their ability to convince
255 <span class="emphasis"><em>you</em></span> to buy stuff. Big Tech’s product is
256 persuasion. The services — social media, search engines, maps,
257 messaging, and more — are delivery systems for persuasion.
258 </p><p>
259 The fear of surveillance capitalism starts from the (correct)
260 presumption that everything Big Tech says about itself is probably a
261 lie. But the surveillance capitalism critique makes an exception for
262 the claims Big Tech makes in its sales literature — the breathless
263 hype in the pitches to potential advertisers online and in ad-tech
264 seminars about the efficacy of its products: It assumes that Big
265 Tech is as good at influencing us as they claim they are when
266 they’re selling influencing products to credulous customers. That’s
267 a mistake because sales literature is not a reliable indicator of a
268 product’s efficacy.
269 </p><p>
270 Surveillance capitalism assumes that because advertisers buy a lot
271 of what Big Tech is selling, Big Tech must be selling something
272 real. But Big Tech’s massive sales could just as easily be the
273 result of a popular delusion or something even more pernicious:
274 monopolistic control over our communications and commerce.
275 </p><p>
276 Being watched changes your behavior, and not for the better. It
277 creates risks for our social progress. Zuboff’s book features
278 beautifully wrought explanations of these phenomena. But Zuboff also
279 claims that surveillance literally robs us of our free will — that
280 when our personal data is mixed with machine learning, it creates a
281 system of persuasion so devastating that we are helpless before it.
282 That is, Facebook uses an algorithm to analyze the data it
283 nonconsensually extracts from your daily life and uses it to
284 customize your feed in ways that get you to buy stuff. It is a
285 mind-control ray out of a 1950s comic book, wielded by mad
286 scientists whose supercomputers guarantee them perpetual and total
287 world domination.
288 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="what-is-persuasion"></a>What is persuasion?</h2></div></div></div><p>
289 To understand why you shouldn’t worry about mind-control rays — but
290 why you <span class="emphasis"><em>should</em></span> worry about surveillance
291 <span class="emphasis"><em>and</em></span> Big Tech — we must start by unpacking what
292 we mean by <span class="quote"><span class="quote">persuasion.</span></span>
293 </p><p>
294 Google, Facebook, and other surveillance capitalists promise their
295 customers (the advertisers) that if they use machine-learning tools
296 trained on unimaginably large data sets of nonconsensually harvested
297 personal information, they will be able to uncover ways to bypass
298 the rational faculties of the public and direct their behavior,
299 creating a stream of purchases, votes, and other desired outcomes.
300 </p><div class="blockquote"><blockquote class="blockquote"><p>
301 The impact of dominance far exceeds the impact of manipulation and
302 should be central to our analysis and any remedies we seek.
303 </p></blockquote></div><p>
304 But there’s little evidence that this is happening. Instead, the
305 predictions that surveillance capitalism delivers to its customers
306 are much less impressive. Rather than finding ways to bypass our
307 rational faculties, surveillance capitalists like Mark Zuckerberg
308 mostly do one or more of three things:
309 </p><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="segmenting"></a>1. Segmenting</h3></div></div></div><p>
310 If you’re selling diapers, you have better luck if you pitch them
311 to people in maternity wards. Not everyone who enters or leaves a
312 maternity ward just had a baby, and not everyone who just had a
313 baby is in the market for diapers. But having a baby is a really
314 reliable correlate of being in the market for diapers, and being
315 in a maternity ward is highly correlated with having a baby. Hence
316 diaper ads around maternity wards (and even pitchmen for baby
317 products, who haunt maternity wards with baskets full of
318 freebies).
319 </p><p>
320 Surveillance capitalism is segmenting times a billion. Diaper
321 vendors can go way beyond people in maternity wards (though they
322 can do that, too, with things like location-based mobile ads).
323 They can target you based on whether you’re reading articles about
324 child-rearing, diapers, or a host of other subjects, and data
325 mining can suggest unobvious keywords to advertise against. They
326 can target you based on the articles you’ve recently read. They
327 can target you based on what you’ve recently purchased. They can
328 target you based on whether you receive emails or private messages
329 about these subjects — or even if you speak aloud about them
330 (though Facebook and the like convincingly claim that’s not
331 happening — yet).
332 </p><p>
333 This is seriously creepy.
334 </p><p>
335 But it’s not mind control.
336 </p><p>
337 It doesn’t deprive you of your free will. It doesn’t trick you.
338 </p><p>
339 Think of how surveillance capitalism works in politics.
340 Surveillance capitalist companies sell political operatives the
341 power to locate people who might be receptive to their pitch.
342 Candidates campaigning on finance industry corruption seek people
343 struggling with debt; candidates campaigning on xenophobia seek
344 out racists. Political operatives have always targeted their
345 message whether their intentions were honorable or not: Union
346 organizers set up pitches at factory gates, and white supremacists
347 hand out fliers at John Birch Society meetings.
348 </p><p>
349 But this is an inexact and thus wasteful practice. The union
350 organizer can’t know which worker to approach on the way out of
351 the factory gates and may waste their time on a covert John Birch
352 Society member; the white supremacist doesn’t know which of the
353 Birchers are so delusional that making it to a meeting is as much
354 as they can manage and which ones might be convinced to cross the
355 country to carry a tiki torch through the streets of
356 Charlottesville, Virginia.
357 </p><p>
358 Because targeting improves the yields on political pitches, it can
359 accelerate the pace of political upheaval by making it possible
360 for everyone who has secretly wished for the toppling of an
361 autocrat — or just an 11-term incumbent politician — to find
362 everyone else who feels the same way at very low cost. This has
363 been critical to the rapid crystallization of recent political
364 movements including Black Lives Matter and Occupy Wall Street as
365 well as less savory players like the far-right white nationalist
366 movements that marched in Charlottesville.
367 </p><p>
368 It’s important to differentiate this kind of political organizing
369 from influence campaigns; finding people who secretly agree with
370 you isn’t the same as convincing people to agree with you. The
371 rise of phenomena like nonbinary or otherwise nonconforming gender
372 identities is often characterized by reactionaries as the result
373 of online brainwashing campaigns that convince impressionable
374 people that they have been secretly queer all along.
375 </p><p>
376 But the personal accounts of those who have come out tell a
377 different story where people who long harbored a secret about
378 their gender were emboldened by others coming forward and where
379 people who knew that they were different but lacked a vocabulary
380 for discussing that difference learned the right words from these
381 low-cost means of finding people and learning about their ideas.
382 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="deception"></a>2. Deception</h3></div></div></div><p>
383 Lies and fraud are pernicious, and surveillance capitalism
384 supercharges them through targeting. If you want to sell a
385 fraudulent payday loan or subprime mortgage, surveillance
386 capitalism can help you find people who are both desperate and
387 unsophisticated and thus receptive to your pitch. This accounts
388 for the rise of many phenomena, like multilevel marketing schemes,
389 in which deceptive claims about potential earnings and the
390 efficacy of sales techniques are targeted at desperate people by
391 advertising against search queries that indicate, for example,
392 someone struggling with ill-advised loans.
393 </p><p>
394 Surveillance capitalism also abets fraud by making it easy to
395 locate other people who have been similarly deceived, forming a
396 community of people who reinforce one another’s false beliefs.
397 Think of
398 <a class="ulink" href="https://www.vulture.com/2020/01/the-dream-podcast-review.html" target="_top">the
399 forums</a> where people who are being victimized by multilevel
400 marketing frauds gather to trade tips on how to improve their luck
401 in peddling the product.
402 </p><p>
403 Sometimes, online deception involves replacing someone’s correct
404 beliefs with incorrect ones, as it does in the anti-vaccination
405 movement, whose victims are often people who start out believing
406 in vaccines but are convinced by seemingly plausible evidence that
407 leads them into the false belief that vaccines are harmful.
408 </p><p>
409 But it’s much more common for fraud to succeed when it doesn’t
410 have to displace a true belief. When my daughter contracted head
411 lice at daycare, one of the daycare workers told me I could get
412 rid of them by treating her hair and scalp with olive oil. I
413 didn’t know anything about head lice, and I assumed that the
414 daycare worker did, so I tried it (it didn’t work, and it doesn’t
415 work). It’s easy to end up with false beliefs when you simply
416 don’t know any better and when those beliefs are conveyed by
417 someone who seems to know what they’re doing.
418 </p><p>
419 This is pernicious and difficult — and it’s also the kind of thing
420 the internet can help guard against by making true information
421 available, especially in a form that exposes the underlying
422 deliberations among parties with sharply divergent views, such as
423 Wikipedia. But it’s not brainwashing; it’s fraud. In the
424 <a class="ulink" href="https://datasociety.net/library/data-voids/" target="_top">majority
425 of cases</a>, the victims of these fraud campaigns have an
426 informational void filled in the customary way, by consulting a
427 seemingly reliable source. If I look up the length of the Brooklyn
428 Bridge and learn that it is 5,800 feet long, but in reality, it is
429 5,989 feet long, the underlying deception is a problem, but it’s a
430 problem with a simple remedy. It’s a very different problem from
431 the anti-vax issue in which someone’s true belief is displaced by
432 a false one by means of sophisticated persuasion.
433 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="domination"></a>3. Domination</h3></div></div></div><p>
434 Surveillance capitalism is the result of monopoly. Monopoly is the
435 cause, and surveillance capitalism and its negative outcomes are
436 the effects of monopoly. I’ll get into this in depth later, but
437 for now, suffice it to say that the tech industry has grown up
438 with a radical theory of antitrust that has allowed companies to
439 grow by merging with their rivals, buying up their nascent
440 competitors, and expanding to control whole market verticals.
441 </p><p>
442 One example of how monopolism aids in persuasion is through
443 dominance: Google makes editorial decisions about its algorithms
444 that determine the sort order of the responses to our queries. If
445 a cabal of fraudsters have set out to trick the world into
446 thinking that the Brooklyn Bridge is 5,800 feet long, and if
447 Google gives a high search rank to this group in response to
448 queries like <span class="quote"><span class="quote">How long is the Brooklyn Bridge?</span></span> then the first
449 eight or 10 screens’ worth of Google results could be wrong. And
450 since most people don’t go beyond the first couple of results —
451 let alone the first <span class="emphasis"><em>page</em></span> of results —
452 Google’s choice means that many people will be deceived.
453 </p><p>
454 Google’s dominance over search — more than 86% of web searches are
455 performed through Google — means that the way it orders its search
456 results has an outsized effect on public beliefs. Ironically,
457 Google claims this is why it can’t afford to have any transparency
458 in its algorithm design: Google’s search dominance makes the
459 results of its sorting too important to risk telling the world how
460 it arrives at those results lest some bad actor discover a flaw in
461 the ranking system and exploit it to push its point of view to the
462 top of the search results. There’s an obvious remedy to a company
463 that is too big to audit: break it up into smaller pieces.
464 </p><p>
465 Zuboff calls surveillance capitalism a <span class="quote"><span class="quote">rogue capitalism</span></span> whose
466 data-hoarding and machine-learning techniques rob us of our free
467 will. But influence campaigns that seek to displace existing,
468 correct beliefs with false ones have an effect that is small and
469 temporary while monopolistic dominance over informational systems
470 has massive, enduring effects. Controlling the results to the
471 world’s search queries means controlling access both to arguments
472 and their rebuttals and, thus, control over much of the world’s
473 beliefs. If our concern is how corporations are foreclosing on our
474 ability to make up our own minds and determine our own futures,
475 the impact of dominance far exceeds the impact of manipulation and
476 should be central to our analysis and any remedies we seek.
477 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="bypassing-our-rational-faculties"></a>4. Bypassing our rational faculties</h3></div></div></div><p>
478 <span class="emphasis"><em>This</em></span> is the good stuff: using machine
479 learning, <span class="quote"><span class="quote">dark patterns,</span></span> engagement hacking, and other
480 techniques to get us to do things that run counter to our better
481 judgment. This is mind control.
482 </p><p>
483 Some of these techniques have proven devastatingly effective (if
484 only in the short term). The use of countdown timers on a purchase
485 completion page can create a sense of urgency that causes you to
486 ignore the nagging internal voice suggesting that you should shop
487 around or sleep on your decision. The use of people from your
488 social graph in ads can provide <span class="quote"><span class="quote">social proof</span></span> that a purchase is
489 worth making. Even the auction system pioneered by eBay is
490 calculated to play on our cognitive blind spots, letting us feel
491 like we <span class="quote"><span class="quote">own</span></span> something because we bid on it, thus encouraging us
492 to bid again when we are outbid to ensure that <span class="quote"><span class="quote">our</span></span> things stay
493 ours.
494 </p><p>
495 Games are extraordinarily good at this. <span class="quote"><span class="quote">Free to play</span></span> games
496 manipulate us through many techniques, such as presenting players
497 with a series of smoothly escalating challenges that create a
498 sense of mastery and accomplishment but which sharply transition
499 into a set of challenges that are impossible to overcome without
500 paid upgrades. Add some social proof to the mix — a stream of
501 notifications about how well your friends are faring — and before
502 you know it, you’re buying virtual power-ups to get to the next
503 level.
504 </p><p>
505 Companies have risen and fallen on these techniques, and the
506 <span class="quote"><span class="quote">fallen</span></span> part is worth paying attention to. In general, living
507 things adapt to stimulus: Something that is very compelling or
508 noteworthy when you first encounter it fades with repetition until
509 you stop noticing it altogether. Consider the refrigerator hum
510 that irritates you when it starts up but disappears into the
511 background so thoroughly that you only notice it when it stops
512 again.
513 </p><p>
514 That’s why behavioral conditioning uses <span class="quote"><span class="quote">intermittent
515 reinforcement schedules.</span></span> Instead of giving you a steady drip of
516 encouragement or setbacks, games and gamified services scatter
517 rewards on a randomized schedule — often enough to keep you
518 interested and random enough that you can never quite find the
519 pattern that would make it boring.
520 </p><p>
521 Intermittent reinforcement is a powerful behavioral tool, but it
522 also represents a collective action problem for surveillance
523 capitalism. The <span class="quote"><span class="quote">engagement techniques</span></span> invented by the
524 behaviorists of surveillance capitalist companies are quickly
525 copied across the whole sector so that what starts as a
526 mysteriously compelling fillip in the design of a service—like
527 <span class="quote"><span class="quote">pull to refresh</span></span> or alerts when someone likes your posts or side
528 quests that your characters get invited to while in the midst of
529 main quests—quickly becomes dully ubiquitous. The
530 impossible-to-nail-down nonpattern of randomized drips from your
531 phone becomes a grey-noise wall of sound as every single app and
532 site starts to make use of whatever seems to be working at the
533 time.
534 </p><p>
535 From the surveillance capitalist’s point of view, our adaptive
536 capacity is like a harmful bacterium that deprives it of its food
537 source — our attention — and novel techniques for snagging that
538 attention are like new antibiotics that can be used to breach our
539 defenses and destroy our self-determination. And there
540 <span class="emphasis"><em>are</em></span> techniques like that. Who can forget the
541 Great Zynga Epidemic, when all of our friends were caught in
542 <span class="emphasis"><em>FarmVille</em></span>’s endless, mindless dopamine loops?
543 But every new attention-commanding technique is jumped on by the
544 whole industry and used so indiscriminately that antibiotic
545 resistance sets in. Given enough repetition, almost all of us
546 develop immunity to even the most powerful techniques — by 2013,
547 two years after Zynga’s peak, its user base had halved.
548 </p><p>
549 Not everyone, of course. Some people never adapt to stimulus, just
550 as some people never stop hearing the hum of the refrigerator.
551 This is why most people who are exposed to slot machines play them
552 for a while and then move on while a small and tragic minority
553 liquidate their kids’ college funds, buy adult diapers, and
554 position themselves in front of a machine until they collapse.
555 </p><p>
556 But surveillance capitalism’s margins on behavioral modification
557 suck. Tripling the rate at which someone buys a widget sounds
558 great
559 <a class="ulink" href="https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957" target="_top">unless
560 the base rate is way less than 1%</a> with an improved rate
561 of… still less than 1%. Even penny slot machines pull down pennies
562 for every spin while surveillance capitalism rakes in
563 infinitesimal penny fractions.
564 </p><p>
565 Slot machines’ high returns mean that they can be profitable just
566 by draining the fortunes of the small rump of people who are
567 pathologically vulnerable to them and unable to adapt to their
568 tricks. But surveillance capitalism can’t survive on the
569 fractional pennies it brings down from that vulnerable sliver —
570 that’s why, after the Great Zynga Epidemic had finally burned
571 itself out, the small number of still-addicted players left behind
572 couldn’t sustain it as a global phenomenon. And new powerful
573 attention weapons aren’t easy to find, as is evidenced by the long
574 years since the last time Zynga had a hit. Despite the hundreds of
575 millions of dollars that Zynga has to spend on developing new
576 tools to blast through our adaptation, it has never managed to
577 repeat the lucky accident that let it snag so much of our
578 attention for a brief moment in 2009. Powerhouses like Supercell
579 have fared a little better, but they are rare and throw away many
580 failures for every success.
581 </p><p>
582 The vulnerability of small segments of the population to dramatic,
583 efficient corporate manipulation is a real concern that’s worthy
584 of our attention and energy. But it’s not an existential threat to
585 society.
586 </p></div></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak"></a>If data is the new oil, then surveillance capitalism’s engine
587 has a leak</h2></div></div></div><p>
588 This adaptation problem offers an explanation for one of
589 surveillance capitalism’s most alarming traits: its relentless
590 hunger for data and its endless expansion of data-gathering
591 capabilities through the spread of sensors, online surveillance, and
592 acquisition of data streams from third parties.
593 </p><p>
594 Zuboff observes this phenomenon and concludes that data must be very
595 valuable if surveillance capitalism is so hungry for it. (In her
596 words: <span class="quote"><span class="quote">Just as industrial capitalism was driven to the continuous
597 intensification of the means of production, so surveillance
598 capitalists and their market players are now locked into the
599 continuous intensification of the means of behavioral modification
600 and the gathering might of instrumentarian power.</span></span>) But what if the
601 voracious appetite is because data has such a short half-life —
602 because people become inured so quickly to new, data-driven
603 persuasion techniques — that the companies are locked in an arms
604 race with our limbic system? What if it’s all a Red Queen’s race
605 where they have to run ever faster — collect ever-more data — just
606 to stay in the same spot?
607 </p><p>
608 Of course, all of Big Tech’s persuasion techniques work in concert
609 with one another, and collecting data is useful beyond mere
610 behavioral trickery.
611 </p><p>
612 If someone wants to recruit you to buy a refrigerator or join a
613 pogrom, they might use profiling and targeting to send messages to
614 people they judge to be good sales prospects. The messages
615 themselves may be deceptive, making claims about things you’re not
616 very knowledgeable about (food safety and energy efficiency or
617 eugenics and historical claims about racial superiority). They might
618 use search engine optimization and/or armies of fake reviewers and
619 commenters and/or paid placement to dominate the discourse so that
620 any search for further information takes you back to their messages.
621 And finally, they may refine the different pitches using machine
622 learning and other techniques to figure out what kind of pitch works
623 best on someone like you.
624 </p><p>
625 Each phase of this process benefits from surveillance: The more data
626 they have, the more precisely they can profile you and target you
627 with specific messages. Think of how you’d sell a fridge if you knew
628 that the warranty on your prospect’s fridge just expired and that
629 they were expecting a tax rebate in April.
630 </p><p>
631 Also, the more data they have, the better they can craft deceptive
632 messages — if I know that you’re into genealogy, I might not try to
633 feed you pseudoscience about genetic differences between <span class="quote"><span class="quote">races,</span></span>
634 sticking instead to conspiratorial secret histories of <span class="quote"><span class="quote">demographic
635 replacement</span></span> and the like.
636 </p><p>
637 Facebook also helps you locate people who have the same odious or
638 antisocial views as you. It makes it possible to find other people
639 who want to carry tiki torches through the streets of
640 Charlottesville in Confederate cosplay. It can help you find other
641 people who want to join your militia and go to the border to look
642 for undocumented migrants to terrorize. It can help you find people
643 who share your belief that vaccines are poison and that the Earth is
644 flat.
645 </p><p>
646 There is one way in which targeted advertising uniquely benefits
647 those advocating for socially unacceptable causes: It is invisible.
648 Racism is widely geographically dispersed, and there are few places
649 where racists — and only racists — gather. This is similar to the
650 problem of selling refrigerators in that potential refrigerator
651 purchasers are geographically dispersed and there are few places
652 where you can buy an ad that will be primarily seen by refrigerator
653 customers. But buying a refrigerator is socially acceptable while
654 being a Nazi is not, so you can buy a billboard or advertise in the
655 newspaper sports section for your refrigerator business, and the
656 only potential downside is that your ad will be seen by a lot of
657 people who don’t want refrigerators, resulting in a lot of wasted
658 expense.
659 </p><p>
660 But even if you wanted to advertise your Nazi movement on a
661 billboard or prime-time TV or the sports section, you would struggle
662 to find anyone willing to sell you the space for your ad partly
663 because they disagree with your views and partly because they fear
664 censure (boycott, reputational damage, etc.) from other people who
665 disagree with your views.
666 </p><p>
667 Targeted ads solve this problem: On the internet, every ad unit can
668 be different for every person, meaning that you can buy ads that are
669 only shown to people who appear to be Nazis and not to people who
670 hate Nazis. When there’s spillover — when someone who hates racism
671 is shown a racist recruiting ad — there is some fallout; the
672 platform or publication might get an angry public or private
673 denunciation. But the nature of the risk assumed by an online ad
674 buyer is different than the risks to a traditional publisher or
675 billboard owner who might want to run a Nazi ad.
676 </p><p>
677 Online ads are placed by algorithms that broker between a diverse
678 ecosystem of self-serve ad platforms that anyone can buy an ad
679 through, so the Nazi ad that slips onto your favorite online
680 publication isn’t seen as their moral failing but rather as a
681 failure in some distant, upstream ad supplier. When a publication
682 gets a complaint about an offensive ad that’s appearing in one of
683 its units, it can take some steps to block that ad, but the Nazi
684 might buy a slightly different ad from a different broker serving
685 the same unit. And in any event, internet users increasingly
686 understand that when they see an ad, it’s likely that the advertiser
687 did not choose that publication and that the publication has no idea
688 who its advertisers are.
689 </p><p>
690 These layers of indirection between advertisers and publishers serve
691 as moral buffers: Today’s moral consensus is largely that publishers
692 shouldn’t be held responsible for the ads that appear on their pages
693 because they’re not actively choosing to put those ads there.
694 Because of this, Nazis are able to overcome significant barriers to
695 organizing their movement.
696 </p><p>
697 Data has a complex relationship with domination. Being able to spy
698 on your customers can alert you to their preferences for your rivals
699 and allow you to head off your rivals at the pass.
700 </p><p>
701 More importantly, if you can dominate the information space while
702 also gathering data, then you make other deceptive tactics stronger
703 because it’s harder to break out of the web of deceit you’re
704 spinning. Domination — that is, ultimately becoming a monopoly — and
705 not the data itself is the supercharger that makes every tactic
706 worth pursuing because monopolistic domination deprives your target
707 of an escape route.
708 </p><p>
709 If you’re a Nazi who wants to ensure that your prospects primarily
710 see deceptive, confirming information when they search for more, you
711 can improve your odds by seeding the search terms they use through
712 your initial communications. You don’t need to own the top 10
713 results for <span class="quote"><span class="quote">voter suppression</span></span> if you can convince your marks to
714 confine their search terms to <span class="quote"><span class="quote">voter fraud,</span></span> which throws up a very
715 different set of search results.
716 </p><p>
717 Surveillance capitalists are like stage mentalists who claim that
718 their extraordinary insights into human behavior let them guess the
719 word that you wrote down and folded up in your pocket but who really
720 use shills, hidden cameras, sleight of hand, and brute-force
721 memorization to amaze you.
722 </p><p>
723 Or perhaps they’re more like pick-up artists, the misogynistic cult
724 that promises to help awkward men have sex with women by teaching
725 them <span class="quote"><span class="quote">neurolinguistic programming</span></span> phrases, body language
726 techniques, and psychological manipulation tactics like <span class="quote"><span class="quote">negging</span></span>
727 offering unsolicited negative feedback to women to lower their
728 self-esteem and prick their interest.
729 </p><p>
730 Some pick-up artists eventually manage to convince women to go home
731 with them, but it’s not because these men have figured out how to
732 bypass women’s critical faculties. Rather, pick-up artists’
733 <span class="quote"><span class="quote">success</span></span> stories are a mix of women who were incapable of giving
734 consent, women who were coerced, women who were intoxicated,
735 self-destructive women, and a few women who were sober and in
736 command of their faculties but who didn’t realize straightaway that
737 they were with terrible men but rectified the error as soon as they
738 could.
739 </p><p>
740 Pick-up artists <span class="emphasis"><em>believe</em></span> they have figured out a
741 secret back door that bypasses women’s critical faculties, but they
742 haven’t. Many of the tactics they deploy, like negging, became the
743 butt of jokes (just like people joke about bad ad targeting), and
744 there’s a good chance that anyone they try these tactics on will
745 immediately recognize them and dismiss the men who use them as
746 irredeemable losers.
747 </p><p>
748 Pick-up artists are proof that people can believe they have
749 developed a system of mind control <span class="emphasis"><em>even when it doesn’t
750 work</em></span>. Pick-up artists simply exploit the fact that
751 one-in-a-million chances can come through for you if you make a
752 million attempts, and then they assume that the other 999,999 times,
753 they simply performed the technique incorrectly and commit
754 themselves to doing better next time. There’s only one group of
755 people who find pick-up artist lore reliably convincing: other
756 would-be pick-up artists whose anxiety and insecurity make them
757 vulnerable to scammers and delusional men who convince them that if
758 they pay for tutelage and follow instructions, then they will
759 someday succeed. Pick-up artists assume they fail to entice women
760 because they are bad at being pick-up artists, not because pick-up
761 artistry is bullshit. Pick-up artists are bad at selling themselves
762 to women, but they’re much better at selling themselves to men who
763 pay to learn the secrets of pick-up artistry.
764 </p><p>
765 Department store pioneer John Wanamaker is said to have lamented,
766 <span class="quote"><span class="quote">Half the money I spend on advertising is wasted; the trouble is I
767 don’t know which half.</span></span> The fact that Wanamaker thought that only
768 half of his advertising spending was wasted is a tribute to the
769 persuasiveness of advertising executives, who are
770 <span class="emphasis"><em>much</em></span> better at convincing potential clients to
771 buy their services than they are at convincing the general public to
772 buy their clients’ wares.
773 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="what-is-facebook"></a>What is Facebook?</h2></div></div></div><p>
774 Facebook is heralded as the origin of all of our modern plagues, and
775 it’s not hard to see why. Some tech companies want to lock their
776 users in but make their money by monopolizing access to the market
777 for apps for their devices and gouging them on prices rather than by
778 spying on them (like Apple). Some companies don’t care about locking
779 in users because they’ve figured out how to spy on them no matter
780 where they are and what they’re doing and can turn that surveillance
781 into money (Google). Facebook alone among the Western tech giants
782 has built a business based on locking in its users
783 <span class="emphasis"><em>and</em></span> spying on them all the time.
784 </p><p>
785 Facebook’s surveillance regime is really without parallel in the
786 Western world. Though Facebook tries to prevent itself from being
787 visible on the public web, hiding most of what goes on there from
788 people unless they’re logged into Facebook, the company has
789 nevertheless booby-trapped the entire web with surveillance tools in
790 the form of Facebook <span class="quote"><span class="quote">Like</span></span> buttons that web publishers include on
791 their sites to boost their Facebook profiles. Facebook also makes
792 various libraries and other useful code snippets available to web
793 publishers that act as surveillance tendrils on the sites where
794 they’re used, funneling information about visitors to the site —
795 newspapers, dating sites, message boards — to Facebook.
796 </p><div class="blockquote"><blockquote class="blockquote"><p>
797 Big Tech is able to practice surveillance not just because it is
798 tech but because it is <span class="emphasis"><em>big</em></span>.
799 </p></blockquote></div><p>
800 Facebook offers similar tools to app developers, so the apps —
801 games, fart machines, business review services, apps for keeping
802 abreast of your kid’s schooling — you use will send information
803 about your activities to Facebook even if you don’t have a Facebook
804 account and even if you don’t download or use Facebook apps. On top
805 of all that, Facebook buys data from third-party brokers on shopping
806 habits, physical location, use of <span class="quote"><span class="quote">loyalty</span></span> programs, financial
807 transactions, etc., and cross-references that with the dossiers it
808 develops on activity on Facebook and with apps and the public web.
809 </p><p>
810 Though it’s easy to integrate the web with Facebook — linking to
811 news stories and such — Facebook products are generally not
812 available to be integrated back into the web itself. You can embed a
813 tweet in a Facebook post, but if you embed a Facebook post in a
814 tweet, you just get a link back to Facebook and must log in before
815 you can see it. Facebook has used extreme technological and legal
816 countermeasures to prevent rivals from allowing their users to embed
817 Facebook snippets in competing services or to create alternative
818 interfaces to Facebook that merge your Facebook inbox with those of
819 other services that you use.
820 </p><p>
821 And Facebook is incredibly popular, with 2.3 billion claimed users
822 (though many believe this figure to be inflated). Facebook has been
823 used to organize genocidal pogroms, racist riots, anti-vaccination
824 movements, flat Earth cults, and the political lives of some of the
825 world’s ugliest, most brutal autocrats. There are some really
826 alarming things going on in the world, and Facebook is implicated in
827 many of them, so it’s easy to conclude that these bad things are the
828 result of Facebook’s mind-control system, which it rents out to
829 anyone with a few bucks to spend.
830 </p><p>
831 To understand what role Facebook plays in the formulation and
832 mobilization of antisocial movements, we need to understand the dual
833 nature of Facebook.
834 </p><p>
835 Because it has a lot of users and a lot of data about those users,
836 Facebook is a very efficient tool for locating people with
837 hard-to-find traits, the kinds of traits that are widely diffused in
838 the population such that advertisers have historically struggled to
839 find a cost-effective way to reach them. Think back to
840 refrigerators: Most of us only replace our major appliances a few
841 times in our entire lives. If you’re a refrigerator manufacturer or
842 retailer, you have these brief windows in the life of a consumer
843 during which they are pondering a purchase, and you have to somehow
844 reach them. Anyone who’s ever registered a title change after buying
845 a house can attest that appliance manufacturers are incredibly
846 desperate to reach anyone who has even the slenderest chance of
847 being in the market for a new fridge.
848 </p><p>
849 Facebook makes finding people shopping for refrigerators a
850 <span class="emphasis"><em>lot</em></span> easier. It can target ads to people who’ve
851 registered a new home purchase, to people who’ve searched for
852 refrigerator buying advice, to people who have complained about
853 their fridge dying, or any combination thereof. It can even target
854 people who’ve recently bought <span class="emphasis"><em>other</em></span> kitchen
855 appliances on the theory that someone who’s just replaced their
856 stove and dishwasher might be in a fridge-buying kind of mood. The
857 vast majority of people who are reached by these ads will not be in
858 the market for a new fridge, but — crucially — the percentage of
859 people who <span class="emphasis"><em>are</em></span> looking for fridges that these
860 ads reach is <span class="emphasis"><em>much</em></span> larger than it is than for
861 any group that might be subjected to traditional, offline targeted
862 refrigerator marketing.
863 </p><p>
864 Facebook also makes it a lot easier to find people who have the same
865 rare disease as you, which might have been impossible in earlier
866 eras — the closest fellow sufferer might otherwise be hundreds of
867 miles away. It makes it easier to find people who went to the same
868 high school as you even though decades have passed and your former
869 classmates have all been scattered to the four corners of the Earth.
870 </p><p>
871 Facebook also makes it much easier to find people who hold the same
872 rare political beliefs as you. If you’ve always harbored a secret
873 affinity for socialism but never dared utter this aloud lest you be
874 demonized by your neighbors, Facebook can help you discover other
875 people who feel the same way (and it might just demonstrate to you
876 that your affinity is more widespread than you ever suspected). It
877 can make it easier to find people who share your sexual identity.
878 And again, it can help you to understand that what you thought was a
879 shameful secret that affected only you was really a widely shared
880 trait, giving you both comfort and the courage to come out to the
881 people in your life.
882 </p><p>
883 All of this presents a dilemma for Facebook: Targeting makes the
884 company’s ads more effective than traditional ads, but it also lets
885 advertisers see just how effective their ads are. While advertisers
886 are pleased to learn that Facebook ads are more effective than ads
887 on systems with less sophisticated targeting, advertisers can also
888 see that in nearly every case, the people who see their ads ignore
889 them. Or, at best, the ads work on a subconscious level, creating
890 nebulous unmeasurables like <span class="quote"><span class="quote">brand recognition.</span></span> This means that the
891 price per ad is very low in nearly every case.
892 </p><p>
893 To make things worse, many Facebook groups spark precious little
894 discussion. Your little-league soccer team, the people with the same
895 rare disease as you, and the people you share a political affinity
896 with may exchange the odd flurry of messages at critical junctures,
897 but on a daily basis, there’s not much to say to your old high
898 school chums or other hockey-card collectors.
899 </p><p>
900 With nothing but <span class="quote"><span class="quote">organic</span></span> discussion, Facebook would not generate
901 enough traffic to sell enough ads to make the money it needs to
902 continually expand by buying up its competitors while returning
903 handsome sums to its investors.
904 </p><p>
905 So Facebook has to gin up traffic by sidetracking its own forums:
906 Every time Facebook’s algorithm injects controversial materials —
907 inflammatory political articles, conspiracy theories, outrage
908 stories — into a group, it can hijack that group’s nominal purpose
909 with its desultory discussions and supercharge those discussions by
910 turning them into bitter, unproductive arguments that drag on and
911 on. Facebook is optimized for engagement, not happiness, and it
912 turns out that automated systems are pretty good at figuring out
913 things that people will get angry about.
914 </p><p>
915 Facebook <span class="emphasis"><em>can</em></span> modify our behavior but only in a
916 couple of trivial ways. First, it can lock in all your friends and
917 family members so that you check and check and check with Facebook
918 to find out what they are up to; and second, it can make you angry
919 and anxious. It can force you to choose between being interrupted
920 constantly by updates — a process that breaks your concentration and
921 makes it hard to be introspective — and staying in touch with your
922 friends. This is a very limited form of mind control, and it can
923 only really make us miserable, angry, and anxious.
924 </p><p>
925 This is why Facebook’s targeting systems — both the ones it shows to
926 advertisers and the ones that let users find people who share their
927 interests — are so next-gen and smooth and easy to use as well as
928 why its message boards have a toolset that seems like it hasn’t
929 changed since the mid-2000s. If Facebook delivered an equally
930 flexible, sophisticated message-reading system to its users, those
931 users could defend themselves against being nonconsensually
932 eyeball-fucked with Donald Trump headlines.
933 </p><p>
934 The more time you spend on Facebook, the more ads it gets to show
935 you. The solution to Facebook’s ads only working one in a thousand
936 times is for the company to try to increase how much time you spend
937 on Facebook by a factor of a thousand. Rather than thinking of
938 Facebook as a company that has figured out how to show you exactly
939 the right ad in exactly the right way to get you to do what its
940 advertisers want, think of it as a company that has figured out how
941 to make you slog through an endless torrent of arguments even though
942 they make you miserable, spending so much time on the site that it
943 eventually shows you at least one ad that you respond to.
944 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="monopoly-and-the-right-to-the-future-tense"></a>Monopoly and the right to the future tense</h2></div></div></div><p>
945 Zuboff and her cohort are particularly alarmed at the extent to
946 which surveillance allows corporations to influence our decisions,
947 taking away something she poetically calls <span class="quote"><span class="quote">the right to the future
948 tense</span></span> — that is, the right to decide for yourself what you will do
949 in the future.
950 </p><p>
951 It’s true that advertising can tip the scales one way or another:
952 When you’re thinking of buying a fridge, a timely fridge ad might
953 end the search on the spot. But Zuboff puts enormous and undue
954 weight on the persuasive power of surveillance-based influence
955 techniques. Most of these don’t work very well, and the ones that do
956 won’t work for very long. The makers of these influence tools are
957 confident they will someday refine them into systems of total
958 control, but they are hardly unbiased observers, and the risks from
959 their dreams coming true are very speculative.
960 </p><p>
961 By contrast, Zuboff is rather sanguine about 40 years of lax
962 antitrust practice that has allowed a handful of companies to
963 dominate the internet, ushering in an information age with,
964 <a class="ulink" href="https://twitter.com/tveastman/status/1069674780826071040" target="_top">as
965 one person on Twitter noted</a>, five giant websites each filled
966 with screenshots of the other four.
967 </p><p>
968 However, if we are to be alarmed that we might lose the right to
969 choose for ourselves what our future will hold, then monopoly’s
970 nonspeculative, concrete, here-and-now harms should be front and
971 center in our debate over tech policy.
972 </p><p>
973 Start with <span class="quote"><span class="quote">digital rights management.</span></span> In 1998, Bill Clinton signed
974 the Digital Millennium Copyright Act (DMCA) into law. It’s a complex
975 piece of legislation with many controversial clauses but none more
976 so than Section 1201, the <span class="quote"><span class="quote">anti-circumvention</span></span> rule.
977 </p><p>
978 This is a blanket ban on tampering with systems that restrict access
979 to copyrighted works. The ban is so thoroughgoing that it prohibits
980 removing a copyright lock even when no copyright infringement takes
981 place. This is by design: The activities that the DMCA’s Section
982 1201 sets out to ban are not copyright infringements; rather, they
983 are legal activities that frustrate manufacturers’ commercial plans.
984 </p><p>
985 For example, Section 1201’s first major application was on DVD
986 players as a means of enforcing the region coding built into those
987 devices. DVD-CCA, the body that standardized DVDs and DVD players,
988 divided the world into six regions and specified that DVD players
989 must check each disc to determine which regions it was authorized to
990 be played in. DVD players would have their own corresponding region
991 (a DVD player bought in the U.S. would be region 1 while one bought
992 in India would be region 5). If the player and the disc’s region
993 matched, the player would play the disc; otherwise, it would reject
994 it.
995 </p><p>
996 However, watching a lawfully produced disc in a country other than
997 the one where you purchased it is not copyright infringement — it’s
998 the opposite. Copyright law imposes this duty on customers for a
999 movie: You must go into a store, find a licensed disc, and pay the
1000 asking price. Do that — and <span class="emphasis"><em>nothing else</em></span> — and
1001 you and copyright are square with one another.
1002 </p><p>
1003 The fact that a movie studio wants to charge Indians less than
1004 Americans or release in Australia later than it releases in the U.K.
1005 has no bearing on copyright law. Once you lawfully acquire a DVD, it
1006 is no copyright infringement to watch it no matter where you happen
1007 to be.
1008 </p><p>
1009 So DVD and DVD player manufacturers would not be able to use
1010 accusations of abetting copyright infringement to punish
1011 manufacturers who made noncompliant players that would play discs
1012 from any region or repair shops that modified players to let you
1013 watch out-of-region discs or software programmers who created
1014 programs to let you do this.
1015 </p><p>
1016 That’s where Section 1201 of the DMCA comes in: By banning tampering
1017 with an <span class="quote"><span class="quote">access control,</span></span> the rule gave manufacturers and rights
1018 holders standing to sue competitors who released superior products
1019 with lawful features that the market demanded (in this case,
1020 region-free players).
1021 </p><p>
1022 This is an odious scam against consumers, but as time went by,
1023 Section 1201 grew to encompass a rapidly expanding constellation of
1024 devices and services as canny manufacturers have realized certain
1025 things:
1026 </p><div class="itemizedlist"><ul class="itemizedlist compact" style="list-style-type: disc; "><li class="listitem"><p>
1027 Any device with software in it contains a <span class="quote"><span class="quote">copyrighted work</span></span>
1028 i.e., the software.
1029 </p></li><li class="listitem"><p>
1030 A device can be designed so that reconfiguring the software
1031 requires bypassing an <span class="quote"><span class="quote">access control for copyrighted works,</span></span>
1032 which is a potential felony under Section 1201.
1033 </p></li><li class="listitem"><p>
1034 Thus, companies can control their customers’ behavior after they
1035 take home their purchases by designing products so that all
1036 unpermitted uses require modifications that fall afoul of
1037 Section 1201.
1038 </p></li></ul></div><p>
1039 Section 1201 then becomes a means for manufacturers of all
1040 descriptions to force their customers to arrange their affairs to
1041 benefit the manufacturers’ shareholders instead of themselves.
1042 </p><p>
1043 This manifests in many ways: from a new generation of inkjet
1044 printers that use countermeasures to prevent third-party ink that
1045 cannot be bypassed without legal risks to similar systems in
1046 tractors that prevent third-party technicians from swapping in the
1047 manufacturer’s own parts that are not recognized by the tractor’s
1048 control system until it is supplied with a manufacturer’s unlock
1049 code.
1050 </p><p>
1051 Closer to home, Apple’s iPhones use these measures to prevent both
1052 third-party service and third-party software installation. This
1053 allows Apple to decide when an iPhone is beyond repair and must be
1054 shredded and landfilled as opposed to the iPhone’s purchaser. (Apple
1055 is notorious for its environmentally catastrophic policy of
1056 destroying old electronics rather than permitting them to be
1057 cannibalized for parts.) This is a very useful power to wield,
1058 especially in light of CEO Tim Cook’s January 2019 warning to
1059 investors that the company’s profits are endangered by customers
1060 choosing to hold onto their phones for longer rather than replacing
1061 them.
1062 </p><p>
1063 Apple’s use of copyright locks also allows it to establish a
1064 monopoly over how its customers acquire software for their mobile
1065 devices. The App Store’s commercial terms guarantee Apple a share of
1066 all revenues generated by the apps sold there, meaning that Apple
1067 gets paid when you buy an app from its store and then continues to
1068 get paid every time you buy something using that app. This comes out
1069 of the bottom line of software developers, who must either charge
1070 more or accept lower profits for their products.
1071 </p><p>
1072 Crucially, Apple’s use of copyright locks gives it the power to make
1073 editorial decisions about which apps you may and may not install on
1074 your own device. Apple has used this power to
1075 <a class="ulink" href="https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html" target="_top">reject
1076 dictionaries</a> for containing obscene words; to
1077 <a class="ulink" href="https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again" target="_top">limit
1078 political speech</a>, especially from apps that make sensitive
1079 political commentary such as an app that notifies you every time a
1080 U.S. drone kills someone somewhere in the world; and to
1081 <a class="ulink" href="https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says" target="_top">object
1082 to a game</a> that commented on the Israel-Palestine conflict.
1083 </p><p>
1084 Apple often justifies monopoly power over software installation in
1085 the name of security, arguing that its vetting of apps for its store
1086 means that it can guard its users against apps that contain
1087 surveillance code. But this cuts both ways. In China, the government
1088 <a class="ulink" href="https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc" target="_top">ordered
1089 Apple to prohibit the sale of privacy tools</a> like VPNs with
1090 the exception of VPNs that had deliberately introduced flaws
1091 designed to let the Chinese state eavesdrop on users. Because Apple
1092 uses technological countermeasures — with legal backstops — to block
1093 customers from installing unauthorized apps, Chinese iPhone owners
1094 cannot readily (or legally) acquire VPNs that would protect them
1095 from Chinese state snooping.
1096 </p><p>
1097 Zuboff calls surveillance capitalism a <span class="quote"><span class="quote">rogue capitalism.</span></span>
1098 Theoreticians of capitalism claim that its virtue is that it
1099 <a class="ulink" href="https://en.wikipedia.org/wiki/Price_signal" target="_top">aggregates
1100 information in the form of consumers’ decisions</a>, producing
1101 efficient markets. Surveillance capitalism’s supposed power to rob
1102 its victims of their free will through computationally supercharged
1103 influence campaigns means that our markets no longer aggregate
1104 customers’ decisions because we customers no longer decide — we are
1105 given orders by surveillance capitalism’s mind-control rays.
1106 </p><p>
1107 If our concern is that markets cease to function when consumers can
1108 no longer make choices, then copyright locks should concern us at
1109 <span class="emphasis"><em>least</em></span> as much as influence campaigns. An
1110 influence campaign might nudge you to buy a certain brand of phone;
1111 but the copyright locks on that phone absolutely determine where you
1112 get it serviced, which apps can run on it, and when you have to
1113 throw it away rather than fixing it.
1114 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="search-order-and-the-right-to-the-future-tense"></a>Search order and the right to the future tense</h2></div></div></div><p>
1115 Markets are posed as a kind of magic: By discovering otherwise
1116 hidden information conveyed by the free choices of consumers, those
1117 consumers’ local knowledge is integrated into a self-correcting
1118 system that makes efficient allocations—more efficient than any
1119 computer could calculate. But monopolies are incompatible with that
1120 notion. When you only have one app store, the owner of the store —
1121 not the consumer — decides on the range of choices. As Boss Tweed
1122 once said, <span class="quote"><span class="quote">I don’t care who does the electing, so long as I get to
1123 do the nominating.</span></span> A monopolized market is an election whose
1124 candidates are chosen by the monopolist.
1125 </p><p>
1126 This ballot rigging is made more pernicious by the existence of
1127 monopolies over search order. Google’s search market share is about
1128 90%. When Google’s ranking algorithm puts a result for a popular
1129 search term in its top 10, that helps determine the behavior of
1130 millions of people. If Google’s answer to <span class="quote"><span class="quote">Are vaccines dangerous?</span></span>
1131 is a page that rebuts anti-vax conspiracy theories, then a sizable
1132 portion of the public will learn that vaccines are safe. If, on the
1133 other hand, Google sends those people to a site affirming the
1134 anti-vax conspiracies, a sizable portion of those millions will come
1135 away convinced that vaccines are dangerous.
1136 </p><p>
1137 Google’s algorithm is often tricked into serving disinformation as a
1138 prominent search result. But in these cases, Google isn’t persuading
1139 people to change their minds; it’s just presenting something untrue
1140 as fact when the user has no cause to doubt it.
1141 </p><p>
1142 This is true whether the search is for <span class="quote"><span class="quote">Are vaccines dangerous?</span></span> or
1143 <span class="quote"><span class="quote">best restaurants near me.</span></span> Most users will never look past the
1144 first page of search results, and when the overwhelming majority of
1145 people all use the same search engine, the ranking algorithm
1146 deployed by that search engine will determine myriad outcomes
1147 (whether to adopt a child, whether to have cancer surgery, where to
1148 eat dinner, where to move, where to apply for a job) to a degree
1149 that vastly outstrips any behavioral outcomes dictated by
1150 algorithmic persuasion techniques.
1151 </p><p>
1152 Many of the questions we ask search engines have no empirically
1153 correct answers: <span class="quote"><span class="quote">Where should I eat dinner?</span></span> is not an objective
1154 question. Even questions that do have correct answers (<span class="quote"><span class="quote">Are vaccines
1155 dangerous?</span></span>) don’t have one empirically superior source for that
1156 answer. Many pages affirm the safety of vaccines, so which one goes
1157 first? Under conditions of competition, consumers can choose from
1158 many search engines and stick with the one whose algorithmic
1159 judgment suits them best, but under conditions of monopoly, we all
1160 get our answers from the same place.
1161 </p><p>
1162 Google’s search dominance isn’t a matter of pure merit: The company
1163 has leveraged many tactics that would have been prohibited under
1164 classical, pre-Ronald-Reagan antitrust enforcement standards to
1165 attain its dominance. After all, this is a company that has
1166 developed two major products: a really good search engine and a
1167 pretty good Hotmail clone. Every other major success it’s had —
1168 Android, YouTube, Google Maps, etc. — has come through an
1169 acquisition of a nascent competitor. Many of the company’s key
1170 divisions, such as the advertising technology of DoubleClick,
1171 violate the historical antitrust principle of structural separation,
1172 which forbade firms from owning subsidiaries that competed with
1173 their customers. Railroads, for example, were barred from owning
1174 freight companies that competed with the shippers whose freight they
1175 carried.
1176 </p><p>
1177 If we’re worried about giant companies subverting markets by
1178 stripping consumers of their ability to make free choices, then
1179 vigorous antitrust enforcement seems like an excellent remedy. If
1180 we’d denied Google the right to effect its many mergers, we would
1181 also have probably denied it its total search dominance. Without
1182 that dominance, the pet theories, biases, errors (and good judgment,
1183 too) of Google search engineers and product managers would not have
1184 such an outsized effect on consumer choice.
1185 </p><p>
1186 This goes for many other companies. Amazon, a classic surveillance
1187 capitalist, is obviously the dominant tool for searching Amazon —
1188 though many people find their way to Amazon through Google searches
1189 and Facebook posts — and obviously, Amazon controls Amazon search.
1190 That means that Amazon’s own self-serving editorial choices—like
1191 promoting its own house brands over rival goods from its sellers as
1192 well as its own pet theories, biases, and errors— determine much of
1193 what we buy on Amazon. And since Amazon is the dominant e-commerce
1194 retailer outside of China and since it attained that dominance by
1195 buying up both large rivals and nascent competitors in defiance of
1196 historical antitrust rules, we can blame the monopoly for stripping
1197 consumers of their right to the future tense and the ability to
1198 shape markets by making informed choices.
1199 </p><p>
1200 Not every monopolist is a surveillance capitalist, but that doesn’t
1201 mean they’re not able to shape consumer choices in wide-ranging
1202 ways. Zuboff lauds Apple for its App Store and iTunes Store,
1203 insisting that adding price tags to the features on its platforms
1204 has been the secret to resisting surveillance and thus creating
1205 markets. But Apple is the only retailer allowed to sell on its
1206 platforms, and it’s the second-largest mobile device vendor in the
1207 world. The independent software vendors that sell through Apple’s
1208 marketplace accuse the company of the same surveillance sins as
1209 Amazon and other big retailers: spying on its customers to find
1210 lucrative new products to launch, effectively using independent
1211 software vendors as free-market researchers, then forcing them out
1212 of any markets they discover.
1213 </p><p>
1214 Because of its use of copyright locks, Apple’s mobile customers are
1215 not legally allowed to switch to a rival retailer for its apps if
1216 they want to do so on an iPhone. Apple, obviously, is the only
1217 entity that gets to decide how it ranks the results of search
1218 queries in its stores. These decisions ensure that some apps are
1219 often installed (because they appear on page one) and others are
1220 never installed (because they appear on page one million). Apple’s
1221 search-ranking design decisions have a vastly more significant
1222 effect on consumer behaviors than influence campaigns delivered by
1223 surveillance capitalism’s ad-serving bots.
1224 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="monopolists-can-afford-sleeping-pills-for-watchdogs"></a>Monopolists can afford sleeping pills for watchdogs</h2></div></div></div><p>
1225 Only the most extreme market ideologues think that markets can
1226 self-regulate without state oversight. Markets need watchdogs —
1227 regulators, lawmakers, and other elements of democratic control — to
1228 keep them honest. When these watchdogs sleep on the job, then
1229 markets cease to aggregate consumer choices because those choices
1230 are constrained by illegitimate and deceptive activities that
1231 companies are able to get away with because no one is holding them
1232 to account.
1233 </p><p>
1234 But this kind of regulatory capture doesn’t come cheap. In
1235 competitive sectors, where rivals are constantly eroding one
1236 another’s margins, individual firms lack the surplus capital to
1237 effectively lobby for laws and regulations that serve their ends.
1238 </p><p>
1239 Many of the harms of surveillance capitalism are the result of weak
1240 or nonexistent regulation. Those regulatory vacuums spring from the
1241 power of monopolists to resist stronger regulation and to tailor
1242 what regulation exists to permit their existing businesses.
1243 </p><p>
1244 Here’s an example: When firms over-collect and over-retain our data,
1245 they are at increased risk of suffering a breach — you can’t leak
1246 data you never collected, and once you delete all copies of that
1247 data, you can no longer leak it. For more than a decade, we’ve lived
1248 through an endless parade of ever-worsening data breaches, each one
1249 uniquely horrible in the scale of data breached and the sensitivity
1250 of that data.
1251 </p><p>
1252 But still, firms continue to over-collect and over-retain our data
1253 for three reasons:
1254 </p><p>
1255 <span class="strong"><strong>1. They are locked in the aforementioned
1256 limbic arms race with our capacity to shore up our attentional
1257 defense systems to resist their new persuasion
1258 techniques.</strong></span> They’re also locked in an arms race with
1259 their competitors to find new ways to target people for sales
1260 pitches. As soon as they discover a soft spot in our attentional
1261 defenses (a counterintuitive, unobvious way to target potential
1262 refrigerator buyers), the public begins to wise up to the tactic,
1263 and their competitors leap on it, hastening the day in which all
1264 potential refrigerator buyers have been inured to the pitch.
1265 </p><p>
1266 <span class="strong"><strong>2. They believe the surveillance capitalism
1267 story.</strong></span> Data is cheap to aggregate and store, and both
1268 proponents and opponents of surveillance capitalism have assured
1269 managers and product designers that if you collect enough data, you
1270 will be able to perform sorcerous acts of mind control, thus
1271 supercharging your sales. Even if you never figure out how to profit
1272 from the data, someone else will eventually offer to buy it from you
1273 to give it a try. This is the hallmark of all economic bubbles:
1274 acquiring an asset on the assumption that someone else will buy it
1275 from you for more than you paid for it, often to sell to someone
1276 else at an even greater price.
1277 </p><p>
1278 <span class="strong"><strong>3. The penalties for leaking data are
1279 negligible.</strong></span> Most countries limit these penalties to
1280 actual damages, meaning that consumers who’ve had their data
1281 breached have to show actual monetary harms to get a reward. In
1282 2014, Home Depot disclosed that it had lost credit-card data for 53
1283 million of its customers, but it settled the matter by paying those
1284 customers about $0.34 each — and a third of that $0.34 wasn’t even
1285 paid in cash. It took the form of a credit to procure a largely
1286 ineffectual credit-monitoring service.
1287 </p><p>
1288 But the harms from breaches are much more extensive than these
1289 actual-damages rules capture. Identity thieves and fraudsters are
1290 wily and endlessly inventive. All the vast breaches of our century
1291 are being continuously recombined, the data sets merged and mined
1292 for new ways to victimize the people whose data was present in them.
1293 Any reasonable, evidence-based theory of deterrence and compensation
1294 for breaches would not confine damages to actual damages but rather
1295 would allow users to claim these future harms.
1296 </p><p>
1297 However, even the most ambitious privacy rules, such as the EU
1298 General Data Protection Regulation, fall far short of capturing the
1299 negative externalities of the platforms’ negligent over-collection
1300 and over-retention, and what penalties they do provide are not
1301 aggressively pursued by regulators.
1302 </p><p>
1303 This tolerance of — or indifference to — data over-collection and
1304 over-retention can be ascribed in part to the sheer lobbying muscle
1305 of the platforms. They are so profitable that they can handily
1306 afford to divert gigantic sums to fight any real change — that is,
1307 change that would force them to internalize the costs of their
1308 surveillance activities.
1309 </p><p>
1310 And then there’s state surveillance, which the surveillance
1311 capitalism story dismisses as a relic of another era when the big
1312 worry was being jailed for your dissident speech, not having your
1313 free will stripped away with machine learning.
1314 </p><p>
1315 But state surveillance and private surveillance are intimately
1316 related. As we saw when Apple was conscripted by the Chinese
1317 government as a vital collaborator in state surveillance, the only
1318 really affordable and tractable way to conduct mass surveillance on
1319 the scale practiced by modern states — both <span class="quote"><span class="quote">free</span></span> and autocratic
1320 states — is to suborn commercial services.
1321 </p><p>
1322 Whether it’s Google being used as a location tracking tool by local
1323 law enforcement across the U.S. or the use of social media tracking
1324 by the Department of Homeland Security to build dossiers on
1325 participants in protests against Immigration and Customs
1326 Enforcement’s family separation practices, any hard limits on
1327 surveillance capitalism would hamstring the state’s own surveillance
1328 capability. Without Palantir, Amazon, Google, and other major tech
1329 contractors, U.S. cops would not be able to spy on Black people, ICE
1330 would not be able to manage the caging of children at the U.S.
1331 border, and state welfare systems would not be able to purge their
1332 rolls by dressing up cruelty as empiricism and claiming that poor
1333 and vulnerable people are ineligible for assistance. At least some
1334 of the states’ unwillingness to take meaningful action to curb
1335 surveillance should be attributed to this symbiotic relationship.
1336 There is no mass state surveillance without mass commercial
1337 surveillance.
1338 </p><p>
1339 Monopolism is key to the project of mass state surveillance. It’s
1340 true that smaller tech firms are apt to be less well-defended than
1341 Big Tech, whose security experts are drawn from the tops of their
1342 field and who are given enormous resources to secure and monitor
1343 their systems against intruders. But smaller firms also have less to
1344 protect: fewer users whose data is more fragmented across more
1345 systems and have to be suborned one at a time by state actors.
1346 </p><p>
1347 A concentrated tech sector that works with authorities is a much
1348 more powerful ally in the project of mass state surveillance than a
1349 fragmented one composed of smaller actors. The U.S. tech sector is
1350 small enough that all of its top executives fit around a single
1351 boardroom table in Trump Tower in 2017, shortly after Trump’s
1352 inauguration. Most of its biggest players bid to win JEDI, the
1353 Pentagon’s $10 billion Joint Enterprise Defense Infrastructure cloud
1354 contract. Like other highly concentrated industries, Big Tech
1355 rotates its key employees in and out of government service, sending
1356 them to serve in the Department of Defense and the White House, then
1357 hiring ex-Pentagon and ex-DOD top staffers and officers to work in
1358 their own government relations departments.
1359 </p><p>
1360 They can even make a good case for doing this: After all, when there
1361 are only four or five big companies in an industry, everyone
1362 qualified to regulate those companies has served as an executive in
1363 at least a couple of them — because, likewise, when there are only
1364 five companies in an industry, everyone qualified for a senior role
1365 at any of them is by definition working at one of the other ones.
1366 </p><div class="blockquote"><blockquote class="blockquote"><p>
1367 While surveillance doesn’t cause monopolies, monopolies certainly
1368 abet surveillance.
1369 </p></blockquote></div><p>
1370 Industries that are competitive are fragmented — composed of
1371 companies that are at each other’s throats all the time and eroding
1372 one another’s margins in bids to steal their best customers. This
1373 leaves them with much more limited capital to use to lobby for
1374 favorable rules and a much harder job of getting everyone to agree
1375 to pool their resources to benefit the industry as a whole.
1376 </p><p>
1377 Surveillance combined with machine learning is supposed to be an
1378 existential crisis, a species-defining moment at which our free will
1379 is just a few more advances in the field from being stripped away. I
1380 am skeptical of this claim, but I <span class="emphasis"><em>do</em></span> think that
1381 tech poses an existential threat to our society and possibly our
1382 species.
1383 </p><p>
1384 But that threat grows out of monopoly.
1385 </p><p>
1386 One of the consequences of tech’s regulatory capture is that it can
1387 shift liability for poor security decisions onto its customers and
1388 the wider society. It is absolutely normal in tech for companies to
1389 obfuscate the workings of their products, to make them deliberately
1390 hard to understand, and to threaten security researchers who seek to
1391 independently audit those products.
1392 </p><p>
1393 IT is the only field in which this is practiced: No one builds a
1394 bridge or a hospital and keeps the composition of the steel or the
1395 equations used to calculate load stresses a secret. It is a frankly
1396 bizarre practice that leads, time and again, to grotesque security
1397 defects on farcical scales, with whole classes of devices being
1398 revealed as vulnerable long after they are deployed in the field and
1399 put into sensitive places.
1400 </p><p>
1401 The monopoly power that keeps any meaningful consequences for
1402 breaches at bay means that tech companies continue to build terrible
1403 products that are insecure by design and that end up integrated into
1404 our lives, in possession of our data, and connected to our physical
1405 world. For years, Boeing has struggled with the aftermath of a
1406 series of bad technology decisions that made its 737 fleet a global
1407 pariah, a rare instance in which bad tech decisions have been
1408 seriously punished in the market.
1409 </p><p>
1410 These bad security decisions are compounded yet again by the use of
1411 copyright locks to enforce business-model decisions against
1412 consumers. Recall that these locks have become the go-to means for
1413 shaping consumer behavior, making it technically impossible to use
1414 third-party ink, insulin, apps, or service depots in connection with
1415 your lawfully acquired property.
1416 </p><p>
1417 Recall also that these copyright locks are backstopped by
1418 legislation (such as Section 1201 of the DMCA or Article 6 of the
1419 2001 EU Copyright Directive) that ban tampering with
1420 (<span class="quote"><span class="quote">circumventing</span></span>) them, and these statutes have been used to
1421 threaten security researchers who make disclosures about
1422 vulnerabilities without permission from manufacturers.
1423 </p><p>
1424 This amounts to a manufacturer’s veto over safety warnings and
1425 criticism. While this is far from the legislative intent of the DMCA
1426 and its sister statutes around the world, Congress has not
1427 intervened to clarify the statute nor will it because to do so would
1428 run counter to the interests of powerful, large firms whose lobbying
1429 muscle is unstoppable.
1430 </p><p>
1431 Copyright locks are a double whammy: They create bad security
1432 decisions that can’t be freely investigated or discussed. If markets
1433 are supposed to be machines for aggregating information (and if
1434 surveillance capitalism’s notional mind-control rays are what make
1435 it a <span class="quote"><span class="quote">rogue capitalism</span></span> because it denies consumers the power to
1436 make decisions), then a program of legally enforced ignorance of the
1437 risks of products makes monopolism even more of a <span class="quote"><span class="quote">rogue capitalism</span></span>
1438 than surveillance capitalism’s influence campaigns.
1439 </p><p>
1440 And unlike mind-control rays, enforced silence over security is an
1441 immediate, documented problem, and it <span class="emphasis"><em>does</em></span>
1442 constitute an existential threat to our civilization and possibly
1443 our species. The proliferation of insecure devices — especially
1444 devices that spy on us and especially when those devices also can
1445 manipulate the physical world by, say, steering your car or flipping
1446 a breaker at a power station — is a kind of technology debt.
1447 </p><p>
1448 In software design, <span class="quote"><span class="quote">technology debt</span></span> refers to old, baked-in
1449 decisions that turn out to be bad ones in hindsight. Perhaps a
1450 long-ago developer decided to incorporate a networking protocol made
1451 by a vendor that has since stopped supporting it. But everything in
1452 the product still relies on that superannuated protocol, and so,
1453 with each revision, the product team has to work around this
1454 obsolete core, adding compatibility layers, surrounding it with
1455 security checks that try to shore up its defenses, and so on. These
1456 Band-Aid measures compound the debt because every subsequent
1457 revision has to make allowances for <span class="emphasis"><em>them</em></span>, too,
1458 like interest mounting on a predatory subprime loan. And like a
1459 subprime loan, the interest mounts faster than you can hope to pay
1460 it off: The product team has to put so much energy into maintaining
1461 this complex, brittle system that they don’t have any time left over
1462 to refactor the product from the ground up and <span class="quote"><span class="quote">pay off the debt</span></span>
1463 once and for all.
1464 </p><p>
1465 Typically, technology debt results in a technological bankruptcy:
1466 The product gets so brittle and unsustainable that it fails
1467 catastrophically. Think of the antiquated COBOL-based banking and
1468 accounting systems that fell over at the start of the pandemic
1469 emergency when confronted with surges of unemployment claims.
1470 Sometimes that ends the product; sometimes it takes the company down
1471 with it. Being caught in the default of a technology debt is scary
1472 and traumatic, just like losing your house due to bankruptcy is
1473 scary and traumatic.
1474 </p><p>
1475 But the technology debt created by copyright locks isn’t individual
1476 debt; it’s systemic. Everyone in the world is exposed to this
1477 over-leverage, as was the case with the 2008 financial crisis. When
1478 that debt comes due — when we face a cascade of security breaches
1479 that threaten global shipping and logistics, the food supply,
1480 pharmaceutical production pipelines, emergency communications, and
1481 other critical systems that are accumulating technology debt in part
1482 due to the presence of deliberately insecure and deliberately
1483 unauditable copyright locks — it will indeed pose an existential
1484 risk.
1485 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="privacy-and-monopoly"></a>Privacy and monopoly</h2></div></div></div><p>
1486 Many tech companies are gripped by an orthodoxy that holds that if
1487 they just gather enough data on enough of our activities, everything
1488 else is possible — the mind control and endless profits. This is an
1489 unfalsifiable hypothesis: If data gives a tech company even a tiny
1490 improvement in behavior prediction and modification, the company
1491 declares that it has taken the first step toward global domination
1492 with no end in sight. If a company <span class="emphasis"><em>fails</em></span> to
1493 attain any improvements from gathering and analyzing data, it
1494 declares success to be just around the corner, attainable once more
1495 data is in hand.
1496 </p><p>
1497 Surveillance tech is far from the first industry to embrace a
1498 nonsensical, self-serving belief that harms the rest of the world,
1499 and it is not the first industry to profit handsomely from such a
1500 delusion. Long before hedge-fund managers were claiming (falsely)
1501 that they could beat the S&amp;P 500, there were plenty of other
1502 <span class="quote"><span class="quote">respectable</span></span> industries that have been revealed as quacks in
1503 hindsight. From the makers of radium suppositories (a real thing!)
1504 to the cruel sociopaths who claimed they could <span class="quote"><span class="quote">cure</span></span> gay people,
1505 history is littered with the formerly respectable titans of
1506 discredited industries.
1507 </p><p>
1508 This is not to say that there’s nothing wrong with Big Tech and its
1509 ideological addiction to data. While surveillance’s benefits are
1510 mostly overstated, its harms are, if anything,
1511 <span class="emphasis"><em>understated</em></span>.
1512 </p><p>
1513 There’s real irony here. The belief in surveillance capitalism as a
1514 <span class="quote"><span class="quote">rogue capitalism</span></span> is driven by the belief that markets wouldn’t
1515 tolerate firms that are gripped by false beliefs. An oil company
1516 that has false beliefs about where the oil is will eventually go
1517 broke digging dry wells after all.
1518 </p><p>
1519 But monopolists get to do terrible things for a long time before
1520 they pay the price. Think of how concentration in the finance sector
1521 allowed the subprime crisis to fester as bond-rating agencies,
1522 regulators, investors, and critics all fell under the sway of a
1523 false belief that complex mathematics could construct <span class="quote"><span class="quote">fully hedged</span></span>
1524 debt instruments that could not possibly default. A small bank that
1525 engaged in this kind of malfeasance would simply go broke rather
1526 than outrunning the inevitable crisis, perhaps growing so big that
1527 it averted it altogether. But large banks were able to continue to
1528 attract investors, and when they finally <span class="emphasis"><em>did</em></span>
1529 come a-cropper, the world’s governments bailed them out. The worst
1530 offenders of the subprime crisis are bigger than they were in 2008,
1531 bringing home more profits and paying their execs even larger sums.
1532 </p><p>
1533 Big Tech is able to practice surveillance not just because it is
1534 tech but because it is <span class="emphasis"><em>big</em></span>. The reason every
1535 web publisher embeds a Facebook <span class="quote"><span class="quote">Like</span></span> button is that Facebook
1536 dominates the internet’s social media referrals — and every one of
1537 those <span class="quote"><span class="quote">Like</span></span> buttons spies on everyone who lands on a page that
1538 contains them (see also: Google Analytics embeds, Twitter buttons,
1539 etc.).
1540 </p><p>
1541 The reason the world’s governments have been slow to create
1542 meaningful penalties for privacy breaches is that Big Tech’s
1543 concentration produces huge profits that can be used to lobby
1544 against those penalties — and Big Tech’s concentration means that
1545 the companies involved are able to arrive at a unified negotiating
1546 position that supercharges the lobbying.
1547 </p><p>
1548 The reason that the smartest engineers in the world want to work for
1549 Big Tech is that Big Tech commands the lion’s share of tech industry
1550 jobs.
1551 </p><p>
1552 The reason people who are aghast at Facebook’s and Google’s and
1553 Amazon’s data-handling practices continue to use these services is
1554 that all their friends are on Facebook; Google dominates search; and
1555 Amazon has put all the local merchants out of business.
1556 </p><p>
1557 Competitive markets would weaken the companies’ lobbying muscle by
1558 reducing their profits and pitting them against each other in
1559 regulatory forums. It would give customers other places to go to get
1560 their online services. It would make the companies small enough to
1561 regulate and pave the way to meaningful penalties for breaches. It
1562 would let engineers with ideas that challenged the surveillance
1563 orthodoxy raise capital to compete with the incumbents. It would
1564 give web publishers multiple ways to reach audiences and make the
1565 case against Facebook and Google and Twitter embeds.
1566 </p><p>
1567 In other words, while surveillance doesn’t cause monopolies,
1568 monopolies certainly abet surveillance.
1569 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="ronald-reagan-pioneer-of-tech-monopolism"></a>Ronald Reagan, pioneer of tech monopolism</h2></div></div></div><p>
1570 Technology exceptionalism is a sin, whether it’s practiced by
1571 technology’s blind proponents or by its critics. Both of these camps
1572 are prone to explaining away monopolistic concentration by citing
1573 some special characteristic of the tech industry, like network
1574 effects or first-mover advantage. The only real difference between
1575 these two groups is that the tech apologists say monopoly is
1576 inevitable so we should just let tech get away with its abuses while
1577 competition regulators in the U.S. and the EU say monopoly is
1578 inevitable so we should punish tech for its abuses but not try to
1579 break up the monopolies.
1580 </p><p>
1581 To understand how tech became so monopolistic, it’s useful to look
1582 at the dawn of the consumer tech industry: 1979, the year the Apple
1583 II Plus launched and became the first successful home computer. That
1584 also happens to be the year that Ronald Reagan hit the campaign
1585 trail for the 1980 presidential race — a race he won, leading to a
1586 radical shift in the way that antitrust concerns are handled in
1587 America. Reagan’s cohort of politicians — including Margaret
1588 Thatcher in the U.K., Brian Mulroney in Canada, Helmut Kohl in
1589 Germany, and Augusto Pinochet in Chile — went on to enact similar
1590 reforms that eventually spread around the world.
1591 </p><p>
1592 Antitrust’s story began nearly a century before all that with laws
1593 like the Sherman Act, which took aim at monopolists on the grounds
1594 that monopolies were bad in and of themselves — squeezing out
1595 competitors, creating <span class="quote"><span class="quote">diseconomies of scale</span></span> (when a company is so
1596 big that its constituent parts go awry and it is seemingly helpless
1597 to address the problems), and capturing their regulators to such a
1598 degree that they can get away with a host of evils.
1599 </p><p>
1600 Then came a fabulist named Robert Bork, a former solicitor general
1601 who Reagan appointed to the powerful U.S. Court of Appeals for the
1602 D.C. Circuit and who had created an alternate legislative history of
1603 the Sherman Act and its successors out of whole cloth. Bork insisted
1604 that these statutes were never targeted at monopolies (despite a
1605 wealth of evidence to the contrary, including the transcribed
1606 speeches of the acts’ authors) but, rather, that they were intended
1607 to prevent <span class="quote"><span class="quote">consumer harm</span></span> — in the form of higher prices.
1608 </p><p>
1609 Bork was a crank, but he was a crank with a theory that rich people
1610 really liked. Monopolies are a great way to make rich people richer
1611 by allowing them to receive <span class="quote"><span class="quote">monopoly rents</span></span> (that is, bigger
1612 profits) and capture regulators, leading to a weaker, more favorable
1613 regulatory environment with fewer protections for customers,
1614 suppliers, the environment, and workers.
1615 </p><p>
1616 Bork’s theories were especially palatable to the same power brokers
1617 who backed Reagan, and Reagan’s Department of Justice and other
1618 agencies began to incorporate Bork’s antitrust doctrine into their
1619 enforcement decisions (Reagan even put Bork up for a Supreme Court
1620 seat, but Bork flunked the Senate confirmation hearing so badly
1621 that, 40 years later, D.C. insiders use the term <span class="quote"><span class="quote">borked</span></span> to refer
1622 to any catastrophically bad political performance).
1623 </p><p>
1624 Little by little, Bork’s theories entered the mainstream, and their
1625 backers began to infiltrate the legal education field, even putting
1626 on junkets where members of the judiciary were treated to lavish
1627 meals, fun outdoor activities, and seminars where they were
1628 indoctrinated into the consumer harm theory of antitrust. The more
1629 Bork’s theories took hold, the more money the monopolists were
1630 making — and the more surplus capital they had at their disposal to
1631 lobby for even more Borkian antitrust influence campaigns.
1632 </p><p>
1633 The history of Bork’s antitrust theories is a really good example of
1634 the kind of covertly engineered shifts in public opinion that Zuboff
1635 warns us against, where fringe ideas become mainstream orthodoxy.
1636 But Bork didn’t change the world overnight. He played a very long
1637 game, for over a generation, and he had a tailwind because the same
1638 forces that backed oligarchic antitrust theories also backed many
1639 other oligarchic shifts in public opinion. For example, the idea
1640 that taxation is theft, that wealth is a sign of virtue, and so on —
1641 all of these theories meshed to form a coherent ideology that
1642 elevated inequality to a virtue.
1643 </p><p>
1644 Today, many fear that machine learning allows surveillance
1645 capitalism to sell <span class="quote"><span class="quote">Bork-as-a-Service,</span></span> at internet speeds, so that
1646 you can contract a machine-learning company to engineer
1647 <span class="emphasis"><em>rapid</em></span> shifts in public sentiment without
1648 needing the capital to sustain a multipronged, multigenerational
1649 project working at the local, state, national, and global levels in
1650 business, law, and philosophy. I do not believe that such a project
1651 is plausible, though I agree that this is basically what the
1652 platforms claim to be selling. They’re just lying about it. Big Tech
1653 lies all the time, <span class="emphasis"><em>including</em></span> in their sales
1654 literature.
1655 </p><p>
1656 The idea that tech forms <span class="quote"><span class="quote">natural monopolies</span></span> (monopolies that are
1657 the inevitable result of the realities of an industry, such as the
1658 monopolies that accrue the first company to run long-haul phone
1659 lines or rail lines) is belied by tech’s own history: In the absence
1660 of anti-competitive tactics, Google was able to unseat AltaVista and
1661 Yahoo; Facebook was able to head off Myspace. There are some
1662 advantages to gathering mountains of data, but those mountains of
1663 data also have disadvantages: liability (from leaking), diminishing
1664 returns (from old data), and institutional inertia (big companies,
1665 like science, progress one funeral at a time).
1666 </p><p>
1667 Indeed, the birth of the web saw a mass-extinction event for the
1668 existing giant, wildly profitable proprietary technologies that had
1669 capital, network effects, and walls and moats surrounding their
1670 businesses. The web showed that when a new industry is built around
1671 a protocol, rather than a product, the combined might of everyone
1672 who uses the protocol to reach their customers or users or
1673 communities outweighs even the most massive products. CompuServe,
1674 AOL, MSN, and a host of other proprietary walled gardens learned
1675 this lesson the hard way: Each believed it could stay separate from
1676 the web, offering <span class="quote"><span class="quote">curation</span></span> and a guarantee of consistency and
1677 quality instead of the chaos of an open system. Each was wrong and
1678 ended up being absorbed into the public web.
1679 </p><p>
1680 Yes, tech is heavily monopolized and is now closely associated with
1681 industry concentration, but this has more to do with a matter of
1682 timing than its intrinsically monopolistic tendencies. Tech was born
1683 at the moment that antitrust enforcement was being dismantled, and
1684 tech fell into exactly the same pathologies that antitrust was
1685 supposed to guard against. To a first approximation, it is
1686 reasonable to assume that tech’s monopolies are the result of a lack
1687 of anti-monopoly action and not the much-touted unique
1688 characteristics of tech, such as network effects, first-mover
1689 advantage, and so on.
1690 </p><p>
1691 In support of this thesis, I offer the concentration that every
1692 <span class="emphasis"><em>other</em></span> industry has undergone over the same
1693 period. From professional wrestling to consumer packaged goods to
1694 commercial property leasing to banking to sea freight to oil to
1695 record labels to newspaper ownership to theme parks,
1696 <span class="emphasis"><em>every</em></span> industry has undergone a massive shift
1697 toward concentration. There’s no obvious network effects or
1698 first-mover advantage at play in these industries. However, in every
1699 case, these industries attained their concentrated status through
1700 tactics that were prohibited before Bork’s triumph: merging with
1701 major competitors, buying out innovative new market entrants,
1702 horizontal and vertical integration, and a suite of anti-competitive
1703 tactics that were once illegal but are not any longer.
1704 </p><p>
1705 Again: When you change the laws intended to prevent monopolies and
1706 then monopolies form in exactly the way the law was supposed to
1707 prevent, it is reasonable to suppose that these facts are related.
1708 Tech’s concentration can be readily explained without recourse to
1709 radical theories of network effects — but only if you’re willing to
1710 indict unregulated markets as tending toward monopoly. Just as a
1711 lifelong smoker can give you a hundred reasons why their smoking
1712 didn’t cause their cancer (<span class="quote"><span class="quote">It was the environmental toxins</span></span>), true
1713 believers in unregulated markets have a whole suite of unconvincing
1714 explanations for monopoly in tech that leave capitalism intact.
1715 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="steering-with-the-windshield-wipers"></a>Steering with the windshield wipers</h2></div></div></div><p>
1716 It’s been 40 years since Bork’s project to rehabilitate monopolies
1717 achieved liftoff, and that is a generation and a half, which is
1718 plenty of time to take a common idea and make it seem outlandish and
1719 vice versa. Before the 1940s, affluent Americans dressed their baby
1720 boys in pink while baby girls wore blue (a <span class="quote"><span class="quote">delicate and dainty</span></span>
1721 color). While gendered colors are obviously totally arbitrary, many
1722 still greet this news with amazement and find it hard to imagine a
1723 time when pink connoted masculinity.
1724 </p><p>
1725 After 40 years of studiously ignoring antitrust analysis and
1726 enforcement, it’s not surprising that we’ve all but forgotten that
1727 antitrust exists, that in living memory, growth through mergers and
1728 acquisitions were largely prohibited under law, that
1729 market-cornering strategies like vertical integration could land a
1730 company in court.
1731 </p><p>
1732 Antitrust is a market society’s steering wheel, the control of first
1733 resort to keep would-be masters of the universe in their lanes. But
1734 Bork and his cohort ripped out our steering wheel 40 years ago. The
1735 car is still barreling along, and so we’re yanking as hard as we can
1736 on all the <span class="emphasis"><em>other</em></span> controls in the car as well as
1737 desperately flapping the doors and rolling the windows up and down
1738 in the hopes that one of these other controls can be repurposed to
1739 let us choose where we’re heading before we careen off a cliff.
1740 </p><p>
1741 It’s like a 1960s science-fiction plot come to life: People stuck in
1742 a <span class="quote"><span class="quote">generation ship,</span></span> plying its way across the stars, a ship once
1743 piloted by their ancestors; and now, after a great cataclysm, the
1744 ship’s crew have forgotten that they’re in a ship at all and no
1745 longer remember where the control room is. Adrift, the ship is
1746 racing toward its extinction, and unless we can seize the controls
1747 and execute emergency course correction, we’re all headed for a
1748 fiery death in the heart of a sun.
1749 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="surveillance-still-matters"></a>Surveillance still matters</h2></div></div></div><p>
1750 None of this is to minimize the problems with surveillance.
1751 Surveillance matters, and Big Tech’s use of surveillance
1752 <span class="emphasis"><em>is</em></span> an existential risk to our species, but
1753 that’s not because surveillance and machine learning rob us of our
1754 free will.
1755 </p><p>
1756 Surveillance has become <span class="emphasis"><em>much</em></span> more efficient
1757 thanks to Big Tech. In 1989, the Stasi — the East German secret
1758 police — had the whole country under surveillance, a massive
1759 undertaking that recruited one out of every 60 people to serve as an
1760 informant or intelligence operative.
1761 </p><p>
1762 Today, we know that the NSA is spying on a significant fraction of
1763 the entire world’s population, and its ratio of surveillance
1764 operatives to the surveilled is more like 1:10,000 (that’s probably
1765 on the low side since it assumes that every American with top-secret
1766 clearance is working for the NSA on this project — we don’t know how
1767 many of those cleared people are involved in NSA spying, but it’s
1768 definitely not all of them).
1769 </p><p>
1770 How did the ratio of surveillable citizens expand from 1:60 to
1771 1:10,000 in less than 30 years? It’s thanks to Big Tech. Our devices
1772 and services gather most of the data that the NSA mines for its
1773 surveillance project. We pay for these devices and the services they
1774 connect to, and then we painstakingly perform the data-entry tasks
1775 associated with logging facts about our lives, opinions, and
1776 preferences. This mass surveillance project has been largely useless
1777 for fighting terrorism: The NSA can
1778 <a class="ulink" href="https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html" target="_top">only
1779 point to a single minor success story</a> in which it used its
1780 data collection program to foil an attempt by a U.S. resident to
1781 wire a few thousand dollars to an overseas terror group. It’s
1782 ineffective for much the same reason that commercial surveillance
1783 projects are largely ineffective at targeting advertising: The
1784 people who want to commit acts of terror, like people who want to
1785 buy a refrigerator, are extremely rare. If you’re trying to detect a
1786 phenomenon whose base rate is one in a million with an instrument
1787 whose accuracy is only 99%, then every true positive will come at
1788 the cost of 9,999 false positives.
1789 </p><p>
1790 Let me explain that again: If one in a million people is a
1791 terrorist, then there will only be about one terrorist in a random
1792 sample of one million people. If your test for detecting terrorists
1793 is 99% accurate, it will identify 10,000 terrorists in your
1794 million-person sample (1% of one million is 10,000). For every true
1795 positive, you’ll get 9,999 false positives.
1796 </p><p>
1797 In reality, the accuracy of algorithmic terrorism detection falls
1798 far short of the 99% mark, as does refrigerator ad targeting. The
1799 difference is that being falsely accused of wanting to buy a fridge
1800 is a minor nuisance while being falsely accused of planning a terror
1801 attack can destroy your life and the lives of everyone you love.
1802 </p><p>
1803 Mass state surveillance is only feasible because of surveillance
1804 capitalism and its extremely low-yield ad-targeting systems, which
1805 require a constant feed of personal data to remain barely viable.
1806 Surveillance capitalism’s primary failure mode is mistargeted ads
1807 while mass state surveillance’s primary failure mode is grotesque
1808 human rights abuses, tending toward totalitarianism.
1809 </p><p>
1810 State surveillance is no mere parasite on Big Tech, sucking up its
1811 data and giving nothing in return. In truth, the two are symbiotes:
1812 Big Tech sucks up our data for spy agencies, and spy agencies ensure
1813 that governments don’t limit Big Tech’s activities so severely that
1814 it would no longer serve the spy agencies’ needs. There is no firm
1815 distinction between state surveillance and surveillance capitalism;
1816 they are dependent on one another.
1817 </p><p>
1818 To see this at work today, look no further than Amazon’s home
1819 surveillance device, the Ring doorbell, and its associated app,
1820 Neighbors. Ring — a product that Amazon acquired and did not develop
1821 in house — makes a camera-enabled doorbell that streams footage from
1822 your front door to your mobile device. The Neighbors app allows you
1823 to form a neighborhood-wide surveillance grid with your fellow Ring
1824 owners through which you can share clips of <span class="quote"><span class="quote">suspicious characters.</span></span>
1825 If you’re thinking that this sounds like a recipe for letting
1826 curtain-twitching racists supercharge their suspicions of people
1827 with brown skin who walk down their blocks,
1828 <a class="ulink" href="https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement" target="_top">you’re
1829 right</a>. Ring has become a <span class="emphasis"><em>de facto,</em></span>
1830 off-the-books arm of the police without any of the pesky oversight
1831 or rules.
1832 </p><p>
1833 In mid-2019, a series of public records requests revealed that
1834 Amazon had struck confidential deals with more than 400 local law
1835 enforcement agencies through which the agencies would promote Ring
1836 and Neighbors and in exchange get access to footage from Ring
1837 cameras. In theory, cops would need to request this footage through
1838 Amazon (and internal documents reveal that Amazon devotes
1839 substantial resources to coaching cops on how to spin a convincing
1840 story when doing so), but in practice, when a Ring customer turns
1841 down a police request, Amazon only requires the agency to formally
1842 request the footage from the company, which it will then produce.
1843 </p><p>
1844 Ring and law enforcement have found many ways to intertwine their
1845 activities. Ring strikes secret deals to acquire real-time access to
1846 911 dispatch and then streams alarming crime reports to Neighbors
1847 users, which serve as convincers for anyone who’s contemplating a
1848 surveillance doorbell but isn’t sure whether their neighborhood is
1849 dangerous enough to warrant it.
1850 </p><p>
1851 The more the cops buzz-market the surveillance capitalist Ring, the
1852 more surveillance capability the state gets. Cops who rely on
1853 private entities for law-enforcement roles then brief against any
1854 controls on the deployment of that technology while the companies
1855 return the favor by lobbying against rules requiring public
1856 oversight of police surveillance technology. The more the cops rely
1857 on Ring and Neighbors, the harder it will be to pass laws to curb
1858 them. The fewer laws there are against them, the more the cops will
1859 rely on them.
1860 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="dignity-and-sanctuary"></a>Dignity and sanctuary</h2></div></div></div><p>
1861 But even if we could exercise democratic control over our states and
1862 force them to stop raiding surveillance capitalism’s reservoirs of
1863 behavioral data, surveillance capitalism would still harm us.
1864 </p><p>
1865 This is an area where Zuboff shines. Her chapter on <span class="quote"><span class="quote">sanctuary</span></span>
1866 the feeling of being unobserved — is a beautiful hymn to
1867 introspection, calmness, mindfulness, and tranquility.
1868 </p><p>
1869 When you are watched, something changes. Anyone who has ever raised
1870 a child knows this. You might look up from your book (or more
1871 realistically, from your phone) and catch your child in a moment of
1872 profound realization and growth, a moment where they are learning
1873 something that is right at the edge of their abilities, requiring
1874 their entire ferocious concentration. For a moment, you’re
1875 transfixed, watching that rare and beautiful moment of focus playing
1876 out before your eyes, and then your child looks up and sees you
1877 seeing them, and the moment collapses. To grow, you need to be and
1878 expose your authentic self, and in that moment, you are vulnerable
1879 like a hermit crab scuttling from one shell to the next. The tender,
1880 unprotected tissues you expose in that moment are too delicate to
1881 reveal in the presence of another, even someone you trust as
1882 implicitly as a child trusts their parent.
1883 </p><p>
1884 In the digital age, our authentic selves are inextricably tied to
1885 our digital lives. Your search history is a running ledger of the
1886 questions you’ve pondered. Your location history is a record of the
1887 places you’ve sought out and the experiences you’ve had there. Your
1888 social graph reveals the different facets of your identity, the
1889 people you’ve connected with.
1890 </p><p>
1891 To be observed in these activities is to lose the sanctuary of your
1892 authentic self.
1893 </p><p>
1894 There’s another way in which surveillance capitalism robs us of our
1895 capacity to be our authentic selves: by making us anxious.
1896 Surveillance capitalism isn’t really a mind-control ray, but you
1897 don’t need a mind-control ray to make someone anxious. After all,
1898 another word for anxiety is agitation, and to make someone
1899 experience agitation, you need merely to agitate them. To poke them
1900 and prod them and beep at them and buzz at them and bombard them on
1901 an intermittent schedule that is just random enough that our limbic
1902 systems never quite become inured to it.
1903 </p><p>
1904 Our devices and services are <span class="quote"><span class="quote">general purpose</span></span> in that they can
1905 connect anything or anyone to anything or anyone else and that they
1906 can run any program that can be written. This means that the
1907 distraction rectangles in our pockets hold our most precious moments
1908 with our most beloved people and their most urgent or time-sensitive
1909 communications (from <span class="quote"><span class="quote">running late can you get the kid?</span></span> to <span class="quote"><span class="quote">doctor
1910 gave me bad news and I need to talk to you RIGHT NOW</span></span>) as well as
1911 ads for refrigerators and recruiting messages from Nazis.
1912 </p><p>
1913 All day and all night, our pockets buzz, shattering our
1914 concentration and tearing apart the fragile webs of connection we
1915 spin as we think through difficult ideas. If you locked someone in a
1916 cell and agitated them like this, we’d call it <span class="quote"><span class="quote">sleep deprivation
1917 torture,</span></span> and it would be
1918 <a class="ulink" href="https://www.youtube.com/watch?v=1SKpRbvnx6g" target="_top">a war crime
1919 under the Geneva Conventions</a>.
1920 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="afflicting-the-afflicted"></a>Afflicting the afflicted</h2></div></div></div><p>
1921 The effects of surveillance on our ability to be our authentic
1922 selves are not equal for all people. Some of us are lucky enough to
1923 live in a time and place in which all the most important facts of
1924 our lives are widely and roundly socially acceptable and can be
1925 publicly displayed without the risk of social consequence.
1926 </p><p>
1927 But for many of us, this is not true. Recall that in living memory,
1928 many of the ways of being that we think of as socially acceptable
1929 today were once cause for dire social sanction or even imprisonment.
1930 If you are 65 years old, you have lived through a time in which
1931 people living in <span class="quote"><span class="quote">free societies</span></span> could be imprisoned or sanctioned
1932 for engaging in homosexual activity, for falling in love with a
1933 person whose skin was a different color than their own, or for
1934 smoking weed.
1935 </p><p>
1936 Today, these activities aren’t just decriminalized in much of the
1937 world, they’re considered normal, and the fallen prohibitions are
1938 viewed as shameful, regrettable relics of the past.
1939 </p><p>
1940 How did we get from prohibition to normalization? Through private,
1941 personal activity: People who were secretly gay or secret
1942 pot-smokers or who secretly loved someone with a different skin
1943 color were vulnerable to retaliation if they made their true selves
1944 known and were limited in how much they could advocate for their own
1945 right to exist in the world and be true to themselves. But because
1946 there was a private sphere, these people could form alliances with
1947 their friends and loved ones who did not share their disfavored
1948 traits by having private conversations in which they came out,
1949 disclosing their true selves to the people around them and bringing
1950 them to their cause one conversation at a time.
1951 </p><p>
1952 The right to choose the time and manner of these conversations was
1953 key to their success. It’s one thing to come out to your dad while
1954 you’re on a fishing trip away from the world and another thing
1955 entirely to blurt it out over the Christmas dinner table while your
1956 racist Facebook uncle is there to make a scene.
1957 </p><p>
1958 Without a private sphere, there’s a chance that none of these
1959 changes would have come to pass and that the people who benefited
1960 from these changes would have either faced social sanction for
1961 coming out to a hostile world or would have never been able to
1962 reveal their true selves to the people they love.
1963 </p><p>
1964 The corollary is that, unless you think that our society has
1965 attained social perfection — that your grandchildren in 50 years
1966 will ask you to tell them the story of how, in 2020, every injustice
1967 had been righted and no further change had to be made — then you
1968 should expect that right now, at this minute, there are people you
1969 love, whose happiness is key to your own, who have a secret in their
1970 hearts that stops them from ever being their authentic selves with
1971 you. These people are sorrowing and will go to their graves with
1972 that secret sorrow in their hearts, and the source of that sorrow
1973 will be the falsity of their relationship to you.
1974 </p><p>
1975 A private realm is necessary for human progress.
1976 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="any-data-you-collect-and-retain-will-eventually-leak"></a>Any data you collect and retain will eventually leak</h2></div></div></div><p>
1977 The lack of a private life can rob vulnerable people of the chance
1978 to be their authentic selves and constrain our actions by depriving
1979 us of sanctuary, but there is another risk that is borne by
1980 everyone, not just people with a secret: crime.
1981 </p><p>
1982 Personally identifying information is of very limited use for the
1983 purpose of controlling peoples’ minds, but identity theft — really a
1984 catchall term for a whole constellation of terrible criminal
1985 activities that can destroy your finances, compromise your personal
1986 integrity, ruin your reputation, or even expose you to physical
1987 danger — thrives on it.
1988 </p><p>
1989 Attackers are not limited to using data from one breached source,
1990 either. Multiple services have suffered breaches that exposed names,
1991 addresses, phone numbers, passwords, sexual tastes, school grades,
1992 work performance, brushes with the criminal justice system, family
1993 details, genetic information, fingerprints and other biometrics,
1994 reading habits, search histories, literary tastes, pseudonymous
1995 identities, and other sensitive information. Attackers can merge
1996 data from these different breaches to build up extremely detailed
1997 dossiers on random subjects and then use different parts of the data
1998 for different criminal purposes.
1999 </p><p>
2000 For example, attackers can use leaked username and password
2001 combinations to hijack whole fleets of commercial vehicles that
2002 <a class="ulink" href="https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps" target="_top">have
2003 been fitted with anti-theft GPS trackers and immobilizers</a> or
2004 to hijack baby monitors in order to
2005 <a class="ulink" href="https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550" target="_top">terrorize
2006 toddlers with the audio tracks from pornography</a>. Attackers
2007 use leaked data to trick phone companies into giving them your phone
2008 number, then they intercept SMS-based two-factor authentication
2009 codes in order to take over your email, bank account, and/or
2010 cryptocurrency wallets.
2011 </p><p>
2012 Attackers are endlessly inventive in the pursuit of creative ways to
2013 weaponize leaked data. One common use of leaked data is to penetrate
2014 companies in order to access <span class="emphasis"><em>more</em></span> data.
2015 </p><p>
2016 Like spies, online fraudsters are totally dependent on companies
2017 over-collecting and over-retaining our data. Spy agencies sometimes
2018 pay companies for access to their data or intimidate them into
2019 giving it up, but sometimes they work just like criminals do — by
2020 <a class="ulink" href="https://www.bbc.com/news/world-us-canada-24751821" target="_top">sneaking
2021 data out of companies’ databases</a>.
2022 </p><p>
2023 The over-collection of data has a host of terrible social
2024 consequences, from the erosion of our authentic selves to the
2025 undermining of social progress, from state surveillance to an
2026 epidemic of online crime. Commercial surveillance is also a boon to
2027 people running influence campaigns, but that’s the least of our
2028 troubles.
2029 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="critical-tech-exceptionalism-is-still-tech-exceptionalism"></a>Critical tech exceptionalism is still tech
2030 exceptionalism</h2></div></div></div><p>
2031 Big Tech has long practiced technology exceptionalism: the idea that
2032 it should not be subject to the mundane laws and norms of
2033 <span class="quote"><span class="quote">meatspace.</span></span> Mottoes like Facebook’s <span class="quote"><span class="quote">move fast and break things</span></span>
2034 attracted justifiable scorn of the companies’ self-serving rhetoric.
2035 </p><p>
2036 Tech exceptionalism got us all into a lot of trouble, so it’s ironic
2037 and distressing to see Big Tech’s critics committing the same sin.
2038 </p><p>
2039 Big Tech is not a <span class="quote"><span class="quote">rogue capitalism</span></span> that cannot be cured through
2040 the traditional anti-monopoly remedies of trustbusting (forcing
2041 companies to divest of competitors they have acquired) and bans on
2042 mergers to monopoly and other anti-competitive tactics. Big Tech
2043 does not have the power to use machine learning to influence our
2044 behavior so thoroughly that markets lose the ability to punish bad
2045 actors and reward superior competitors. Big Tech has no rule-writing
2046 mind-control ray that necessitates ditching our old toolbox.
2047 </p><p>
2048 The thing is, people have been claiming to have perfected
2049 mind-control rays for centuries, and every time, it turned out to be
2050 a con — though sometimes the con artists were also conning
2051 themselves.
2052 </p><p>
2053 For generations, the advertising industry has been steadily
2054 improving its ability to sell advertising services to businesses
2055 while only making marginal gains in selling those businesses’
2056 products to prospective customers. John Wanamaker’s lament that <span class="quote"><span class="quote">50%
2057 of my advertising budget is wasted, I just don’t know which 50%</span></span> is
2058 a testament to the triumph of <span class="emphasis"><em>ad executives</em></span>,
2059 who successfully convinced Wanamaker that only half of the money he
2060 spent went to waste.
2061 </p><p>
2062 The tech industry has made enormous improvements in the science of
2063 convincing businesses that they’re good at advertising while their
2064 actual improvements to advertising — as opposed to targeting — have
2065 been pretty ho-hum. The vogue for machine learning — and the
2066 mystical invocation of <span class="quote"><span class="quote">artificial intelligence</span></span> as a synonym for
2067 straightforward statistical inference techniques — has greatly
2068 boosted the efficacy of Big Tech’s sales pitch as marketers have
2069 exploited potential customers’ lack of technical sophistication to
2070 get away with breathtaking acts of overpromising and
2071 underdelivering.
2072 </p><p>
2073 It’s tempting to think that if businesses are willing to pour
2074 billions into a venture that the venture must be a good one. Yet
2075 there are plenty of times when this rule of thumb has led us astray.
2076 For example, it’s virtually unheard of for managed investment funds
2077 to outperform simple index funds, and investors who put their money
2078 into the hands of expert money managers overwhelmingly fare worse
2079 than those who entrust their savings to index funds. But managed
2080 funds still account for the majority of the money invested in the
2081 markets, and they are patronized by some of the richest, most
2082 sophisticated investors in the world. Their vote of confidence in an
2083 underperforming sector is a parable about the role of luck in wealth
2084 accumulation, not a sign that managed funds are a good buy.
2085 </p><p>
2086 The claims of Big Tech’s mind-control system are full of tells that
2087 the enterprise is a con. For example,
2088 <a class="ulink" href="https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full" target="_top">the
2089 reliance on the <span class="quote"><span class="quote">Big Five</span></span> personality traits</a> as a primary
2090 means of influencing people even though the <span class="quote"><span class="quote">Big Five</span></span> theory is
2091 unsupported by any large-scale, peer-reviewed studies and is
2092 <a class="ulink" href="https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/" target="_top">mostly
2093 the realm of marketing hucksters and pop psych</a>.
2094 </p><p>
2095 Big Tech’s promotional materials also claim that their algorithms
2096 can accurately perform <span class="quote"><span class="quote">sentiment analysis</span></span> or detect peoples’ moods
2097 based on their <span class="quote"><span class="quote">microexpressions,</span></span> but
2098 <a class="ulink" href="https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it" target="_top">these
2099 are marketing claims, not scientific ones</a>. These methods are
2100 largely untested by independent scientific experts, and where they
2101 have been tested, they’ve been found sorely wanting.
2102 Microexpressions are particularly suspect as the companies that
2103 specialize in training people to detect them
2104 <a class="ulink" href="https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/" target="_top">have
2105 been shown</a> to underperform relative to random chance.
2106 </p><p>
2107 Big Tech has been so good at marketing its own supposed superpowers
2108 that it’s easy to believe that they can market everything else with
2109 similar acumen, but it’s a mistake to believe the hype. Any
2110 statement a company makes about the quality of its products is
2111 clearly not impartial. The fact that we distrust all the things that
2112 Big Tech says about its data handling, compliance with privacy laws,
2113 etc., is only reasonable — but why on Earth would we treat Big
2114 Tech’s marketing literature as the gospel truth? Big Tech lies about
2115 just about <span class="emphasis"><em>everything</em></span>, including how well its
2116 machine-learning fueled persuasion systems work.
2117 </p><p>
2118 That skepticism should infuse all of our evaluations of Big Tech and
2119 its supposed abilities, including our perusal of its patents. Zuboff
2120 vests these patents with enormous significance, pointing out that
2121 Google claimed extensive new persuasion capabilities in
2122 <a class="ulink" href="https://patents.google.com/patent/US20050131762A1/en" target="_top">its
2123 patent filings</a>. These claims are doubly suspect: first,
2124 because they are so self-serving, and second, because the patent
2125 itself is so notoriously an invitation to exaggeration.
2126 </p><p>
2127 Patent applications take the form of a series of claims and range
2128 from broad to narrow. A typical patent starts out by claiming that
2129 its authors have invented a method or system for doing every
2130 conceivable thing that anyone might do, ever, with any tool or
2131 device. Then it narrows that claim in successive stages until we get
2132 to the actual <span class="quote"><span class="quote">invention</span></span> that is the true subject of the patent.
2133 The hope is that the patent examiner — who is almost certainly
2134 overworked and underinformed — will miss the fact that some or all
2135 of these claims are ridiculous, or at least suspect, and grant the
2136 patent’s broader claims. Patents for unpatentable things are still
2137 incredibly useful because they can be wielded against competitors
2138 who might license that patent or steer clear of its claims rather
2139 than endure the lengthy, expensive process of contesting it.
2140 </p><p>
2141 What’s more, software patents are routinely granted even though the
2142 filer doesn’t have any evidence that they can do the thing claimed
2143 by the patent. That is, you can patent an <span class="quote"><span class="quote">invention</span></span> that you
2144 haven’t actually made and that you don’t know how to make.
2145 </p><p>
2146 With these considerations in hand, it becomes obvious that the fact
2147 that a Big Tech company has patented what it
2148 <span class="emphasis"><em>says</em></span> is an effective mind-control ray is
2149 largely irrelevant to whether Big Tech can in fact control our
2150 minds.
2151 </p><p>
2152 Big Tech collects our data for many reasons, including the
2153 diminishing returns on existing stores of data. But many tech
2154 companies also collect data out of a mistaken tech exceptionalist
2155 belief in the network effects of data. Network effects occur when
2156 each new user in a system increases its value. The classic example
2157 is fax machines: A single fax machine is of no use, two fax machines
2158 are of limited use, but every new fax machine that’s put to use
2159 after the first doubles the number of possible fax-to-fax links.
2160 </p><p>
2161 Data mined for predictive systems doesn’t necessarily produce these
2162 dividends. Think of Netflix: The predictive value of the data mined
2163 from a million English-speaking Netflix viewers is hardly improved
2164 by the addition of one more user’s viewing data. Most of the data
2165 Netflix acquires after that first minimum viable sample duplicates
2166 existing data and produces only minimal gains. Meanwhile, retraining
2167 models with new data gets progressively more expensive as the number
2168 of data points increases, and manual tasks like labeling and
2169 validating data do not get cheaper at scale.
2170 </p><p>
2171 Businesses pursue fads to the detriment of their profits all the
2172 time, especially when the businesses and their investors are not
2173 motivated by the prospect of becoming profitable but rather by the
2174 prospect of being acquired by a Big Tech giant or by having an IPO.
2175 For these firms, ticking faddish boxes like <span class="quote"><span class="quote">collects as much data
2176 as possible</span></span> might realize a bigger return on investment than
2177 <span class="quote"><span class="quote">collects a business-appropriate quantity of data.</span></span>
2178 </p><p>
2179 This is another harm of tech exceptionalism: The belief that more
2180 data always produces more profits in the form of more insights that
2181 can be translated into better mind-control rays drives firms to
2182 over-collect and over-retain data beyond all rationality. And since
2183 the firms are behaving irrationally, a good number of them will go
2184 out of business and become ghost ships whose cargo holds are stuffed
2185 full of data that can harm people in myriad ways — but which no one
2186 is responsible for antey longer. Even if the companies don’t go
2187 under, the data they collect is maintained behind the minimum viable
2188 security — just enough security to keep the company viable while it
2189 waits to get bought out by a tech giant, an amount calculated to
2190 spend not one penny more than is necessary on protecting data.
2191 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story"></a>How monopolies, not mind control, drive surveillance
2192 capitalism: The Snapchat story</h2></div></div></div><p>
2193 For the first decade of its existence, Facebook competed with the
2194 social media giants of the day (Myspace, Orkut, etc.) by presenting
2195 itself as the pro-privacy alternative. Indeed, Facebook justified
2196 its walled garden — which let users bring in data from the web but
2197 blocked web services like Google Search from indexing and caching
2198 Facebook pages — as a pro-privacy measure that protected users from
2199 the surveillance-happy winners of the social media wars like
2200 Myspace.
2201 </p><p>
2202 Despite frequent promises that it would never collect or analyze its
2203 users’ data, Facebook periodically created initiatives that did just
2204 that, like the creepy, ham-fisted Beacon tool, which spied on you as
2205 you moved around the web and then added your online activities to
2206 your public timeline, allowing your friends to monitor your browsing
2207 habits. Beacon sparked a user revolt. Every time, Facebook backed
2208 off from its surveillance initiative, but not all the way;
2209 inevitably, the new Facebook would be more surveilling than the old
2210 Facebook, though not quite as surveilling as the intermediate
2211 Facebook following the launch of the new product or service.
2212 </p><p>
2213 The pace at which Facebook ramped up its surveillance efforts seems
2214 to have been set by Facebook’s competitive landscape. The more
2215 competitors Facebook had, the better it behaved. Every time a major
2216 competitor foundered, Facebook’s behavior
2217 <a class="ulink" href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362" target="_top">got
2218 markedly worse</a>.
2219 </p><p>
2220 All the while, Facebook was prodigiously acquiring companies,
2221 including a company called Onavo. Nominally, Onavo made a
2222 battery-monitoring mobile app. But the permissions that Onavo
2223 required were so expansive that the app was able to gather
2224 fine-grained telemetry on everything users did with their phones,
2225 including which apps they used and how they were using them.
2226 </p><p>
2227 Through Onavo, Facebook discovered that it was losing market share
2228 to Snapchat, an app that — like Facebook a decade before — billed
2229 itself as the pro-privacy alternative to the status quo. Through
2230 Onavo, Facebook was able to mine data from the devices of Snapchat
2231 users, including both current and former Snapchat users. This
2232 spurred Facebook to acquire Instagram — some features of which
2233 competed with Snapchat — and then allowed Facebook to fine-tune
2234 Instagram’s features and sales pitch to erode Snapchat’s gains and
2235 ensure that Facebook would not have to face the kinds of competitive
2236 pressures it had earlier inflicted on Myspace and Orkut.
2237 </p><p>
2238 The story of how Facebook crushed Snapchat reveals the relationship
2239 between monopoly and surveillance capitalism. Facebook combined
2240 surveillance with lax antitrust enforcement to spot the competitive
2241 threat of Snapchat on its horizon and then take decisive action
2242 against it. Facebook’s surveillance capitalism let it avert
2243 competitive pressure with anti-competitive tactics. Facebook users
2244 still want privacy — Facebook hasn’t used surveillance to brainwash
2245 them out of it — but they can’t get it because Facebook’s
2246 surveillance lets it destroy any hope of a rival service emerging
2247 that competes on privacy features.
2248 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="a-monopoly-over-your-friends"></a>A monopoly over your friends</h2></div></div></div><p>
2249 A decentralization movement has tried to erode the dominance of
2250 Facebook and other Big Tech companies by fielding <span class="quote"><span class="quote">indieweb</span></span>
2251 alternatives — Mastodon as a Twitter alternative, Diaspora as a
2252 Facebook alternative, etc. — but these efforts have failed to attain
2253 any kind of liftoff.
2254 </p><p>
2255 Fundamentally, each of these services is hamstrung by the same
2256 problem: Every potential user for a Facebook or Twitter alternative
2257 has to convince all their friends to follow them to a decentralized
2258 web alternative in order to continue to realize the benefit of
2259 social media. For many of us, the only reason to have a Facebook
2260 account is that our friends have Facebook accounts, and the reason
2261 they have Facebook accounts is that <span class="emphasis"><em>we</em></span> have
2262 Facebook accounts.
2263 </p><p>
2264 All of this has conspired to make Facebook — and other dominant
2265 platforms — into <span class="quote"><span class="quote">kill zones</span></span> that investors will not fund new
2266 entrants for.
2267 </p><p>
2268 And yet, all of today’s tech giants came into existence despite the
2269 entrenched advantage of the companies that came before them. To
2270 understand how that happened, you have to understand both
2271 interoperability and adversarial interoperability.
2272 </p><div class="blockquote"><blockquote class="blockquote"><p>
2273 The hard problem of our species is coordination.
2274 </p></blockquote></div><p>
2275 <span class="quote"><span class="quote">Interoperability</span></span> is the ability of two technologies to work with
2276 one another: Anyone can make an LP that will play on any record
2277 player, anyone can make a filter you can install in your stove’s
2278 extractor fan, anyone can make gasoline for your car, anyone can
2279 make a USB phone charger that fits in your car’s cigarette lighter
2280 receptacle, anyone can make a light bulb that works in your light
2281 socket, anyone can make bread that will toast in your toaster.
2282 </p><p>
2283 Interoperability is often a source of innovation and consumer
2284 benefit: Apple made the first commercially successful PC, but
2285 millions of independent software vendors made interoperable programs
2286 that ran on the Apple II Plus. The simple analog antenna inputs on
2287 the back of TVs first allowed cable operators to connect directly to
2288 TVs, then they allowed game console companies and then personal
2289 computer companies to use standard televisions as displays. Standard
2290 RJ-11 telephone jacks allowed for the production of phones from a
2291 variety of vendors in a variety of forms, from the free
2292 football-shaped phone that came with a <span class="emphasis"><em>Sports
2293 Illustrated</em></span> subscription to business phones with
2294 speakers, hold functions, and so on and then answering machines and
2295 finally modems, paving the way for the internet revolution.
2296 </p><p>
2297 <span class="quote"><span class="quote">Interoperability</span></span> is often used interchangeably with
2298 <span class="quote"><span class="quote">standardization,</span></span> which is the process when manufacturers and other
2299 stakeholders hammer out a set of agreed-upon rules for implementing
2300 a technology, such as the electrical plug on your wall, the CAN bus
2301 used by your car’s computer systems, or the HTML instructions that
2302 your browser interprets.
2303 </p><p>
2304 But interoperability doesn’t require standardization — indeed,
2305 standardization often proceeds from the chaos of ad hoc
2306 interoperability measures. The inventor of the cigarette-lighter USB
2307 charger didn’t need to get permission from car manufacturers or even
2308 the manufacturers of the dashboard lighter subcomponent. The
2309 automakers didn’t take any countermeasures to prevent the use of
2310 these aftermarket accessories by their customers, but they also
2311 didn’t do anything to make life easier for the chargers’
2312 manufacturers. This is a kind of <span class="quote"><span class="quote">neutral interoperability.</span></span>
2313 </p><p>
2314 Beyond neutral interoperability, there is <span class="quote"><span class="quote">adversarial
2315 interoperability.</span></span> That’s when a manufacturer makes a product that
2316 interoperates with another manufacturer’s product <span class="emphasis"><em>despite
2317 the second manufacturer’s objections</em></span> and <span class="emphasis"><em>even
2318 if that means bypassing a security system designed to prevent
2319 interoperability</em></span>.
2320 </p><p>
2321 Probably the most familiar form of adversarial interoperability is
2322 third-party printer ink. Printer manufacturers claim that they sell
2323 printers below cost and that the only way they can recoup the losses
2324 they incur is by charging high markups on ink. To prevent the owners
2325 of printers from buying ink elsewhere, the printer companies deploy
2326 a suite of anti-customer security systems that detect and reject
2327 both refilled and third-party cartridges.
2328 </p><p>
2329 Owners of printers take the position that HP and Epson and Brother
2330 are not charities and that customers for their wares have no
2331 obligation to help them survive, and so if the companies choose to
2332 sell their products at a loss, that’s their foolish choice and their
2333 consequences to live with. Likewise, competitors who make ink or
2334 refill kits observe that they don’t owe printer companies anything,
2335 and their erosion of printer companies’ margins are the printer
2336 companies’ problems, not their competitors’. After all, the printer
2337 companies shed no tears when they drive a refiller out of business,
2338 so why should the refillers concern themselves with the economic
2339 fortunes of the printer companies?
2340 </p><p>
2341 Adversarial interoperability has played an outsized role in the
2342 history of the tech industry: from the founding of the <span class="quote"><span class="quote">alt.*</span></span>
2343 Usenet hierarchy (which was started against the wishes of Usenet’s
2344 maintainers and which grew to be bigger than all of Usenet combined)
2345 to the browser wars (when Netscape and Microsoft devoted massive
2346 engineering efforts to making their browsers incompatible with the
2347 other’s special commands and peccadilloes) to Facebook (whose
2348 success was built in part by helping its new users stay in touch
2349 with friends they’d left behind on Myspace because Facebook supplied
2350 them with a tool that scraped waiting messages from Myspace and
2351 imported them into Facebook, effectively creating an Facebook-based
2352 Myspace reader).
2353 </p><p>
2354 Today, incumbency is seen as an unassailable advantage. Facebook is
2355 where all of your friends are, so no one can start a Facebook
2356 competitor. But adversarial compatibility reverses the competitive
2357 advantage: If you were allowed to compete with Facebook by providing
2358 a tool that imported all your users’ waiting Facebook messages into
2359 an environment that competed on lines that Facebook couldn’t cross,
2360 like eliminating surveillance and ads, then Facebook would be at a
2361 huge disadvantage. It would have assembled all possible ex-Facebook
2362 users into a single, easy-to-find service; it would have educated
2363 them on how a Facebook-like service worked and what its potential
2364 benefits were; and it would have provided an easy means for
2365 disgruntled Facebook users to tell their friends where they might
2366 expect better treatment.
2367 </p><p>
2368 Adversarial interoperability was once the norm and a key contributor
2369 to the dynamic, vibrant tech scene, but now it is stuck behind a
2370 thicket of laws and regulations that add legal risks to the
2371 tried-and-true tactics of adversarial interoperability. New rules
2372 and new interpretations of existing rules mean that a would-be
2373 adversarial interoperator needs to steer clear of claims under
2374 copyright, terms of service, trade secrecy, tortious interference,
2375 and patent.
2376 </p><p>
2377 In the absence of a competitive market, lawmakers have resorted to
2378 assigning expensive, state-like duties to Big Tech firms, such as
2379 automatically filtering user contributions for copyright
2380 infringement or terrorist and extremist content or detecting and
2381 preventing harassment in real time or controlling access to sexual
2382 material.
2383 </p><p>
2384 These measures put a floor under how small we can make Big Tech
2385 because only the very largest companies can afford the humans and
2386 automated filters needed to perform these duties.
2387 </p><p>
2388 But that’s not the only way in which making platforms responsible
2389 for policing their users undermines competition. A platform that is
2390 expected to police its users’ conduct must prevent many vital
2391 adversarial interoperability techniques lest these subvert its
2392 policing measures. For example, if someone using a Twitter
2393 replacement like Mastodon is able to push messages into Twitter and
2394 read messages out of Twitter, they could avoid being caught by
2395 automated systems that detect and prevent harassment (such as
2396 systems that use the timing of messages or IP-based rules to make
2397 guesses about whether someone is a harasser).
2398 </p><p>
2399 To the extent that we are willing to let Big Tech police itself —
2400 rather than making Big Tech small enough that users can leave bad
2401 platforms for better ones and small enough that a regulation that
2402 simply puts a platform out of business will not destroy billions of
2403 users’ access to their communities and data — we build the case that
2404 Big Tech should be able to block its competitors and make it easier
2405 for Big Tech to demand legal enforcement tools to ban and punish
2406 attempts at adversarial interoperability.
2407 </p><p>
2408 Ultimately, we can try to fix Big Tech by making it responsible for
2409 bad acts by its users, or we can try to fix the internet by cutting
2410 Big Tech down to size. But we can’t do both. To replace today’s
2411 giant products with pluralistic protocols, we need to clear the
2412 legal thicket that prevents adversarial interoperability so that
2413 tomorrow’s nimble, personal, small-scale products can federate
2414 themselves with giants like Facebook, allowing the users who’ve left
2415 to continue to communicate with users who haven’t left yet, reaching
2416 tendrils over Facebook’s garden wall that Facebook’s trapped users
2417 can use to scale the walls and escape to the global, open web.
2418 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="fake-news-is-an-epistemological-crisis"></a>Fake news is an epistemological crisis</h2></div></div></div><p>
2419 Tech is not the only industry that has undergone massive
2420 concentration since the Reagan era. Virtually every major industry —
2421 from oil to newspapers to meatpacking to sea freight to eyewear to
2422 online pornography — has become a clubby oligarchy that just a few
2423 players dominate.
2424 </p><p>
2425 At the same time, every industry has become something of a tech
2426 industry as general-purpose computers and general-purpose networks
2427 and the promise of efficiencies through data-driven analysis infuse
2428 every device, process, and firm with tech.
2429 </p><p>
2430 This phenomenon of industrial concentration is part of a wider story
2431 about wealth concentration overall as a smaller and smaller number
2432 of people own more and more of our world. This concentration of both
2433 wealth and industries means that our political outcomes are
2434 increasingly beholden to the parochial interests of the people and
2435 companies with all the money.
2436 </p><p>
2437 That means that whenever a regulator asks a question with an
2438 obvious, empirical answer (<span class="quote"><span class="quote">Are humans causing climate change?</span></span> or
2439 <span class="quote"><span class="quote">Should we let companies conduct commercial mass surveillance?</span></span> or
2440 <span class="quote"><span class="quote">Does society benefit from allowing network neutrality
2441 violations?</span></span>), the answer that comes out is only correct if that
2442 correctness meets with the approval of rich people and the
2443 industries that made them so wealthy.
2444 </p><p>
2445 Rich people have always played an outsized role in politics and more
2446 so since the Supreme Court’s <span class="emphasis"><em>Citizens United</em></span>
2447 decision eliminated key controls over political spending. Widening
2448 inequality and wealth concentration means that the very richest
2449 people are now a lot richer and can afford to spend a lot more money
2450 on political projects than ever before. Think of the Koch brothers
2451 or George Soros or Bill Gates.
2452 </p><p>
2453 But the policy distortions of rich individuals pale in comparison to
2454 the policy distortions that concentrated industries are capable of.
2455 The companies in highly concentrated industries are much more
2456 profitable than companies in competitive industries — no competition
2457 means not having to reduce prices or improve quality to win
2458 customers — leaving them with bigger capital surpluses to spend on
2459 lobbying.
2460 </p><p>
2461 Concentrated industries also find it easier to collaborate on policy
2462 objectives than competitive ones. When all the top execs from your
2463 industry can fit around a single boardroom table, they often do. And
2464 <span class="emphasis"><em>when</em></span> they do, they can forge a consensus
2465 position on regulation.
2466 </p><p>
2467 Rising through the ranks in a concentrated industry generally means
2468 working at two or three of the big companies. When there are only
2469 relatively few companies in a given industry, each company has a
2470 more ossified executive rank, leaving ambitious execs with fewer
2471 paths to higher positions unless they are recruited to a rival. This
2472 means that the top execs in concentrated industries are likely to
2473 have been colleagues at some point and socialize in the same circles
2474 — connected through social ties or, say, serving as trustees for
2475 each others’ estates. These tight social bonds foster a collegial,
2476 rather than competitive, attitude.
2477 </p><p>
2478 Highly concentrated industries also present a regulatory conundrum.
2479 When an industry is dominated by just four or five companies, the
2480 only people who are likely to truly understand the industry’s
2481 practices are its veteran executives. This means that top regulators
2482 are often former execs of the companies they are supposed to be
2483 regulating. These turns in government are often tacitly understood
2484 to be leaves of absence from industry, with former employers
2485 welcoming their erstwhile watchdogs back into their executive ranks
2486 once their terms have expired.
2487 </p><p>
2488 All this is to say that the tight social bonds, small number of
2489 firms, and regulatory capture of concentrated industries give the
2490 companies that comprise them the power to dictate many, if not all,
2491 of the regulations that bind them.
2492 </p><p>
2493 This is increasingly obvious. Whether it’s payday lenders
2494 <a class="ulink" href="https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/" target="_top">winning
2495 the right to practice predatory lending</a> or Apple
2496 <a class="ulink" href="https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation" target="_top">winning
2497 the right to decide who can fix your phone</a> or Google and
2498 Facebook winning the right to breach your private data without
2499 suffering meaningful consequences or victories for pipeline
2500 companies or impunity for opioid manufacturers or massive tax
2501 subsidies for incredibly profitable dominant businesses, it’s
2502 increasingly apparent that many of our official, evidence-based
2503 truth-seeking processes are, in fact, auctions for sale to the
2504 highest bidder.
2505 </p><p>
2506 It’s really impossible to overstate what a terrifying prospect this
2507 is. We live in an incredibly high-tech society, and none of us could
2508 acquire the expertise to evaluate every technological proposition
2509 that stands between us and our untimely, horrible deaths. You might
2510 devote your life to acquiring the media literacy to distinguish good
2511 scientific journals from corrupt pay-for-play lookalikes and the
2512 statistical literacy to evaluate the quality of the analysis in the
2513 journals as well as the microbiology and epidemiology knowledge to
2514 determine whether you can trust claims about the safety of vaccines
2515 — but that would still leave you unqualified to judge whether the
2516 wiring in your home will give you a lethal shock
2517 <span class="emphasis"><em>and</em></span> whether your car’s brakes’ software will
2518 cause them to fail unpredictably <span class="emphasis"><em>and</em></span> whether
2519 the hygiene standards at your butcher are sufficient to keep you
2520 from dying after you finish your dinner.
2521 </p><p>
2522 In a world as complex as this one, we have to defer to authorities,
2523 and we keep them honest by making those authorities accountable to
2524 us and binding them with rules to prevent conflicts of interest. We
2525 can’t possibly acquire the expertise to adjudicate conflicting
2526 claims about the best way to make the world safe and prosperous, but
2527 we <span class="emphasis"><em>can</em></span> determine whether the adjudication
2528 process itself is trustworthy.
2529 </p><p>
2530 Right now, it’s obviously not.
2531 </p><p>
2532 The past 40 years of rising inequality and industry concentration,
2533 together with increasingly weak accountability and transparency for
2534 expert agencies, has created an increasingly urgent sense of
2535 impending doom, the sense that there are vast conspiracies afoot
2536 that operate with tacit official approval despite the likelihood
2537 they are working to better themselves by ruining the rest of us.
2538 </p><p>
2539 For example, it’s been decades since Exxon’s own scientists
2540 concluded that its products would render the Earth uninhabitable by
2541 humans. And yet those decades were lost to us, in large part because
2542 Exxon lobbied governments and sowed doubt about the dangers of its
2543 products and did so with the cooperation of many public officials.
2544 When the survival of you and everyone you love is threatened by
2545 conspiracies, it’s not unreasonable to start questioning the things
2546 you think you know in an attempt to determine whether they, too, are
2547 the outcome of another conspiracy.
2548 </p><p>
2549 The collapse of the credibility of our systems for divining and
2550 upholding truths has left us in a state of epistemological chaos.
2551 Once, most of us might have assumed that the system was working and
2552 that our regulations reflected our best understanding of the
2553 empirical truths of the world as they were best understood — now we
2554 have to find our own experts to help us sort the true from the
2555 false.
2556 </p><p>
2557 If you’re like me, you probably believe that vaccines are safe, but
2558 you (like me) probably also can’t explain the microbiology or
2559 statistics. Few of us have the math skills to review the literature
2560 on vaccine safety and describe why their statistical reasoning is
2561 sound. Likewise, few of us can review the stats in the (now
2562 discredited) literature on opioid safety and explain how those stats
2563 were manipulated. Both vaccines and opioids were embraced by medical
2564 authorities, after all, and one is safe while the other could ruin
2565 your life. You’re left with a kind of inchoate constellation of
2566 rules of thumb about which experts you trust to fact-check
2567 controversial claims and then to explain how all those respectable
2568 doctors with their peer-reviewed research on opioid safety
2569 <span class="emphasis"><em>were</em></span> an aberration and then how you know that
2570 the doctors writing about vaccine safety are
2571 <span class="emphasis"><em>not</em></span> an aberration.
2572 </p><p>
2573 I’m 100% certain that vaccinating is safe and effective, but I’m
2574 also at something of a loss to explain exactly,
2575 <span class="emphasis"><em>precisely,</em></span> why I believe this, given all the
2576 corruption I know about and the many times the stamp of certainty
2577 has turned out to be a parochial lie told to further enrich the
2578 super rich.
2579 </p><p>
2580 Fake news — conspiracy theories, racist ideologies, scientific
2581 denialism — has always been with us. What’s changed today is not the
2582 mix of ideas in the public discourse but the popularity of the worst
2583 ideas in that mix. Conspiracy and denial have skyrocketed in
2584 lockstep with the growth of Big Inequality, which has also tracked
2585 the rise of Big Tech and Big Pharma and Big Wrestling and Big Car
2586 and Big Movie Theater and Big Everything Else.
2587 </p><p>
2588 No one can say for certain why this has happened, but the two
2589 dominant camps are idealism (the belief that the people who argue
2590 for these conspiracies have gotten better at explaining them, maybe
2591 with the help of machine-learning tools) or materialism (the ideas
2592 have become more attractive because of material conditions in the
2593 world).
2594 </p><p>
2595 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2596 theorists all my life, and I have not experienced any qualitative
2597 leap in the quality of those arguments.
2598 </p><p>
2599 The major difference is in the world, not the arguments. In a time
2600 where actual conspiracies are commonplace, conspiracy theories
2601 acquire a ring of plausibility.
2602 </p><p>
2603 We have always had disagreements about what’s true, but today, we
2604 have a disagreement over how we know whether something is true. This
2605 is an epistemological crisis, not a crisis over belief. It’s a
2606 crisis over the credibility of our truth-seeking exercises, from
2607 scientific journals (in an era where the biggest journal publishers
2608 have been caught producing pay-to-play journals for junk science) to
2609 regulations (in an era where regulators are routinely cycling in and
2610 out of business) to education (in an era where universities are
2611 dependent on corporate donations to keep their lights on).
2612 </p><p>
2613 Targeting — surveillance capitalism — makes it easier to find people
2614 who are undergoing this epistemological crisis, but it doesn’t
2615 create the crisis. For that, you need to look to corruption.
2616 </p><p>
2617 And, conveniently enough, it’s corruption that allows surveillance
2618 capitalism to grow by dismantling monopoly protections, by
2619 permitting reckless collection and retention of personal data, by
2620 allowing ads to be targeted in secret, and by foreclosing on the
2621 possibility of going somewhere else where you might continue to
2622 enjoy your friends without subjecting yourself to commercial
2623 surveillance.
2624 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="tech-is-different"></a>Tech is different</h2></div></div></div><p>
2625 I reject both iterations of technological exceptionalism. I reject
2626 the idea that tech is uniquely terrible and led by people who are
2627 greedier or worse than the leaders of other industries, and I reject
2628 the idea that tech is so good — or so intrinsically prone to
2629 concentration — that it can’t be blamed for its present-day
2630 monopolistic status.
2631 </p><p>
2632 I think tech is just another industry, albeit one that grew up in
2633 the absence of real monopoly constraints. It may have been first,
2634 but it isn’t the worst nor will it be the last.
2635 </p><p>
2636 But there’s one way in which I <span class="emphasis"><em>am</em></span> a tech
2637 exceptionalist. I believe that online tools are the key to
2638 overcoming problems that are much more urgent than tech
2639 monopolization: climate change, inequality, misogyny, and
2640 discrimination on the basis of race, gender identity, and other
2641 factors. The internet is how we will recruit people to fight those
2642 fights, and how we will coordinate their labor. Tech is not a
2643 substitute for democratic accountability, the rule of law, fairness,
2644 or stability — but it’s a means to achieve these things.
2645 </p><p>
2646 The hard problem of our species is coordination. Everything from
2647 climate change to social change to running a business to making a
2648 family work can be viewed as a collective action problem.
2649 </p><p>
2650 The internet makes it easier than at any time before to find people
2651 who want to work on a project with you — hence the success of free
2652 and open-source software, crowdfunding, and racist terror groups —
2653 and easier than ever to coordinate the work you do.
2654 </p><p>
2655 The internet and the computers we connect to it also possess an
2656 exceptional quality: general-purposeness. The internet is designed
2657 to allow any two parties to communicate any data, using any
2658 protocol, without permission from anyone else. The only production
2659 design we have for computers is the general-purpose, <span class="quote"><span class="quote">Turing
2660 complete</span></span> computer that can run every program we can express in
2661 symbolic logic.
2662 </p><p>
2663 This means that every time someone with a special communications
2664 need invests in infrastructure and techniques to make the internet
2665 faster, cheaper, and more robust, this benefit redounds to everyone
2666 else who is using the internet to communicate. And this also means
2667 that every time someone with a special computing need invests to
2668 make computers faster, cheaper, and more robust, every other
2669 computing application is a potential beneficiary of this work.
2670 </p><p>
2671 For these reasons, every type of communication is gradually absorbed
2672 into the internet, and every type of device — from airplanes to
2673 pacemakers — eventually becomes a computer in a fancy case.
2674 </p><p>
2675 While these considerations don’t preclude regulating networks and
2676 computers, they do call for gravitas and caution when doing so
2677 because changes to regulatory frameworks could ripple out to have
2678 unintended consequences in many, many other domains.
2679 </p><p>
2680 The upshot of this is that our best hope of solving the big
2681 coordination problems — climate change, inequality, etc. — is with
2682 free, fair, and open tech. Our best hope of keeping tech free, fair,
2683 and open is to exercise caution in how we regulate tech and to
2684 attend closely to the ways in which interventions to solve one
2685 problem might create problems in other domains.
2686 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="ownership-of-facts"></a>Ownership of facts</h2></div></div></div><p>
2687 Big Tech has a funny relationship with information. When you’re
2688 generating information — anything from the location data streaming
2689 off your mobile device to the private messages you send to friends
2690 on a social network — it claims the rights to make unlimited use of
2691 that data.
2692 </p><p>
2693 But when you have the audacity to turn the tables — to use a tool
2694 that blocks ads or slurps your waiting updates out of a social
2695 network and puts them in another app that lets you set your own
2696 priorities and suggestions or crawls their system to allow you to
2697 start a rival business — they claim that you’re stealing from them.
2698 </p><p>
2699 The thing is, information is a very bad fit for any kind of private
2700 property regime. Property rights are useful for establishing markets
2701 that can lead to the effective development of fallow assets. These
2702 markets depend on clear titles to ensure that the things being
2703 bought and sold in them can, in fact, be bought and sold.
2704 </p><p>
2705 Information rarely has such a clear title. Take phone numbers:
2706 There’s clearly something going wrong when Facebook slurps up
2707 millions of users’ address books and uses the phone numbers it finds
2708 in them to plot out social graphs and fill in missing information
2709 about other users.
2710 </p><p>
2711 But the phone numbers Facebook nonconsensually acquires in this
2712 transaction are not the <span class="quote"><span class="quote">property</span></span> of the users they’re taken from
2713 nor do they belong to the people whose phones ring when you dial
2714 those numbers. The numbers are mere integers, 10 digits in the U.S.
2715 and Canada, and they appear in millions of places, including
2716 somewhere deep in pi as well as numerous other contexts. Giving
2717 people ownership titles to integers is an obviously terrible idea.
2718 </p><p>
2719 Likewise for the facts that Facebook and other commercial
2720 surveillance operators acquire about us, like that we are the
2721 children of our parents or the parents to our children or that we
2722 had a conversation with someone else or went to a public place.
2723 These data points can’t be property in the sense that your house or
2724 your shirt is your property because the title to them is
2725 intrinsically muddy: Does your mom own the fact that she is your
2726 mother? Do you? Do both of you? What about your dad — does he own
2727 this fact too, or does he have to license the fact from you (or your
2728 mom or both of you) in order to use this fact? What about the
2729 hundreds or thousands of other people who know these facts?
2730 </p><p>
2731 If you go to a Black Lives Matter demonstration, do the other
2732 demonstrators need your permission to post their photos from the
2733 event? The online fights over
2734 <a class="ulink" href="https://www.wired.com/story/how-to-take-photos-at-protests/" target="_top">when
2735 and how to post photos from demonstrations</a> reveal a nuanced,
2736 complex issue that cannot be easily hand-waved away by giving one
2737 party a property right that everyone else in the mix has to respect.
2738 </p><p>
2739 The fact that information isn’t a good fit with property and markets
2740 doesn’t mean that it’s not valuable. Babies aren’t property, but
2741 they’re inarguably valuable. In fact, we have a whole set of rules
2742 just for babies as well as a subset of those rules that apply to
2743 humans more generally. Someone who argues that babies won’t be truly
2744 valuable until they can be bought and sold like loaves of bread
2745 would be instantly and rightfully condemned as a monster.
2746 </p><p>
2747 It’s tempting to reach for the property hammer when Big Tech treats
2748 your information like a nail — not least because Big Tech are such
2749 prolific abusers of property hammers when it comes to
2750 <span class="emphasis"><em>their</em></span> information. But this is a mistake. If we
2751 allow markets to dictate the use of our information, then we’ll find
2752 that we’re sellers in a buyers’ market where the Big Tech monopolies
2753 set a price for our data that is so low as to be insignificant or,
2754 more likely, set at a nonnegotiable price of zero in a click-through
2755 agreement that you don’t have the opportunity to modify.
2756 </p><p>
2757 Meanwhile, establishing property rights over information will create
2758 insurmountable barriers to independent data processing. Imagine that
2759 we require a license to be negotiated when a translated document is
2760 compared with its original, something Google has done and continues
2761 to do billions of times to train its automated language translation
2762 tools. Google can afford this, but independent third parties cannot.
2763 Google can staff a clearances department to negotiate one-time
2764 payments to the likes of the EU (one of the major repositories of
2765 translated documents) while independent watchdogs wanting to verify
2766 that the translations are well-prepared, or to root out bias in
2767 translations, will find themselves needing a staffed-up legal
2768 department and millions for licenses before they can even get
2769 started.
2770 </p><p>
2771 The same goes for things like search indexes of the web or photos of
2772 peoples’ houses, which have become contentious thanks to Google’s
2773 Street View project. Whatever problems may exist with Google’s
2774 photographing of street scenes, resolving them by letting people
2775 decide who can take pictures of the facades of their homes from a
2776 public street will surely create even worse ones. Think of how
2777 street photography is important for newsgathering — including
2778 informal newsgathering, like photographing abuses of authority — and
2779 how being able to document housing and street life are important for
2780 contesting eminent domain, advocating for social aid, reporting
2781 planning and zoning violations, documenting discriminatory and
2782 unequal living conditions, and more.
2783 </p><p>
2784 The ownership of facts is antithetical to many kinds of human
2785 progress. It’s hard to imagine a rule that limits Big Tech’s
2786 exploitation of our collective labors without inadvertently banning
2787 people from gathering data on online harassment or compiling indexes
2788 of changes in language or simply investigating how the platforms are
2789 shaping our discourse — all of which require scraping data that
2790 other people have created and subjecting it to scrutiny and
2791 analysis.
2792 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="persuasion-works-slowly"></a>Persuasion works… slowly</h2></div></div></div><p>
2793 The platforms may oversell their ability to persuade people, but
2794 obviously, persuasion works sometimes. Whether it’s the private
2795 realm that LGBTQ people used to recruit allies and normalize sexual
2796 diversity or the decadeslong project to convince people that markets
2797 are the only efficient way to solve complicated resource allocation
2798 problems, it’s clear that our societal attitudes
2799 <span class="emphasis"><em>can</em></span> change.
2800 </p><p>
2801 The project of shifting societal attitudes is a game of inches and
2802 years. For centuries, svengalis have purported to be able to
2803 accelerate this process, but even the most brutal forms of
2804 propaganda have struggled to make permanent changes. Joseph Goebbels
2805 was able to subject Germans to daily, mandatory, hourslong radio
2806 broadcasts, to round up and torture and murder dissidents, and to
2807 seize full control over their children’s education while banning any
2808 literature, broadcasts, or films that did not comport with his
2809 worldview.
2810 </p><p>
2811 Yet, after 12 years of terror, once the war ended, Nazi ideology was
2812 largely discredited in both East and West Germany, and a program of
2813 national truth and reconciliation was put in its place. Racism and
2814 authoritarianism were never fully abolished in Germany, but neither
2815 were the majority of Germans irrevocably convinced of Nazism — and
2816 the rise of racist authoritarianism in Germany today tells us that
2817 the liberal attitudes that replaced Nazism were no more permanent
2818 than Nazism itself.
2819 </p><p>
2820 Racism and authoritarianism have also always been with us. Anyone
2821 who’s reviewed the kind of messages and arguments that racists put
2822 forward today would be hard-pressed to say that they have gotten
2823 better at presenting their ideas. The same pseudoscience, appeals to
2824 fear, and circular logic that racists presented in the 1980s, when
2825 the cause of white supremacy was on the wane, are to be found in the
2826 communications of leading white nationalists today.
2827 </p><p>
2828 If racists haven’t gotten more convincing in the past decade, then
2829 how is it that more people were convinced to be openly racist at
2830 that time? I believe that the answer lies in the material world, not
2831 the world of ideas. The ideas haven’t gotten more convincing, but
2832 people have become more afraid. Afraid that the state can’t be
2833 trusted to act as an honest broker in life-or-death decisions, from
2834 those regarding the management of the economy to the regulation of
2835 painkillers to the rules for handling private information. Afraid
2836 that the world has become a game of musical chairs in which the
2837 chairs are being taken away at a never-before-seen rate. Afraid that
2838 justice for others will come at their expense. Monopolism isn’t the
2839 cause of these fears, but the inequality and material desperation
2840 and policy malpractice that monopolism contributes to is a
2841 significant contributor to these conditions. Inequality creates the
2842 conditions for both conspiracies and violent racist ideologies, and
2843 then surveillance capitalism lets opportunists target the fearful
2844 and the conspiracy-minded.
2845 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="paying-wont-help"></a>Paying won’t help</h2></div></div></div><p>
2846 As the old saw goes, <span class="quote"><span class="quote">If you’re not paying for the product, you’re
2847 the product.</span></span>
2848 </p><p>
2849 It’s a commonplace belief today that the advent of free,
2850 ad-supported media was the original sin of surveillance capitalism.
2851 The reasoning is that the companies that charged for access couldn’t
2852 <span class="quote"><span class="quote">compete with free</span></span> and so they were driven out of business. Their
2853 ad-supported competitors, meanwhile, declared open season on their
2854 users’ data in a bid to improve their ad targeting and make more
2855 money and then resorted to the most sensationalist tactics to
2856 generate clicks on those ads. If only we’d pay for media again, we’d
2857 have a better, more responsible, more sober discourse that would be
2858 better for democracy.
2859 </p><p>
2860 But the degradation of news products long precedes the advent of
2861 ad-supported online news. Long before newspapers were online, lax
2862 antitrust enforcement had opened the door for unprecedented waves of
2863 consolidation and roll-ups in newsrooms. Rival newspapers were
2864 merged, reporters and ad sales staff were laid off, physical plants
2865 were sold and leased back, leaving the companies loaded up with debt
2866 through leveraged buyouts and subsequent profit-taking by the new
2867 owners. In other words, it wasn’t merely shifts in the classified
2868 advertising market, which was long held to be the primary driver in
2869 the decline of the traditional newsroom, that made news companies
2870 unable to adapt to the internet — it was monopolism.
2871 </p><p>
2872 Then, as news companies <span class="emphasis"><em>did</em></span> come online, the ad
2873 revenues they commanded dropped even as the number of internet users
2874 (and thus potential online readers) increased. That shift was a
2875 function of consolidation in the ad sales market, with Google and
2876 Facebook emerging as duopolists who made more money every year from
2877 advertising while paying less and less of it to the publishers whose
2878 work the ads appeared alongside. Monopolism created a buyer’s market
2879 for ad inventory with Facebook and Google acting as gatekeepers.
2880 </p><p>
2881 Paid services continue to exist alongside free ones, and often it is
2882 these paid services — anxious to prevent people from bypassing their
2883 paywalls or sharing paid media with freeloaders — that exert the
2884 most control over their customers. Apple’s iTunes and App Stores are
2885 paid services, but to maximize their profitability, Apple has to
2886 lock its platforms so that third parties can’t make compatible
2887 software without permission. These locks allow the company to
2888 exercise both editorial control (enabling it to exclude
2889 <a class="ulink" href="https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression" target="_top">controversial
2890 political material</a>) and technological control, including
2891 control over who can repair the devices it makes. If we’re worried
2892 that ad-supported products deprive people of their right to
2893 self-determination by using persuasion techniques to nudge their
2894 purchase decisions a few degrees in one direction or the other, then
2895 the near-total control a single company holds over the decision of
2896 who gets to sell you software, parts, and service for your iPhone
2897 should have us very worried indeed.
2898 </p><p>
2899 We shouldn’t just be concerned about payment and control: The idea
2900 that paying will improve discourse is also dangerously wrong. The
2901 poor success rate of targeted advertising means that the platforms
2902 have to incentivize you to <span class="quote"><span class="quote">engage</span></span> with posts at extremely high
2903 levels to generate enough pageviews to safeguard their profits. As
2904 discussed earlier, to increase engagement, platforms like Facebook
2905 use machine learning to guess which messages will be most
2906 inflammatory and make a point of shoving those into your eyeballs at
2907 every turn so that you will hate-click and argue with people.
2908 </p><p>
2909 Perhaps paying would fix this, the reasoning goes. If platforms
2910 could be economically viable even if you stopped clicking on them
2911 once your intellectual and social curiosity had been slaked, then
2912 they would have no reason to algorithmically enrage you to get more
2913 clicks out of you, right?
2914 </p><p>
2915 There may be something to that argument, but it still ignores the
2916 wider economic and political context of the platforms and the world
2917 that allowed them to grow so dominant.
2918 </p><p>
2919 Platforms are world-spanning and all-encompassing because they are
2920 monopolies, and they are monopolies because we have gutted our most
2921 important and reliable anti-monopoly rules. Antitrust was neutered
2922 as a key part of the project to make the wealthy wealthier, and that
2923 project has worked. The vast majority of people on Earth have a
2924 negative net worth, and even the dwindling middle class is in a
2925 precarious state, undersaved for retirement, underinsured for
2926 medical disasters, and undersecured against climate and technology
2927 shocks.
2928 </p><p>
2929 In this wildly unequal world, paying doesn’t improve the discourse;
2930 it simply prices discourse out of the range of the majority of
2931 people. Paying for the product is dandy, if you can afford it.
2932 </p><p>
2933 If you think today’s filter bubbles are a problem for our discourse,
2934 imagine what they’d be like if rich people inhabited free-flowing
2935 Athenian marketplaces of ideas where you have to pay for admission
2936 while everyone else lives in online spaces that are subsidized by
2937 wealthy benefactors who relish the chance to establish
2938 conversational spaces where the <span class="quote"><span class="quote">house rules</span></span> forbid questioning the
2939 status quo. That is, imagine if the rich seceded from Facebook, and
2940 then, instead of running ads that made money for shareholders,
2941 Facebook became a billionaire’s vanity project that also happened to
2942 ensure that nobody talked about whether it was fair that only
2943 billionaires could afford to hang out in the rarified corners of the
2944 internet.
2945 </p><p>
2946 Behind the idea of paying for access is a belief that free markets
2947 will address Big Tech’s dysfunction. After all, to the extent that
2948 people have a view of surveillance at all, it is generally an
2949 unfavorable one, and the longer and more thoroughly one is
2950 surveilled, the less one tends to like it. Same goes for lock-in: If
2951 HP’s ink or Apple’s App Store were really obviously fantastic, they
2952 wouldn’t need technical measures to prevent users from choosing a
2953 rival’s product. The only reason these technical countermeasures
2954 exist is that the companies don’t believe their customers would
2955 <span class="emphasis"><em>voluntarily</em></span> submit to their terms, and they
2956 want to deprive them of the choice to take their business elsewhere.
2957 </p><p>
2958 Advocates for markets laud their ability to aggregate the diffused
2959 knowledge of buyers and sellers across a whole society through
2960 demand signals, price signals, and so on. The argument for
2961 surveillance capitalism being a <span class="quote"><span class="quote">rogue capitalism</span></span> is that
2962 machine-learning-driven persuasion techniques distort
2963 decision-making by consumers, leading to incorrect signals —
2964 consumers don’t buy what they prefer, they buy what they’re tricked
2965 into preferring. It follows that the monopolistic practices of
2966 lock-in, which do far more to constrain consumers’ free choices, are
2967 even more of a <span class="quote"><span class="quote">rogue capitalism.</span></span>
2968 </p><p>
2969 The profitability of any business is constrained by the possibility
2970 that its customers will take their business elsewhere. Both
2971 surveillance and lock-in are anti-features that no customer wants.
2972 But monopolies can capture their regulators, crush their
2973 competitors, insert themselves into their customers’ lives, and
2974 corral people into <span class="quote"><span class="quote">choosing</span></span> their services regardless of whether
2975 they want them — it’s fine to be terrible when there is no
2976 alternative.
2977 </p><p>
2978 Ultimately, surveillance and lock-in are both simply business
2979 strategies that monopolists can choose. Surveillance companies like
2980 Google are perfectly capable of deploying lock-in technologies —
2981 just look at the onerous Android licensing terms that require
2982 device-makers to bundle in Google’s suite of applications. And
2983 lock-in companies like Apple are perfectly capable of subjecting
2984 their users to surveillance if it means keeping the Chinese
2985 government happy and preserving ongoing access to Chinese markets.
2986 Monopolies may be made up of good, ethical people, but as
2987 institutions, they are not your friend — they will do whatever they
2988 can get away with to maximize their profits, and the more
2989 monopolistic they are, the more they <span class="emphasis"><em>can</em></span> get
2990 away with.
2991 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="an-ecology-moment-for-trustbusting"></a>An <span class="quote"><span class="quote">ecology</span></span> moment for trustbusting</h2></div></div></div><p>
2992 If we’re going to break Big Tech’s death grip on our digital lives,
2993 we’re going to have to fight monopolies. That may sound pretty
2994 mundane and old-fashioned, something out of the New Deal era, while
2995 ending the use of automated behavioral modification feels like the
2996 plotline of a really cool cyberpunk novel.
2997 </p><p>
2998 Meanwhile, breaking up monopolies is something we seem to have
2999 forgotten how to do. There is a bipartisan, trans-Atlantic consensus
3000 that breaking up companies is a fool’s errand at best — liable to
3001 mire your federal prosecutors in decades of litigation — and
3002 counterproductive at worst, eroding the <span class="quote"><span class="quote">consumer benefits</span></span> of large
3003 companies with massive efficiencies of scale.
3004 </p><p>
3005 But trustbusters once strode the nation, brandishing law books,
3006 terrorizing robber barons, and shattering the illusion of
3007 monopolies’ all-powerful grip on our society. The trustbusting era
3008 could not begin until we found the political will — until the people
3009 convinced politicians they’d have their backs when they went up
3010 against the richest, most powerful men in the world.
3011 </p><p>
3012 Could we find that political will again?
3013 </p><p>
3014 Copyright scholar James Boyle has described how the term <span class="quote"><span class="quote">ecology</span></span>
3015 marked a turning point in environmental activism. Prior to the
3016 adoption of this term, people who wanted to preserve whale
3017 populations didn’t necessarily see themselves as fighting the same
3018 battle as people who wanted to protect the ozone layer or fight
3019 freshwater pollution or beat back smog or acid rain.
3020 </p><p>
3021 But the term <span class="quote"><span class="quote">ecology</span></span> welded these disparate causes together into a
3022 single movement, and the members of this movement found solidarity
3023 with one another. The people who cared about smog signed petitions
3024 circulated by the people who wanted to end whaling, and the
3025 anti-whalers marched alongside the people demanding action on acid
3026 rain. This uniting behind a common cause completely changed the
3027 dynamics of environmentalism, setting the stage for today’s climate
3028 activism and the sense that preserving the habitability of the
3029 planet Earth is a shared duty among all people.
3030 </p><p>
3031 I believe we are on the verge of a new <span class="quote"><span class="quote">ecology</span></span> moment dedicated to
3032 combating monopolies. After all, tech isn’t the only concentrated
3033 industry nor is it even the <span class="emphasis"><em>most</em></span> concentrated
3034 of industries.
3035 </p><p>
3036 You can find partisans for trustbusting in every sector of the
3037 economy. Everywhere you look, you can find people who’ve been
3038 wronged by monopolists who’ve trashed their finances, their health,
3039 their privacy, their educations, and the lives of people they love.
3040 Those people have the same cause as the people who want to break up
3041 Big Tech and the same enemies. When most of the world’s wealth is in
3042 the hands of a very few, it follows that nearly every large company
3043 will have overlapping shareholders.
3044 </p><p>
3045 That’s the good news: With a little bit of work and a little bit of
3046 coalition building, we have more than enough political will to break
3047 up Big Tech and every other concentrated industry besides. First we
3048 take Facebook, then we take AT&amp;T/WarnerMedia.
3049 </p><p>
3050 But here’s the bad news: Much of what we’re doing to tame Big Tech
3051 <span class="emphasis"><em>instead</em></span> of breaking up the big companies also
3052 forecloses on the possibility of breaking them up later.
3053 </p><p>
3054 Big Tech’s concentration currently means that their inaction on
3055 harassment, for example, leaves users with an impossible choice:
3056 absent themselves from public discourse by, say, quitting Twitter or
3057 endure vile, constant abuse. Big Tech’s over-collection and
3058 over-retention of data results in horrific identity theft. And their
3059 inaction on extremist recruitment means that white supremacists who
3060 livestream their shooting rampages can reach an audience of
3061 billions. The combination of tech concentration and media
3062 concentration means that artists’ incomes are falling even as the
3063 revenue generated by their creations are increasing.
3064 </p><p>
3065 Yet governments confronting all of these problems all inevitably
3066 converge on the same solution: deputize the Big Tech giants to
3067 police their users and render them liable for their users’ bad
3068 actions. The drive to force Big Tech to use automated filters to
3069 block everything from copyright infringement to sex-trafficking to
3070 violent extremism means that tech companies will have to allocate
3071 hundreds of millions to run these compliance systems.
3072 </p><p>
3073 These rules — the EU’s new Directive on Copyright, Australia’s new
3074 terror regulation, America’s FOSTA/SESTA sex-trafficking law and
3075 more — are not just death warrants for small, upstart competitors
3076 that might challenge Big Tech’s dominance but who lack the deep
3077 pockets of established incumbents to pay for all these automated
3078 systems. Worse still, these rules put a floor under how small we can
3079 hope to make Big Tech.
3080 </p><p>
3081 That’s because any move to break up Big Tech and cut it down to size
3082 will have to cope with the hard limit of not making these companies
3083 so small that they can no longer afford to perform these duties —
3084 and it’s <span class="emphasis"><em>expensive</em></span> to invest in those automated
3085 filters and outsource content moderation. It’s already going to be
3086 hard to unwind these deeply concentrated, chimeric behemoths that
3087 have been welded together in the pursuit of monopoly profits. Doing
3088 so while simultaneously finding some way to fill the regulatory void
3089 that will be left behind if these self-policing rulers were forced
3090 to suddenly abdicate will be much, much harder.
3091 </p><p>
3092 Allowing the platforms to grow to their present size has given them
3093 a dominance that is nearly insurmountable — deputizing them with
3094 public duties to redress the pathologies created by their size makes
3095 it virtually impossible to reduce that size. Lather, rinse, repeat:
3096 If the platforms don’t get smaller, they will get larger, and as
3097 they get larger, they will create more problems, which will give
3098 rise to more public duties for the companies, which will make them
3099 bigger still.
3100 </p><p>
3101 We can work to fix the internet by breaking up Big Tech and
3102 depriving them of monopoly profits, or we can work to fix Big Tech
3103 by making them spend their monopoly profits on governance. But we
3104 can’t do both. We have to choose between a vibrant, open internet or
3105 a dominated, monopolized internet commanded by Big Tech giants that
3106 we struggle with constantly to get them to behave themselves.
3107 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="make-big-tech-small-again"></a>Make Big Tech small again</h2></div></div></div><p>
3108 Trustbusting is hard. Breaking big companies into smaller ones is
3109 expensive and time-consuming. So time-consuming that by the time
3110 you’re done, the world has often moved on and rendered years of
3111 litigation irrelevant. From 1969 to 1982, the U.S. government
3112 pursued an antitrust case against IBM over its dominance of
3113 mainframe computing — but the case collapsed in 1982 because
3114 mainframes were being speedily replaced by PCs.
3115 </p><div class="blockquote"><blockquote class="blockquote"><p>
3116 A future U.S. president could simply direct their attorney general
3117 to enforce the law as it was written.
3118 </p></blockquote></div><p>
3119 It’s far easier to prevent concentration than to fix it, and
3120 reinstating the traditional contours of U.S. antitrust enforcement
3121 will, at the very least, prevent further concentration. That means
3122 bans on mergers between large companies, on big companies acquiring
3123 nascent competitors, and on platform companies competing directly
3124 with the companies that rely on the platforms.
3125 </p><p>
3126 These powers are all in the plain language of U.S. antitrust laws,
3127 so in theory, a future U.S. president could simply direct their
3128 attorney general to enforce the law as it was written. But after
3129 decades of judicial <span class="quote"><span class="quote">education</span></span> in the benefits of monopolies, after
3130 multiple administrations that have packed the federal courts with
3131 lifetime-appointed monopoly cheerleaders, it’s not clear that mere
3132 administrative action would do the trick.
3133 </p><p>
3134 If the courts frustrate the Justice Department and the president,
3135 the next stop would be Congress, which could eliminate any doubt
3136 about how antitrust law should be enforced in the U.S. by passing
3137 new laws that boil down to saying, <span class="quote"><span class="quote">Knock it off. We all know what
3138 the Sherman Act says. Robert Bork was a deranged fantasist. For
3139 avoidance of doubt, <span class="emphasis"><em>fuck that guy</em></span>.</span></span> In other
3140 words, the problem with monopolies is
3141 <span class="emphasis"><em>monopolism</em></span> — the concentration of power into
3142 too few hands, which erodes our right to self-determination. If
3143 there is a monopoly, the law wants it gone, period. Sure, get rid of
3144 monopolies that create <span class="quote"><span class="quote">consumer harm</span></span> in the form of higher prices,
3145 but also, <span class="emphasis"><em>get rid of other monopolies, too.</em></span>
3146 </p><p>
3147 But this only prevents things from getting worse. To help them get
3148 better, we will have to build coalitions with other activists in the
3149 anti-monopoly ecology movement — a pluralism movement or a
3150 self-determination movement — and target existing monopolies in
3151 every industry for breakup and structural separation rules that
3152 prevent, for example, the giant eyewear monopolist Luxottica from
3153 dominating both the sale and the manufacture of spectacles.
3154 </p><p>
3155 In an important sense, it doesn’t matter which industry the breakups
3156 begin in. Once they start, shareholders in
3157 <span class="emphasis"><em>every</em></span> industry will start to eye their
3158 investments in monopolists skeptically. As trustbusters ride into
3159 town and start making lives miserable for monopolists, the debate
3160 around every corporate boardroom’s table will shift. People within
3161 corporations who’ve always felt uneasy about monopolism will gain a
3162 powerful new argument to fend off their evil rivals in the corporate
3163 hierarchy: <span class="quote"><span class="quote">If we do it my way, we make less money; if we do it your
3164 way, a judge will fine us billions and expose us to ridicule and
3165 public disapprobation. So even though I get that it would be really
3166 cool to do that merger, lock out that competitor, or buy that little
3167 company and kill it before it can threaten it, we really shouldn’t —
3168 not if we don’t want to get tied to the DOJ’s bumper and get dragged
3169 up and down Trustbuster Road for the next 10 years.</span></span>
3170 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="goto-10"></a>20 GOTO 10</h2></div></div></div><p>
3171 Fixing Big Tech will require a lot of iteration. As cyber lawyer
3172 Lawrence Lessig wrote in his 1999 book, <span class="emphasis"><em>Code and Other
3173 Laws of Cyberspace</em></span>, our lives are regulated by four
3174 forces: law (what’s legal), code (what’s technologically possible),
3175 norms (what’s socially acceptable), and markets (what’s profitable).
3176 </p><p>
3177 If you could wave a wand and get Congress to pass a law that
3178 re-fanged the Sherman Act tomorrow, you could use the impending
3179 breakups to convince venture capitalists to fund competitors to
3180 Facebook, Google, Twitter, and Apple that would be waiting in the
3181 wings after they were cut down to size.
3182 </p><p>
3183 But getting Congress to act will require a massive normative shift,
3184 a mass movement of people who care about monopolies — and pulling
3185 them apart.
3186 </p><p>
3187 Getting people to care about monopolies will take technological
3188 interventions that help them to see what a world free from Big Tech
3189 might look like. Imagine if someone could make a beloved (but
3190 unauthorized) third-party Facebook or Twitter client that dampens
3191 the anxiety-producing algorithmic drumbeat and still lets you talk
3192 to your friends without being spied upon — something that made
3193 social media more sociable and less toxic. Now imagine that it gets
3194 shut down in a brutal legal battle. It’s always easier to convince
3195 people that something must be done to save a thing they love than it
3196 is to excite them about something that doesn’t even exist yet.
3197 </p><p>
3198 Neither tech nor law nor code nor markets are sufficient to reform
3199 Big Tech. But a profitable competitor to Big Tech could bankroll a
3200 legislative push; legal reform can embolden a toolsmith to make a
3201 better tool; the tool can create customers for a potential business
3202 who value the benefits of the internet but want them delivered
3203 without Big Tech; and that business can get funded and divert some
3204 of its profits to legal reform. 20 GOTO 10 (or lather, rinse,
3205 repeat). Do it again, but this time, get farther! After all, this
3206 time you’re starting with weaker Big Tech adversaries, a
3207 constituency that understands things can be better, Big Tech rivals
3208 who’ll help ensure their own future by bankrolling reform, and code
3209 that other programmers can build on to weaken Big Tech even further.
3210 </p><p>
3211 The surveillance capitalism hypothesis — that Big Tech’s products
3212 really work as well as they say they do and that’s why everything is
3213 so screwed up — is way too easy on surveillance and even easier on
3214 capitalism. Companies spy because they believe their own BS, and
3215 companies spy because governments let them, and companies spy
3216 because any advantage from spying is so short-lived and minor that
3217 they have to do more and more of it just to stay in place.
3218 </p><p>
3219 As to why things are so screwed up? Capitalism. Specifically, the
3220 monopolism that creates inequality and the inequality that creates
3221 monopolism. It’s a form of capitalism that rewards sociopaths who
3222 destroy the real economy to inflate the bottom line, and they get
3223 away with it for the same reason companies get away with spying:
3224 because our governments are in thrall to both the ideology that says
3225 monopolies are actually just fine and in thrall to the ideology that
3226 says that in a monopolistic world, you’d better not piss off the
3227 monopolists.
3228 </p><p>
3229 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked
3230 rule begets surveillance. Surveillance isn’t bad because it lets
3231 people manipulate us. It’s bad because it crushes our ability to be
3232 our authentic selves — and because it lets the rich and powerful
3233 figure out who might be thinking of building guillotines and what
3234 dirt they can use to discredit those embryonic guillotine-builders
3235 before they can even get to the lumberyard.
3236 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="up-and-through"></a>Up and through</h2></div></div></div><p>
3237 With all the problems of Big Tech, it’s tempting to imagine solving
3238 the problem by returning to a world without tech at all. Resist that
3239 temptation.
3240 </p><p>
3241 The only way out of our Big Tech problem is up and through. If our
3242 future is not reliant upon high tech, it will be because
3243 civilization has fallen. Big Tech wired together a planetary,
3244 species-wide nervous system that, with the proper reforms and course
3245 corrections, is capable of seeing us through the existential
3246 challenge of our species and planet. Now it’s up to us to seize the
3247 means of computation, putting that electronic nervous system under
3248 democratic, accountable control.
3249 </p><p>
3250 I am, secretly, despite what I have said earlier, a tech
3251 exceptionalist. Not in the sense of thinking that tech should be
3252 given a free pass to monopolize because it has <span class="quote"><span class="quote">economies of scale</span></span>
3253 or some other nebulous feature. I’m a tech exceptionalist because I
3254 believe that getting tech right matters and that getting it wrong
3255 will be an unmitigated catastrophe — and doing it right can give us
3256 the power to work together to save our civilization, our species,
3257 and our planet.
3258 </p></div></div></body></html>