]> pere.pagekite.me Git - text-destroy-surveillance.git/blob - how-to-destroy-surveillance-capitalism-body.rst
Mention French PDF.
[text-destroy-surveillance.git] / how-to-destroy-surveillance-capitalism-body.rst
1 How to Destroy Surveillance Capitalism
2 ======================================
3
4 :author: Cory Doctorow
5 :copyright: © 2020 Cory Doctorow, CC BY-ND
6
7 The net of a thousand lies
8 --------------------------
9
10 The most surprising thing about the rebirth of flat Earthers in the 21st
11 century is just how widespread the evidence against them is. You can
12 understand how, centuries ago, people who’d never gained a high-enough
13 vantage point from which to see the Earth’s curvature might come to the
14 commonsense belief that the flat-seeming Earth was, indeed, flat.
15
16 But today, when elementary schools routinely dangle GoPro cameras from
17 balloons and loft them high enough to photograph the Earth’s curve — to
18 say nothing of the unexceptional sight of the curved Earth from an
19 airplane window — it takes a heroic effort to maintain the belief that
20 the world is flat.
21
22 Likewise for white nationalism and eugenics: In an age where you can
23 become a computational genomics datapoint by swabbing your cheek and
24 mailing it to a gene-sequencing company along with a modest sum of
25 money, “race science” has never been easier to refute.
26
27 We are living through a golden age of both readily available facts and
28 denial of those facts. Terrible ideas that have lingered on the fringes
29 for decades or even centuries have gone mainstream seemingly overnight.
30
31 When an obscure idea gains currency, there are only two things that can
32 explain its ascendance: Either the person expressing that idea has
33 gotten a lot better at stating their case, or the proposition has become
34 harder to deny in the face of mounting evidence. In other words, if we
35 want people to take climate change seriously, we can get a bunch of
36 Greta Thunbergs to make eloquent, passionate arguments from podiums,
37 winning our hearts and minds, or we can wait for flood, fire, broiling
38 sun, and pandemics to make the case for us. In practice, we’ll probably
39 have to do some of both: The more we’re boiling and burning and drowning
40 and wasting away, the easier it will be for the Greta Thunbergs of the
41 world to convince us.
42
43 The arguments for ridiculous beliefs in odious conspiracies like
44 anti-vaccination, climate denial, a flat Earth, and eugenics are no
45 better than they were a generation ago. Indeed, they’re worse because
46 they are being pitched to people who have at least a background
47 awareness of the refuting facts.
48
49 Anti-vax has been around since the first vaccines, but the early
50 anti-vaxxers were pitching people who were less equipped to understand
51 even the most basic ideas from microbiology, and moreover, those people
52 had not witnessed the extermination of mass-murdering diseases like
53 polio, smallpox, and measles. Today’s anti-vaxxers are no more eloquent
54 than their forebears, and they have a much harder job.
55
56 So can these far-fetched conspiracy theorists really be succeeding on
57 the basis of superior arguments?
58
59 Some people think so. Today, there is a widespread belief that machine
60 learning and commercial surveillance can turn even the most
61 fumble-tongued conspiracy theorist into a svengali who can warp your
62 perceptions and win your belief by locating vulnerable people and then
63 pitching them with A.I.-refined arguments that bypass their rational
64 faculties and turn everyday people into flat Earthers, anti-vaxxers, or
65 even Nazis. When the RAND Corporation `blames Facebook for
66 “radicalization” <https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf>`__
67 and when Facebook’s role in spreading coronavirus misinformation is
68 `blamed on its
69 algorithm <https://secure.avaaz.org/campaign/en/facebook_threat_health/>`__,
70 the implicit message is that machine learning and surveillance are
71 causing the changes in our consensus about what’s true.
72
73 After all, in a world where sprawling and incoherent conspiracy theories
74 like Pizzagate and its successor, QAnon, have widespread followings,
75 *something* must be afoot.
76
77 But what if there’s another explanation? What if it’s the material
78 circumstances, and not the arguments, that are making the difference for
79 these conspiracy pitchmen? What if the trauma of living through *real
80 conspiracies* all around us — conspiracies among wealthy people, their
81 lobbyists, and lawmakers to bury inconvenient facts and evidence of
82 wrongdoing (these conspiracies are commonly known as “corruption”) — is
83 making people vulnerable to conspiracy theories?
84
85 If it’s trauma and not contagion — material conditions and not ideology
86 — that is making the difference today and enabling a rise of repulsive
87 misinformation in the face of easily observed facts, that doesn’t mean
88 our computer networks are blameless. They’re still doing the heavy work
89 of locating vulnerable people and guiding them through a series of
90 ever-more-extreme ideas and communities.
91
92 Belief in conspiracy is a raging fire that has done real damage and
93 poses real danger to our planet and species, from epidemics `kicked off
94 by vaccine denial <https://www.cdc.gov/measles/cases-outbreaks.html>`__
95 to genocides `kicked off by racist
96 conspiracies <https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html>`__
97 to planetary meltdown caused by denial-inspired climate inaction. Our
98 world is on fire, and so we have to put the fires out — to figure out
99 how to help people see the truth of the world through the conspiracies
100 they’ve been confused by.
101
102 But firefighting is reactive. We need fire *prevention*. We need to
103 strike at the traumatic material conditions that make people vulnerable
104 to the contagion of conspiracy. Here, too, tech has a role to play.
105
106 There’s no shortage of proposals to address this. From the EU’s
107 `Terrorist Content Regulation <https://edri.org/tag/terreg/>`__, which
108 requires platforms to police and remove “extremist” content, to the U.S.
109 proposals to `force tech companies to spy on their
110 users <https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution>`__
111 and hold them liable `for their users’ bad
112 speech <https://www.natlawreview.com/article/repeal-cda-section-230>`__,
113 there’s a lot of energy to force tech companies to solve the problems
114 they created.
115
116 There’s a critical piece missing from the debate, though. All these
117 solutions assume that tech companies are a fixture, that their dominance
118 over the internet is a permanent fact. Proposals to replace Big Tech
119 with a more diffused, pluralistic internet are nowhere to be found.
120 Worse: The “solutions” on the table today *require* Big Tech to stay big
121 because only the very largest companies can afford to implement the
122 systems these laws demand.
123
124 Figuring out what we want our tech to look like is crucial if we’re
125 going to get out of this mess. Today, we’re at a crossroads where we’re
126 trying to figure out if we want to fix the Big Tech companies that
127 dominate our internet or if we want to fix the internet itself by
128 unshackling it from Big Tech’s stranglehold. We can’t do both, so we
129 have to choose.
130
131 I want us to choose wisely. Taming Big Tech is integral to fixing the
132 Internet, and for that, we need digital rights activism.
133
134 Digital rights activism, a quarter-century on
135 ---------------------------------------------
136
137 Digital rights activism is more than 30 years old now. The Electronic
138 Frontier Foundation turned 30 this year; the Free Software Foundation
139 launched in 1985. For most of the history of the movement, the most
140 prominent criticism leveled against it was that it was irrelevant: The
141 real activist causes were real-world causes (think of the skepticism
142 when `Finland declared broadband a human right in
143 2010 <https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015.>`__),
144 and real-world activism was shoe-leather activism (think of Malcolm
145 Gladwell’s `contempt for
146 “clicktivism” <https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell>`__).
147 But as tech has grown more central to our daily lives, these accusations
148 of irrelevance have given way first to accusations of insincerity (“You
149 only care about tech because you’re `shilling for tech
150 companies <https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/>`__\ ”)
151 to accusations of negligence (“Why didn’t you foresee that tech could be
152 such a destructive force?”). But digital rights activism is right where
153 it’s always been: looking out for the humans in a world where tech is
154 inexorably taking over.
155
156 The latest version of this critique comes in the form of “surveillance
157 capitalism,” a term coined by business professor Shoshana Zuboff in her
158 long and influential 2019 book, *The Age of Surveillance Capitalism: The
159 Fight for a Human Future at the New Frontier of Power*. Zuboff argues
160 that “surveillance capitalism” is a unique creature of the tech industry
161 and that it is unlike any other abusive commercial practice in history,
162 one that is “constituted by unexpected and often illegible mechanisms of
163 extraction, commodification, and control that effectively exile persons
164 from their own behavior while producing new markets of behavioral
165 prediction and modification. Surveillance capitalism challenges
166 democratic norms and departs in key ways from the centuries-long
167 evolution of market capitalism.” It is a new and deadly form of
168 capitalism, a “rogue capitalism,” and our lack of understanding of its
169 unique capabilities and dangers represents an existential, species-wide
170 threat. She’s right that capitalism today threatens our species, and
171 she’s right that tech poses unique challenges to our species and
172 civilization, but she’s really wrong about how tech is different and why
173 it threatens our species.
174
175 What’s more, I think that her incorrect diagnosis will lead us down a
176 path that ends up making Big Tech stronger, not weaker. We need to take
177 down Big Tech, and to do that, we need to start by correctly identifying
178 the problem.
179
180 Tech exceptionalism, then and now
181 ---------------------------------
182
183 Early critics of the digital rights movement — perhaps best represented
184 by campaigning organizations like the Electronic Frontier Foundation,
185 the Free Software Foundation, Public Knowledge, and others that focused
186 on preserving and enhancing basic human rights in the digital realm —
187 damned activists for practicing “tech exceptionalism.” Around the turn
188 of the millennium, serious people ridiculed any claim that tech policy
189 mattered in the “real world.” Claims that tech rules had implications
190 for speech, association, privacy, search and seizure, and fundamental
191 rights and equities were treated as ridiculous, an elevation of the
192 concerns of sad nerds arguing about *Star Trek* on bulletin board
193 systems above the struggles of the Freedom Riders, Nelson Mandela, or
194 the Warsaw ghetto uprising.
195
196 In the decades since, accusations of “tech exceptionalism” have only
197 sharpened as tech’s role in everyday life has expanded: Now that tech
198 has infiltrated every corner of our life and our online lives have been
199 monopolized by a handful of giants, defenders of digital freedoms are
200 accused of carrying water for Big Tech, providing cover for its
201 self-interested negligence (or worse, nefarious plots).
202
203 From my perspective, the digital rights movement has remained stationary
204 while the rest of the world has moved. From the earliest days, the
205 movement’s concern was users and the toolsmiths who provided the code
206 they needed to realize their fundamental rights. Digital rights
207 activists only cared about companies to the extent that companies were
208 acting to uphold users’ rights (or, just as often, when companies were
209 acting so foolishly that they threatened to bring down new rules that
210 would also make it harder for good actors to help users).
211
212 The “surveillance capitalism” critique recasts the digital rights
213 movement in a new light again: not as alarmists who overestimate the
214 importance of their shiny toys nor as shills for big tech but as serene
215 deck-chair rearrangers whose long-standing activism is a liability
216 because it makes them incapable of perceiving novel threats as they
217 continue to fight the last century’s tech battles.
218
219 But tech exceptionalism is a sin no matter who practices it.
220
221 Don’t believe the hype
222 -----------------------
223
224 You’ve probably heard that “if you’re not paying for the product, you’re
225 the product.” As we’ll see below, that’s true, if incomplete. But what
226 is *absolutely* true is that ad-driven Big Tech’s customers are
227 advertisers, and what companies like Google and Facebook sell is their
228 ability to convince *you* to buy stuff. Big Tech’s product is
229 persuasion. The services — social media, search engines, maps,
230 messaging, and more — are delivery systems for persuasion.
231
232 The fear of surveillance capitalism starts from the (correct)
233 presumption that everything Big Tech says about itself is probably a
234 lie. But the surveillance capitalism critique makes an exception for the
235 claims Big Tech makes in its sales literature — the breathless hype in
236 the pitches to potential advertisers online and in ad-tech seminars
237 about the efficacy of its products: It assumes that Big Tech is as good
238 at influencing us as they claim they are when they’re selling
239 influencing products to credulous customers. That’s a mistake because
240 sales literature is not a reliable indicator of a product’s efficacy.
241
242 Surveillance capitalism assumes that because advertisers buy a lot of
243 what Big Tech is selling, Big Tech must be selling something real. But
244 Big Tech’s massive sales could just as easily be the result of a popular
245 delusion or something even more pernicious: monopolistic control over
246 our communications and commerce.
247
248 Being watched changes your behavior, and not for the better. It creates
249 risks for our social progress. Zuboff’s book features beautifully
250 wrought explanations of these phenomena. But Zuboff also claims that
251 surveillance literally robs us of our free will — that when our personal
252 data is mixed with machine learning, it creates a system of persuasion
253 so devastating that we are helpless before it. That is, Facebook uses an
254 algorithm to analyze the data it nonconsensually extracts from your
255 daily life and uses it to customize your feed in ways that get you to
256 buy stuff. It is a mind-control ray out of a 1950s comic book, wielded
257 by mad scientists whose supercomputers guarantee them perpetual and
258 total world domination.
259
260 What is persuasion?
261 -------------------
262
263 To understand why you shouldn’t worry about mind-control rays — but why
264 you *should* worry about surveillance *and* Big Tech — we must start by
265 unpacking what we mean by “persuasion.”
266
267 Google, Facebook, and other surveillance capitalists promise their
268 customers (the advertisers) that if they use machine-learning tools
269 trained on unimaginably large data sets of nonconsensually harvested
270 personal information, they will be able to uncover ways to bypass the
271 rational faculties of the public and direct their behavior, creating a
272 stream of purchases, votes, and other desired outcomes.
273
274 The impact of dominance far exceeds the impact of manipulation and
275 should be central to our analysis and any remedies we seek.
276
277 But there’s little evidence that this is happening. Instead, the
278 predictions that surveillance capitalism delivers to its customers are
279 much less impressive. Rather than finding ways to bypass our rational
280 faculties, surveillance capitalists like Mark Zuckerberg mostly do one
281 or more of three things:
282
283 1. Segmenting
284 ~~~~~~~~~~~~~
285
286 If you’re selling diapers, you have better luck if you pitch them to
287 people in maternity wards. Not everyone who enters or leaves a maternity
288 ward just had a baby, and not everyone who just had a baby is in the
289 market for diapers. But having a baby is a really reliable correlate of
290 being in the market for diapers, and being in a maternity ward is highly
291 correlated with having a baby. Hence diaper ads around maternity wards
292 (and even pitchmen for baby products, who haunt maternity wards with
293 baskets full of freebies).
294
295 Surveillance capitalism is segmenting times a billion. Diaper vendors
296 can go way beyond people in maternity wards (though they can do that,
297 too, with things like location-based mobile ads). They can target you
298 based on whether you’re reading articles about child-rearing, diapers,
299 or a host of other subjects, and data mining can suggest unobvious
300 keywords to advertise against. They can target you based on the articles
301 you’ve recently read. They can target you based on what you’ve recently
302 purchased. They can target you based on whether you receive emails or
303 private messages about these subjects — or even if you speak aloud about
304 them (though Facebook and the like convincingly claim that’s not
305 happening — yet).
306
307 This is seriously creepy.
308
309 But it’s not mind control.
310
311 It doesn’t deprive you of your free will. It doesn’t trick you.
312
313 Think of how surveillance capitalism works in politics. Surveillance
314 capitalist companies sell political operatives the power to locate
315 people who might be receptive to their pitch. Candidates campaigning on
316 finance industry corruption seek people struggling with debt; candidates
317 campaigning on xenophobia seek out racists. Political operatives have
318 always targeted their message whether their intentions were honorable or
319 not: Union organizers set up pitches at factory gates, and white
320 supremacists hand out fliers at John Birch Society meetings.
321
322 But this is an inexact and thus wasteful practice. The union organizer
323 can’t know which worker to approach on the way out of the factory gates
324 and may waste their time on a covert John Birch Society member; the
325 white supremacist doesn’t know which of the Birchers are so delusional
326 that making it to a meeting is as much as they can manage and which ones
327 might be convinced to cross the country to carry a tiki torch through
328 the streets of Charlottesville, Virginia.
329
330 Because targeting improves the yields on political pitches, it can
331 accelerate the pace of political upheaval by making it possible for
332 everyone who has secretly wished for the toppling of an autocrat — or
333 just an 11-term incumbent politician — to find everyone else who feels
334 the same way at very low cost. This has been critical to the rapid
335 crystallization of recent political movements including Black Lives
336 Matter and Occupy Wall Street as well as less savory players like the
337 far-right white nationalist movements that marched in Charlottesville.
338
339 It’s important to differentiate this kind of political organizing from
340 influence campaigns; finding people who secretly agree with you isn’t
341 the same as convincing people to agree with you. The rise of phenomena
342 like nonbinary or otherwise nonconforming gender identities is often
343 characterized by reactionaries as the result of online brainwashing
344 campaigns that convince impressionable people that they have been
345 secretly queer all along.
346
347 But the personal accounts of those who have come out tell a different
348 story where people who long harbored a secret about their gender were
349 emboldened by others coming forward and where people who knew that they
350 were different but lacked a vocabulary for discussing that difference
351 learned the right words from these low-cost means of finding people and
352 learning about their ideas.
353
354 2. Deception
355 ~~~~~~~~~~~~
356
357 Lies and fraud are pernicious, and surveillance capitalism supercharges
358 them through targeting. If you want to sell a fraudulent payday loan or
359 subprime mortgage, surveillance capitalism can help you find people who
360 are both desperate and unsophisticated and thus receptive to your pitch.
361 This accounts for the rise of many phenomena, like multilevel marketing
362 schemes, in which deceptive claims about potential earnings and the
363 efficacy of sales techniques are targeted at desperate people by
364 advertising against search queries that indicate, for example, someone
365 struggling with ill-advised loans.
366
367 Surveillance capitalism also abets fraud by making it easy to locate
368 other people who have been similarly deceived, forming a community of
369 people who reinforce one another’s false beliefs. Think of `the
370 forums <https://www.vulture.com/2020/01/the-dream-podcast-review.html>`__
371 where people who are being victimized by multilevel marketing frauds
372 gather to trade tips on how to improve their luck in peddling the
373 product.
374
375 Sometimes, online deception involves replacing someone’s correct beliefs
376 with incorrect ones, as it does in the anti-vaccination movement, whose
377 victims are often people who start out believing in vaccines but are
378 convinced by seemingly plausible evidence that leads them into the false
379 belief that vaccines are harmful.
380
381 But it’s much more common for fraud to succeed when it doesn’t have to
382 displace a true belief. When my daughter contracted head lice at
383 daycare, one of the daycare workers told me I could get rid of them by
384 treating her hair and scalp with olive oil. I didn’t know anything about
385 head lice, and I assumed that the daycare worker did, so I tried it (it
386 didn’t work, and it doesn’t work). It’s easy to end up with false
387 beliefs when you simply don’t know any better and when those beliefs are
388 conveyed by someone who seems to know what they’re doing.
389
390 This is pernicious and difficult — and it’s also the kind of thing the
391 internet can help guard against by making true information available,
392 especially in a form that exposes the underlying deliberations among
393 parties with sharply divergent views, such as Wikipedia. But it’s not
394 brainwashing; it’s fraud. In the `majority of
395 cases <https://datasociety.net/library/data-voids/>`__, the victims of
396 these fraud campaigns have an informational void filled in the customary
397 way, by consulting a seemingly reliable source. If I look up the length
398 of the Brooklyn Bridge and learn that it is 5,800 feet long, but in
399 reality, it is 5,989 feet long, the underlying deception is a problem,
400 but it’s a problem with a simple remedy. It’s a very different problem
401 from the anti-vax issue in which someone’s true belief is displaced by a
402 false one by means of sophisticated persuasion.
403
404 3. Domination
405 ~~~~~~~~~~~~~
406
407 Surveillance capitalism is the result of monopoly. Monopoly is the
408 cause, and surveillance capitalism and its negative outcomes are the
409 effects of monopoly. I’ll get into this in depth later, but for now,
410 suffice it to say that the tech industry has grown up with a radical
411 theory of antitrust that has allowed companies to grow by merging with
412 their rivals, buying up their nascent competitors, and expanding to
413 control whole market verticals.
414
415 One example of how monopolism aids in persuasion is through dominance:
416 Google makes editorial decisions about its algorithms that determine the
417 sort order of the responses to our queries. If a cabal of fraudsters
418 have set out to trick the world into thinking that the Brooklyn Bridge
419 is 5,800 feet long, and if Google gives a high search rank to this group
420 in response to queries like “How long is the Brooklyn Bridge?” then the
421 first eight or 10 screens’ worth of Google results could be wrong. And
422 since most people don’t go beyond the first couple of results — let
423 alone the first *page* of results — Google’s choice means that many
424 people will be deceived.
425
426 Google’s dominance over search — more than 86% of web searches are
427 performed through Google — means that the way it orders its search
428 results has an outsized effect on public beliefs. Ironically, Google
429 claims this is why it can’t afford to have any transparency in its
430 algorithm design: Google’s search dominance makes the results of its
431 sorting too important to risk telling the world how it arrives at those
432 results lest some bad actor discover a flaw in the ranking system and
433 exploit it to push its point of view to the top of the search results.
434 There’s an obvious remedy to a company that is too big to audit: break
435 it up into smaller pieces.
436
437 Zuboff calls surveillance capitalism a “rogue capitalism” whose
438 data-hoarding and machine-learning techniques rob us of our free will.
439 But influence campaigns that seek to displace existing, correct beliefs
440 with false ones have an effect that is small and temporary while
441 monopolistic dominance over informational systems has massive, enduring
442 effects. Controlling the results to the world’s search queries means
443 controlling access both to arguments and their rebuttals and, thus,
444 control over much of the world’s beliefs. If our concern is how
445 corporations are foreclosing on our ability to make up our own minds and
446 determine our own futures, the impact of dominance far exceeds the
447 impact of manipulation and should be central to our analysis and any
448 remedies we seek.
449
450 4. Bypassing our rational faculties
451 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
452
453 *This* is the good stuff: using machine learning, “dark patterns,”
454 engagement hacking, and other techniques to get us to do things that run
455 counter to our better judgment. This is mind control.
456
457 Some of these techniques have proven devastatingly effective (if only in
458 the short term). The use of countdown timers on a purchase completion
459 page can create a sense of urgency that causes you to ignore the nagging
460 internal voice suggesting that you should shop around or sleep on your
461 decision. The use of people from your social graph in ads can provide
462 “social proof” that a purchase is worth making. Even the auction system
463 pioneered by eBay is calculated to play on our cognitive blind spots,
464 letting us feel like we “own” something because we bid on it, thus
465 encouraging us to bid again when we are outbid to ensure that “our”
466 things stay ours.
467
468 Games are extraordinarily good at this. “Free to play” games manipulate
469 us through many techniques, such as presenting players with a series of
470 smoothly escalating challenges that create a sense of mastery and
471 accomplishment but which sharply transition into a set of challenges
472 that are impossible to overcome without paid upgrades. Add some social
473 proof to the mix — a stream of notifications about how well your friends
474 are faring — and before you know it, you’re buying virtual power-ups to
475 get to the next level.
476
477 Companies have risen and fallen on these techniques, and the “fallen”
478 part is worth paying attention to. In general, living things adapt to
479 stimulus: Something that is very compelling or noteworthy when you first
480 encounter it fades with repetition until you stop noticing it
481 altogether. Consider the refrigerator hum that irritates you when it
482 starts up but disappears into the background so thoroughly that you only
483 notice it when it stops again.
484
485 That’s why behavioral conditioning uses “intermittent reinforcement
486 schedules.” Instead of giving you a steady drip of encouragement or
487 setbacks, games and gamified services scatter rewards on a randomized
488 schedule — often enough to keep you interested and random enough that
489 you can never quite find the pattern that would make it boring.
490
491 Intermittent reinforcement is a powerful behavioral tool, but it also
492 represents a collective action problem for surveillance capitalism. The
493 “engagement techniques” invented by the behaviorists of surveillance
494 capitalist companies are quickly copied across the whole sector so that
495 what starts as a mysteriously compelling fillip in the design of a
496 service—like “pull to refresh” or alerts when someone likes your posts
497 or side quests that your characters get invited to while in the midst of
498 main quests—quickly becomes dully ubiquitous. The
499 impossible-to-nail-down nonpattern of randomized drips from your phone
500 becomes a grey-noise wall of sound as every single app and site starts
501 to make use of whatever seems to be working at the time.
502
503 From the surveillance capitalist’s point of view, our adaptive capacity
504 is like a harmful bacterium that deprives it of its food source — our
505 attention — and novel techniques for snagging that attention are like
506 new antibiotics that can be used to breach our defenses and destroy our
507 self-determination. And there *are* techniques like that. Who can forget
508 the Great Zynga Epidemic, when all of our friends were caught in
509 *FarmVille*\ ’s endless, mindless dopamine loops? But every new
510 attention-commanding technique is jumped on by the whole industry and
511 used so indiscriminately that antibiotic resistance sets in. Given
512 enough repetition, almost all of us develop immunity to even the most
513 powerful techniques — by 2013, two years after Zynga’s peak, its user
514 base had halved.
515
516 Not everyone, of course. Some people never adapt to stimulus, just as
517 some people never stop hearing the hum of the refrigerator. This is why
518 most people who are exposed to slot machines play them for a while and
519 then move on while a small and tragic minority liquidate their kids’
520 college funds, buy adult diapers, and position themselves in front of a
521 machine until they collapse.
522
523 But surveillance capitalism’s margins on behavioral modification suck.
524 Tripling the rate at which someone buys a widget sounds great `unless
525 the base rate is way less than
526 1% <https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957>`__
527 with an improved rate of… still less than 1%. Even penny slot machines
528 pull down pennies for every spin while surveillance capitalism rakes in
529 infinitesimal penny fractions.
530
531 Slot machines’ high returns mean that they can be profitable just by
532 draining the fortunes of the small rump of people who are pathologically
533 vulnerable to them and unable to adapt to their tricks. But surveillance
534 capitalism can’t survive on the fractional pennies it brings down from
535 that vulnerable sliver — that’s why, after the Great Zynga Epidemic had
536 finally burned itself out, the small number of still-addicted players
537 left behind couldn’t sustain it as a global phenomenon. And new powerful
538 attention weapons aren’t easy to find, as is evidenced by the long years
539 since the last time Zynga had a hit. Despite the hundreds of millions of
540 dollars that Zynga has to spend on developing new tools to blast through
541 our adaptation, it has never managed to repeat the lucky accident that
542 let it snag so much of our attention for a brief moment in 2009.
543 Powerhouses like Supercell have fared a little better, but they are rare
544 and throw away many failures for every success.
545
546 The vulnerability of small segments of the population to dramatic,
547 efficient corporate manipulation is a real concern that’s worthy of our
548 attention and energy. But it’s not an existential threat to society.
549
550 If data is the new oil, then surveillance capitalism’s engine has a leak
551 -------------------------------------------------------------------------
552
553 This adaptation problem offers an explanation for one of surveillance
554 capitalism’s most alarming traits: its relentless hunger for data and
555 its endless expansion of data-gathering capabilities through the spread
556 of sensors, online surveillance, and acquisition of data streams from
557 third parties.
558
559 Zuboff observes this phenomenon and concludes that data must be very
560 valuable if surveillance capitalism is so hungry for it. (In her words:
561 “Just as industrial capitalism was driven to the continuous
562 intensification of the means of production, so surveillance capitalists
563 and their market players are now locked into the continuous
564 intensification of the means of behavioral modification and the
565 gathering might of instrumentarian power.”) But what if the voracious
566 appetite is because data has such a short half-life — because people
567 become inured so quickly to new, data-driven persuasion techniques —
568 that the companies are locked in an arms race with our limbic system?
569 What if it’s all a Red Queen’s race where they have to run ever faster —
570 collect ever-more data — just to stay in the same spot?
571
572 Of course, all of Big Tech’s persuasion techniques work in concert with
573 one another, and collecting data is useful beyond mere behavioral
574 trickery.
575
576 If someone wants to recruit you to buy a refrigerator or join a pogrom,
577 they might use profiling and targeting to send messages to people they
578 judge to be good sales prospects. The messages themselves may be
579 deceptive, making claims about things you’re not very knowledgeable
580 about (food safety and energy efficiency or eugenics and historical
581 claims about racial superiority). They might use search engine
582 optimization and/or armies of fake reviewers and commenters and/or paid
583 placement to dominate the discourse so that any search for further
584 information takes you back to their messages. And finally, they may
585 refine the different pitches using machine learning and other techniques
586 to figure out what kind of pitch works best on someone like you.
587
588 Each phase of this process benefits from surveillance: The more data
589 they have, the more precisely they can profile you and target you with
590 specific messages. Think of how you’d sell a fridge if you knew that the
591 warranty on your prospect’s fridge just expired and that they were
592 expecting a tax rebate in April.
593
594 Also, the more data they have, the better they can craft deceptive
595 messages — if I know that you’re into genealogy, I might not try to feed
596 you pseudoscience about genetic differences between “races,” sticking
597 instead to conspiratorial secret histories of “demographic replacement”
598 and the like.
599
600 Facebook also helps you locate people who have the same odious or
601 antisocial views as you. It makes it possible to find other people who
602 want to carry tiki torches through the streets of Charlottesville in
603 Confederate cosplay. It can help you find other people who want to join
604 your militia and go to the border to look for undocumented migrants to
605 terrorize. It can help you find people who share your belief that
606 vaccines are poison and that the Earth is flat.
607
608 There is one way in which targeted advertising uniquely benefits those
609 advocating for socially unacceptable causes: It is invisible. Racism is
610 widely geographically dispersed, and there are few places where racists
611 — and only racists — gather. This is similar to the problem of selling
612 refrigerators in that potential refrigerator purchasers are
613 geographically dispersed and there are few places where you can buy an
614 ad that will be primarily seen by refrigerator customers. But buying a
615 refrigerator is socially acceptable while being a Nazi is not, so you
616 can buy a billboard or advertise in the newspaper sports section for
617 your refrigerator business, and the only potential downside is that your
618 ad will be seen by a lot of people who don’t want refrigerators,
619 resulting in a lot of wasted expense.
620
621 But even if you wanted to advertise your Nazi movement on a billboard or
622 prime-time TV or the sports section, you would struggle to find anyone
623 willing to sell you the space for your ad partly because they disagree
624 with your views and partly because they fear censure (boycott,
625 reputational damage, etc.) from other people who disagree with your
626 views.
627
628 Targeted ads solve this problem: On the internet, every ad unit can be
629 different for every person, meaning that you can buy ads that are only
630 shown to people who appear to be Nazis and not to people who hate Nazis.
631 When there’s spillover — when someone who hates racism is shown a racist
632 recruiting ad — there is some fallout; the platform or publication might
633 get an angry public or private denunciation. But the nature of the risk
634 assumed by an online ad buyer is different than the risks to a
635 traditional publisher or billboard owner who might want to run a Nazi
636 ad.
637
638 Online ads are placed by algorithms that broker between a diverse
639 ecosystem of self-serve ad platforms that anyone can buy an ad through,
640 so the Nazi ad that slips onto your favorite online publication isn’t
641 seen as their moral failing but rather as a failure in some distant,
642 upstream ad supplier. When a publication gets a complaint about an
643 offensive ad that’s appearing in one of its units, it can take some
644 steps to block that ad, but the Nazi might buy a slightly different ad
645 from a different broker serving the same unit. And in any event,
646 internet users increasingly understand that when they see an ad, it’s
647 likely that the advertiser did not choose that publication and that the
648 publication has no idea who its advertisers are.
649
650 These layers of indirection between advertisers and publishers serve as
651 moral buffers: Today’s moral consensus is largely that publishers
652 shouldn’t be held responsible for the ads that appear on their pages
653 because they’re not actively choosing to put those ads there. Because of
654 this, Nazis are able to overcome significant barriers to organizing
655 their movement.
656
657 Data has a complex relationship with domination. Being able to spy on
658 your customers can alert you to their preferences for your rivals and
659 allow you to head off your rivals at the pass.
660
661 More importantly, if you can dominate the information space while also
662 gathering data, then you make other deceptive tactics stronger because
663 it’s harder to break out of the web of deceit you’re spinning.
664 Domination — that is, ultimately becoming a monopoly — and not the data
665 itself is the supercharger that makes every tactic worth pursuing
666 because monopolistic domination deprives your target of an escape route.
667
668 If you’re a Nazi who wants to ensure that your prospects primarily see
669 deceptive, confirming information when they search for more, you can
670 improve your odds by seeding the search terms they use through your
671 initial communications. You don’t need to own the top 10 results for
672 “voter suppression” if you can convince your marks to confine their
673 search terms to “voter fraud,” which throws up a very different set of
674 search results.
675
676 Surveillance capitalists are like stage mentalists who claim that their
677 extraordinary insights into human behavior let them guess the word that
678 you wrote down and folded up in your pocket but who really use shills,
679 hidden cameras, sleight of hand, and brute-force memorization to amaze
680 you.
681
682 Or perhaps they’re more like pick-up artists, the misogynistic cult that
683 promises to help awkward men have sex with women by teaching them
684 “neurolinguistic programming” phrases, body language techniques, and
685 psychological manipulation tactics like “negging” — offering unsolicited
686 negative feedback to women to lower their self-esteem and prick their
687 interest.
688
689 Some pick-up artists eventually manage to convince women to go home with
690 them, but it’s not because these men have figured out how to bypass
691 women’s critical faculties. Rather, pick-up artists’ “success” stories
692 are a mix of women who were incapable of giving consent, women who were
693 coerced, women who were intoxicated, self-destructive women, and a few
694 women who were sober and in command of their faculties but who didn’t
695 realize straightaway that they were with terrible men but rectified the
696 error as soon as they could.
697
698 Pick-up artists *believe* they have figured out a secret back door that
699 bypasses women’s critical faculties, but they haven’t. Many of the
700 tactics they deploy, like negging, became the butt of jokes (just like
701 people joke about bad ad targeting), and there’s a good chance that
702 anyone they try these tactics on will immediately recognize them and
703 dismiss the men who use them as irredeemable losers.
704
705 Pick-up artists are proof that people can believe they have developed a
706 system of mind control *even when it doesn’t work*. Pick-up artists
707 simply exploit the fact that one-in-a-million chances can come through
708 for you if you make a million attempts, and then they assume that the
709 other 999,999 times, they simply performed the technique incorrectly and
710 commit themselves to doing better next time. There’s only one group of
711 people who find pick-up artist lore reliably convincing: other would-be
712 pick-up artists whose anxiety and insecurity make them vulnerable to
713 scammers and delusional men who convince them that if they pay for
714 tutelage and follow instructions, then they will someday succeed.
715 Pick-up artists assume they fail to entice women because they are bad at
716 being pick-up artists, not because pick-up artistry is bullshit. Pick-up
717 artists are bad at selling themselves to women, but they’re much better
718 at selling themselves to men who pay to learn the secrets of pick-up
719 artistry.
720
721 Department store pioneer John Wanamaker is said to have lamented, “Half
722 the money I spend on advertising is wasted; the trouble is I don’t know
723 which half.” The fact that Wanamaker thought that only half of his
724 advertising spending was wasted is a tribute to the persuasiveness of
725 advertising executives, who are *much* better at convincing potential
726 clients to buy their services than they are at convincing the general
727 public to buy their clients’ wares.
728
729 What is Facebook?
730 -----------------
731
732 Facebook is heralded as the origin of all of our modern plagues, and
733 it’s not hard to see why. Some tech companies want to lock their users
734 in but make their money by monopolizing access to the market for apps
735 for their devices and gouging them on prices rather than by spying on
736 them (like Apple). Some companies don’t care about locking in users
737 because they’ve figured out how to spy on them no matter where they are
738 and what they’re doing and can turn that surveillance into money
739 (Google). Facebook alone among the Western tech giants has built a
740 business based on locking in its users *and* spying on them all the
741 time.
742
743 Facebook’s surveillance regime is really without parallel in the Western
744 world. Though Facebook tries to prevent itself from being visible on the
745 public web, hiding most of what goes on there from people unless they’re
746 logged into Facebook, the company has nevertheless booby-trapped the
747 entire web with surveillance tools in the form of Facebook “Like”
748 buttons that web publishers include on their sites to boost their
749 Facebook profiles. Facebook also makes various libraries and other
750 useful code snippets available to web publishers that act as
751 surveillance tendrils on the sites where they’re used, funneling
752 information about visitors to the site — newspapers, dating sites,
753 message boards — to Facebook.
754
755 Big Tech is able to practice surveillance not just because it is tech
756 but because it is *big*.
757
758 Facebook offers similar tools to app developers, so the apps — games,
759 fart machines, business review services, apps for keeping abreast of
760 your kid’s schooling — you use will send information about your
761 activities to Facebook even if you don’t have a Facebook account and
762 even if you don’t download or use Facebook apps. On top of all that,
763 Facebook buys data from third-party brokers on shopping habits, physical
764 location, use of “loyalty” programs, financial transactions, etc., and
765 cross-references that with the dossiers it develops on activity on
766 Facebook and with apps and the public web.
767
768 Though it’s easy to integrate the web with Facebook — linking to news
769 stories and such — Facebook products are generally not available to be
770 integrated back into the web itself. You can embed a tweet in a Facebook
771 post, but if you embed a Facebook post in a tweet, you just get a link
772 back to Facebook and must log in before you can see it. Facebook has
773 used extreme technological and legal countermeasures to prevent rivals
774 from allowing their users to embed Facebook snippets in competing
775 services or to create alternative interfaces to Facebook that merge your
776 Facebook inbox with those of other services that you use.
777
778 And Facebook is incredibly popular, with 2.3 billion claimed users
779 (though many believe this figure to be inflated). Facebook has been used
780 to organize genocidal pogroms, racist riots, anti-vaccination movements,
781 flat Earth cults, and the political lives of some of the world’s
782 ugliest, most brutal autocrats. There are some really alarming things
783 going on in the world, and Facebook is implicated in many of them, so
784 it’s easy to conclude that these bad things are the result of Facebook’s
785 mind-control system, which it rents out to anyone with a few bucks to
786 spend.
787
788 To understand what role Facebook plays in the formulation and
789 mobilization of antisocial movements, we need to understand the dual
790 nature of Facebook.
791
792 Because it has a lot of users and a lot of data about those users,
793 Facebook is a very efficient tool for locating people with hard-to-find
794 traits, the kinds of traits that are widely diffused in the population
795 such that advertisers have historically struggled to find a
796 cost-effective way to reach them. Think back to refrigerators: Most of
797 us only replace our major appliances a few times in our entire lives. If
798 you’re a refrigerator manufacturer or retailer, you have these brief
799 windows in the life of a consumer during which they are pondering a
800 purchase, and you have to somehow reach them. Anyone who’s ever
801 registered a title change after buying a house can attest that appliance
802 manufacturers are incredibly desperate to reach anyone who has even the
803 slenderest chance of being in the market for a new fridge.
804
805 Facebook makes finding people shopping for refrigerators a *lot* easier.
806 It can target ads to people who’ve registered a new home purchase, to
807 people who’ve searched for refrigerator buying advice, to people who
808 have complained about their fridge dying, or any combination thereof. It
809 can even target people who’ve recently bought *other* kitchen appliances
810 on the theory that someone who’s just replaced their stove and
811 dishwasher might be in a fridge-buying kind of mood. The vast majority
812 of people who are reached by these ads will not be in the market for a
813 new fridge, but — crucially — the percentage of people who *are* looking
814 for fridges that these ads reach is *much* larger than it is than for
815 any group that might be subjected to traditional, offline targeted
816 refrigerator marketing.
817
818 Facebook also makes it a lot easier to find people who have the same
819 rare disease as you, which might have been impossible in earlier eras —
820 the closest fellow sufferer might otherwise be hundreds of miles away.
821 It makes it easier to find people who went to the same high school as
822 you even though decades have passed and your former classmates have all
823 been scattered to the four corners of the Earth.
824
825 Facebook also makes it much easier to find people who hold the same rare
826 political beliefs as you. If you’ve always harbored a secret affinity
827 for socialism but never dared utter this aloud lest you be demonized by
828 your neighbors, Facebook can help you discover other people who feel the
829 same way (and it might just demonstrate to you that your affinity is
830 more widespread than you ever suspected). It can make it easier to find
831 people who share your sexual identity. And again, it can help you to
832 understand that what you thought was a shameful secret that affected
833 only you was really a widely shared trait, giving you both comfort and
834 the courage to come out to the people in your life.
835
836 All of this presents a dilemma for Facebook: Targeting makes the
837 company’s ads more effective than traditional ads, but it also lets
838 advertisers see just how effective their ads are. While advertisers are
839 pleased to learn that Facebook ads are more effective than ads on
840 systems with less sophisticated targeting, advertisers can also see that
841 in nearly every case, the people who see their ads ignore them. Or, at
842 best, the ads work on a subconscious level, creating nebulous
843 unmeasurables like “brand recognition.” This means that the price per ad
844 is very low in nearly every case.
845
846 To make things worse, many Facebook groups spark precious little
847 discussion. Your little-league soccer team, the people with the same
848 rare disease as you, and the people you share a political affinity with
849 may exchange the odd flurry of messages at critical junctures, but on a
850 daily basis, there’s not much to say to your old high school chums or
851 other hockey-card collectors.
852
853 With nothing but “organic” discussion, Facebook would not generate
854 enough traffic to sell enough ads to make the money it needs to
855 continually expand by buying up its competitors while returning handsome
856 sums to its investors.
857
858 So Facebook has to gin up traffic by sidetracking its own forums: Every
859 time Facebook’s algorithm injects controversial materials — inflammatory
860 political articles, conspiracy theories, outrage stories — into a group,
861 it can hijack that group’s nominal purpose with its desultory
862 discussions and supercharge those discussions by turning them into
863 bitter, unproductive arguments that drag on and on. Facebook is
864 optimized for engagement, not happiness, and it turns out that automated
865 systems are pretty good at figuring out things that people will get
866 angry about.
867
868 Facebook *can* modify our behavior but only in a couple of trivial ways.
869 First, it can lock in all your friends and family members so that you
870 check and check and check with Facebook to find out what they are up to;
871 and second, it can make you angry and anxious. It can force you to
872 choose between being interrupted constantly by updates — a process that
873 breaks your concentration and makes it hard to be introspective — and
874 staying in touch with your friends. This is a very limited form of mind
875 control, and it can only really make us miserable, angry, and anxious.
876
877 This is why Facebook’s targeting systems — both the ones it shows to
878 advertisers and the ones that let users find people who share their
879 interests — are so next-gen and smooth and easy to use as well as why
880 its message boards have a toolset that seems like it hasn’t changed
881 since the mid-2000s. If Facebook delivered an equally flexible,
882 sophisticated message-reading system to its users, those users could
883 defend themselves against being nonconsensually eyeball-fucked with
884 Donald Trump headlines.
885
886 The more time you spend on Facebook, the more ads it gets to show you.
887 The solution to Facebook’s ads only working one in a thousand times is
888 for the company to try to increase how much time you spend on Facebook
889 by a factor of a thousand. Rather than thinking of Facebook as a company
890 that has figured out how to show you exactly the right ad in exactly the
891 right way to get you to do what its advertisers want, think of it as a
892 company that has figured out how to make you slog through an endless
893 torrent of arguments even though they make you miserable, spending so
894 much time on the site that it eventually shows you at least one ad that
895 you respond to.
896
897 Monopoly and the right to the future tense
898 ------------------------------------------
899
900 Zuboff and her cohort are particularly alarmed at the extent to which
901 surveillance allows corporations to influence our decisions, taking away
902 something she poetically calls “the right to the future tense” — that
903 is, the right to decide for yourself what you will do in the future.
904
905 It’s true that advertising can tip the scales one way or another: When
906 you’re thinking of buying a fridge, a timely fridge ad might end the
907 search on the spot. But Zuboff puts enormous and undue weight on the
908 persuasive power of surveillance-based influence techniques. Most of
909 these don’t work very well, and the ones that do won’t work for very
910 long. The makers of these influence tools are confident they will
911 someday refine them into systems of total control, but they are hardly
912 unbiased observers, and the risks from their dreams coming true are very
913 speculative.
914
915 By contrast, Zuboff is rather sanguine about 40 years of lax antitrust
916 practice that has allowed a handful of companies to dominate the
917 internet, ushering in an information age with, `as one person on Twitter
918 noted <https://twitter.com/tveastman/status/1069674780826071040>`__,
919 five giant websites each filled with screenshots of the other four.
920
921 However, if we are to be alarmed that we might lose the right to choose
922 for ourselves what our future will hold, then monopoly’s nonspeculative,
923 concrete, here-and-now harms should be front and center in our debate
924 over tech policy.
925
926 Start with “digital rights management.” In 1998, Bill Clinton signed the
927 Digital Millennium Copyright Act (DMCA) into law. It’s a complex piece
928 of legislation with many controversial clauses but none more so than
929 Section 1201, the “anti-circumvention” rule.
930
931 This is a blanket ban on tampering with systems that restrict access to
932 copyrighted works. The ban is so thoroughgoing that it prohibits
933 removing a copyright lock even when no copyright infringement takes
934 place. This is by design: The activities that the DMCA’s Section 1201
935 sets out to ban are not copyright infringements; rather, they are legal
936 activities that frustrate manufacturers’ commercial plans.
937
938 For example, Section 1201’s first major application was on DVD players
939 as a means of enforcing the region coding built into those devices.
940 DVD-CCA, the body that standardized DVDs and DVD players, divided the
941 world into six regions and specified that DVD players must check each
942 disc to determine which regions it was authorized to be played in. DVD
943 players would have their own corresponding region (a DVD player bought
944 in the U.S. would be region 1 while one bought in India would be region
945 5). If the player and the disc’s region matched, the player would play
946 the disc; otherwise, it would reject it.
947
948 However, watching a lawfully produced disc in a country other than the
949 one where you purchased it is not copyright infringement — it’s the
950 opposite. Copyright law imposes this duty on customers for a movie: You
951 must go into a store, find a licensed disc, and pay the asking price. Do
952 that — and *nothing else* — and you and copyright are square with one
953 another.
954
955 The fact that a movie studio wants to charge Indians less than Americans
956 or release in Australia later than it releases in the U.K. has no
957 bearing on copyright law. Once you lawfully acquire a DVD, it is no
958 copyright infringement to watch it no matter where you happen to be.
959
960 So DVD and DVD player manufacturers would not be able to use accusations
961 of abetting copyright infringement to punish manufacturers who made
962 noncompliant players that would play discs from any region or repair
963 shops that modified players to let you watch out-of-region discs or
964 software programmers who created programs to let you do this.
965
966 That’s where Section 1201 of the DMCA comes in: By banning tampering
967 with an “access control,” the rule gave manufacturers and rights holders
968 standing to sue competitors who released superior products with lawful
969 features that the market demanded (in this case, region-free players).
970
971 This is an odious scam against consumers, but as time went by, Section
972 1201 grew to encompass a rapidly expanding constellation of devices and
973 services as canny manufacturers have realized certain things:
974
975 - Any device with software in it contains a “copyrighted work” — i.e.,
976 the software.
977 - A device can be designed so that reconfiguring the software requires
978 bypassing an “access control for copyrighted works,” which is a
979 potential felony under Section 1201.
980 - Thus, companies can control their customers’ behavior after they take
981 home their purchases by designing products so that all unpermitted
982 uses require modifications that fall afoul of Section 1201.
983
984 Section 1201 then becomes a means for manufacturers of all descriptions
985 to force their customers to arrange their affairs to benefit the
986 manufacturers’ shareholders instead of themselves.
987
988 This manifests in many ways: from a new generation of inkjet printers
989 that use countermeasures to prevent third-party ink that cannot be
990 bypassed without legal risks to similar systems in tractors that prevent
991 third-party technicians from swapping in the manufacturer’s own parts
992 that are not recognized by the tractor’s control system until it is
993 supplied with a manufacturer’s unlock code.
994
995 Closer to home, Apple’s iPhones use these measures to prevent both
996 third-party service and third-party software installation. This allows
997 Apple to decide when an iPhone is beyond repair and must be shredded and
998 landfilled as opposed to the iPhone’s purchaser. (Apple is notorious for
999 its environmentally catastrophic policy of destroying old electronics
1000 rather than permitting them to be cannibalized for parts.) This is a
1001 very useful power to wield, especially in light of CEO Tim Cook’s
1002 January 2019 warning to investors that the company’s profits are
1003 endangered by customers choosing to hold onto their phones for longer
1004 rather than replacing them.
1005
1006 Apple’s use of copyright locks also allows it to establish a monopoly
1007 over how its customers acquire software for their mobile devices. The
1008 App Store’s commercial terms guarantee Apple a share of all revenues
1009 generated by the apps sold there, meaning that Apple gets paid when you
1010 buy an app from its store and then continues to get paid every time you
1011 buy something using that app. This comes out of the bottom line of
1012 software developers, who must either charge more or accept lower profits
1013 for their products.
1014
1015 Crucially, Apple’s use of copyright locks gives it the power to make
1016 editorial decisions about which apps you may and may not install on your
1017 own device. Apple has used this power to `reject
1018 dictionaries <https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html>`__
1019 for containing obscene words; to `limit political
1020 speech <https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again>`__,
1021 especially from apps that make sensitive political commentary such as an
1022 app that notifies you every time a U.S. drone kills someone somewhere in
1023 the world; and to `object to a
1024 game <https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says>`__
1025 that commented on the Israel-Palestine conflict.
1026
1027 Apple often justifies monopoly power over software installation in the
1028 name of security, arguing that its vetting of apps for its store means
1029 that it can guard its users against apps that contain surveillance code.
1030 But this cuts both ways. In China, the government `ordered Apple to
1031 prohibit the sale of privacy
1032 tools <https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc>`__
1033 like VPNs with the exception of VPNs that had deliberately introduced
1034 flaws designed to let the Chinese state eavesdrop on users. Because
1035 Apple uses technological countermeasures — with legal backstops — to
1036 block customers from installing unauthorized apps, Chinese iPhone owners
1037 cannot readily (or legally) acquire VPNs that would protect them from
1038 Chinese state snooping.
1039
1040 Zuboff calls surveillance capitalism a “rogue capitalism.” Theoreticians
1041 of capitalism claim that its virtue is that it `aggregates information
1042 in the form of consumers’
1043 decisions <https://en.wikipedia.org/wiki/Price_signal>`__, producing
1044 efficient markets. Surveillance capitalism’s supposed power to rob its
1045 victims of their free will through computationally supercharged
1046 influence campaigns means that our markets no longer aggregate
1047 customers’ decisions because we customers no longer decide — we are
1048 given orders by surveillance capitalism’s mind-control rays.
1049
1050 If our concern is that markets cease to function when consumers can no
1051 longer make choices, then copyright locks should concern us at *least*
1052 as much as influence campaigns. An influence campaign might nudge you to
1053 buy a certain brand of phone; but the copyright locks on that phone
1054 absolutely determine where you get it serviced, which apps can run on
1055 it, and when you have to throw it away rather than fixing it.
1056
1057 Search order and the right to the future tense
1058 ----------------------------------------------
1059
1060 Markets are posed as a kind of magic: By discovering otherwise hidden
1061 information conveyed by the free choices of consumers, those consumers’
1062 local knowledge is integrated into a self-correcting system that makes
1063 efficient allocations—more efficient than any computer could calculate.
1064 But monopolies are incompatible with that notion. When you only have one
1065 app store, the owner of the store — not the consumer — decides on the
1066 range of choices. As Boss Tweed once said, “I don’t care who does the
1067 electing, so long as I get to do the nominating.” A monopolized market
1068 is an election whose candidates are chosen by the monopolist.
1069
1070 This ballot rigging is made more pernicious by the existence of
1071 monopolies over search order. Google’s search market share is about 90%.
1072 When Google’s ranking algorithm puts a result for a popular search term
1073 in its top 10, that helps determine the behavior of millions of people.
1074 If Google’s answer to “Are vaccines dangerous?” is a page that rebuts
1075 anti-vax conspiracy theories, then a sizable portion of the public will
1076 learn that vaccines are safe. If, on the other hand, Google sends those
1077 people to a site affirming the anti-vax conspiracies, a sizable portion
1078 of those millions will come away convinced that vaccines are dangerous.
1079
1080 Google’s algorithm is often tricked into serving disinformation as a
1081 prominent search result. But in these cases, Google isn’t persuading
1082 people to change their minds; it’s just presenting something untrue as
1083 fact when the user has no cause to doubt it.
1084
1085 This is true whether the search is for “Are vaccines dangerous?” or
1086 “best restaurants near me.” Most users will never look past the first
1087 page of search results, and when the overwhelming majority of people all
1088 use the same search engine, the ranking algorithm deployed by that
1089 search engine will determine myriad outcomes (whether to adopt a child,
1090 whether to have cancer surgery, where to eat dinner, where to move,
1091 where to apply for a job) to a degree that vastly outstrips any
1092 behavioral outcomes dictated by algorithmic persuasion techniques.
1093
1094 Many of the questions we ask search engines have no empirically correct
1095 answers: “Where should I eat dinner?” is not an objective question. Even
1096 questions that do have correct answers (“Are vaccines dangerous?”) don’t
1097 have one empirically superior source for that answer. Many pages affirm
1098 the safety of vaccines, so which one goes first? Under conditions of
1099 competition, consumers can choose from many search engines and stick
1100 with the one whose algorithmic judgment suits them best, but under
1101 conditions of monopoly, we all get our answers from the same place.
1102
1103 Google’s search dominance isn’t a matter of pure merit: The company has
1104 leveraged many tactics that would have been prohibited under classical,
1105 pre-Ronald-Reagan antitrust enforcement standards to attain its
1106 dominance. After all, this is a company that has developed two major
1107 products: a really good search engine and a pretty good Hotmail clone.
1108 Every other major success it’s had — Android, YouTube, Google Maps, etc.
1109 — has come through an acquisition of a nascent competitor. Many of the
1110 company’s key divisions, such as the advertising technology of
1111 DoubleClick, violate the historical antitrust principle of structural
1112 separation, which forbade firms from owning subsidiaries that competed
1113 with their customers. Railroads, for example, were barred from owning
1114 freight companies that competed with the shippers whose freight they
1115 carried.
1116
1117 If we’re worried about giant companies subverting markets by stripping
1118 consumers of their ability to make free choices, then vigorous antitrust
1119 enforcement seems like an excellent remedy. If we’d denied Google the
1120 right to effect its many mergers, we would also have probably denied it
1121 its total search dominance. Without that dominance, the pet theories,
1122 biases, errors (and good judgment, too) of Google search engineers and
1123 product managers would not have such an outsized effect on consumer
1124 choice.
1125
1126 This goes for many other companies. Amazon, a classic surveillance
1127 capitalist, is obviously the dominant tool for searching Amazon — though
1128 many people find their way to Amazon through Google searches and
1129 Facebook posts — and obviously, Amazon controls Amazon search. That
1130 means that Amazon’s own self-serving editorial choices—like promoting
1131 its own house brands over rival goods from its sellers as well as its
1132 own pet theories, biases, and errors— determine much of what we buy on
1133 Amazon. And since Amazon is the dominant e-commerce retailer outside of
1134 China and since it attained that dominance by buying up both large
1135 rivals and nascent competitors in defiance of historical antitrust
1136 rules, we can blame the monopoly for stripping consumers of their right
1137 to the future tense and the ability to shape markets by making informed
1138 choices.
1139
1140 Not every monopolist is a surveillance capitalist, but that doesn’t mean
1141 they’re not able to shape consumer choices in wide-ranging ways. Zuboff
1142 lauds Apple for its App Store and iTunes Store, insisting that adding
1143 price tags to the features on its platforms has been the secret to
1144 resisting surveillance and thus creating markets. But Apple is the only
1145 retailer allowed to sell on its platforms, and it’s the second-largest
1146 mobile device vendor in the world. The independent software vendors that
1147 sell through Apple’s marketplace accuse the company of the same
1148 surveillance sins as Amazon and other big retailers: spying on its
1149 customers to find lucrative new products to launch, effectively using
1150 independent software vendors as free-market researchers, then forcing
1151 them out of any markets they discover.
1152
1153 Because of its use of copyright locks, Apple’s mobile customers are not
1154 legally allowed to switch to a rival retailer for its apps if they want
1155 to do so on an iPhone. Apple, obviously, is the only entity that gets to
1156 decide how it ranks the results of search queries in its stores. These
1157 decisions ensure that some apps are often installed (because they appear
1158 on page one) and others are never installed (because they appear on page
1159 one million). Apple’s search-ranking design decisions have a vastly more
1160 significant effect on consumer behaviors than influence campaigns
1161 delivered by surveillance capitalism’s ad-serving bots.
1162
1163 Monopolists can afford sleeping pills for watchdogs
1164 ---------------------------------------------------
1165
1166 Only the most extreme market ideologues think that markets can
1167 self-regulate without state oversight. Markets need watchdogs —
1168 regulators, lawmakers, and other elements of democratic control — to
1169 keep them honest. When these watchdogs sleep on the job, then markets
1170 cease to aggregate consumer choices because those choices are
1171 constrained by illegitimate and deceptive activities that companies are
1172 able to get away with because no one is holding them to account.
1173
1174 But this kind of regulatory capture doesn’t come cheap. In competitive
1175 sectors, where rivals are constantly eroding one another’s margins,
1176 individual firms lack the surplus capital to effectively lobby for laws
1177 and regulations that serve their ends.
1178
1179 Many of the harms of surveillance capitalism are the result of weak or
1180 nonexistent regulation. Those regulatory vacuums spring from the power
1181 of monopolists to resist stronger regulation and to tailor what
1182 regulation exists to permit their existing businesses.
1183
1184 Here’s an example: When firms over-collect and over-retain our data,
1185 they are at increased risk of suffering a breach — you can’t leak data
1186 you never collected, and once you delete all copies of that data, you
1187 can no longer leak it. For more than a decade, we’ve lived through an
1188 endless parade of ever-worsening data breaches, each one uniquely
1189 horrible in the scale of data breached and the sensitivity of that data.
1190
1191 But still, firms continue to over-collect and over-retain our data for
1192 three reasons:
1193
1194 **1. They are locked in the aforementioned limbic arms race with our
1195 capacity to shore up our attentional defense systems to resist their new
1196 persuasion techniques.** They’re also locked in an arms race with their
1197 competitors to find new ways to target people for sales pitches. As soon
1198 as they discover a soft spot in our attentional defenses (a
1199 counterintuitive, unobvious way to target potential refrigerator
1200 buyers), the public begins to wise up to the tactic, and their
1201 competitors leap on it, hastening the day in which all potential
1202 refrigerator buyers have been inured to the pitch.
1203
1204 **2. They believe the surveillance capitalism story.** Data is cheap to
1205 aggregate and store, and both proponents and opponents of surveillance
1206 capitalism have assured managers and product designers that if you
1207 collect enough data, you will be able to perform sorcerous acts of mind
1208 control, thus supercharging your sales. Even if you never figure out how
1209 to profit from the data, someone else will eventually offer to buy it
1210 from you to give it a try. This is the hallmark of all economic bubbles:
1211 acquiring an asset on the assumption that someone else will buy it from
1212 you for more than you paid for it, often to sell to someone else at an
1213 even greater price.
1214
1215 **3. The penalties for leaking data are negligible.** Most countries
1216 limit these penalties to actual damages, meaning that consumers who’ve
1217 had their data breached have to show actual monetary harms to get a
1218 reward. In 2014, Home Depot disclosed that it had lost credit-card data
1219 for 53 million of its customers, but it settled the matter by paying
1220 those customers about $0.34 each — and a third of that $0.34 wasn’t even
1221 paid in cash. It took the form of a credit to procure a largely
1222 ineffectual credit-monitoring service.
1223
1224 But the harms from breaches are much more extensive than these
1225 actual-damages rules capture. Identity thieves and fraudsters are wily
1226 and endlessly inventive. All the vast breaches of our century are being
1227 continuously recombined, the data sets merged and mined for new ways to
1228 victimize the people whose data was present in them. Any reasonable,
1229 evidence-based theory of deterrence and compensation for breaches would
1230 not confine damages to actual damages but rather would allow users to
1231 claim these future harms.
1232
1233 However, even the most ambitious privacy rules, such as the EU General
1234 Data Protection Regulation, fall far short of capturing the negative
1235 externalities of the platforms’ negligent over-collection and
1236 over-retention, and what penalties they do provide are not aggressively
1237 pursued by regulators.
1238
1239 This tolerance of — or indifference to — data over-collection and
1240 over-retention can be ascribed in part to the sheer lobbying muscle of
1241 the platforms. They are so profitable that they can handily afford to
1242 divert gigantic sums to fight any real change — that is, change that
1243 would force them to internalize the costs of their surveillance
1244 activities.
1245
1246 And then there’s state surveillance, which the surveillance capitalism
1247 story dismisses as a relic of another era when the big worry was being
1248 jailed for your dissident speech, not having your free will stripped
1249 away with machine learning.
1250
1251 But state surveillance and private surveillance are intimately related.
1252 As we saw when Apple was conscripted by the Chinese government as a
1253 vital collaborator in state surveillance, the only really affordable and
1254 tractable way to conduct mass surveillance on the scale practiced by
1255 modern states — both “free” and autocratic states — is to suborn
1256 commercial services.
1257
1258 Whether it’s Google being used as a location tracking tool by local law
1259 enforcement across the U.S. or the use of social media tracking by the
1260 Department of Homeland Security to build dossiers on participants in
1261 protests against Immigration and Customs Enforcement’s family separation
1262 practices, any hard limits on surveillance capitalism would hamstring
1263 the state’s own surveillance capability. Without Palantir, Amazon,
1264 Google, and other major tech contractors, U.S. cops would not be able to
1265 spy on Black people, ICE would not be able to manage the caging of
1266 children at the U.S. border, and state welfare systems would not be able
1267 to purge their rolls by dressing up cruelty as empiricism and claiming
1268 that poor and vulnerable people are ineligible for assistance. At least
1269 some of the states’ unwillingness to take meaningful action to curb
1270 surveillance should be attributed to this symbiotic relationship. There
1271 is no mass state surveillance without mass commercial surveillance.
1272
1273 Monopolism is key to the project of mass state surveillance. It’s true
1274 that smaller tech firms are apt to be less well-defended than Big Tech,
1275 whose security experts are drawn from the tops of their field and who
1276 are given enormous resources to secure and monitor their systems against
1277 intruders. But smaller firms also have less to protect: fewer users
1278 whose data is more fragmented across more systems and have to be
1279 suborned one at a time by state actors.
1280
1281 A concentrated tech sector that works with authorities is a much more
1282 powerful ally in the project of mass state surveillance than a
1283 fragmented one composed of smaller actors. The U.S. tech sector is small
1284 enough that all of its top executives fit around a single boardroom
1285 table in Trump Tower in 2017, shortly after Trump’s inauguration. Most
1286 of its biggest players bid to win JEDI, the Pentagon’s $10 billion Joint
1287 Enterprise Defense Infrastructure cloud contract. Like other highly
1288 concentrated industries, Big Tech rotates its key employees in and out
1289 of government service, sending them to serve in the Department of
1290 Defense and the White House, then hiring ex-Pentagon and ex-DOD top
1291 staffers and officers to work in their own government relations
1292 departments.
1293
1294 They can even make a good case for doing this: After all, when there are
1295 only four or five big companies in an industry, everyone qualified to
1296 regulate those companies has served as an executive in at least a couple
1297 of them — because, likewise, when there are only five companies in an
1298 industry, everyone qualified for a senior role at any of them is by
1299 definition working at one of the other ones.
1300
1301 While surveillance doesn’t cause monopolies, monopolies certainly
1302 abet surveillance.
1303
1304 Industries that are competitive are fragmented — composed of companies
1305 that are at each other’s throats all the time and eroding one another’s
1306 margins in bids to steal their best customers. This leaves them with
1307 much more limited capital to use to lobby for favorable rules and a much
1308 harder job of getting everyone to agree to pool their resources to
1309 benefit the industry as a whole.
1310
1311 Surveillance combined with machine learning is supposed to be an
1312 existential crisis, a species-defining moment at which our free will is
1313 just a few more advances in the field from being stripped away. I am
1314 skeptical of this claim, but I *do* think that tech poses an existential
1315 threat to our society and possibly our species.
1316
1317 But that threat grows out of monopoly.
1318
1319 One of the consequences of tech’s regulatory capture is that it can
1320 shift liability for poor security decisions onto its customers and the
1321 wider society. It is absolutely normal in tech for companies to
1322 obfuscate the workings of their products, to make them deliberately hard
1323 to understand, and to threaten security researchers who seek to
1324 independently audit those products.
1325
1326 IT is the only field in which this is practiced: No one builds a bridge
1327 or a hospital and keeps the composition of the steel or the equations
1328 used to calculate load stresses a secret. It is a frankly bizarre
1329 practice that leads, time and again, to grotesque security defects on
1330 farcical scales, with whole classes of devices being revealed as
1331 vulnerable long after they are deployed in the field and put into
1332 sensitive places.
1333
1334 The monopoly power that keeps any meaningful consequences for breaches
1335 at bay means that tech companies continue to build terrible products
1336 that are insecure by design and that end up integrated into our lives,
1337 in possession of our data, and connected to our physical world. For
1338 years, Boeing has struggled with the aftermath of a series of bad
1339 technology decisions that made its 737 fleet a global pariah, a rare
1340 instance in which bad tech decisions have been seriously punished in the
1341 market.
1342
1343 These bad security decisions are compounded yet again by the use of
1344 copyright locks to enforce business-model decisions against consumers.
1345 Recall that these locks have become the go-to means for shaping consumer
1346 behavior, making it technically impossible to use third-party ink,
1347 insulin, apps, or service depots in connection with your lawfully
1348 acquired property.
1349
1350 Recall also that these copyright locks are backstopped by legislation
1351 (such as Section 1201 of the DMCA or Article 6 of the 2001 EU Copyright
1352 Directive) that ban tampering with (“circumventing”) them, and these
1353 statutes have been used to threaten security researchers who make
1354 disclosures about vulnerabilities without permission from manufacturers.
1355
1356 This amounts to a manufacturer’s veto over safety warnings and
1357 criticism. While this is far from the legislative intent of the DMCA and
1358 its sister statutes around the world, Congress has not intervened to
1359 clarify the statute nor will it because to do so would run counter to
1360 the interests of powerful, large firms whose lobbying muscle is
1361 unstoppable.
1362
1363 Copyright locks are a double whammy: They create bad security decisions
1364 that can’t be freely investigated or discussed. If markets are supposed
1365 to be machines for aggregating information (and if surveillance
1366 capitalism’s notional mind-control rays are what make it a “rogue
1367 capitalism” because it denies consumers the power to make decisions),
1368 then a program of legally enforced ignorance of the risks of products
1369 makes monopolism even more of a “rogue capitalism” than surveillance
1370 capitalism’s influence campaigns.
1371
1372 And unlike mind-control rays, enforced silence over security is an
1373 immediate, documented problem, and it *does* constitute an existential
1374 threat to our civilization and possibly our species. The proliferation
1375 of insecure devices — especially devices that spy on us and especially
1376 when those devices also can manipulate the physical world by, say,
1377 steering your car or flipping a breaker at a power station — is a kind
1378 of technology debt.
1379
1380 In software design, “technology debt” refers to old, baked-in decisions
1381 that turn out to be bad ones in hindsight. Perhaps a long-ago developer
1382 decided to incorporate a networking protocol made by a vendor that has
1383 since stopped supporting it. But everything in the product still relies
1384 on that superannuated protocol, and so, with each revision, the product
1385 team has to work around this obsolete core, adding compatibility layers,
1386 surrounding it with security checks that try to shore up its defenses,
1387 and so on. These Band-Aid measures compound the debt because every
1388 subsequent revision has to make allowances for *them*, too, like
1389 interest mounting on a predatory subprime loan. And like a subprime
1390 loan, the interest mounts faster than you can hope to pay it off: The
1391 product team has to put so much energy into maintaining this complex,
1392 brittle system that they don’t have any time left over to refactor the
1393 product from the ground up and “pay off the debt” once and for all.
1394
1395 Typically, technology debt results in a technological bankruptcy: The
1396 product gets so brittle and unsustainable that it fails
1397 catastrophically. Think of the antiquated COBOL-based banking and
1398 accounting systems that fell over at the start of the pandemic emergency
1399 when confronted with surges of unemployment claims. Sometimes that ends
1400 the product; sometimes it takes the company down with it. Being caught
1401 in the default of a technology debt is scary and traumatic, just like
1402 losing your house due to bankruptcy is scary and traumatic.
1403
1404 But the technology debt created by copyright locks isn’t individual
1405 debt; it’s systemic. Everyone in the world is exposed to this
1406 over-leverage, as was the case with the 2008 financial crisis. When that
1407 debt comes due — when we face a cascade of security breaches that
1408 threaten global shipping and logistics, the food supply, pharmaceutical
1409 production pipelines, emergency communications, and other critical
1410 systems that are accumulating technology debt in part due to the
1411 presence of deliberately insecure and deliberately unauditable copyright
1412 locks — it will indeed pose an existential risk.
1413
1414 Privacy and monopoly
1415 --------------------
1416
1417 Many tech companies are gripped by an orthodoxy that holds that if they
1418 just gather enough data on enough of our activities, everything else is
1419 possible — the mind control and endless profits. This is an
1420 unfalsifiable hypothesis: If data gives a tech company even a tiny
1421 improvement in behavior prediction and modification, the company
1422 declares that it has taken the first step toward global domination with
1423 no end in sight. If a company *fails* to attain any improvements from
1424 gathering and analyzing data, it declares success to be just around the
1425 corner, attainable once more data is in hand.
1426
1427 Surveillance tech is far from the first industry to embrace a
1428 nonsensical, self-serving belief that harms the rest of the world, and
1429 it is not the first industry to profit handsomely from such a delusion.
1430 Long before hedge-fund managers were claiming (falsely) that they could
1431 beat the S&P 500, there were plenty of other “respectable” industries
1432 that have been revealed as quacks in hindsight. From the makers of
1433 radium suppositories (a real thing!) to the cruel sociopaths who claimed
1434 they could “cure” gay people, history is littered with the formerly
1435 respectable titans of discredited industries.
1436
1437 This is not to say that there’s nothing wrong with Big Tech and its
1438 ideological addiction to data. While surveillance’s benefits are mostly
1439 overstated, its harms are, if anything, *understated*.
1440
1441 There’s real irony here. The belief in surveillance capitalism as a
1442 “rogue capitalism” is driven by the belief that markets wouldn’t
1443 tolerate firms that are gripped by false beliefs. An oil company that
1444 has false beliefs about where the oil is will eventually go broke
1445 digging dry wells after all.
1446
1447 But monopolists get to do terrible things for a long time before they
1448 pay the price. Think of how concentration in the finance sector allowed
1449 the subprime crisis to fester as bond-rating agencies, regulators,
1450 investors, and critics all fell under the sway of a false belief that
1451 complex mathematics could construct “fully hedged” debt instruments that
1452 could not possibly default. A small bank that engaged in this kind of
1453 malfeasance would simply go broke rather than outrunning the inevitable
1454 crisis, perhaps growing so big that it averted it altogether. But large
1455 banks were able to continue to attract investors, and when they finally
1456 *did* come a-cropper, the world’s governments bailed them out. The worst
1457 offenders of the subprime crisis are bigger than they were in 2008,
1458 bringing home more profits and paying their execs even larger sums.
1459
1460 Big Tech is able to practice surveillance not just because it is tech
1461 but because it is *big*. The reason every web publisher embeds a
1462 Facebook “Like” button is that Facebook dominates the internet’s social
1463 media referrals — and every one of those “Like” buttons spies on
1464 everyone who lands on a page that contains them (see also: Google
1465 Analytics embeds, Twitter buttons, etc.).
1466
1467 The reason the world’s governments have been slow to create meaningful
1468 penalties for privacy breaches is that Big Tech’s concentration produces
1469 huge profits that can be used to lobby against those penalties — and Big
1470 Tech’s concentration means that the companies involved are able to
1471 arrive at a unified negotiating position that supercharges the lobbying.
1472
1473 The reason that the smartest engineers in the world want to work for Big
1474 Tech is that Big Tech commands the lion’s share of tech industry jobs.
1475
1476 The reason people who are aghast at Facebook’s and Google’s and Amazon’s
1477 data-handling practices continue to use these services is that all their
1478 friends are on Facebook; Google dominates search; and Amazon has put all
1479 the local merchants out of business.
1480
1481 Competitive markets would weaken the companies’ lobbying muscle by
1482 reducing their profits and pitting them against each other in regulatory
1483 forums. It would give customers other places to go to get their online
1484 services. It would make the companies small enough to regulate and pave
1485 the way to meaningful penalties for breaches. It would let engineers
1486 with ideas that challenged the surveillance orthodoxy raise capital to
1487 compete with the incumbents. It would give web publishers multiple ways
1488 to reach audiences and make the case against Facebook and Google and
1489 Twitter embeds.
1490
1491 In other words, while surveillance doesn’t cause monopolies, monopolies
1492 certainly abet surveillance.
1493
1494 Ronald Reagan, pioneer of tech monopolism
1495 -----------------------------------------
1496
1497 Technology exceptionalism is a sin, whether it’s practiced by
1498 technology’s blind proponents or by its critics. Both of these camps are
1499 prone to explaining away monopolistic concentration by citing some
1500 special characteristic of the tech industry, like network effects or
1501 first-mover advantage. The only real difference between these two groups
1502 is that the tech apologists say monopoly is inevitable so we should just
1503 let tech get away with its abuses while competition regulators in the
1504 U.S. and the EU say monopoly is inevitable so we should punish tech for
1505 its abuses but not try to break up the monopolies.
1506
1507 To understand how tech became so monopolistic, it’s useful to look at
1508 the dawn of the consumer tech industry: 1979, the year the Apple II Plus
1509 launched and became the first successful home computer. That also
1510 happens to be the year that Ronald Reagan hit the campaign trail for the
1511 1980 presidential race — a race he won, leading to a radical shift in
1512 the way that antitrust concerns are handled in America. Reagan’s cohort
1513 of politicians — including Margaret Thatcher in the U.K., Brian Mulroney
1514 in Canada, Helmut Kohl in Germany, and Augusto Pinochet in Chile — went
1515 on to enact similar reforms that eventually spread around the world.
1516
1517 Antitrust’s story began nearly a century before all that with laws like
1518 the Sherman Act, which took aim at monopolists on the grounds that
1519 monopolies were bad in and of themselves — squeezing out competitors,
1520 creating “diseconomies of scale” (when a company is so big that its
1521 constituent parts go awry and it is seemingly helpless to address the
1522 problems), and capturing their regulators to such a degree that they can
1523 get away with a host of evils.
1524
1525 Then came a fabulist named Robert Bork, a former solicitor general who
1526 Reagan appointed to the powerful U.S. Court of Appeals for the D.C.
1527 Circuit and who had created an alternate legislative history of the
1528 Sherman Act and its successors out of whole cloth. Bork insisted that
1529 these statutes were never targeted at monopolies (despite a wealth of
1530 evidence to the contrary, including the transcribed speeches of the
1531 acts’ authors) but, rather, that they were intended to prevent “consumer
1532 harm” — in the form of higher prices.
1533
1534 Bork was a crank, but he was a crank with a theory that rich people
1535 really liked. Monopolies are a great way to make rich people richer by
1536 allowing them to receive “monopoly rents” (that is, bigger profits) and
1537 capture regulators, leading to a weaker, more favorable regulatory
1538 environment with fewer protections for customers, suppliers, the
1539 environment, and workers.
1540
1541 Bork’s theories were especially palatable to the same power brokers who
1542 backed Reagan, and Reagan’s Department of Justice and other agencies
1543 began to incorporate Bork’s antitrust doctrine into their enforcement
1544 decisions (Reagan even put Bork up for a Supreme Court seat, but Bork
1545 flunked the Senate confirmation hearing so badly that, 40 years later,
1546 D.C. insiders use the term “borked” to refer to any catastrophically bad
1547 political performance).
1548
1549 Little by little, Bork’s theories entered the mainstream, and their
1550 backers began to infiltrate the legal education field, even putting on
1551 junkets where members of the judiciary were treated to lavish meals, fun
1552 outdoor activities, and seminars where they were indoctrinated into the
1553 consumer harm theory of antitrust. The more Bork’s theories took hold,
1554 the more money the monopolists were making — and the more surplus
1555 capital they had at their disposal to lobby for even more Borkian
1556 antitrust influence campaigns.
1557
1558 The history of Bork’s antitrust theories is a really good example of the
1559 kind of covertly engineered shifts in public opinion that Zuboff warns
1560 us against, where fringe ideas become mainstream orthodoxy. But Bork
1561 didn’t change the world overnight. He played a very long game, for over
1562 a generation, and he had a tailwind because the same forces that backed
1563 oligarchic antitrust theories also backed many other oligarchic shifts
1564 in public opinion. For example, the idea that taxation is theft, that
1565 wealth is a sign of virtue, and so on — all of these theories meshed to
1566 form a coherent ideology that elevated inequality to a virtue.
1567
1568 Today, many fear that machine learning allows surveillance capitalism to
1569 sell “Bork-as-a-Service,” at internet speeds, so that you can contract a
1570 machine-learning company to engineer *rapid* shifts in public sentiment
1571 without needing the capital to sustain a multipronged, multigenerational
1572 project working at the local, state, national, and global levels in
1573 business, law, and philosophy. I do not believe that such a project is
1574 plausible, though I agree that this is basically what the platforms
1575 claim to be selling. They’re just lying about it. Big Tech lies all the
1576 time, *including* in their sales literature.
1577
1578 The idea that tech forms “natural monopolies” (monopolies that are the
1579 inevitable result of the realities of an industry, such as the
1580 monopolies that accrue the first company to run long-haul phone lines or
1581 rail lines) is belied by tech’s own history: In the absence of
1582 anti-competitive tactics, Google was able to unseat AltaVista and Yahoo;
1583 Facebook was able to head off Myspace. There are some advantages to
1584 gathering mountains of data, but those mountains of data also have
1585 disadvantages: liability (from leaking), diminishing returns (from old
1586 data), and institutional inertia (big companies, like science, progress
1587 one funeral at a time).
1588
1589 Indeed, the birth of the web saw a mass-extinction event for the
1590 existing giant, wildly profitable proprietary technologies that had
1591 capital, network effects, and walls and moats surrounding their
1592 businesses. The web showed that when a new industry is built around a
1593 protocol, rather than a product, the combined might of everyone who uses
1594 the protocol to reach their customers or users or communities outweighs
1595 even the most massive products. CompuServe, AOL, MSN, and a host of
1596 other proprietary walled gardens learned this lesson the hard way: Each
1597 believed it could stay separate from the web, offering “curation” and a
1598 guarantee of consistency and quality instead of the chaos of an open
1599 system. Each was wrong and ended up being absorbed into the public web.
1600
1601 Yes, tech is heavily monopolized and is now closely associated with
1602 industry concentration, but this has more to do with a matter of timing
1603 than its intrinsically monopolistic tendencies. Tech was born at the
1604 moment that antitrust enforcement was being dismantled, and tech fell
1605 into exactly the same pathologies that antitrust was supposed to guard
1606 against. To a first approximation, it is reasonable to assume that
1607 tech’s monopolies are the result of a lack of anti-monopoly action and
1608 not the much-touted unique characteristics of tech, such as network
1609 effects, first-mover advantage, and so on.
1610
1611 In support of this thesis, I offer the concentration that every *other*
1612 industry has undergone over the same period. From professional wrestling
1613 to consumer packaged goods to commercial property leasing to banking to
1614 sea freight to oil to record labels to newspaper ownership to theme
1615 parks, *every* industry has undergone a massive shift toward
1616 concentration. There’s no obvious network effects or first-mover
1617 advantage at play in these industries. However, in every case, these
1618 industries attained their concentrated status through tactics that were
1619 prohibited before Bork’s triumph: merging with major competitors, buying
1620 out innovative new market entrants, horizontal and vertical integration,
1621 and a suite of anti-competitive tactics that were once illegal but are
1622 not any longer.
1623
1624 Again: When you change the laws intended to prevent monopolies and then
1625 monopolies form in exactly the way the law was supposed to prevent, it
1626 is reasonable to suppose that these facts are related. Tech’s
1627 concentration can be readily explained without recourse to radical
1628 theories of network effects — but only if you’re willing to indict
1629 unregulated markets as tending toward monopoly. Just as a lifelong
1630 smoker can give you a hundred reasons why their smoking didn’t cause
1631 their cancer (“It was the environmental toxins”), true believers in
1632 unregulated markets have a whole suite of unconvincing explanations for
1633 monopoly in tech that leave capitalism intact.
1634
1635 Steering with the windshield wipers
1636 -----------------------------------
1637
1638 It’s been 40 years since Bork’s project to rehabilitate monopolies
1639 achieved liftoff, and that is a generation and a half, which is plenty
1640 of time to take a common idea and make it seem outlandish and vice
1641 versa. Before the 1940s, affluent Americans dressed their baby boys in
1642 pink while baby girls wore blue (a “delicate and dainty” color). While
1643 gendered colors are obviously totally arbitrary, many still greet this
1644 news with amazement and find it hard to imagine a time when pink
1645 connoted masculinity.
1646
1647 After 40 years of studiously ignoring antitrust analysis and
1648 enforcement, it’s not surprising that we’ve all but forgotten that
1649 antitrust exists, that in living memory, growth through mergers and
1650 acquisitions were largely prohibited under law, that market-cornering
1651 strategies like vertical integration could land a company in court.
1652
1653 Antitrust is a market society’s steering wheel, the control of first
1654 resort to keep would-be masters of the universe in their lanes. But Bork
1655 and his cohort ripped out our steering wheel 40 years ago. The car is
1656 still barreling along, and so we’re yanking as hard as we can on all the
1657 *other* controls in the car as well as desperately flapping the doors
1658 and rolling the windows up and down in the hopes that one of these other
1659 controls can be repurposed to let us choose where we’re heading before
1660 we careen off a cliff.
1661
1662 It’s like a 1960s science-fiction plot come to life: People stuck in a
1663 “generation ship,” plying its way across the stars, a ship once piloted
1664 by their ancestors; and now, after a great cataclysm, the ship’s crew
1665 have forgotten that they’re in a ship at all and no longer remember
1666 where the control room is. Adrift, the ship is racing toward its
1667 extinction, and unless we can seize the controls and execute emergency
1668 course correction, we’re all headed for a fiery death in the heart of a
1669 sun.
1670
1671 Surveillance still matters
1672 --------------------------
1673
1674 None of this is to minimize the problems with surveillance. Surveillance
1675 matters, and Big Tech’s use of surveillance *is* an existential risk to
1676 our species, but that’s not because surveillance and machine learning
1677 rob us of our free will.
1678
1679 Surveillance has become *much* more efficient thanks to Big Tech. In
1680 1989, the Stasi — the East German secret police — had the whole country
1681 under surveillance, a massive undertaking that recruited one out of
1682 every 60 people to serve as an informant or intelligence operative.
1683
1684 Today, we know that the NSA is spying on a significant fraction of the
1685 entire world’s population, and its ratio of surveillance operatives to
1686 the surveilled is more like 1:10,000 (that’s probably on the low side
1687 since it assumes that every American with top-secret clearance is
1688 working for the NSA on this project — we don’t know how many of those
1689 cleared people are involved in NSA spying, but it’s definitely not all
1690 of them).
1691
1692 How did the ratio of surveillable citizens expand from 1:60 to 1:10,000
1693 in less than 30 years? It’s thanks to Big Tech. Our devices and services
1694 gather most of the data that the NSA mines for its surveillance project.
1695 We pay for these devices and the services they connect to, and then we
1696 painstakingly perform the data-entry tasks associated with logging facts
1697 about our lives, opinions, and preferences. This mass surveillance
1698 project has been largely useless for fighting terrorism: The NSA can
1699 `only point to a single minor success
1700 story <https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html>`__
1701 in which it used its data collection program to foil an attempt by a
1702 U.S. resident to wire a few thousand dollars to an overseas terror
1703 group. It’s ineffective for much the same reason that commercial
1704 surveillance projects are largely ineffective at targeting advertising:
1705 The people who want to commit acts of terror, like people who want to
1706 buy a refrigerator, are extremely rare. If you’re trying to detect a
1707 phenomenon whose base rate is one in a million with an instrument whose
1708 accuracy is only 99%, then every true positive will come at the cost of
1709 9,999 false positives.
1710
1711 Let me explain that again: If one in a million people is a terrorist,
1712 then there will only be about one terrorist in a random sample of one
1713 million people. If your test for detecting terrorists is 99% accurate,
1714 it will identify 10,000 terrorists in your million-person sample (1% of
1715 one million is 10,000). For every true positive, you’ll get 9,999 false
1716 positives.
1717
1718 In reality, the accuracy of algorithmic terrorism detection falls far
1719 short of the 99% mark, as does refrigerator ad targeting. The difference
1720 is that being falsely accused of wanting to buy a fridge is a minor
1721 nuisance while being falsely accused of planning a terror attack can
1722 destroy your life and the lives of everyone you love.
1723
1724 Mass state surveillance is only feasible because of surveillance
1725 capitalism and its extremely low-yield ad-targeting systems, which
1726 require a constant feed of personal data to remain barely viable.
1727 Surveillance capitalism’s primary failure mode is mistargeted ads while
1728 mass state surveillance’s primary failure mode is grotesque human rights
1729 abuses, tending toward totalitarianism.
1730
1731 State surveillance is no mere parasite on Big Tech, sucking up its data
1732 and giving nothing in return. In truth, the two are symbiotes: Big Tech
1733 sucks up our data for spy agencies, and spy agencies ensure that
1734 governments don’t limit Big Tech’s activities so severely that it would
1735 no longer serve the spy agencies’ needs. There is no firm distinction
1736 between state surveillance and surveillance capitalism; they are
1737 dependent on one another.
1738
1739 To see this at work today, look no further than Amazon’s home
1740 surveillance device, the Ring doorbell, and its associated app,
1741 Neighbors. Ring — a product that Amazon acquired and did not develop in
1742 house — makes a camera-enabled doorbell that streams footage from your
1743 front door to your mobile device. The Neighbors app allows you to form a
1744 neighborhood-wide surveillance grid with your fellow Ring owners through
1745 which you can share clips of “suspicious characters.” If you’re thinking
1746 that this sounds like a recipe for letting curtain-twitching racists
1747 supercharge their suspicions of people with brown skin who walk down
1748 their blocks, `you’re
1749 right <https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement>`__.
1750 Ring has become a *de facto,* off-the-books arm of the police without
1751 any of the pesky oversight or rules.
1752
1753 In mid-2019, a series of public records requests revealed that Amazon
1754 had struck confidential deals with more than 400 local law enforcement
1755 agencies through which the agencies would promote Ring and Neighbors and
1756 in exchange get access to footage from Ring cameras. In theory, cops
1757 would need to request this footage through Amazon (and internal
1758 documents reveal that Amazon devotes substantial resources to coaching
1759 cops on how to spin a convincing story when doing so), but in practice,
1760 when a Ring customer turns down a police request, Amazon only requires
1761 the agency to formally request the footage from the company, which it
1762 will then produce.
1763
1764 Ring and law enforcement have found many ways to intertwine their
1765 activities. Ring strikes secret deals to acquire real-time access to 911
1766 dispatch and then streams alarming crime reports to Neighbors users,
1767 which serve as convincers for anyone who’s contemplating a surveillance
1768 doorbell but isn’t sure whether their neighborhood is dangerous enough
1769 to warrant it.
1770
1771 The more the cops buzz-market the surveillance capitalist Ring, the more
1772 surveillance capability the state gets. Cops who rely on private
1773 entities for law-enforcement roles then brief against any controls on
1774 the deployment of that technology while the companies return the favor
1775 by lobbying against rules requiring public oversight of police
1776 surveillance technology. The more the cops rely on Ring and Neighbors,
1777 the harder it will be to pass laws to curb them. The fewer laws there
1778 are against them, the more the cops will rely on them.
1779
1780 Dignity and sanctuary
1781 ---------------------
1782
1783 But even if we could exercise democratic control over our states and
1784 force them to stop raiding surveillance capitalism’s reservoirs of
1785 behavioral data, surveillance capitalism would still harm us.
1786
1787 This is an area where Zuboff shines. Her chapter on “sanctuary” — the
1788 feeling of being unobserved — is a beautiful hymn to introspection,
1789 calmness, mindfulness, and tranquility.
1790
1791 When you are watched, something changes. Anyone who has ever raised a
1792 child knows this. You might look up from your book (or more
1793 realistically, from your phone) and catch your child in a moment of
1794 profound realization and growth, a moment where they are learning
1795 something that is right at the edge of their abilities, requiring their
1796 entire ferocious concentration. For a moment, you’re transfixed,
1797 watching that rare and beautiful moment of focus playing out before your
1798 eyes, and then your child looks up and sees you seeing them, and the
1799 moment collapses. To grow, you need to be and expose your authentic
1800 self, and in that moment, you are vulnerable like a hermit crab
1801 scuttling from one shell to the next. The tender, unprotected tissues
1802 you expose in that moment are too delicate to reveal in the presence of
1803 another, even someone you trust as implicitly as a child trusts their
1804 parent.
1805
1806 In the digital age, our authentic selves are inextricably tied to our
1807 digital lives. Your search history is a running ledger of the questions
1808 you’ve pondered. Your location history is a record of the places you’ve
1809 sought out and the experiences you’ve had there. Your social graph
1810 reveals the different facets of your identity, the people you’ve
1811 connected with.
1812
1813 To be observed in these activities is to lose the sanctuary of your
1814 authentic self.
1815
1816 There’s another way in which surveillance capitalism robs us of our
1817 capacity to be our authentic selves: by making us anxious. Surveillance
1818 capitalism isn’t really a mind-control ray, but you don’t need a
1819 mind-control ray to make someone anxious. After all, another word for
1820 anxiety is agitation, and to make someone experience agitation, you need
1821 merely to agitate them. To poke them and prod them and beep at them and
1822 buzz at them and bombard them on an intermittent schedule that is just
1823 random enough that our limbic systems never quite become inured to it.
1824
1825 Our devices and services are “general purpose” in that they can connect
1826 anything or anyone to anything or anyone else and that they can run any
1827 program that can be written. This means that the distraction rectangles
1828 in our pockets hold our most precious moments with our most beloved
1829 people and their most urgent or time-sensitive communications (from
1830 “running late can you get the kid?” to “doctor gave me bad news and I
1831 need to talk to you RIGHT NOW”) as well as ads for refrigerators and
1832 recruiting messages from Nazis.
1833
1834 All day and all night, our pockets buzz, shattering our concentration
1835 and tearing apart the fragile webs of connection we spin as we think
1836 through difficult ideas. If you locked someone in a cell and agitated
1837 them like this, we’d call it “sleep deprivation torture,” and it would
1838 be `a war crime under the Geneva
1839 Conventions <https://www.youtube.com/watch?v=1SKpRbvnx6g>`__.
1840
1841 Afflicting the afflicted
1842 ------------------------
1843
1844 The effects of surveillance on our ability to be our authentic selves
1845 are not equal for all people. Some of us are lucky enough to live in a
1846 time and place in which all the most important facts of our lives are
1847 widely and roundly socially acceptable and can be publicly displayed
1848 without the risk of social consequence.
1849
1850 But for many of us, this is not true. Recall that in living memory, many
1851 of the ways of being that we think of as socially acceptable today were
1852 once cause for dire social sanction or even imprisonment. If you are 65
1853 years old, you have lived through a time in which people living in “free
1854 societies” could be imprisoned or sanctioned for engaging in homosexual
1855 activity, for falling in love with a person whose skin was a different
1856 color than their own, or for smoking weed.
1857
1858 Today, these activities aren’t just decriminalized in much of the world,
1859 they’re considered normal, and the fallen prohibitions are viewed as
1860 shameful, regrettable relics of the past.
1861
1862 How did we get from prohibition to normalization? Through private,
1863 personal activity: People who were secretly gay or secret pot-smokers or
1864 who secretly loved someone with a different skin color were vulnerable
1865 to retaliation if they made their true selves known and were limited in
1866 how much they could advocate for their own right to exist in the world
1867 and be true to themselves. But because there was a private sphere, these
1868 people could form alliances with their friends and loved ones who did
1869 not share their disfavored traits by having private conversations in
1870 which they came out, disclosing their true selves to the people around
1871 them and bringing them to their cause one conversation at a time.
1872
1873 The right to choose the time and manner of these conversations was key
1874 to their success. It’s one thing to come out to your dad while you’re on
1875 a fishing trip away from the world and another thing entirely to blurt
1876 it out over the Christmas dinner table while your racist Facebook uncle
1877 is there to make a scene.
1878
1879 Without a private sphere, there’s a chance that none of these changes
1880 would have come to pass and that the people who benefited from these
1881 changes would have either faced social sanction for coming out to a
1882 hostile world or would have never been able to reveal their true selves
1883 to the people they love.
1884
1885 The corollary is that, unless you think that our society has attained
1886 social perfection — that your grandchildren in 50 years will ask you to
1887 tell them the story of how, in 2020, every injustice had been righted
1888 and no further change had to be made — then you should expect that right
1889 now, at this minute, there are people you love, whose happiness is key
1890 to your own, who have a secret in their hearts that stops them from ever
1891 being their authentic selves with you. These people are sorrowing and
1892 will go to their graves with that secret sorrow in their hearts, and the
1893 source of that sorrow will be the falsity of their relationship to you.
1894
1895 A private realm is necessary for human progress.
1896
1897 Any data you collect and retain will eventually leak
1898 ----------------------------------------------------
1899
1900 The lack of a private life can rob vulnerable people of the chance to be
1901 their authentic selves and constrain our actions by depriving us of
1902 sanctuary, but there is another risk that is borne by everyone, not just
1903 people with a secret: crime.
1904
1905 Personally identifying information is of very limited use for the
1906 purpose of controlling peoples’ minds, but identity theft — really a
1907 catchall term for a whole constellation of terrible criminal activities
1908 that can destroy your finances, compromise your personal integrity, ruin
1909 your reputation, or even expose you to physical danger — thrives on it.
1910
1911 Attackers are not limited to using data from one breached source,
1912 either. Multiple services have suffered breaches that exposed names,
1913 addresses, phone numbers, passwords, sexual tastes, school grades, work
1914 performance, brushes with the criminal justice system, family details,
1915 genetic information, fingerprints and other biometrics, reading habits,
1916 search histories, literary tastes, pseudonymous identities, and other
1917 sensitive information. Attackers can merge data from these different
1918 breaches to build up extremely detailed dossiers on random subjects and
1919 then use different parts of the data for different criminal purposes.
1920
1921 For example, attackers can use leaked username and password combinations
1922 to hijack whole fleets of commercial vehicles that `have been fitted
1923 with anti-theft GPS trackers and
1924 immobilizers <https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps>`__
1925 or to hijack baby monitors in order to `terrorize toddlers with the
1926 audio tracks from
1927 pornography <https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550>`__.
1928 Attackers use leaked data to trick phone companies into giving them your
1929 phone number, then they intercept SMS-based two-factor authentication
1930 codes in order to take over your email, bank account, and/or
1931 cryptocurrency wallets.
1932
1933 Attackers are endlessly inventive in the pursuit of creative ways to
1934 weaponize leaked data. One common use of leaked data is to penetrate
1935 companies in order to access *more* data.
1936
1937 Like spies, online fraudsters are totally dependent on companies
1938 over-collecting and over-retaining our data. Spy agencies sometimes pay
1939 companies for access to their data or intimidate them into giving it up,
1940 but sometimes they work just like criminals do — by `sneaking data out
1941 of companies’
1942 databases <https://www.bbc.com/news/world-us-canada-24751821>`__.
1943
1944 The over-collection of data has a host of terrible social consequences,
1945 from the erosion of our authentic selves to the undermining of social
1946 progress, from state surveillance to an epidemic of online crime.
1947 Commercial surveillance is also a boon to people running influence
1948 campaigns, but that’s the least of our troubles.
1949
1950 Critical tech exceptionalism is still tech exceptionalism
1951 ---------------------------------------------------------
1952
1953 Big Tech has long practiced technology exceptionalism: the idea that it
1954 should not be subject to the mundane laws and norms of “meatspace.”
1955 Mottoes like Facebook’s “move fast and break things” attracted
1956 justifiable scorn of the companies’ self-serving rhetoric.
1957
1958 Tech exceptionalism got us all into a lot of trouble, so it’s ironic and
1959 distressing to see Big Tech’s critics committing the same sin.
1960
1961 Big Tech is not a “rogue capitalism” that cannot be cured through the
1962 traditional anti-monopoly remedies of trustbusting (forcing companies to
1963 divest of competitors they have acquired) and bans on mergers to
1964 monopoly and other anti-competitive tactics. Big Tech does not have the
1965 power to use machine learning to influence our behavior so thoroughly
1966 that markets lose the ability to punish bad actors and reward superior
1967 competitors. Big Tech has no rule-writing mind-control ray that
1968 necessitates ditching our old toolbox.
1969
1970 The thing is, people have been claiming to have perfected mind-control
1971 rays for centuries, and every time, it turned out to be a con — though
1972 sometimes the con artists were also conning themselves.
1973
1974 For generations, the advertising industry has been steadily improving
1975 its ability to sell advertising services to businesses while only making
1976 marginal gains in selling those businesses’ products to prospective
1977 customers. John Wanamaker’s lament that “50% of my advertising budget is
1978 wasted, I just don’t know which 50%” is a testament to the triumph of
1979 *ad executives*, who successfully convinced Wanamaker that only half of
1980 the money he spent went to waste.
1981
1982 The tech industry has made enormous improvements in the science of
1983 convincing businesses that they’re good at advertising while their
1984 actual improvements to advertising — as opposed to targeting — have been
1985 pretty ho-hum. The vogue for machine learning — and the mystical
1986 invocation of “artificial intelligence” as a synonym for straightforward
1987 statistical inference techniques — has greatly boosted the efficacy of
1988 Big Tech’s sales pitch as marketers have exploited potential customers’
1989 lack of technical sophistication to get away with breathtaking acts of
1990 overpromising and underdelivering.
1991
1992 It’s tempting to think that if businesses are willing to pour billions
1993 into a venture that the venture must be a good one. Yet there are plenty
1994 of times when this rule of thumb has led us astray. For example, it’s
1995 virtually unheard of for managed investment funds to outperform simple
1996 index funds, and investors who put their money into the hands of expert
1997 money managers overwhelmingly fare worse than those who entrust their
1998 savings to index funds. But managed funds still account for the majority
1999 of the money invested in the markets, and they are patronized by some of
2000 the richest, most sophisticated investors in the world. Their vote of
2001 confidence in an underperforming sector is a parable about the role of
2002 luck in wealth accumulation, not a sign that managed funds are a good
2003 buy.
2004
2005 The claims of Big Tech’s mind-control system are full of tells that the
2006 enterprise is a con. For example, `the reliance on the “Big Five”
2007 personality
2008 traits <https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full>`__
2009 as a primary means of influencing people even though the “Big Five”
2010 theory is unsupported by any large-scale, peer-reviewed studies and is
2011 `mostly the realm of marketing hucksters and pop
2012 psych <https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/>`__.
2013
2014 Big Tech’s promotional materials also claim that their algorithms can
2015 accurately perform “sentiment analysis” or detect peoples’ moods based
2016 on their “microexpressions,” but `these are marketing claims, not
2017 scientific
2018 ones <https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it>`__.
2019 These methods are largely untested by independent scientific experts,
2020 and where they have been tested, they’ve been found sorely wanting.
2021 Microexpressions are particularly suspect as the companies that
2022 specialize in training people to detect them `have been
2023 shown <https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/>`__
2024 to underperform relative to random chance.
2025
2026 Big Tech has been so good at marketing its own supposed superpowers that
2027 it’s easy to believe that they can market everything else with similar
2028 acumen, but it’s a mistake to believe the hype. Any statement a company
2029 makes about the quality of its products is clearly not impartial. The
2030 fact that we distrust all the things that Big Tech says about its data
2031 handling, compliance with privacy laws, etc., is only reasonable — but
2032 why on Earth would we treat Big Tech’s marketing literature as the
2033 gospel truth? Big Tech lies about just about *everything*, including how
2034 well its machine-learning fueled persuasion systems work.
2035
2036 That skepticism should infuse all of our evaluations of Big Tech and its
2037 supposed abilities, including our perusal of its patents. Zuboff vests
2038 these patents with enormous significance, pointing out that Google
2039 claimed extensive new persuasion capabilities in `its patent
2040 filings <https://patents.google.com/patent/US20050131762A1/en>`__. These
2041 claims are doubly suspect: first, because they are so self-serving, and
2042 second, because the patent itself is so notoriously an invitation to
2043 exaggeration.
2044
2045 Patent applications take the form of a series of claims and range from
2046 broad to narrow. A typical patent starts out by claiming that its
2047 authors have invented a method or system for doing every conceivable
2048 thing that anyone might do, ever, with any tool or device. Then it
2049 narrows that claim in successive stages until we get to the actual
2050 “invention” that is the true subject of the patent. The hope is that the
2051 patent examiner — who is almost certainly overworked and underinformed —
2052 will miss the fact that some or all of these claims are ridiculous, or
2053 at least suspect, and grant the patent’s broader claims. Patents for
2054 unpatentable things are still incredibly useful because they can be
2055 wielded against competitors who might license that patent or steer clear
2056 of its claims rather than endure the lengthy, expensive process of
2057 contesting it.
2058
2059 What’s more, software patents are routinely granted even though the
2060 filer doesn’t have any evidence that they can do the thing claimed by
2061 the patent. That is, you can patent an “invention” that you haven’t
2062 actually made and that you don’t know how to make.
2063
2064 With these considerations in hand, it becomes obvious that the fact that
2065 a Big Tech company has patented what it *says* is an effective
2066 mind-control ray is largely irrelevant to whether Big Tech can in fact
2067 control our minds.
2068
2069 Big Tech collects our data for many reasons, including the diminishing
2070 returns on existing stores of data. But many tech companies also collect
2071 data out of a mistaken tech exceptionalist belief in the network effects
2072 of data. Network effects occur when each new user in a system increases
2073 its value. The classic example is fax machines: A single fax machine is
2074 of no use, two fax machines are of limited use, but every new fax
2075 machine that’s put to use after the first doubles the number of possible
2076 fax-to-fax links.
2077
2078 Data mined for predictive systems doesn’t necessarily produce these
2079 dividends. Think of Netflix: The predictive value of the data mined from
2080 a million English-speaking Netflix viewers is hardly improved by the
2081 addition of one more user’s viewing data. Most of the data Netflix
2082 acquires after that first minimum viable sample duplicates existing data
2083 and produces only minimal gains. Meanwhile, retraining models with new
2084 data gets progressively more expensive as the number of data points
2085 increases, and manual tasks like labeling and validating data do not get
2086 cheaper at scale.
2087
2088 Businesses pursue fads to the detriment of their profits all the time,
2089 especially when the businesses and their investors are not motivated by
2090 the prospect of becoming profitable but rather by the prospect of being
2091 acquired by a Big Tech giant or by having an IPO. For these firms,
2092 ticking faddish boxes like “collects as much data as possible” might
2093 realize a bigger return on investment than “collects a
2094 business-appropriate quantity of data.”
2095
2096 This is another harm of tech exceptionalism: The belief that more data
2097 always produces more profits in the form of more insights that can be
2098 translated into better mind-control rays drives firms to over-collect
2099 and over-retain data beyond all rationality. And since the firms are
2100 behaving irrationally, a good number of them will go out of business and
2101 become ghost ships whose cargo holds are stuffed full of data that can
2102 harm people in myriad ways — but which no one is responsible for antey
2103 longer. Even if the companies don’t go under, the data they collect is
2104 maintained behind the minimum viable security — just enough security to
2105 keep the company viable while it waits to get bought out by a tech
2106 giant, an amount calculated to spend not one penny more than is
2107 necessary on protecting data.
2108
2109 How monopolies, not mind control, drive surveillance capitalism: The Snapchat story
2110 -----------------------------------------------------------------------------------
2111
2112 For the first decade of its existence, Facebook competed with the social
2113 media giants of the day (Myspace, Orkut, etc.) by presenting itself as
2114 the pro-privacy alternative. Indeed, Facebook justified its walled
2115 garden — which let users bring in data from the web but blocked web
2116 services like Google Search from indexing and caching Facebook pages —
2117 as a pro-privacy measure that protected users from the
2118 surveillance-happy winners of the social media wars like Myspace.
2119
2120 Despite frequent promises that it would never collect or analyze its
2121 users’ data, Facebook periodically created initiatives that did just
2122 that, like the creepy, ham-fisted Beacon tool, which spied on you as you
2123 moved around the web and then added your online activities to your
2124 public timeline, allowing your friends to monitor your browsing habits.
2125 Beacon sparked a user revolt. Every time, Facebook backed off from its
2126 surveillance initiative, but not all the way; inevitably, the new
2127 Facebook would be more surveilling than the old Facebook, though not
2128 quite as surveilling as the intermediate Facebook following the launch
2129 of the new product or service.
2130
2131 The pace at which Facebook ramped up its surveillance efforts seems to
2132 have been set by Facebook’s competitive landscape. The more competitors
2133 Facebook had, the better it behaved. Every time a major competitor
2134 foundered, Facebook’s behavior `got markedly
2135 worse <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362>`__.
2136
2137 All the while, Facebook was prodigiously acquiring companies, including
2138 a company called Onavo. Nominally, Onavo made a battery-monitoring
2139 mobile app. But the permissions that Onavo required were so expansive
2140 that the app was able to gather fine-grained telemetry on everything
2141 users did with their phones, including which apps they used and how they
2142 were using them.
2143
2144 Through Onavo, Facebook discovered that it was losing market share to
2145 Snapchat, an app that — like Facebook a decade before — billed itself as
2146 the pro-privacy alternative to the status quo. Through Onavo, Facebook
2147 was able to mine data from the devices of Snapchat users, including both
2148 current and former Snapchat users. This spurred Facebook to acquire
2149 Instagram — some features of which competed with Snapchat — and then
2150 allowed Facebook to fine-tune Instagram’s features and sales pitch to
2151 erode Snapchat’s gains and ensure that Facebook would not have to face
2152 the kinds of competitive pressures it had earlier inflicted on Myspace
2153 and Orkut.
2154
2155 The story of how Facebook crushed Snapchat reveals the relationship
2156 between monopoly and surveillance capitalism. Facebook combined
2157 surveillance with lax antitrust enforcement to spot the competitive
2158 threat of Snapchat on its horizon and then take decisive action against
2159 it. Facebook’s surveillance capitalism let it avert competitive pressure
2160 with anti-competitive tactics. Facebook users still want privacy —
2161 Facebook hasn’t used surveillance to brainwash them out of it — but they
2162 can’t get it because Facebook’s surveillance lets it destroy any hope of
2163 a rival service emerging that competes on privacy features.
2164
2165 A monopoly over your friends
2166 ----------------------------
2167
2168 A decentralization movement has tried to erode the dominance of Facebook
2169 and other Big Tech companies by fielding “indieweb” alternatives —
2170 Mastodon as a Twitter alternative, Diaspora as a Facebook alternative,
2171 etc. — but these efforts have failed to attain any kind of liftoff.
2172
2173 Fundamentally, each of these services is hamstrung by the same problem:
2174 Every potential user for a Facebook or Twitter alternative has to
2175 convince all their friends to follow them to a decentralized web
2176 alternative in order to continue to realize the benefit of social media.
2177 For many of us, the only reason to have a Facebook account is that our
2178 friends have Facebook accounts, and the reason they have Facebook
2179 accounts is that *we* have Facebook accounts.
2180
2181 All of this has conspired to make Facebook — and other dominant
2182 platforms — into “kill zones” that investors will not fund new entrants
2183 for.
2184
2185 And yet, all of today’s tech giants came into existence despite the
2186 entrenched advantage of the companies that came before them. To
2187 understand how that happened, you have to understand both
2188 interoperability and adversarial interoperability.
2189
2190 The hard problem of our species is coordination.
2191
2192 “Interoperability” is the ability of two technologies to work with one
2193 another: Anyone can make an LP that will play on any record player,
2194 anyone can make a filter you can install in your stove’s extractor fan,
2195 anyone can make gasoline for your car, anyone can make a USB phone
2196 charger that fits in your car’s cigarette lighter receptacle, anyone can
2197 make a light bulb that works in your light socket, anyone can make bread
2198 that will toast in your toaster.
2199
2200 Interoperability is often a source of innovation and consumer benefit:
2201 Apple made the first commercially successful PC, but millions of
2202 independent software vendors made interoperable programs that ran on the
2203 Apple II Plus. The simple analog antenna inputs on the back of TVs first
2204 allowed cable operators to connect directly to TVs, then they allowed
2205 game console companies and then personal computer companies to use
2206 standard televisions as displays. Standard RJ-11 telephone jacks allowed
2207 for the production of phones from a variety of vendors in a variety of
2208 forms, from the free football-shaped phone that came with a *Sports
2209 Illustrated* subscription to business phones with speakers, hold
2210 functions, and so on and then answering machines and finally modems,
2211 paving the way for the internet revolution.
2212
2213 “Interoperability” is often used interchangeably with “standardization,”
2214 which is the process when manufacturers and other stakeholders hammer
2215 out a set of agreed-upon rules for implementing a technology, such as
2216 the electrical plug on your wall, the CAN bus used by your car’s
2217 computer systems, or the HTML instructions that your browser interprets.
2218
2219 But interoperability doesn’t require standardization — indeed,
2220 standardization often proceeds from the chaos of ad hoc interoperability
2221 measures. The inventor of the cigarette-lighter USB charger didn’t need
2222 to get permission from car manufacturers or even the manufacturers of
2223 the dashboard lighter subcomponent. The automakers didn’t take any
2224 countermeasures to prevent the use of these aftermarket accessories by
2225 their customers, but they also didn’t do anything to make life easier
2226 for the chargers’ manufacturers. This is a kind of “neutral
2227 interoperability.”
2228
2229 Beyond neutral interoperability, there is “adversarial
2230 interoperability.” That’s when a manufacturer makes a product that
2231 interoperates with another manufacturer’s product *despite the second
2232 manufacturer’s objections* and *even if that means bypassing a security
2233 system designed to prevent interoperability*.
2234
2235 Probably the most familiar form of adversarial interoperability is
2236 third-party printer ink. Printer manufacturers claim that they sell
2237 printers below cost and that the only way they can recoup the losses
2238 they incur is by charging high markups on ink. To prevent the owners of
2239 printers from buying ink elsewhere, the printer companies deploy a suite
2240 of anti-customer security systems that detect and reject both refilled
2241 and third-party cartridges.
2242
2243 Owners of printers take the position that HP and Epson and Brother are
2244 not charities and that customers for their wares have no obligation to
2245 help them survive, and so if the companies choose to sell their products
2246 at a loss, that’s their foolish choice and their consequences to live
2247 with. Likewise, competitors who make ink or refill kits observe that
2248 they don’t owe printer companies anything, and their erosion of printer
2249 companies’ margins are the printer companies’ problems, not their
2250 competitors’. After all, the printer companies shed no tears when they
2251 drive a refiller out of business, so why should the refillers concern
2252 themselves with the economic fortunes of the printer companies?
2253
2254 Adversarial interoperability has played an outsized role in the history
2255 of the tech industry: from the founding of the “alt.*” Usenet hierarchy
2256 (which was started against the wishes of Usenet’s maintainers and which
2257 grew to be bigger than all of Usenet combined) to the browser wars (when
2258 Netscape and Microsoft devoted massive engineering efforts to making
2259 their browsers incompatible with the other’s special commands and
2260 peccadilloes) to Facebook (whose success was built in part by helping
2261 its new users stay in touch with friends they’d left behind on Myspace
2262 because Facebook supplied them with a tool that scraped waiting messages
2263 from Myspace and imported them into Facebook, effectively creating an
2264 Facebook-based Myspace reader).
2265
2266 Today, incumbency is seen as an unassailable advantage. Facebook is
2267 where all of your friends are, so no one can start a Facebook
2268 competitor. But adversarial compatibility reverses the competitive
2269 advantage: If you were allowed to compete with Facebook by providing a
2270 tool that imported all your users’ waiting Facebook messages into an
2271 environment that competed on lines that Facebook couldn’t cross, like
2272 eliminating surveillance and ads, then Facebook would be at a huge
2273 disadvantage. It would have assembled all possible ex-Facebook users
2274 into a single, easy-to-find service; it would have educated them on how
2275 a Facebook-like service worked and what its potential benefits were; and
2276 it would have provided an easy means for disgruntled Facebook users to
2277 tell their friends where they might expect better treatment.
2278
2279 Adversarial interoperability was once the norm and a key contributor to
2280 the dynamic, vibrant tech scene, but now it is stuck behind a thicket of
2281 laws and regulations that add legal risks to the tried-and-true tactics
2282 of adversarial interoperability. New rules and new interpretations of
2283 existing rules mean that a would-be adversarial interoperator needs to
2284 steer clear of claims under copyright, terms of service, trade secrecy,
2285 tortious interference, and patent.
2286
2287 In the absence of a competitive market, lawmakers have resorted to
2288 assigning expensive, state-like duties to Big Tech firms, such as
2289 automatically filtering user contributions for copyright infringement or
2290 terrorist and extremist content or detecting and preventing harassment
2291 in real time or controlling access to sexual material.
2292
2293 These measures put a floor under how small we can make Big Tech because
2294 only the very largest companies can afford the humans and automated
2295 filters needed to perform these duties.
2296
2297 But that’s not the only way in which making platforms responsible for
2298 policing their users undermines competition. A platform that is expected
2299 to police its users’ conduct must prevent many vital adversarial
2300 interoperability techniques lest these subvert its policing measures.
2301 For example, if someone using a Twitter replacement like Mastodon is
2302 able to push messages into Twitter and read messages out of Twitter,
2303 they could avoid being caught by automated systems that detect and
2304 prevent harassment (such as systems that use the timing of messages or
2305 IP-based rules to make guesses about whether someone is a harasser).
2306
2307 To the extent that we are willing to let Big Tech police itself — rather
2308 than making Big Tech small enough that users can leave bad platforms for
2309 better ones and small enough that a regulation that simply puts a
2310 platform out of business will not destroy billions of users’ access to
2311 their communities and data — we build the case that Big Tech should be
2312 able to block its competitors and make it easier for Big Tech to demand
2313 legal enforcement tools to ban and punish attempts at adversarial
2314 interoperability.
2315
2316 Ultimately, we can try to fix Big Tech by making it responsible for bad
2317 acts by its users, or we can try to fix the internet by cutting Big Tech
2318 down to size. But we can’t do both. To replace today’s giant products
2319 with pluralistic protocols, we need to clear the legal thicket that
2320 prevents adversarial interoperability so that tomorrow’s nimble,
2321 personal, small-scale products can federate themselves with giants like
2322 Facebook, allowing the users who’ve left to continue to communicate with
2323 users who haven’t left yet, reaching tendrils over Facebook’s garden
2324 wall that Facebook’s trapped users can use to scale the walls and escape
2325 to the global, open web.
2326
2327 Fake news is an epistemological crisis
2328 --------------------------------------
2329
2330 Tech is not the only industry that has undergone massive concentration
2331 since the Reagan era. Virtually every major industry — from oil to
2332 newspapers to meatpacking to sea freight to eyewear to online
2333 pornography — has become a clubby oligarchy that just a few players
2334 dominate.
2335
2336 At the same time, every industry has become something of a tech industry
2337 as general-purpose computers and general-purpose networks and the
2338 promise of efficiencies through data-driven analysis infuse every
2339 device, process, and firm with tech.
2340
2341 This phenomenon of industrial concentration is part of a wider story
2342 about wealth concentration overall as a smaller and smaller number of
2343 people own more and more of our world. This concentration of both wealth
2344 and industries means that our political outcomes are increasingly
2345 beholden to the parochial interests of the people and companies with all
2346 the money.
2347
2348 That means that whenever a regulator asks a question with an obvious,
2349 empirical answer (“Are humans causing climate change?” or “Should we let
2350 companies conduct commercial mass surveillance?” or “Does society
2351 benefit from allowing network neutrality violations?”), the answer that
2352 comes out is only correct if that correctness meets with the approval of
2353 rich people and the industries that made them so wealthy.
2354
2355 Rich people have always played an outsized role in politics and more so
2356 since the Supreme Court’s *Citizens United* decision eliminated key
2357 controls over political spending. Widening inequality and wealth
2358 concentration means that the very richest people are now a lot richer
2359 and can afford to spend a lot more money on political projects than ever
2360 before. Think of the Koch brothers or George Soros or Bill Gates.
2361
2362 But the policy distortions of rich individuals pale in comparison to the
2363 policy distortions that concentrated industries are capable of. The
2364 companies in highly concentrated industries are much more profitable
2365 than companies in competitive industries — no competition means not
2366 having to reduce prices or improve quality to win customers — leaving
2367 them with bigger capital surpluses to spend on lobbying.
2368
2369 Concentrated industries also find it easier to collaborate on policy
2370 objectives than competitive ones. When all the top execs from your
2371 industry can fit around a single boardroom table, they often do. And
2372 *when* they do, they can forge a consensus position on regulation.
2373
2374 Rising through the ranks in a concentrated industry generally means
2375 working at two or three of the big companies. When there are only
2376 relatively few companies in a given industry, each company has a more
2377 ossified executive rank, leaving ambitious execs with fewer paths to
2378 higher positions unless they are recruited to a rival. This means that
2379 the top execs in concentrated industries are likely to have been
2380 colleagues at some point and socialize in the same circles — connected
2381 through social ties or, say, serving as trustees for each others’
2382 estates. These tight social bonds foster a collegial, rather than
2383 competitive, attitude.
2384
2385 Highly concentrated industries also present a regulatory conundrum. When
2386 an industry is dominated by just four or five companies, the only people
2387 who are likely to truly understand the industry’s practices are its
2388 veteran executives. This means that top regulators are often former
2389 execs of the companies they are supposed to be regulating. These turns
2390 in government are often tacitly understood to be leaves of absence from
2391 industry, with former employers welcoming their erstwhile watchdogs back
2392 into their executive ranks once their terms have expired.
2393
2394 All this is to say that the tight social bonds, small number of firms,
2395 and regulatory capture of concentrated industries give the companies
2396 that comprise them the power to dictate many, if not all, of the
2397 regulations that bind them.
2398
2399 This is increasingly obvious. Whether it’s payday lenders `winning the
2400 right to practice predatory
2401 lending <https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/>`__
2402 or Apple `winning the right to decide who can fix your
2403 phone <https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation>`__
2404 or Google and Facebook winning the right to breach your private data
2405 without suffering meaningful consequences or victories for pipeline
2406 companies or impunity for opioid manufacturers or massive tax subsidies
2407 for incredibly profitable dominant businesses, it’s increasingly
2408 apparent that many of our official, evidence-based truth-seeking
2409 processes are, in fact, auctions for sale to the highest bidder.
2410
2411 It’s really impossible to overstate what a terrifying prospect this is.
2412 We live in an incredibly high-tech society, and none of us could acquire
2413 the expertise to evaluate every technological proposition that stands
2414 between us and our untimely, horrible deaths. You might devote your life
2415 to acquiring the media literacy to distinguish good scientific journals
2416 from corrupt pay-for-play lookalikes and the statistical literacy to
2417 evaluate the quality of the analysis in the journals as well as the
2418 microbiology and epidemiology knowledge to determine whether you can
2419 trust claims about the safety of vaccines — but that would still leave
2420 you unqualified to judge whether the wiring in your home will give you a
2421 lethal shock *and* whether your car’s brakes’ software will cause them
2422 to fail unpredictably *and* whether the hygiene standards at your
2423 butcher are sufficient to keep you from dying after you finish your
2424 dinner.
2425
2426 In a world as complex as this one, we have to defer to authorities, and
2427 we keep them honest by making those authorities accountable to us and
2428 binding them with rules to prevent conflicts of interest. We can’t
2429 possibly acquire the expertise to adjudicate conflicting claims about
2430 the best way to make the world safe and prosperous, but we *can*
2431 determine whether the adjudication process itself is trustworthy.
2432
2433 Right now, it’s obviously not.
2434
2435 The past 40 years of rising inequality and industry concentration,
2436 together with increasingly weak accountability and transparency for
2437 expert agencies, has created an increasingly urgent sense of impending
2438 doom, the sense that there are vast conspiracies afoot that operate with
2439 tacit official approval despite the likelihood they are working to
2440 better themselves by ruining the rest of us.
2441
2442 For example, it’s been decades since Exxon’s own scientists concluded
2443 that its products would render the Earth uninhabitable by humans. And
2444 yet those decades were lost to us, in large part because Exxon lobbied
2445 governments and sowed doubt about the dangers of its products and did so
2446 with the cooperation of many public officials. When the survival of you
2447 and everyone you love is threatened by conspiracies, it’s not
2448 unreasonable to start questioning the things you think you know in an
2449 attempt to determine whether they, too, are the outcome of another
2450 conspiracy.
2451
2452 The collapse of the credibility of our systems for divining and
2453 upholding truths has left us in a state of epistemological chaos. Once,
2454 most of us might have assumed that the system was working and that our
2455 regulations reflected our best understanding of the empirical truths of
2456 the world as they were best understood — now we have to find our own
2457 experts to help us sort the true from the false.
2458
2459 If you’re like me, you probably believe that vaccines are safe, but you
2460 (like me) probably also can’t explain the microbiology or statistics.
2461 Few of us have the math skills to review the literature on vaccine
2462 safety and describe why their statistical reasoning is sound. Likewise,
2463 few of us can review the stats in the (now discredited) literature on
2464 opioid safety and explain how those stats were manipulated. Both
2465 vaccines and opioids were embraced by medical authorities, after all,
2466 and one is safe while the other could ruin your life. You’re left with a
2467 kind of inchoate constellation of rules of thumb about which experts you
2468 trust to fact-check controversial claims and then to explain how all
2469 those respectable doctors with their peer-reviewed research on opioid
2470 safety *were* an aberration and then how you know that the doctors
2471 writing about vaccine safety are *not* an aberration.
2472
2473 I’m 100% certain that vaccinating is safe and effective, but I’m also at
2474 something of a loss to explain exactly, *precisely,* why I believe this,
2475 given all the corruption I know about and the many times the stamp of
2476 certainty has turned out to be a parochial lie told to further enrich
2477 the super rich.
2478
2479 Fake news — conspiracy theories, racist ideologies, scientific denialism
2480 — has always been with us. What’s changed today is not the mix of ideas
2481 in the public discourse but the popularity of the worst ideas in that
2482 mix. Conspiracy and denial have skyrocketed in lockstep with the growth
2483 of Big Inequality, which has also tracked the rise of Big Tech and Big
2484 Pharma and Big Wrestling and Big Car and Big Movie Theater and Big
2485 Everything Else.
2486
2487 No one can say for certain why this has happened, but the two dominant
2488 camps are idealism (the belief that the people who argue for these
2489 conspiracies have gotten better at explaining them, maybe with the help
2490 of machine-learning tools) or materialism (the ideas have become more
2491 attractive because of material conditions in the world).
2492
2493 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2494 theorists all my life, and I have not experienced any qualitative leap
2495 in the quality of those arguments.
2496
2497 The major difference is in the world, not the arguments. In a time where
2498 actual conspiracies are commonplace, conspiracy theories acquire a ring
2499 of plausibility.
2500
2501 We have always had disagreements about what’s true, but today, we have a
2502 disagreement over how we know whether something is true. This is an
2503 epistemological crisis, not a crisis over belief. It’s a crisis over the
2504 credibility of our truth-seeking exercises, from scientific journals (in
2505 an era where the biggest journal publishers have been caught producing
2506 pay-to-play journals for junk science) to regulations (in an era where
2507 regulators are routinely cycling in and out of business) to education
2508 (in an era where universities are dependent on corporate donations to
2509 keep their lights on).
2510
2511 Targeting — surveillance capitalism — makes it easier to find people who
2512 are undergoing this epistemological crisis, but it doesn’t create the
2513 crisis. For that, you need to look to corruption.
2514
2515 And, conveniently enough, it’s corruption that allows surveillance
2516 capitalism to grow by dismantling monopoly protections, by permitting
2517 reckless collection and retention of personal data, by allowing ads to
2518 be targeted in secret, and by foreclosing on the possibility of going
2519 somewhere else where you might continue to enjoy your friends without
2520 subjecting yourself to commercial surveillance.
2521
2522 Tech is different
2523 -----------------
2524
2525 I reject both iterations of technological exceptionalism. I reject the
2526 idea that tech is uniquely terrible and led by people who are greedier
2527 or worse than the leaders of other industries, and I reject the idea
2528 that tech is so good — or so intrinsically prone to concentration — that
2529 it can’t be blamed for its present-day monopolistic status.
2530
2531 I think tech is just another industry, albeit one that grew up in the
2532 absence of real monopoly constraints. It may have been first, but it
2533 isn’t the worst nor will it be the last.
2534
2535 But there’s one way in which I *am* a tech exceptionalist. I believe
2536 that online tools are the key to overcoming problems that are much more
2537 urgent than tech monopolization: climate change, inequality, misogyny,
2538 and discrimination on the basis of race, gender identity, and other
2539 factors. The internet is how we will recruit people to fight those
2540 fights, and how we will coordinate their labor. Tech is not a substitute
2541 for democratic accountability, the rule of law, fairness, or stability —
2542 but it’s a means to achieve these things.
2543
2544 The hard problem of our species is coordination. Everything from climate
2545 change to social change to running a business to making a family work
2546 can be viewed as a collective action problem.
2547
2548 The internet makes it easier than at any time before to find people who
2549 want to work on a project with you — hence the success of free and
2550 open-source software, crowdfunding, and racist terror groups — and
2551 easier than ever to coordinate the work you do.
2552
2553 The internet and the computers we connect to it also possess an
2554 exceptional quality: general-purposeness. The internet is designed to
2555 allow any two parties to communicate any data, using any protocol,
2556 without permission from anyone else. The only production design we have
2557 for computers is the general-purpose, “Turing complete” computer that
2558 can run every program we can express in symbolic logic.
2559
2560 This means that every time someone with a special communications need
2561 invests in infrastructure and techniques to make the internet faster,
2562 cheaper, and more robust, this benefit redounds to everyone else who is
2563 using the internet to communicate. And this also means that every time
2564 someone with a special computing need invests to make computers faster,
2565 cheaper, and more robust, every other computing application is a
2566 potential beneficiary of this work.
2567
2568 For these reasons, every type of communication is gradually absorbed
2569 into the internet, and every type of device — from airplanes to
2570 pacemakers — eventually becomes a computer in a fancy case.
2571
2572 While these considerations don’t preclude regulating networks and
2573 computers, they do call for gravitas and caution when doing so because
2574 changes to regulatory frameworks could ripple out to have unintended
2575 consequences in many, many other domains.
2576
2577 The upshot of this is that our best hope of solving the big coordination
2578 problems — climate change, inequality, etc. — is with free, fair, and
2579 open tech. Our best hope of keeping tech free, fair, and open is to
2580 exercise caution in how we regulate tech and to attend closely to the
2581 ways in which interventions to solve one problem might create problems
2582 in other domains.
2583
2584 Ownership of facts
2585 ------------------
2586
2587 Big Tech has a funny relationship with information. When you’re
2588 generating information — anything from the location data streaming off
2589 your mobile device to the private messages you send to friends on a
2590 social network — it claims the rights to make unlimited use of that
2591 data.
2592
2593 But when you have the audacity to turn the tables — to use a tool that
2594 blocks ads or slurps your waiting updates out of a social network and
2595 puts them in another app that lets you set your own priorities and
2596 suggestions or crawls their system to allow you to start a rival
2597 business — they claim that you’re stealing from them.
2598
2599 The thing is, information is a very bad fit for any kind of private
2600 property regime. Property rights are useful for establishing markets
2601 that can lead to the effective development of fallow assets. These
2602 markets depend on clear titles to ensure that the things being bought
2603 and sold in them can, in fact, be bought and sold.
2604
2605 Information rarely has such a clear title. Take phone numbers: There’s
2606 clearly something going wrong when Facebook slurps up millions of users’
2607 address books and uses the phone numbers it finds in them to plot out
2608 social graphs and fill in missing information about other users.
2609
2610 But the phone numbers Facebook nonconsensually acquires in this
2611 transaction are not the “property” of the users they’re taken from nor
2612 do they belong to the people whose phones ring when you dial those
2613 numbers. The numbers are mere integers, 10 digits in the U.S. and
2614 Canada, and they appear in millions of places, including somewhere deep
2615 in pi as well as numerous other contexts. Giving people ownership titles
2616 to integers is an obviously terrible idea.
2617
2618 Likewise for the facts that Facebook and other commercial surveillance
2619 operators acquire about us, like that we are the children of our parents
2620 or the parents to our children or that we had a conversation with
2621 someone else or went to a public place. These data points can’t be
2622 property in the sense that your house or your shirt is your property
2623 because the title to them is intrinsically muddy: Does your mom own the
2624 fact that she is your mother? Do you? Do both of you? What about your
2625 dad — does he own this fact too, or does he have to license the fact
2626 from you (or your mom or both of you) in order to use this fact? What
2627 about the hundreds or thousands of other people who know these facts?
2628
2629 If you go to a Black Lives Matter demonstration, do the other
2630 demonstrators need your permission to post their photos from the event?
2631 The online fights over `when and how to post photos from
2632 demonstrations <https://www.wired.com/story/how-to-take-photos-at-protests/>`__
2633 reveal a nuanced, complex issue that cannot be easily hand-waved away by
2634 giving one party a property right that everyone else in the mix has to
2635 respect.
2636
2637 The fact that information isn’t a good fit with property and markets
2638 doesn’t mean that it’s not valuable. Babies aren’t property, but they’re
2639 inarguably valuable. In fact, we have a whole set of rules just for
2640 babies as well as a subset of those rules that apply to humans more
2641 generally. Someone who argues that babies won’t be truly valuable until
2642 they can be bought and sold like loaves of bread would be instantly and
2643 rightfully condemned as a monster.
2644
2645 It’s tempting to reach for the property hammer when Big Tech treats your
2646 information like a nail — not least because Big Tech are such prolific
2647 abusers of property hammers when it comes to *their* information. But
2648 this is a mistake. If we allow markets to dictate the use of our
2649 information, then we’ll find that we’re sellers in a buyers’ market
2650 where the Big Tech monopolies set a price for our data that is so low as
2651 to be insignificant or, more likely, set at a nonnegotiable price of
2652 zero in a click-through agreement that you don’t have the opportunity to
2653 modify.
2654
2655 Meanwhile, establishing property rights over information will create
2656 insurmountable barriers to independent data processing. Imagine that we
2657 require a license to be negotiated when a translated document is
2658 compared with its original, something Google has done and continues to
2659 do billions of times to train its automated language translation tools.
2660 Google can afford this, but independent third parties cannot. Google can
2661 staff a clearances department to negotiate one-time payments to the
2662 likes of the EU (one of the major repositories of translated documents)
2663 while independent watchdogs wanting to verify that the translations are
2664 well-prepared, or to root out bias in translations, will find themselves
2665 needing a staffed-up legal department and millions for licenses before
2666 they can even get started.
2667
2668 The same goes for things like search indexes of the web or photos of
2669 peoples’ houses, which have become contentious thanks to Google’s Street
2670 View project. Whatever problems may exist with Google’s photographing of
2671 street scenes, resolving them by letting people decide who can take
2672 pictures of the facades of their homes from a public street will surely
2673 create even worse ones. Think of how street photography is important for
2674 newsgathering — including informal newsgathering, like photographing
2675 abuses of authority — and how being able to document housing and street
2676 life are important for contesting eminent domain, advocating for social
2677 aid, reporting planning and zoning violations, documenting
2678 discriminatory and unequal living conditions, and more.
2679
2680 The ownership of facts is antithetical to many kinds of human progress.
2681 It’s hard to imagine a rule that limits Big Tech’s exploitation of our
2682 collective labors without inadvertently banning people from gathering
2683 data on online harassment or compiling indexes of changes in language or
2684 simply investigating how the platforms are shaping our discourse — all
2685 of which require scraping data that other people have created and
2686 subjecting it to scrutiny and analysis.
2687
2688 Persuasion works… slowly
2689 -------------------------
2690
2691 The platforms may oversell their ability to persuade people, but
2692 obviously, persuasion works sometimes. Whether it’s the private realm
2693 that LGBTQ people used to recruit allies and normalize sexual diversity
2694 or the decadeslong project to convince people that markets are the only
2695 efficient way to solve complicated resource allocation problems, it’s
2696 clear that our societal attitudes *can* change.
2697
2698 The project of shifting societal attitudes is a game of inches and
2699 years. For centuries, svengalis have purported to be able to accelerate
2700 this process, but even the most brutal forms of propaganda have
2701 struggled to make permanent changes. Joseph Goebbels was able to subject
2702 Germans to daily, mandatory, hourslong radio broadcasts, to round up and
2703 torture and murder dissidents, and to seize full control over their
2704 children’s education while banning any literature, broadcasts, or films
2705 that did not comport with his worldview.
2706
2707 Yet, after 12 years of terror, once the war ended, Nazi ideology was
2708 largely discredited in both East and West Germany, and a program of
2709 national truth and reconciliation was put in its place. Racism and
2710 authoritarianism were never fully abolished in Germany, but neither were
2711 the majority of Germans irrevocably convinced of Nazism — and the rise
2712 of racist authoritarianism in Germany today tells us that the liberal
2713 attitudes that replaced Nazism were no more permanent than Nazism
2714 itself.
2715
2716 Racism and authoritarianism have also always been with us. Anyone who’s
2717 reviewed the kind of messages and arguments that racists put forward
2718 today would be hard-pressed to say that they have gotten better at
2719 presenting their ideas. The same pseudoscience, appeals to fear, and
2720 circular logic that racists presented in the 1980s, when the cause of
2721 white supremacy was on the wane, are to be found in the communications
2722 of leading white nationalists today.
2723
2724 If racists haven’t gotten more convincing in the past decade, then how
2725 is it that more people were convinced to be openly racist at that time?
2726 I believe that the answer lies in the material world, not the world of
2727 ideas. The ideas haven’t gotten more convincing, but people have become
2728 more afraid. Afraid that the state can’t be trusted to act as an honest
2729 broker in life-or-death decisions, from those regarding the management
2730 of the economy to the regulation of painkillers to the rules for
2731 handling private information. Afraid that the world has become a game of
2732 musical chairs in which the chairs are being taken away at a
2733 never-before-seen rate. Afraid that justice for others will come at
2734 their expense. Monopolism isn’t the cause of these fears, but the
2735 inequality and material desperation and policy malpractice that
2736 monopolism contributes to is a significant contributor to these
2737 conditions. Inequality creates the conditions for both conspiracies and
2738 violent racist ideologies, and then surveillance capitalism lets
2739 opportunists target the fearful and the conspiracy-minded.
2740
2741 Paying won’t help
2742 ------------------
2743
2744 As the old saw goes, “If you’re not paying for the product, you’re the
2745 product.”
2746
2747 It’s a commonplace belief today that the advent of free, ad-supported
2748 media was the original sin of surveillance capitalism. The reasoning is
2749 that the companies that charged for access couldn’t “compete with free”
2750 and so they were driven out of business. Their ad-supported competitors,
2751 meanwhile, declared open season on their users’ data in a bid to improve
2752 their ad targeting and make more money and then resorted to the most
2753 sensationalist tactics to generate clicks on those ads. If only we’d pay
2754 for media again, we’d have a better, more responsible, more sober
2755 discourse that would be better for democracy.
2756
2757 But the degradation of news products long precedes the advent of
2758 ad-supported online news. Long before newspapers were online, lax
2759 antitrust enforcement had opened the door for unprecedented waves of
2760 consolidation and roll-ups in newsrooms. Rival newspapers were merged,
2761 reporters and ad sales staff were laid off, physical plants were sold
2762 and leased back, leaving the companies loaded up with debt through
2763 leveraged buyouts and subsequent profit-taking by the new owners. In
2764 other words, it wasn’t merely shifts in the classified advertising
2765 market, which was long held to be the primary driver in the decline of
2766 the traditional newsroom, that made news companies unable to adapt to
2767 the internet — it was monopolism.
2768
2769 Then, as news companies *did* come online, the ad revenues they
2770 commanded dropped even as the number of internet users (and thus
2771 potential online readers) increased. That shift was a function of
2772 consolidation in the ad sales market, with Google and Facebook emerging
2773 as duopolists who made more money every year from advertising while
2774 paying less and less of it to the publishers whose work the ads appeared
2775 alongside. Monopolism created a buyer’s market for ad inventory with
2776 Facebook and Google acting as gatekeepers.
2777
2778 Paid services continue to exist alongside free ones, and often it is
2779 these paid services — anxious to prevent people from bypassing their
2780 paywalls or sharing paid media with freeloaders — that exert the most
2781 control over their customers. Apple’s iTunes and App Stores are paid
2782 services, but to maximize their profitability, Apple has to lock its
2783 platforms so that third parties can’t make compatible software without
2784 permission. These locks allow the company to exercise both editorial
2785 control (enabling it to exclude `controversial political
2786 material <https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression>`__)
2787 and technological control, including control over who can repair the
2788 devices it makes. If we’re worried that ad-supported products deprive
2789 people of their right to self-determination by using persuasion
2790 techniques to nudge their purchase decisions a few degrees in one
2791 direction or the other, then the near-total control a single company
2792 holds over the decision of who gets to sell you software, parts, and
2793 service for your iPhone should have us very worried indeed.
2794
2795 We shouldn’t just be concerned about payment and control: The idea that
2796 paying will improve discourse is also dangerously wrong. The poor
2797 success rate of targeted advertising means that the platforms have to
2798 incentivize you to “engage” with posts at extremely high levels to
2799 generate enough pageviews to safeguard their profits. As discussed
2800 earlier, to increase engagement, platforms like Facebook use machine
2801 learning to guess which messages will be most inflammatory and make a
2802 point of shoving those into your eyeballs at every turn so that you will
2803 hate-click and argue with people.
2804
2805 Perhaps paying would fix this, the reasoning goes. If platforms could be
2806 economically viable even if you stopped clicking on them once your
2807 intellectual and social curiosity had been slaked, then they would have
2808 no reason to algorithmically enrage you to get more clicks out of you,
2809 right?
2810
2811 There may be something to that argument, but it still ignores the wider
2812 economic and political context of the platforms and the world that
2813 allowed them to grow so dominant.
2814
2815 Platforms are world-spanning and all-encompassing because they are
2816 monopolies, and they are monopolies because we have gutted our most
2817 important and reliable anti-monopoly rules. Antitrust was neutered as a
2818 key part of the project to make the wealthy wealthier, and that project
2819 has worked. The vast majority of people on Earth have a negative net
2820 worth, and even the dwindling middle class is in a precarious state,
2821 undersaved for retirement, underinsured for medical disasters, and
2822 undersecured against climate and technology shocks.
2823
2824 In this wildly unequal world, paying doesn’t improve the discourse; it
2825 simply prices discourse out of the range of the majority of people.
2826 Paying for the product is dandy, if you can afford it.
2827
2828 If you think today’s filter bubbles are a problem for our discourse,
2829 imagine what they’d be like if rich people inhabited free-flowing
2830 Athenian marketplaces of ideas where you have to pay for admission while
2831 everyone else lives in online spaces that are subsidized by wealthy
2832 benefactors who relish the chance to establish conversational spaces
2833 where the “house rules” forbid questioning the status quo. That is,
2834 imagine if the rich seceded from Facebook, and then, instead of running
2835 ads that made money for shareholders, Facebook became a billionaire’s
2836 vanity project that also happened to ensure that nobody talked about
2837 whether it was fair that only billionaires could afford to hang out in
2838 the rarified corners of the internet.
2839
2840 Behind the idea of paying for access is a belief that free markets will
2841 address Big Tech’s dysfunction. After all, to the extent that people
2842 have a view of surveillance at all, it is generally an unfavorable one,
2843 and the longer and more thoroughly one is surveilled, the less one tends
2844 to like it. Same goes for lock-in: If HP’s ink or Apple’s App Store were
2845 really obviously fantastic, they wouldn’t need technical measures to
2846 prevent users from choosing a rival’s product. The only reason these
2847 technical countermeasures exist is that the companies don’t believe
2848 their customers would *voluntarily* submit to their terms, and they want
2849 to deprive them of the choice to take their business elsewhere.
2850
2851 Advocates for markets laud their ability to aggregate the diffused
2852 knowledge of buyers and sellers across a whole society through demand
2853 signals, price signals, and so on. The argument for surveillance
2854 capitalism being a “rogue capitalism” is that machine-learning-driven
2855 persuasion techniques distort decision-making by consumers, leading to
2856 incorrect signals — consumers don’t buy what they prefer, they buy what
2857 they’re tricked into preferring. It follows that the monopolistic
2858 practices of lock-in, which do far more to constrain consumers’ free
2859 choices, are even more of a “rogue capitalism.”
2860
2861 The profitability of any business is constrained by the possibility that
2862 its customers will take their business elsewhere. Both surveillance and
2863 lock-in are anti-features that no customer wants. But monopolies can
2864 capture their regulators, crush their competitors, insert themselves
2865 into their customers’ lives, and corral people into “choosing” their
2866 services regardless of whether they want them — it’s fine to be terrible
2867 when there is no alternative.
2868
2869 Ultimately, surveillance and lock-in are both simply business strategies
2870 that monopolists can choose. Surveillance companies like Google are
2871 perfectly capable of deploying lock-in technologies — just look at the
2872 onerous Android licensing terms that require device-makers to bundle in
2873 Google’s suite of applications. And lock-in companies like Apple are
2874 perfectly capable of subjecting their users to surveillance if it means
2875 keeping the Chinese government happy and preserving ongoing access to
2876 Chinese markets. Monopolies may be made up of good, ethical people, but
2877 as institutions, they are not your friend — they will do whatever they
2878 can get away with to maximize their profits, and the more monopolistic
2879 they are, the more they *can* get away with.
2880
2881 An “ecology” moment for trustbusting
2882 ---------------------------------------
2883
2884 If we’re going to break Big Tech’s death grip on our digital lives,
2885 we’re going to have to fight monopolies. That may sound pretty mundane
2886 and old-fashioned, something out of the New Deal era, while ending the
2887 use of automated behavioral modification feels like the plotline of a
2888 really cool cyberpunk novel.
2889
2890 Meanwhile, breaking up monopolies is something we seem to have forgotten
2891 how to do. There is a bipartisan, trans-Atlantic consensus that breaking
2892 up companies is a fool’s errand at best — liable to mire your federal
2893 prosecutors in decades of litigation — and counterproductive at worst,
2894 eroding the “consumer benefits” of large companies with massive
2895 efficiencies of scale.
2896
2897 But trustbusters once strode the nation, brandishing law books,
2898 terrorizing robber barons, and shattering the illusion of monopolies’
2899 all-powerful grip on our society. The trustbusting era could not begin
2900 until we found the political will — until the people convinced
2901 politicians they’d have their backs when they went up against the
2902 richest, most powerful men in the world.
2903
2904 Could we find that political will again?
2905
2906 Copyright scholar James Boyle has described how the term “ecology”
2907 marked a turning point in environmental activism. Prior to the adoption
2908 of this term, people who wanted to preserve whale populations didn’t
2909 necessarily see themselves as fighting the same battle as people who
2910 wanted to protect the ozone layer or fight freshwater pollution or beat
2911 back smog or acid rain.
2912
2913 But the term “ecology” welded these disparate causes together into a
2914 single movement, and the members of this movement found solidarity with
2915 one another. The people who cared about smog signed petitions circulated
2916 by the people who wanted to end whaling, and the anti-whalers marched
2917 alongside the people demanding action on acid rain. This uniting behind
2918 a common cause completely changed the dynamics of environmentalism,
2919 setting the stage for today’s climate activism and the sense that
2920 preserving the habitability of the planet Earth is a shared duty among
2921 all people.
2922
2923 I believe we are on the verge of a new “ecology” moment dedicated to
2924 combating monopolies. After all, tech isn’t the only concentrated
2925 industry nor is it even the *most* concentrated of industries.
2926
2927 You can find partisans for trustbusting in every sector of the economy.
2928 Everywhere you look, you can find people who’ve been wronged by
2929 monopolists who’ve trashed their finances, their health, their privacy,
2930 their educations, and the lives of people they love. Those people have
2931 the same cause as the people who want to break up Big Tech and the same
2932 enemies. When most of the world’s wealth is in the hands of a very few,
2933 it follows that nearly every large company will have overlapping
2934 shareholders.
2935
2936 That’s the good news: With a little bit of work and a little bit of
2937 coalition building, we have more than enough political will to break up
2938 Big Tech and every other concentrated industry besides. First we take
2939 Facebook, then we take AT&T/WarnerMedia.
2940
2941 But here’s the bad news: Much of what we’re doing to tame Big Tech
2942 *instead* of breaking up the big companies also forecloses on the
2943 possibility of breaking them up later.
2944
2945 Big Tech’s concentration currently means that their inaction on
2946 harassment, for example, leaves users with an impossible choice: absent
2947 themselves from public discourse by, say, quitting Twitter or endure
2948 vile, constant abuse. Big Tech’s over-collection and over-retention of
2949 data results in horrific identity theft. And their inaction on extremist
2950 recruitment means that white supremacists who livestream their shooting
2951 rampages can reach an audience of billions. The combination of tech
2952 concentration and media concentration means that artists’ incomes are
2953 falling even as the revenue generated by their creations are increasing.
2954
2955 Yet governments confronting all of these problems all inevitably
2956 converge on the same solution: deputize the Big Tech giants to police
2957 their users and render them liable for their users’ bad actions. The
2958 drive to force Big Tech to use automated filters to block everything
2959 from copyright infringement to sex-trafficking to violent extremism
2960 means that tech companies will have to allocate hundreds of millions to
2961 run these compliance systems.
2962
2963 These rules — the EU’s new Directive on Copyright, Australia’s new
2964 terror regulation, America’s FOSTA/SESTA sex-trafficking law and more —
2965 are not just death warrants for small, upstart competitors that might
2966 challenge Big Tech’s dominance but who lack the deep pockets of
2967 established incumbents to pay for all these automated systems. Worse
2968 still, these rules put a floor under how small we can hope to make Big
2969 Tech.
2970
2971 That’s because any move to break up Big Tech and cut it down to size
2972 will have to cope with the hard limit of not making these companies so
2973 small that they can no longer afford to perform these duties — and it’s
2974 *expensive* to invest in those automated filters and outsource content
2975 moderation. It’s already going to be hard to unwind these deeply
2976 concentrated, chimeric behemoths that have been welded together in the
2977 pursuit of monopoly profits. Doing so while simultaneously finding some
2978 way to fill the regulatory void that will be left behind if these
2979 self-policing rulers were forced to suddenly abdicate will be much, much
2980 harder.
2981
2982 Allowing the platforms to grow to their present size has given them a
2983 dominance that is nearly insurmountable — deputizing them with public
2984 duties to redress the pathologies created by their size makes it
2985 virtually impossible to reduce that size. Lather, rinse, repeat: If the
2986 platforms don’t get smaller, they will get larger, and as they get
2987 larger, they will create more problems, which will give rise to more
2988 public duties for the companies, which will make them bigger still.
2989
2990 We can work to fix the internet by breaking up Big Tech and depriving
2991 them of monopoly profits, or we can work to fix Big Tech by making them
2992 spend their monopoly profits on governance. But we can’t do both. We
2993 have to choose between a vibrant, open internet or a dominated,
2994 monopolized internet commanded by Big Tech giants that we struggle with
2995 constantly to get them to behave themselves.
2996
2997 Make Big Tech small again
2998 -------------------------
2999
3000 Trustbusting is hard. Breaking big companies into smaller ones is
3001 expensive and time-consuming. So time-consuming that by the time you’re
3002 done, the world has often moved on and rendered years of litigation
3003 irrelevant. From 1969 to 1982, the U.S. government pursued an antitrust
3004 case against IBM over its dominance of mainframe computing — but the
3005 case collapsed in 1982 because mainframes were being speedily replaced
3006 by PCs.
3007
3008 A future U.S. president could simply direct their attorney general to
3009 enforce the law as it was written.
3010
3011 It’s far easier to prevent concentration than to fix it, and reinstating
3012 the traditional contours of U.S. antitrust enforcement will, at the very
3013 least, prevent further concentration. That means bans on mergers between
3014 large companies, on big companies acquiring nascent competitors, and on
3015 platform companies competing directly with the companies that rely on
3016 the platforms.
3017
3018 These powers are all in the plain language of U.S. antitrust laws, so in
3019 theory, a future U.S. president could simply direct their attorney
3020 general to enforce the law as it was written. But after decades of
3021 judicial “education” in the benefits of monopolies, after multiple
3022 administrations that have packed the federal courts with
3023 lifetime-appointed monopoly cheerleaders, it’s not clear that mere
3024 administrative action would do the trick.
3025
3026 If the courts frustrate the Justice Department and the president, the
3027 next stop would be Congress, which could eliminate any doubt about how
3028 antitrust law should be enforced in the U.S. by passing new laws that
3029 boil down to saying, “Knock it off. We all know what the Sherman Act
3030 says. Robert Bork was a deranged fantasist. For avoidance of doubt,
3031 *fuck that guy*.” In other words, the problem with monopolies is
3032 *monopolism* — the concentration of power into too few hands, which
3033 erodes our right to self-determination. If there is a monopoly, the law
3034 wants it gone, period. Sure, get rid of monopolies that create “consumer
3035 harm” in the form of higher prices, but also, *get rid of other
3036 monopolies, too*.
3037
3038 But this only prevents things from getting worse. To help them get
3039 better, we will have to build coalitions with other activists in the
3040 anti-monopoly ecology movement — a pluralism movement or a
3041 self-determination movement — and target existing monopolies in every
3042 industry for breakup and structural separation rules that prevent, for
3043 example, the giant eyewear monopolist Luxottica from dominating both the
3044 sale and the manufacture of spectacles.
3045
3046 In an important sense, it doesn’t matter which industry the breakups
3047 begin in. Once they start, shareholders in *every* industry will start
3048 to eye their investments in monopolists skeptically. As trustbusters
3049 ride into town and start making lives miserable for monopolists, the
3050 debate around every corporate boardroom’s table will shift. People
3051 within corporations who’ve always felt uneasy about monopolism will gain
3052 a powerful new argument to fend off their evil rivals in the corporate
3053 hierarchy: “If we do it my way, we make less money; if we do it your
3054 way, a judge will fine us billions and expose us to ridicule and public
3055 disapprobation. So even though I get that it would be really cool to do
3056 that merger, lock out that competitor, or buy that little company and
3057 kill it before it can threaten it, we really shouldn’t — not if we don’t
3058 want to get tied to the DOJ’s bumper and get dragged up and down
3059 Trustbuster Road for the next 10 years.”
3060
3061 20 GOTO 10
3062 ----------
3063
3064 Fixing Big Tech will require a lot of iteration. As cyber lawyer
3065 Lawrence Lessig wrote in his 1999 book, *Code and Other Laws of
3066 Cyberspace*, our lives are regulated by four forces: law (what’s legal),
3067 code (what’s technologically possible), norms (what’s socially
3068 acceptable), and markets (what’s profitable).
3069
3070 If you could wave a wand and get Congress to pass a law that re-fanged
3071 the Sherman Act tomorrow, you could use the impending breakups to
3072 convince venture capitalists to fund competitors to Facebook, Google,
3073 Twitter, and Apple that would be waiting in the wings after they were
3074 cut down to size.
3075
3076 But getting Congress to act will require a massive normative shift, a
3077 mass movement of people who care about monopolies — and pulling them
3078 apart.
3079
3080 Getting people to care about monopolies will take technological
3081 interventions that help them to see what a world free from Big Tech
3082 might look like. Imagine if someone could make a beloved (but
3083 unauthorized) third-party Facebook or Twitter client that dampens the
3084 anxiety-producing algorithmic drumbeat and still lets you talk to your
3085 friends without being spied upon — something that made social media more
3086 sociable and less toxic. Now imagine that it gets shut down in a brutal
3087 legal battle. It’s always easier to convince people that something must
3088 be done to save a thing they love than it is to excite them about
3089 something that doesn’t even exist yet.
3090
3091 Neither tech nor law nor code nor markets are sufficient to reform Big
3092 Tech. But a profitable competitor to Big Tech could bankroll a
3093 legislative push; legal reform can embolden a toolsmith to make a better
3094 tool; the tool can create customers for a potential business who value
3095 the benefits of the internet but want them delivered without Big Tech;
3096 and that business can get funded and divert some of its profits to legal
3097 reform. 20 GOTO 10 (or lather, rinse, repeat). Do it again, but this
3098 time, get farther! After all, this time you’re starting with weaker Big
3099 Tech adversaries, a constituency that understands things can be better,
3100 Big Tech rivals who’ll help ensure their own future by bankrolling
3101 reform, and code that other programmers can build on to weaken Big Tech
3102 even further.
3103
3104 The surveillance capitalism hypothesis — that Big Tech’s products really
3105 work as well as they say they do and that’s why everything is so screwed
3106 up — is way too easy on surveillance and even easier on capitalism.
3107 Companies spy because they believe their own BS, and companies spy
3108 because governments let them, and companies spy because any advantage
3109 from spying is so short-lived and minor that they have to do more and
3110 more of it just to stay in place.
3111
3112 As to why things are so screwed up? Capitalism. Specifically, the
3113 monopolism that creates inequality and the inequality that creates
3114 monopolism. It’s a form of capitalism that rewards sociopaths who
3115 destroy the real economy to inflate the bottom line, and they get away
3116 with it for the same reason companies get away with spying: because our
3117 governments are in thrall to both the ideology that says monopolies are
3118 actually just fine and in thrall to the ideology that says that in a
3119 monopolistic world, you’d better not piss off the monopolists.
3120
3121 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked rule
3122 begets surveillance. Surveillance isn’t bad because it lets people
3123 manipulate us. It’s bad because it crushes our ability to be our
3124 authentic selves — and because it lets the rich and powerful figure out
3125 who might be thinking of building guillotines and what dirt they can use
3126 to discredit those embryonic guillotine-builders before they can even
3127 get to the lumberyard.
3128
3129 Up and through
3130 --------------
3131
3132 With all the problems of Big Tech, it’s tempting to imagine solving the
3133 problem by returning to a world without tech at all. Resist that
3134 temptation.
3135
3136 The only way out of our Big Tech problem is up and through. If our
3137 future is not reliant upon high tech, it will be because civilization
3138 has fallen. Big Tech wired together a planetary, species-wide nervous
3139 system that, with the proper reforms and course corrections, is capable
3140 of seeing us through the existential challenge of our species and
3141 planet. Now it’s up to us to seize the means of computation, putting
3142 that electronic nervous system under democratic, accountable control.
3143
3144 I am, secretly, despite what I have said earlier, a tech exceptionalist.
3145 Not in the sense of thinking that tech should be given a free pass to
3146 monopolize because it has “economies of scale” or some other nebulous
3147 feature. I’m a tech exceptionalist because I believe that getting tech
3148 right matters and that getting it wrong will be an unmitigated
3149 catastrophe — and doing it right can give us the power to work together
3150 to save our civilization, our species, and our planet.