]> pere.pagekite.me Git - text-destroy-surveillance.git/blob - how-to-destroy-surveillance-capitalism.rst
bfd6680ba28c729bc5097267b22877d58c1c4fe0
[text-destroy-surveillance.git] / how-to-destroy-surveillance-capitalism.rst
1 How to Destroy Surveillance Capitalism
2 ======================================
3
4 The net of a thousand lies
5 --------------------------
6
7 The most surprising thing about the rebirth of flat Earthers in the 21st
8 century is just how widespread the evidence against them is. You can
9 understand how, centuries ago, people who’d never gained a high-enough
10 vantage point from which to see the Earth’s curvature might come to the
11 commonsense belief that the flat-seeming Earth was, indeed, flat.
12
13 But today, when elementary schools routinely dangle GoPro cameras from
14 balloons and loft them high enough to photograph the Earth’s curve — to
15 say nothing of the unexceptional sight of the curved Earth from an
16 airplane window — it takes a heroic effort to maintain the belief that
17 the world is flat.
18
19 Likewise for white nationalism and eugenics: In an age where you can
20 become a computational genomics datapoint by swabbing your cheek and
21 mailing it to a gene-sequencing company along with a modest sum of
22 money, “race science” has never been easier to refute.
23
24 We are living through a golden age of both readily available facts and
25 denial of those facts. Terrible ideas that have lingered on the fringes
26 for decades or even centuries have gone mainstream seemingly overnight.
27
28 When an obscure idea gains currency, there are only two things that can
29 explain its ascendance: Either the person expressing that idea has
30 gotten a lot better at stating their case, or the proposition has become
31 harder to deny in the face of mounting evidence. In other words, if we
32 want people to take climate change seriously, we can get a bunch of
33 Greta Thunbergs to make eloquent, passionate arguments from podiums,
34 winning our hearts and minds, or we can wait for flood, fire, broiling
35 sun, and pandemics to make the case for us. In practice, we’ll probably
36 have to do some of both: The more we’re boiling and burning and drowning
37 and wasting away, the easier it will be for the Greta Thunbergs of the
38 world to convince us.
39
40 The arguments for ridiculous beliefs in odious conspiracies like
41 anti-vaccination, climate denial, a flat Earth, and eugenics are no
42 better than they were a generation ago. Indeed, they’re worse because
43 they are being pitched to people who have at least a background
44 awareness of the refuting facts.
45
46 Anti-vax has been around since the first vaccines, but the early
47 anti-vaxxers were pitching people who were less equipped to understand
48 even the most basic ideas from microbiology, and moreover, those people
49 had not witnessed the extermination of mass-murdering diseases like
50 polio, smallpox, and measles. Today’s anti-vaxxers are no more eloquent
51 than their forebears, and they have a much harder job.
52
53 So can these far-fetched conspiracy theorists really be succeeding on
54 the basis of superior arguments?
55
56 Some people think so. Today, there is a widespread belief that machine
57 learning and commercial surveillance can turn even the most
58 fumble-tongued conspiracy theorist into a svengali who can warp your
59 perceptions and win your belief by locating vulnerable people and then
60 pitching them with A.I.-refined arguments that bypass their rational
61 faculties and turn everyday people into flat Earthers, anti-vaxxers, or
62 even Nazis. When the RAND Corporation `blames Facebook for
63 “radicalization” <https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf>`__
64 and when Facebook’s role in spreading coronavirus misinformation is
65 `blamed on its
66 algorithm <https://secure.avaaz.org/campaign/en/facebook_threat_health/>`__,
67 the implicit message is that machine learning and surveillance are
68 causing the changes in our consensus about what’s true.
69
70 After all, in a world where sprawling and incoherent conspiracy theories
71 like Pizzagate and its successor, QAnon, have widespread followings,
72 *something* must be afoot.
73
74 But what if there’s another explanation? What if it’s the material
75 circumstances, and not the arguments, that are making the difference for
76 these conspiracy pitchmen? What if the trauma of living through *real
77 conspiracies* all around us — conspiracies among wealthy people, their
78 lobbyists, and lawmakers to bury inconvenient facts and evidence of
79 wrongdoing (these conspiracies are commonly known as “corruption”) — is
80 making people vulnerable to conspiracy theories?
81
82 If it’s trauma and not contagion — material conditions and not ideology
83 — that is making the difference today and enabling a rise of repulsive
84 misinformation in the face of easily observed facts, that doesn’t mean
85 our computer networks are blameless. They’re still doing the heavy work
86 of locating vulnerable people and guiding them through a series of
87 ever-more-extreme ideas and communities.
88
89 Belief in conspiracy is a raging fire that has done real damage and
90 poses real danger to our planet and species, from epidemics `kicked off
91 by vaccine denial <https://www.cdc.gov/measles/cases-outbreaks.html>`__
92 to genocides `kicked off by racist
93 conspiracies <https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html>`__
94 to planetary meltdown caused by denial-inspired climate inaction. Our
95 world is on fire, and so we have to put the fires out — to figure out
96 how to help people see the truth of the world through the conspiracies
97 they’ve been confused by.
98
99 But firefighting is reactive. We need fire *prevention*. We need to
100 strike at the traumatic material conditions that make people vulnerable
101 to the contagion of conspiracy. Here, too, tech has a role to play.
102
103 There’s no shortage of proposals to address this. From the EU’s
104 `Terrorist Content Regulation <https://edri.org/tag/terreg/>`__, which
105 requires platforms to police and remove “extremist” content, to the U.S.
106 proposals to `force tech companies to spy on their
107 users <https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution>`__
108 and hold them liable `for their users’ bad
109 speech <https://www.natlawreview.com/article/repeal-cda-section-230>`__,
110 there’s a lot of energy to force tech companies to solve the problems
111 they created.
112
113 There’s a critical piece missing from the debate, though. All these
114 solutions assume that tech companies are a fixture, that their dominance
115 over the internet is a permanent fact. Proposals to replace Big Tech
116 with a more diffused, pluralistic internet are nowhere to be found.
117 Worse: The “solutions” on the table today *require* Big Tech to stay big
118 because only the very largest companies can afford to implement the
119 systems these laws demand.
120
121 Figuring out what we want our tech to look like is crucial if we’re
122 going to get out of this mess. Today, we’re at a crossroads where we’re
123 trying to figure out if we want to fix the Big Tech companies that
124 dominate our internet or if we want to fix the internet itself by
125 unshackling it from Big Tech’s stranglehold. We can’t do both, so we
126 have to choose.
127
128 I want us to choose wisely. Taming Big Tech is integral to fixing the
129 internet, and for that, we need digital rights activism.
130
131 Digital rights activism, a quarter-century on
132 ---------------------------------------------
133
134 Digital rights activism is more than 30 years old now. The Electronic
135 Frontier Foundation turned 30 this year; the Free Software Foundation
136 launched in 1985. For most of the history of the movement, the most
137 prominent criticism leveled against it was that it was irrelevant: The
138 real activist causes were real-world causes (think of the skepticism
139 when `Finland declared broadband a human right in
140 2010 <https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015.>`__),
141 and real-world activism was shoe-leather activism (think of Malcolm
142 Gladwell’s `contempt for
143 “clicktivism” <https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell>`__).
144 But as tech has grown more central to our daily lives, these accusations
145 of irrelevance have given way first to accusations of insincerity (“You
146 only care about tech because you’re `shilling for tech
147 companies <https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/>`__\ ”)
148 to accusations of negligence (“Why didn’t you foresee that tech could be
149 such a destructive force?”). But digital rights activism is right where
150 it’s always been: looking out for the humans in a world where tech is
151 inexorably taking over.
152
153 The latest version of this critique comes in the form of “surveillance
154 capitalism,” a term coined by business professor Shoshana Zuboff in her
155 long and influential 2019 book, *The Age of Surveillance Capitalism: The
156 Fight for a Human Future at the New Frontier of Power*. Zuboff argues
157 that “surveillance capitalism” is a unique creature of the tech industry
158 and that it is unlike any other abusive commercial practice in history,
159 one that is “constituted by unexpected and often illegible mechanisms of
160 extraction, commodification, and control that effectively exile persons
161 from their own behavior while producing new markets of behavioral
162 prediction and modification. Surveillance capitalism challenges
163 democratic norms and departs in key ways from the centuries-long
164 evolution of market capitalism.” It is a new and deadly form of
165 capitalism, a “rogue capitalism,” and our lack of understanding of its
166 unique capabilities and dangers represents an existential, species-wide
167 threat. She’s right that capitalism today threatens our species, and
168 she’s right that tech poses unique challenges to our species and
169 civilization, but she’s really wrong about how tech is different and why
170 it threatens our species.
171
172 What’s more, I think that her incorrect diagnosis will lead us down a
173 path that ends up making Big Tech stronger, not weaker. We need to take
174 down Big Tech, and to do that, we need to start by correctly identifying
175 the problem.
176
177 Tech exceptionalism, then and now
178 ---------------------------------
179
180 Early critics of the digital rights movement — perhaps best represented
181 by campaigning organizations like the Electronic Frontier Foundation,
182 the Free Software Foundation, Public Knowledge, and others that focused
183 on preserving and enhancing basic human rights in the digital realm —
184 damned activists for practicing “tech exceptionalism.” Around the turn
185 of the millennium, serious people ridiculed any claim that tech policy
186 mattered in the “real world.” Claims that tech rules had implications
187 for speech, association, privacy, search and seizure, and fundamental
188 rights and equities were treated as ridiculous, an elevation of the
189 concerns of sad nerds arguing about *Star Trek* on bulletin board
190 systems above the struggles of the Freedom Riders, Nelson Mandela, or
191 the Warsaw ghetto uprising.
192
193 In the decades since, accusations of “tech exceptionalism” have only
194 sharpened as tech’s role in everyday life has expanded: Now that tech
195 has infiltrated every corner of our life and our online lives have been
196 monopolized by a handful of giants, defenders of digital freedoms are
197 accused of carrying water for Big Tech, providing cover for its
198 self-interested negligence (or worse, nefarious plots).
199
200 From my perspective, the digital rights movement has remained stationary
201 while the rest of the world has moved. From the earliest days, the
202 movement’s concern was users and the toolsmiths who provided the code
203 they needed to realize their fundamental rights. Digital rights
204 activists only cared about companies to the extent that companies were
205 acting to uphold users’ rights (or, just as often, when companies were
206 acting so foolishly that they threatened to bring down new rules that
207 would also make it harder for good actors to help users).
208
209 The “surveillance capitalism” critique recasts the digital rights
210 movement in a new light again: not as alarmists who overestimate the
211 importance of their shiny toys nor as shills for big tech but as serene
212 deck-chair rearrangers whose long-standing activism is a liability
213 because it makes them incapable of perceiving novel threats as they
214 continue to fight the last century’s tech battles.
215
216 But tech exceptionalism is a sin no matter who practices it.
217
218 Don’t believe the hype
219 -----------------------
220
221 You’ve probably heard that “if you’re not paying for the product, you’re
222 the product.” As we’ll see below, that’s true, if incomplete. But what
223 is *absolutely* true is that ad-driven Big Tech’s customers are
224 advertisers, and what companies like Google and Facebook sell is their
225 ability to convince *you* to buy stuff. Big Tech’s product is
226 persuasion. The services — social media, search engines, maps,
227 messaging, and more — are delivery systems for persuasion.
228
229 The fear of surveillance capitalism starts from the (correct)
230 presumption that everything Big Tech says about itself is probably a
231 lie. But the surveillance capitalism critique makes an exception for the
232 claims Big Tech makes in its sales literature — the breathless hype in
233 the pitches to potential advertisers online and in ad-tech seminars
234 about the efficacy of its products: It assumes that Big Tech is as good
235 at influencing us as they claim they are when they’re selling
236 influencing products to credulous customers. That’s a mistake because
237 sales literature is not a reliable indicator of a product’s efficacy.
238
239 Surveillance capitalism assumes that because advertisers buy a lot of
240 what Big Tech is selling, Big Tech must be selling something real. But
241 Big Tech’s massive sales could just as easily be the result of a popular
242 delusion or something even more pernicious: monopolistic control over
243 our communications and commerce.
244
245 Being watched changes your behavior, and not for the better. It creates
246 risks for our social progress. Zuboff’s book features beautifully
247 wrought explanations of these phenomena. But Zuboff also claims that
248 surveillance literally robs us of our free will — that when our personal
249 data is mixed with machine learning, it creates a system of persuasion
250 so devastating that we are helpless before it. That is, Facebook uses an
251 algorithm to analyze the data it nonconsensually extracts from your
252 daily life and uses it to customize your feed in ways that get you to
253 buy stuff. It is a mind-control ray out of a 1950s comic book, wielded
254 by mad scientists whose supercomputers guarantee them perpetual and
255 total world domination.
256
257 What is persuasion?
258 -------------------
259
260 To understand why you shouldn’t worry about mind-control rays — but why
261 you *should* worry about surveillance *and* Big Tech — we must start by
262 unpacking what we mean by “persuasion.”
263
264 Google, Facebook, and other surveillance capitalists promise their
265 customers (the advertisers) that if they use machine-learning tools
266 trained on unimaginably large data sets of nonconsensually harvested
267 personal information, they will be able to uncover ways to bypass the
268 rational faculties of the public and direct their behavior, creating a
269 stream of purchases, votes, and other desired outcomes.
270
271 The impact of dominance far exceeds the impact of manipulation and
272 should be central to our analysis and any remedies we seek.
273
274 But there’s little evidence that this is happening. Instead, the
275 predictions that surveillance capitalism delivers to its customers are
276 much less impressive. Rather than finding ways to bypass our rational
277 faculties, surveillance capitalists like Mark Zuckerberg mostly do one
278 or more of three things:
279
280 1. Segmenting
281 -------------
282
283 If you’re selling diapers, you have better luck if you pitch them to
284 people in maternity wards. Not everyone who enters or leaves a maternity
285 ward just had a baby, and not everyone who just had a baby is in the
286 market for diapers. But having a baby is a really reliable correlate of
287 being in the market for diapers, and being in a maternity ward is highly
288 correlated with having a baby. Hence diaper ads around maternity wards
289 (and even pitchmen for baby products, who haunt maternity wards with
290 baskets full of freebies).
291
292 Surveillance capitalism is segmenting times a billion. Diaper vendors
293 can go way beyond people in maternity wards (though they can do that,
294 too, with things like location-based mobile ads). They can target you
295 based on whether you’re reading articles about child-rearing, diapers,
296 or a host of other subjects, and data mining can suggest unobvious
297 keywords to advertise against. They can target you based on the articles
298 you’ve recently read. They can target you based on what you’ve recently
299 purchased. They can target you based on whether you receive emails or
300 private messages about these subjects — or even if you speak aloud about
301 them (though Facebook and the like convincingly claim that’s not
302 happening — yet).
303
304 This is seriously creepy.
305
306 But it’s not mind control.
307
308 It doesn’t deprive you of your free will. It doesn’t trick you.
309
310 Think of how surveillance capitalism works in politics. Surveillance
311 capitalist companies sell political operatives the power to locate
312 people who might be receptive to their pitch. Candidates campaigning on
313 finance industry corruption seek people struggling with debt; candidates
314 campaigning on xenophobia seek out racists. Political operatives have
315 always targeted their message whether their intentions were honorable or
316 not: Union organizers set up pitches at factory gates, and white
317 supremacists hand out fliers at John Birch Society meetings.
318
319 But this is an inexact and thus wasteful practice. The union organizer
320 can’t know which worker to approach on the way out of the factory gates
321 and may waste their time on a covert John Birch Society member; the
322 white supremacist doesn’t know which of the Birchers are so delusional
323 that making it to a meeting is as much as they can manage and which ones
324 might be convinced to cross the country to carry a tiki torch through
325 the streets of Charlottesville, Virginia.
326
327 Because targeting improves the yields on political pitches, it can
328 accelerate the pace of political upheaval by making it possible for
329 everyone who has secretly wished for the toppling of an autocrat — or
330 just an 11-term incumbent politician — to find everyone else who feels
331 the same way at very low cost. This has been critical to the rapid
332 crystallization of recent political movements including Black Lives
333 Matter and Occupy Wall Street as well as less savory players like the
334 far-right white nationalist movements that marched in Charlottesville.
335
336 It’s important to differentiate this kind of political organizing from
337 influence campaigns; finding people who secretly agree with you isn’t
338 the same as convincing people to agree with you. The rise of phenomena
339 like nonbinary or otherwise nonconforming gender identities is often
340 characterized by reactionaries as the result of online brainwashing
341 campaigns that convince impressionable people that they have been
342 secretly queer all along.
343
344 But the personal accounts of those who have come out tell a different
345 story where people who long harbored a secret about their gender were
346 emboldened by others coming forward and where people who knew that they
347 were different but lacked a vocabulary for discussing that difference
348 learned the right words from these low-cost means of finding people and
349 learning about their ideas.
350
351 2. Deception
352 ------------
353
354 Lies and fraud are pernicious, and surveillance capitalism supercharges
355 them through targeting. If you want to sell a fraudulent payday loan or
356 subprime mortgage, surveillance capitalism can help you find people who
357 are both desperate and unsophisticated and thus receptive to your pitch.
358 This accounts for the rise of many phenomena, like multilevel marketing
359 schemes, in which deceptive claims about potential earnings and the
360 efficacy of sales techniques are targeted at desperate people by
361 advertising against search queries that indicate, for example, someone
362 struggling with ill-advised loans.
363
364 Surveillance capitalism also abets fraud by making it easy to locate
365 other people who have been similarly deceived, forming a community of
366 people who reinforce one another’s false beliefs. Think of `the
367 forums <https://www.vulture.com/2020/01/the-dream-podcast-review.html>`__
368 where people who are being victimized by multilevel marketing frauds
369 gather to trade tips on how to improve their luck in peddling the
370 product.
371
372 Sometimes, online deception involves replacing someone’s correct beliefs
373 with incorrect ones, as it does in the anti-vaccination movement, whose
374 victims are often people who start out believing in vaccines but are
375 convinced by seemingly plausible evidence that leads them into the false
376 belief that vaccines are harmful.
377
378 But it’s much more common for fraud to succeed when it doesn’t have to
379 displace a true belief. When my daughter contracted head lice at
380 daycare, one of the daycare workers told me I could get rid of them by
381 treating her hair and scalp with olive oil. I didn’t know anything about
382 head lice, and I assumed that the daycare worker did, so I tried it (it
383 didn’t work, and it doesn’t work). It’s easy to end up with false
384 beliefs when you simply don’t know any better and when those beliefs are
385 conveyed by someone who seems to know what they’re doing.
386
387 This is pernicious and difficult — and it’s also the kind of thing the
388 internet can help guard against by making true information available,
389 especially in a form that exposes the underlying deliberations among
390 parties with sharply divergent views, such as Wikipedia. But it’s not
391 brainwashing; it’s fraud. In the `majority of
392 cases <https://datasociety.net/library/data-voids/>`__, the victims of
393 these fraud campaigns have an informational void filled in the customary
394 way, by consulting a seemingly reliable source. If I look up the length
395 of the Brooklyn Bridge and learn that it is 5,800 feet long, but in
396 reality, it is 5,989 feet long, the underlying deception is a problem,
397 but it’s a problem with a simple remedy. It’s a very different problem
398 from the anti-vax issue in which someone’s true belief is displaced by a
399 false one by means of sophisticated persuasion.
400
401 3. Domination
402 -------------
403
404 Surveillance capitalism is the result of monopoly. Monopoly is the
405 cause, and surveillance capitalism and its negative outcomes are the
406 effects of monopoly. I’ll get into this in depth later, but for now,
407 suffice it to say that the tech industry has grown up with a radical
408 theory of antitrust that has allowed companies to grow by merging with
409 their rivals, buying up their nascent competitors, and expanding to
410 control whole market verticals.
411
412 One example of how monopolism aids in persuasion is through dominance:
413 Google makes editorial decisions about its algorithms that determine the
414 sort order of the responses to our queries. If a cabal of fraudsters
415 have set out to trick the world into thinking that the Brooklyn Bridge
416 is 5,800 feet long, and if Google gives a high search rank to this group
417 in response to queries like “How long is the Brooklyn Bridge?” then the
418 first eight or 10 screens’ worth of Google results could be wrong. And
419 since most people don’t go beyond the first couple of results — let
420 alone the first *page* of results — Google’s choice means that many
421 people will be deceived.
422
423 Google’s dominance over search — more than 86% of web searches are
424 performed through Google — means that the way it orders its search
425 results has an outsized effect on public beliefs. Ironically, Google
426 claims this is why it can’t afford to have any transparency in its
427 algorithm design: Google’s search dominance makes the results of its
428 sorting too important to risk telling the world how it arrives at those
429 results lest some bad actor discover a flaw in the ranking system and
430 exploit it to push its point of view to the top of the search results.
431 There’s an obvious remedy to a company that is too big to audit: break
432 it up into smaller pieces.
433
434 Zuboff calls surveillance capitalism a “rogue capitalism” whose
435 data-hoarding and machine-learning techniques rob us of our free will.
436 But influence campaigns that seek to displace existing, correct beliefs
437 with false ones have an effect that is small and temporary while
438 monopolistic dominance over informational systems has massive, enduring
439 effects. Controlling the results to the world’s search queries means
440 controlling access both to arguments and their rebuttals and, thus,
441 control over much of the world’s beliefs. If our concern is how
442 corporations are foreclosing on our ability to make up our own minds and
443 determine our own futures, the impact of dominance far exceeds the
444 impact of manipulation and should be central to our analysis and any
445 remedies we seek.
446
447 4. Bypassing our rational faculties
448 -----------------------------------
449
450 *This* is the good stuff: using machine learning, “dark patterns,”
451 engagement hacking, and other techniques to get us to do things that run
452 counter to our better judgment. This is mind control.
453
454 Some of these techniques have proven devastatingly effective (if only in
455 the short term). The use of countdown timers on a purchase completion
456 page can create a sense of urgency that causes you to ignore the nagging
457 internal voice suggesting that you should shop around or sleep on your
458 decision. The use of people from your social graph in ads can provide
459 “social proof” that a purchase is worth making. Even the auction system
460 pioneered by eBay is calculated to play on our cognitive blind spots,
461 letting us feel like we “own” something because we bid on it, thus
462 encouraging us to bid again when we are outbid to ensure that “our”
463 things stay ours.
464
465 Games are extraordinarily good at this. “Free to play” games manipulate
466 us through many techniques, such as presenting players with a series of
467 smoothly escalating challenges that create a sense of mastery and
468 accomplishment but which sharply transition into a set of challenges
469 that are impossible to overcome without paid upgrades. Add some social
470 proof to the mix — a stream of notifications about how well your friends
471 are faring — and before you know it, you’re buying virtual power-ups to
472 get to the next level.
473
474 Companies have risen and fallen on these techniques, and the “fallen”
475 part is worth paying attention to. In general, living things adapt to
476 stimulus: Something that is very compelling or noteworthy when you first
477 encounter it fades with repetition until you stop noticing it
478 altogether. Consider the refrigerator hum that irritates you when it
479 starts up but disappears into the background so thoroughly that you only
480 notice it when it stops again.
481
482 That’s why behavioral conditioning uses “intermittent reinforcement
483 schedules.” Instead of giving you a steady drip of encouragement or
484 setbacks, games and gamified services scatter rewards on a randomized
485 schedule — often enough to keep you interested and random enough that
486 you can never quite find the pattern that would make it boring.
487
488 Intermittent reinforcement is a powerful behavioral tool, but it also
489 represents a collective action problem for surveillance capitalism. The
490 “engagement techniques” invented by the behaviorists of surveillance
491 capitalist companies are quickly copied across the whole sector so that
492 what starts as a mysteriously compelling fillip in the design of a
493 service—like “pull to refresh” or alerts when someone likes your posts
494 or side quests that your characters get invited to while in the midst of
495 main quests—quickly becomes dully ubiquitous. The
496 impossible-to-nail-down nonpattern of randomized drips from your phone
497 becomes a grey-noise wall of sound as every single app and site starts
498 to make use of whatever seems to be working at the time.
499
500 From the surveillance capitalist’s point of view, our adaptive capacity
501 is like a harmful bacterium that deprives it of its food source — our
502 attention — and novel techniques for snagging that attention are like
503 new antibiotics that can be used to breach our defenses and destroy our
504 self-determination. And there *are* techniques like that. Who can forget
505 the Great Zynga Epidemic, when all of our friends were caught in
506 *FarmVille*\ ’s endless, mindless dopamine loops? But every new
507 attention-commanding technique is jumped on by the whole industry and
508 used so indiscriminately that antibiotic resistance sets in. Given
509 enough repetition, almost all of us develop immunity to even the most
510 powerful techniques — by 2013, two years after Zynga’s peak, its user
511 base had halved.
512
513 Not everyone, of course. Some people never adapt to stimulus, just as
514 some people never stop hearing the hum of the refrigerator. This is why
515 most people who are exposed to slot machines play them for a while and
516 then move on while a small and tragic minority liquidate their kids’
517 college funds, buy adult diapers, and position themselves in front of a
518 machine until they collapse.
519
520 But surveillance capitalism’s margins on behavioral modification suck.
521 Tripling the rate at which someone buys a widget sounds great `unless
522 the base rate is way less than
523 1% <https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957>`__
524 with an improved rate of… still less than 1%. Even penny slot machines
525 pull down pennies for every spin while surveillance capitalism rakes in
526 infinitesimal penny fractions.
527
528 Slot machines’ high returns mean that they can be profitable just by
529 draining the fortunes of the small rump of people who are pathologically
530 vulnerable to them and unable to adapt to their tricks. But surveillance
531 capitalism can’t survive on the fractional pennies it brings down from
532 that vulnerable sliver — that’s why, after the Great Zynga Epidemic had
533 finally burned itself out, the small number of still-addicted players
534 left behind couldn’t sustain it as a global phenomenon. And new powerful
535 attention weapons aren’t easy to find, as is evidenced by the long years
536 since the last time Zynga had a hit. Despite the hundreds of millions of
537 dollars that Zynga has to spend on developing new tools to blast through
538 our adaptation, it has never managed to repeat the lucky accident that
539 let it snag so much of our attention for a brief moment in 2009.
540 Powerhouses like Supercell have fared a little better, but they are rare
541 and throw away many failures for every success.
542
543 The vulnerability of small segments of the population to dramatic,
544 efficient corporate manipulation is a real concern that’s worthy of our
545 attention and energy. But it’s not an existential threat to society.
546
547 If data is the new oil, then surveillance capitalism’s engine has a leak
548 -------------------------------------------------------------------------
549
550 This adaptation problem offers an explanation for one of surveillance
551 capitalism’s most alarming traits: its relentless hunger for data and
552 its endless expansion of data-gathering capabilities through the spread
553 of sensors, online surveillance, and acquisition of data streams from
554 third parties.
555
556 Zuboff observes this phenomenon and concludes that data must be very
557 valuable if surveillance capitalism is so hungry for it. (In her words:
558 “Just as industrial capitalism was driven to the continuous
559 intensification of the means of production, so surveillance capitalists
560 and their market players are now locked into the continuous
561 intensification of the means of behavioral modification and the
562 gathering might of instrumentarian power.”) But what if the voracious
563 appetite is because data has such a short half-life — because people
564 become inured so quickly to new, data-driven persuasion techniques —
565 that the companies are locked in an arms race with our limbic system?
566 What if it’s all a Red Queen’s race where they have to run ever faster —
567 collect ever-more data — just to stay in the same spot?
568
569 Of course, all of Big Tech’s persuasion techniques work in concert with
570 one another, and collecting data is useful beyond mere behavioral
571 trickery.
572
573 If someone wants to recruit you to buy a refrigerator or join a pogrom,
574 they might use profiling and targeting to send messages to people they
575 judge to be good sales prospects. The messages themselves may be
576 deceptive, making claims about things you’re not very knowledgeable
577 about (food safety and energy efficiency or eugenics and historical
578 claims about racial superiority). They might use search engine
579 optimization and/or armies of fake reviewers and commenters and/or paid
580 placement to dominate the discourse so that any search for further
581 information takes you back to their messages. And finally, they may
582 refine the different pitches using machine learning and other techniques
583 to figure out what kind of pitch works best on someone like you.
584
585 Each phase of this process benefits from surveillance: The more data
586 they have, the more precisely they can profile you and target you with
587 specific messages. Think of how you’d sell a fridge if you knew that the
588 warranty on your prospect’s fridge just expired and that they were
589 expecting a tax rebate in April.
590
591 Also, the more data they have, the better they can craft deceptive
592 messages — if I know that you’re into genealogy, I might not try to feed
593 you pseudoscience about genetic differences between “races,” sticking
594 instead to conspiratorial secret histories of “demographic replacement”
595 and the like.
596
597 Facebook also helps you locate people who have the same odious or
598 antisocial views as you. It makes it possible to find other people who
599 want to carry tiki torches through the streets of Charlottesville in
600 Confederate cosplay. It can help you find other people who want to join
601 your militia and go to the border to look for undocumented migrants to
602 terrorize. It can help you find people who share your belief that
603 vaccines are poison and that the Earth is flat.
604
605 There is one way in which targeted advertising uniquely benefits those
606 advocating for socially unacceptable causes: It is invisible. Racism is
607 widely geographically dispersed, and there are few places where racists
608 — and only racists — gather. This is similar to the problem of selling
609 refrigerators in that potential refrigerator purchasers are
610 geographically dispersed and there are few places where you can buy an
611 ad that will be primarily seen by refrigerator customers. But buying a
612 refrigerator is socially acceptable while being a Nazi is not, so you
613 can buy a billboard or advertise in the newspaper sports section for
614 your refrigerator business, and the only potential downside is that your
615 ad will be seen by a lot of people who don’t want refrigerators,
616 resulting in a lot of wasted expense.
617
618 But even if you wanted to advertise your Nazi movement on a billboard or
619 prime-time TV or the sports section, you would struggle to find anyone
620 willing to sell you the space for your ad partly because they disagree
621 with your views and partly because they fear censure (boycott,
622 reputational damage, etc.) from other people who disagree with your
623 views.
624
625 Targeted ads solve this problem: On the internet, every ad unit can be
626 different for every person, meaning that you can buy ads that are only
627 shown to people who appear to be Nazis and not to people who hate Nazis.
628 When there’s spillover — when someone who hates racism is shown a racist
629 recruiting ad — there is some fallout; the platform or publication might
630 get an angry public or private denunciation. But the nature of the risk
631 assumed by an online ad buyer is different than the risks to a
632 traditional publisher or billboard owner who might want to run a Nazi
633 ad.
634
635 Online ads are placed by algorithms that broker between a diverse
636 ecosystem of self-serve ad platforms that anyone can buy an ad through,
637 so the Nazi ad that slips onto your favorite online publication isn’t
638 seen as their moral failing but rather as a failure in some distant,
639 upstream ad supplier. When a publication gets a complaint about an
640 offensive ad that’s appearing in one of its units, it can take some
641 steps to block that ad, but the Nazi might buy a slightly different ad
642 from a different broker serving the same unit. And in any event,
643 internet users increasingly understand that when they see an ad, it’s
644 likely that the advertiser did not choose that publication and that the
645 publication has no idea who its advertisers are.
646
647 These layers of indirection between advertisers and publishers serve as
648 moral buffers: Today’s moral consensus is largely that publishers
649 shouldn’t be held responsible for the ads that appear on their pages
650 because they’re not actively choosing to put those ads there. Because of
651 this, Nazis are able to overcome significant barriers to organizing
652 their movement.
653
654 Data has a complex relationship with domination. Being able to spy on
655 your customers can alert you to their preferences for your rivals and
656 allow you to head off your rivals at the pass.
657
658 More importantly, if you can dominate the information space while also
659 gathering data, then you make other deceptive tactics stronger because
660 it’s harder to break out of the web of deceit you’re spinning.
661 Domination — that is, ultimately becoming a monopoly — and not the data
662 itself is the supercharger that makes every tactic worth pursuing
663 because monopolistic domination deprives your target of an escape route.
664
665 If you’re a Nazi who wants to ensure that your prospects primarily see
666 deceptive, confirming information when they search for more, you can
667 improve your odds by seeding the search terms they use through your
668 initial communications. You don’t need to own the top 10 results for
669 “voter suppression” if you can convince your marks to confine their
670 search terms to “voter fraud,” which throws up a very different set of
671 search results.
672
673 Surveillance capitalists are like stage mentalists who claim that their
674 extraordinary insights into human behavior let them guess the word that
675 you wrote down and folded up in your pocket but who really use shills,
676 hidden cameras, sleight of hand, and brute-force memorization to amaze
677 you.
678
679 Or perhaps they’re more like pick-up artists, the misogynistic cult that
680 promises to help awkward men have sex with women by teaching them
681 “neurolinguistic programming” phrases, body language techniques, and
682 psychological manipulation tactics like “negging” — offering unsolicited
683 negative feedback to women to lower their self-esteem and prick their
684 interest.
685
686 Some pick-up artists eventually manage to convince women to go home with
687 them, but it’s not because these men have figured out how to bypass
688 women’s critical faculties. Rather, pick-up artists’ “success” stories
689 are a mix of women who were incapable of giving consent, women who were
690 coerced, women who were intoxicated, self-destructive women, and a few
691 women who were sober and in command of their faculties but who didn’t
692 realize straightaway that they were with terrible men but rectified the
693 error as soon as they could.
694
695 Pick-up artists *believe* they have figured out a secret back door that
696 bypasses women’s critical faculties, but they haven’t. Many of the
697 tactics they deploy, like negging, became the butt of jokes (just like
698 people joke about bad ad targeting), and there’s a good chance that
699 anyone they try these tactics on will immediately recognize them and
700 dismiss the men who use them as irredeemable losers.
701
702 Pick-up artists are proof that people can believe they have developed a
703 system of mind control *even when it doesn’t work*. Pick-up artists
704 simply exploit the fact that one-in-a-million chances can come through
705 for you if you make a million attempts, and then they assume that the
706 other 999,999 times, they simply performed the technique incorrectly and
707 commit themselves to doing better next time. There’s only one group of
708 people who find pick-up artist lore reliably convincing: other would-be
709 pick-up artists whose anxiety and insecurity make them vulnerable to
710 scammers and delusional men who convince them that if they pay for
711 tutelage and follow instructions, then they will someday succeed.
712 Pick-up artists assume they fail to entice women because they are bad at
713 being pick-up artists, not because pick-up artistry is bullshit. Pick-up
714 artists are bad at selling themselves to women, but they’re much better
715 at selling themselves to men who pay to learn the secrets of pick-up
716 artistry.
717
718 Department store pioneer John Wanamaker is said to have lamented, “Half
719 the money I spend on advertising is wasted; the trouble is I don’t know
720 which half.” The fact that Wanamaker thought that only half of his
721 advertising spending was wasted is a tribute to the persuasiveness of
722 advertising executives, who are *much* better at convincing potential
723 clients to buy their services than they are at convincing the general
724 public to buy their clients’ wares.
725
726 What is Facebook?
727 -----------------
728
729 Facebook is heralded as the origin of all of our modern plagues, and
730 it’s not hard to see why. Some tech companies want to lock their users
731 in but make their money by monopolizing access to the market for apps
732 for their devices and gouging them on prices rather than by spying on
733 them (like Apple). Some companies don’t care about locking in users
734 because they’ve figured out how to spy on them no matter where they are
735 and what they’re doing and can turn that surveillance into money
736 (Google). Facebook alone among the Western tech giants has built a
737 business based on locking in its users *and* spying on them all the
738 time.
739
740 Facebook’s surveillance regime is really without parallel in the Western
741 world. Though Facebook tries to prevent itself from being visible on the
742 public web, hiding most of what goes on there from people unless they’re
743 logged into Facebook, the company has nevertheless booby-trapped the
744 entire web with surveillance tools in the form of Facebook “Like”
745 buttons that web publishers include on their sites to boost their
746 Facebook profiles. Facebook also makes various libraries and other
747 useful code snippets available to web publishers that act as
748 surveillance tendrils on the sites where they’re used, funneling
749 information about visitors to the site — newspapers, dating sites,
750 message boards — to Facebook.
751
752 Big Tech is able to practice surveillance not just because it is tech
753 but because it is *big*.
754
755 Facebook offers similar tools to app developers, so the apps — games,
756 fart machines, business review services, apps for keeping abreast of
757 your kid’s schooling — you use will send information about your
758 activities to Facebook even if you don’t have a Facebook account and
759 even if you don’t download or use Facebook apps. On top of all that,
760 Facebook buys data from third-party brokers on shopping habits, physical
761 location, use of “loyalty” programs, financial transactions, etc., and
762 cross-references that with the dossiers it develops on activity on
763 Facebook and with apps and the public web.
764
765 Though it’s easy to integrate the web with Facebook — linking to news
766 stories and such — Facebook products are generally not available to be
767 integrated back into the web itself. You can embed a tweet in a Facebook
768 post, but if you embed a Facebook post in a tweet, you just get a link
769 back to Facebook and must log in before you can see it. Facebook has
770 used extreme technological and legal countermeasures to prevent rivals
771 from allowing their users to embed Facebook snippets in competing
772 services or to create alternative interfaces to Facebook that merge your
773 Facebook inbox with those of other services that you use.
774
775 And Facebook is incredibly popular, with 2.3 billion claimed users
776 (though many believe this figure to be inflated). Facebook has been used
777 to organize genocidal pogroms, racist riots, anti-vaccination movements,
778 flat Earth cults, and the political lives of some of the world’s
779 ugliest, most brutal autocrats. There are some really alarming things
780 going on in the world, and Facebook is implicated in many of them, so
781 it’s easy to conclude that these bad things are the result of Facebook’s
782 mind-control system, which it rents out to anyone with a few bucks to
783 spend.
784
785 To understand what role Facebook plays in the formulation and
786 mobilization of antisocial movements, we need to understand the dual
787 nature of Facebook.
788
789 Because it has a lot of users and a lot of data about those users,
790 Facebook is a very efficient tool for locating people with hard-to-find
791 traits, the kinds of traits that are widely diffused in the population
792 such that advertisers have historically struggled to find a
793 cost-effective way to reach them. Think back to refrigerators: Most of
794 us only replace our major appliances a few times in our entire lives. If
795 you’re a refrigerator manufacturer or retailer, you have these brief
796 windows in the life of a consumer during which they are pondering a
797 purchase, and you have to somehow reach them. Anyone who’s ever
798 registered a title change after buying a house can attest that appliance
799 manufacturers are incredibly desperate to reach anyone who has even the
800 slenderest chance of being in the market for a new fridge.
801
802 Facebook makes finding people shopping for refrigerators a *lot* easier.
803 It can target ads to people who’ve registered a new home purchase, to
804 people who’ve searched for refrigerator buying advice, to people who
805 have complained about their fridge dying, or any combination thereof. It
806 can even target people who’ve recently bought *other* kitchen appliances
807 on the theory that someone who’s just replaced their stove and
808 dishwasher might be in a fridge-buying kind of mood. The vast majority
809 of people who are reached by these ads will not be in the market for a
810 new fridge, but — crucially — the percentage of people who *are* looking
811 for fridges that these ads reach is *much* larger than it is than for
812 any group that might be subjected to traditional, offline targeted
813 refrigerator marketing.
814
815 Facebook also makes it a lot easier to find people who have the same
816 rare disease as you, which might have been impossible in earlier eras —
817 the closest fellow sufferer might otherwise be hundreds of miles away.
818 It makes it easier to find people who went to the same high school as
819 you even though decades have passed and your former classmates have all
820 been scattered to the four corners of the Earth.
821
822 Facebook also makes it much easier to find people who hold the same rare
823 political beliefs as you. If you’ve always harbored a secret affinity
824 for socialism but never dared utter this aloud lest you be demonized by
825 your neighbors, Facebook can help you discover other people who feel the
826 same way (and it might just demonstrate to you that your affinity is
827 more widespread than you ever suspected). It can make it easier to find
828 people who share your sexual identity. And again, it can help you to
829 understand that what you thought was a shameful secret that affected
830 only you was really a widely shared trait, giving you both comfort and
831 the courage to come out to the people in your life.
832
833 All of this presents a dilemma for Facebook: Targeting makes the
834 company’s ads more effective than traditional ads, but it also lets
835 advertisers see just how effective their ads are. While advertisers are
836 pleased to learn that Facebook ads are more effective than ads on
837 systems with less sophisticated targeting, advertisers can also see that
838 in nearly every case, the people who see their ads ignore them. Or, at
839 best, the ads work on a subconscious level, creating nebulous
840 unmeasurables like “brand recognition.” This means that the price per ad
841 is very low in nearly every case.
842
843 To make things worse, many Facebook groups spark precious little
844 discussion. Your little-league soccer team, the people with the same
845 rare disease as you, and the people you share a political affinity with
846 may exchange the odd flurry of messages at critical junctures, but on a
847 daily basis, there’s not much to say to your old high school chums or
848 other hockey-card collectors.
849
850 With nothing but “organic” discussion, Facebook would not generate
851 enough traffic to sell enough ads to make the money it needs to
852 continually expand by buying up its competitors while returning handsome
853 sums to its investors.
854
855 So Facebook has to gin up traffic by sidetracking its own forums: Every
856 time Facebook’s algorithm injects controversial materials — inflammatory
857 political articles, conspiracy theories, outrage stories — into a group,
858 it can hijack that group’s nominal purpose with its desultory
859 discussions and supercharge those discussions by turning them into
860 bitter, unproductive arguments that drag on and on. Facebook is
861 optimized for engagement, not happiness, and it turns out that automated
862 systems are pretty good at figuring out things that people will get
863 angry about.
864
865 Facebook *can* modify our behavior but only in a couple of trivial ways.
866 First, it can lock in all your friends and family members so that you
867 check and check and check with Facebook to find out what they are up to;
868 and second, it can make you angry and anxious. It can force you to
869 choose between being interrupted constantly by updates — a process that
870 breaks your concentration and makes it hard to be introspective — and
871 staying in touch with your friends. This is a very limited form of mind
872 control, and it can only really make us miserable, angry, and anxious.
873
874 This is why Facebook’s targeting systems — both the ones it shows to
875 advertisers and the ones that let users find people who share their
876 interests — are so next-gen and smooth and easy to use as well as why
877 its message boards have a toolset that seems like it hasn’t changed
878 since the mid-2000s. If Facebook delivered an equally flexible,
879 sophisticated message-reading system to its users, those users could
880 defend themselves against being nonconsensually eyeball-fucked with
881 Donald Trump headlines.
882
883 The more time you spend on Facebook, the more ads it gets to show you.
884 The solution to Facebook’s ads only working one in a thousand times is
885 for the company to try to increase how much time you spend on Facebook
886 by a factor of a thousand. Rather than thinking of Facebook as a company
887 that has figured out how to show you exactly the right ad in exactly the
888 right way to get you to do what its advertisers want, think of it as a
889 company that has figured out how to make you slog through an endless
890 torrent of arguments even though they make you miserable, spending so
891 much time on the site that it eventually shows you at least one ad that
892 you respond to.
893
894 Monopoly and the right to the future tense
895 ------------------------------------------
896
897 Zuboff and her cohort are particularly alarmed at the extent to which
898 surveillance allows corporations to influence our decisions, taking away
899 something she poetically calls “the right to the future tense” — that
900 is, the right to decide for yourself what you will do in the future.
901
902 It’s true that advertising can tip the scales one way or another: When
903 you’re thinking of buying a fridge, a timely fridge ad might end the
904 search on the spot. But Zuboff puts enormous and undue weight on the
905 persuasive power of surveillance-based influence techniques. Most of
906 these don’t work very well, and the ones that do won’t work for very
907 long. The makers of these influence tools are confident they will
908 someday refine them into systems of total control, but they are hardly
909 unbiased observers, and the risks from their dreams coming true are very
910 speculative.
911
912 By contrast, Zuboff is rather sanguine about 40 years of lax antitrust
913 practice that has allowed a handful of companies to dominate the
914 internet, ushering in an information age with, `as one person on Twitter
915 noted <https://twitter.com/tveastman/status/1069674780826071040>`__,
916 five giant websites each filled with screenshots of the other four.
917
918 However, if we are to be alarmed that we might lose the right to choose
919 for ourselves what our future will hold, then monopoly’s nonspeculative,
920 concrete, here-and-now harms should be front and center in our debate
921 over tech policy.
922
923 Start with “digital rights management.” In 1998, Bill Clinton signed the
924 Digital Millennium Copyright Act (DMCA) into law. It’s a complex piece
925 of legislation with many controversial clauses but none more so than
926 Section 1201, the “anti-circumvention” rule.
927
928 This is a blanket ban on tampering with systems that restrict access to
929 copyrighted works. The ban is so thoroughgoing that it prohibits
930 removing a copyright lock even when no copyright infringement takes
931 place. This is by design: The activities that the DMCA’s Section 1201
932 sets out to ban are not copyright infringements; rather, they are legal
933 activities that frustrate manufacturers’ commercial plans.
934
935 For example, Section 1201’s first major application was on DVD players
936 as a means of enforcing the region coding built into those devices.
937 DVD-CCA, the body that standardized DVDs and DVD players, divided the
938 world into six regions and specified that DVD players must check each
939 disc to determine which regions it was authorized to be played in. DVD
940 players would have their own corresponding region (a DVD player bought
941 in the U.S. would be region 1 while one bought in India would be region
942 5). If the player and the disc’s region matched, the player would play
943 the disc; otherwise, it would reject it.
944
945 However, watching a lawfully produced disc in a country other than the
946 one where you purchased it is not copyright infringement — it’s the
947 opposite. Copyright law imposes this duty on customers for a movie: You
948 must go into a store, find a licensed disc, and pay the asking price. Do
949 that — and *nothing else* — and you and copyright are square with one
950 another.
951
952 The fact that a movie studio wants to charge Indians less than Americans
953 or release in Australia later than it releases in the U.K. has no
954 bearing on copyright law. Once you lawfully acquire a DVD, it is no
955 copyright infringement to watch it no matter where you happen to be.
956
957 So DVD and DVD player manufacturers would not be able to use accusations
958 of abetting copyright infringement to punish manufacturers who made
959 noncompliant players that would play discs from any region or repair
960 shops that modified players to let you watch out-of-region discs or
961 software programmers who created programs to let you do this.
962
963 That’s where Section 1201 of the DMCA comes in: By banning tampering
964 with an “access control,” the rule gave manufacturers and rights holders
965 standing to sue competitors who released superior products with lawful
966 features that the market demanded (in this case, region-free players).
967
968 This is an odious scam against consumers, but as time went by, Section
969 1201 grew to encompass a rapidly expanding constellation of devices and
970 services as canny manufacturers have realized certain things:
971
972 - Any device with software in it contains a “copyrighted work” — i.e.,
973 the software.
974 - A device can be designed so that reconfiguring the software requires
975 bypassing an “access control for copyrighted works,” which is a
976 potential felony under Section 1201.
977 - Thus, companies can control their customers’ behavior after they take
978 home their purchases by designing products so that all unpermitted
979 uses require modifications that fall afoul of Section 1201.
980
981 Section 1201 then becomes a means for manufacturers of all descriptions
982 to force their customers to arrange their affairs to benefit the
983 manufacturers’ shareholders instead of themselves.
984
985 This manifests in many ways: from a new generation of inkjet printers
986 that use countermeasures to prevent third-party ink that cannot be
987 bypassed without legal risks to similar systems in tractors that prevent
988 third-party technicians from swapping in the manufacturer’s own parts
989 that are not recognized by the tractor’s control system until it is
990 supplied with a manufacturer’s unlock code.
991
992 Closer to home, Apple’s iPhones use these measures to prevent both
993 third-party service and third-party software installation. This allows
994 Apple to decide when an iPhone is beyond repair and must be shredded and
995 landfilled as opposed to the iPhone’s purchaser. (Apple is notorious for
996 its environmentally catastrophic policy of destroying old electronics
997 rather than permitting them to be cannibalized for parts.) This is a
998 very useful power to wield, especially in light of CEO Tim Cook’s
999 January 2019 warning to investors that the company’s profits are
1000 endangered by customers choosing to hold onto their phones for longer
1001 rather than replacing them.
1002
1003 Apple’s use of copyright locks also allows it to establish a monopoly
1004 over how its customers acquire software for their mobile devices. The
1005 App Store’s commercial terms guarantee Apple a share of all revenues
1006 generated by the apps sold there, meaning that Apple gets paid when you
1007 buy an app from its store and then continues to get paid every time you
1008 buy something using that app. This comes out of the bottom line of
1009 software developers, who must either charge more or accept lower profits
1010 for their products.
1011
1012 Crucially, Apple’s use of copyright locks gives it the power to make
1013 editorial decisions about which apps you may and may not install on your
1014 own device. Apple has used this power to `reject
1015 dictionaries <https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html>`__
1016 for containing obscene words; to `limit political
1017 speech <https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again>`__,
1018 especially from apps that make sensitive political commentary such as an
1019 app that notifies you every time a U.S. drone kills someone somewhere in
1020 the world; and to `object to a
1021 game <https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says>`__
1022 that commented on the Israel-Palestine conflict.
1023
1024 Apple often justifies monopoly power over software installation in the
1025 name of security, arguing that its vetting of apps for its store means
1026 that it can guard its users against apps that contain surveillance code.
1027 But this cuts both ways. In China, the government `ordered Apple to
1028 prohibit the sale of privacy
1029 tools <https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc>`__
1030 like VPNs with the exception of VPNs that had deliberately introduced
1031 flaws designed to let the Chinese state eavesdrop on users. Because
1032 Apple uses technological countermeasures — with legal backstops — to
1033 block customers from installing unauthorized apps, Chinese iPhone owners
1034 cannot readily (or legally) acquire VPNs that would protect them from
1035 Chinese state snooping.
1036
1037 Zuboff calls surveillance capitalism a “rogue capitalism.” Theoreticians
1038 of capitalism claim that its virtue is that it `aggregates information
1039 in the form of consumers’
1040 decisions <https://en.wikipedia.org/wiki/Price_signal>`__, producing
1041 efficient markets. Surveillance capitalism’s supposed power to rob its
1042 victims of their free will through computationally supercharged
1043 influence campaigns means that our markets no longer aggregate
1044 customers’ decisions because we customers no longer decide — we are
1045 given orders by surveillance capitalism’s mind-control rays.
1046
1047 If our concern is that markets cease to function when consumers can no
1048 longer make choices, then copyright locks should concern us at *least*
1049 as much as influence campaigns. An influence campaign might nudge you to
1050 buy a certain brand of phone; but the copyright locks on that phone
1051 absolutely determine where you get it serviced, which apps can run on
1052 it, and when you have to throw it away rather than fixing it.
1053
1054 Search order and the right to the future tense
1055 ----------------------------------------------
1056
1057 Markets are posed as a kind of magic: By discovering otherwise hidden
1058 information conveyed by the free choices of consumers, those consumers’
1059 local knowledge is integrated into a self-correcting system that makes
1060 efficient allocations—more efficient than any computer could calculate.
1061 But monopolies are incompatible with that notion. When you only have one
1062 app store, the owner of the store — not the consumer — decides on the
1063 range of choices. As Boss Tweed once said, “I don’t care who does the
1064 electing, so long as I get to do the nominating.” A monopolized market
1065 is an election whose candidates are chosen by the monopolist.
1066
1067 This ballot rigging is made more pernicious by the existence of
1068 monopolies over search order. Google’s search market share is about 90%.
1069 When Google’s ranking algorithm puts a result for a popular search term
1070 in its top 10, that helps determine the behavior of millions of people.
1071 If Google’s answer to “Are vaccines dangerous?” is a page that rebuts
1072 anti-vax conspiracy theories, then a sizable portion of the public will
1073 learn that vaccines are safe. If, on the other hand, Google sends those
1074 people to a site affirming the anti-vax conspiracies, a sizable portion
1075 of those millions will come away convinced that vaccines are dangerous.
1076
1077 Google’s algorithm is often tricked into serving disinformation as a
1078 prominent search result. But in these cases, Google isn’t persuading
1079 people to change their minds; it’s just presenting something untrue as
1080 fact when the user has no cause to doubt it.
1081
1082 This is true whether the search is for “Are vaccines dangerous?” or
1083 “best restaurants near me.” Most users will never look past the first
1084 page of search results, and when the overwhelming majority of people all
1085 use the same search engine, the ranking algorithm deployed by that
1086 search engine will determine myriad outcomes (whether to adopt a child,
1087 whether to have cancer surgery, where to eat dinner, where to move,
1088 where to apply for a job) to a degree that vastly outstrips any
1089 behavioral outcomes dictated by algorithmic persuasion techniques.
1090
1091 Many of the questions we ask search engines have no empirically correct
1092 answers: “Where should I eat dinner?” is not an objective question. Even
1093 questions that do have correct answers (“Are vaccines dangerous?”) don’t
1094 have one empirically superior source for that answer. Many pages affirm
1095 the safety of vaccines, so which one goes first? Under conditions of
1096 competition, consumers can choose from many search engines and stick
1097 with the one whose algorithmic judgment suits them best, but under
1098 conditions of monopoly, we all get our answers from the same place.
1099
1100 Google’s search dominance isn’t a matter of pure merit: The company has
1101 leveraged many tactics that would have been prohibited under classical,
1102 pre-Ronald-Reagan antitrust enforcement standards to attain its
1103 dominance. After all, this is a company that has developed two major
1104 products: a really good search engine and a pretty good Hotmail clone.
1105 Every other major success it’s had — Android, YouTube, Google Maps, etc.
1106 — has come through an acquisition of a nascent competitor. Many of the
1107 company’s key divisions, such as the advertising technology of
1108 DoubleClick, violate the historical antitrust principle of structural
1109 separation, which forbade firms from owning subsidiaries that competed
1110 with their customers. Railroads, for example, were barred from owning
1111 freight companies that competed with the shippers whose freight they
1112 carried.
1113
1114 If we’re worried about giant companies subverting markets by stripping
1115 consumers of their ability to make free choices, then vigorous antitrust
1116 enforcement seems like an excellent remedy. If we’d denied Google the
1117 right to effect its many mergers, we would also have probably denied it
1118 its total search dominance. Without that dominance, the pet theories,
1119 biases, errors (and good judgment, too) of Google search engineers and
1120 product managers would not have such an outsized effect on consumer
1121 choice.
1122
1123 This goes for many other companies. Amazon, a classic surveillance
1124 capitalist, is obviously the dominant tool for searching Amazon — though
1125 many people find their way to Amazon through Google searches and
1126 Facebook posts — and obviously, Amazon controls Amazon search. That
1127 means that Amazon’s own self-serving editorial choices—like promoting
1128 its own house brands over rival goods from its sellers as well as its
1129 own pet theories, biases, and errors— determine much of what we buy on
1130 Amazon. And since Amazon is the dominant e-commerce retailer outside of
1131 China and since it attained that dominance by buying up both large
1132 rivals and nascent competitors in defiance of historical antitrust
1133 rules, we can blame the monopoly for stripping consumers of their right
1134 to the future tense and the ability to shape markets by making informed
1135 choices.
1136
1137 Not every monopolist is a surveillance capitalist, but that doesn’t mean
1138 they’re not able to shape consumer choices in wide-ranging ways. Zuboff
1139 lauds Apple for its App Store and iTunes Store, insisting that adding
1140 price tags to the features on its platforms has been the secret to
1141 resisting surveillance and thus creating markets. But Apple is the only
1142 retailer allowed to sell on its platforms, and it’s the second-largest
1143 mobile device vendor in the world. The independent software vendors that
1144 sell through Apple’s marketplace accuse the company of the same
1145 surveillance sins as Amazon and other big retailers: spying on its
1146 customers to find lucrative new products to launch, effectively using
1147 independent software vendors as free-market researchers, then forcing
1148 them out of any markets they discover.
1149
1150 Because of its use of copyright locks, Apple’s mobile customers are not
1151 legally allowed to switch to a rival retailer for its apps if they want
1152 to do so on an iPhone. Apple, obviously, is the only entity that gets to
1153 decide how it ranks the results of search queries in its stores. These
1154 decisions ensure that some apps are often installed (because they appear
1155 on page one) and others are never installed (because they appear on page
1156 one million). Apple’s search-ranking design decisions have a vastly more
1157 significant effect on consumer behaviors than influence campaigns
1158 delivered by surveillance capitalism’s ad-serving bots.
1159
1160 Monopolists can afford sleeping pills for watchdogs
1161 ---------------------------------------------------
1162
1163 Only the most extreme market ideologues think that markets can
1164 self-regulate without state oversight. Markets need watchdogs —
1165 regulators, lawmakers, and other elements of democratic control — to
1166 keep them honest. When these watchdogs sleep on the job, then markets
1167 cease to aggregate consumer choices because those choices are
1168 constrained by illegitimate and deceptive activities that companies are
1169 able to get away with because no one is holding them to account.
1170
1171 But this kind of regulatory capture doesn’t come cheap. In competitive
1172 sectors, where rivals are constantly eroding one another’s margins,
1173 individual firms lack the surplus capital to effectively lobby for laws
1174 and regulations that serve their ends.
1175
1176 Many of the harms of surveillance capitalism are the result of weak or
1177 nonexistent regulation. Those regulatory vacuums spring from the power
1178 of monopolists to resist stronger regulation and to tailor what
1179 regulation exists to permit their existing businesses.
1180
1181 Here’s an example: When firms over-collect and over-retain our data,
1182 they are at increased risk of suffering a breach — you can’t leak data
1183 you never collected, and once you delete all copies of that data, you
1184 can no longer leak it. For more than a decade, we’ve lived through an
1185 endless parade of ever-worsening data breaches, each one uniquely
1186 horrible in the scale of data breached and the sensitivity of that data.
1187
1188 But still, firms continue to over-collect and over-retain our data for
1189 three reasons:
1190
1191 **1. They are locked in the aforementioned limbic arms race with our
1192 capacity to shore up our attentional defense systems to resist their new
1193 persuasion techniques.** They’re also locked in an arms race with their
1194 competitors to find new ways to target people for sales pitches. As soon
1195 as they discover a soft spot in our attentional defenses (a
1196 counterintuitive, unobvious way to target potential refrigerator
1197 buyers), the public begins to wise up to the tactic, and their
1198 competitors leap on it, hastening the day in which all potential
1199 refrigerator buyers have been inured to the pitch.
1200
1201 **2. They believe the surveillance capitalism story.** Data is cheap to
1202 aggregate and store, and both proponents and opponents of surveillance
1203 capitalism have assured managers and product designers that if you
1204 collect enough data, you will be able to perform sorcerous acts of mind
1205 control, thus supercharging your sales. Even if you never figure out how
1206 to profit from the data, someone else will eventually offer to buy it
1207 from you to give it a try. This is the hallmark of all economic bubbles:
1208 acquiring an asset on the assumption that someone else will buy it from
1209 you for more than you paid for it, often to sell to someone else at an
1210 even greater price.
1211
1212 **3. The penalties for leaking data are negligible.** Most countries
1213 limit these penalties to actual damages, meaning that consumers who’ve
1214 had their data breached have to show actual monetary harms to get a
1215 reward. In 2014, Home Depot disclosed that it had lost credit-card data
1216 for 53 million of its customers, but it settled the matter by paying
1217 those customers about $0.34 each — and a third of that $0.34 wasn’t even
1218 paid in cash. It took the form of a credit to procure a largely
1219 ineffectual credit-monitoring service.
1220
1221 But the harms from breaches are much more extensive than these
1222 actual-damages rules capture. Identity thieves and fraudsters are wily
1223 and endlessly inventive. All the vast breaches of our century are being
1224 continuously recombined, the data sets merged and mined for new ways to
1225 victimize the people whose data was present in them. Any reasonable,
1226 evidence-based theory of deterrence and compensation for breaches would
1227 not confine damages to actual damages but rather would allow users to
1228 claim these future harms.
1229
1230 However, even the most ambitious privacy rules, such as the EU General
1231 Data Protection Regulation, fall far short of capturing the negative
1232 externalities of the platforms’ negligent over-collection and
1233 over-retention, and what penalties they do provide are not aggressively
1234 pursued by regulators.
1235
1236 This tolerance of — or indifference to — data over-collection and
1237 over-retention can be ascribed in part to the sheer lobbying muscle of
1238 the platforms. They are so profitable that they can handily afford to
1239 divert gigantic sums to fight any real change — that is, change that
1240 would force them to internalize the costs of their surveillance
1241 activities.
1242
1243 And then there’s state surveillance, which the surveillance capitalism
1244 story dismisses as a relic of another era when the big worry was being
1245 jailed for your dissident speech, not having your free will stripped
1246 away with machine learning.
1247
1248 But state surveillance and private surveillance are intimately related.
1249 As we saw when Apple was conscripted by the Chinese government as a
1250 vital collaborator in state surveillance, the only really affordable and
1251 tractable way to conduct mass surveillance on the scale practiced by
1252 modern states — both “free” and autocratic states — is to suborn
1253 commercial services.
1254
1255 Whether it’s Google being used as a location tracking tool by local law
1256 enforcement across the U.S. or the use of social media tracking by the
1257 Department of Homeland Security to build dossiers on participants in
1258 protests against Immigration and Customs Enforcement’s family separation
1259 practices, any hard limits on surveillance capitalism would hamstring
1260 the state’s own surveillance capability. Without Palantir, Amazon,
1261 Google, and other major tech contractors, U.S. cops would not be able to
1262 spy on Black people, ICE would not be able to manage the caging of
1263 children at the U.S. border, and state welfare systems would not be able
1264 to purge their rolls by dressing up cruelty as empiricism and claiming
1265 that poor and vulnerable people are ineligible for assistance. At least
1266 some of the states’ unwillingness to take meaningful action to curb
1267 surveillance should be attributed to this symbiotic relationship. There
1268 is no mass state surveillance without mass commercial surveillance.
1269
1270 Monopolism is key to the project of mass state surveillance. It’s true
1271 that smaller tech firms are apt to be less well-defended than Big Tech,
1272 whose security experts are drawn from the tops of their field and who
1273 are given enormous resources to secure and monitor their systems against
1274 intruders. But smaller firms also have less to protect: fewer users
1275 whose data is more fragmented across more systems and have to be
1276 suborned one at a time by state actors.
1277
1278 A concentrated tech sector that works with authorities is a much more
1279 powerful ally in the project of mass state surveillance than a
1280 fragmented one composed of smaller actors. The U.S. tech sector is small
1281 enough that all of its top executives fit around a single boardroom
1282 table in Trump Tower in 2017, shortly after Trump’s inauguration. Most
1283 of its biggest players bid to win JEDI, the Pentagon’s $10 billion Joint
1284 Enterprise Defense Infrastructure cloud contract. Like other highly
1285 concentrated industries, Big Tech rotates its key employees in and out
1286 of government service, sending them to serve in the Department of
1287 Defense and the White House, then hiring ex-Pentagon and ex-DOD top
1288 staffers and officers to work in their own government relations
1289 departments.
1290
1291 They can even make a good case for doing this: After all, when there are
1292 only four or five big companies in an industry, everyone qualified to
1293 regulate those companies has served as an executive in at least a couple
1294 of them — because, likewise, when there are only five companies in an
1295 industry, everyone qualified for a senior role at any of them is by
1296 definition working at one of the other ones.
1297
1298 While surveillance doesn’t cause monopolies, monopolies certainly
1299 abet surveillance.
1300
1301 Industries that are competitive are fragmented — composed of companies
1302 that are at each other’s throats all the time and eroding one another’s
1303 margins in bids to steal their best customers. This leaves them with
1304 much more limited capital to use to lobby for favorable rules and a much
1305 harder job of getting everyone to agree to pool their resources to
1306 benefit the industry as a whole.
1307
1308 Surveillance combined with machine learning is supposed to be an
1309 existential crisis, a species-defining moment at which our free will is
1310 just a few more advances in the field from being stripped away. I am
1311 skeptical of this claim, but I *do* think that tech poses an existential
1312 threat to our society and possibly our species.
1313
1314 But that threat grows out of monopoly.
1315
1316 One of the consequences of tech’s regulatory capture is that it can
1317 shift liability for poor security decisions onto its customers and the
1318 wider society. It is absolutely normal in tech for companies to
1319 obfuscate the workings of their products, to make them deliberately hard
1320 to understand, and to threaten security researchers who seek to
1321 independently audit those products.
1322
1323 IT is the only field in which this is practiced: No one builds a bridge
1324 or a hospital and keeps the composition of the steel or the equations
1325 used to calculate load stresses a secret. It is a frankly bizarre
1326 practice that leads, time and again, to grotesque security defects on
1327 farcical scales, with whole classes of devices being revealed as
1328 vulnerable long after they are deployed in the field and put into
1329 sensitive places.
1330
1331 The monopoly power that keeps any meaningful consequences for breaches
1332 at bay means that tech companies continue to build terrible products
1333 that are insecure by design and that end up integrated into our lives,
1334 in possession of our data, and connected to our physical world. For
1335 years, Boeing has struggled with the aftermath of a series of bad
1336 technology decisions that made its 737 fleet a global pariah, a rare
1337 instance in which bad tech decisions have been seriously punished in the
1338 market.
1339
1340 These bad security decisions are compounded yet again by the use of
1341 copyright locks to enforce business-model decisions against consumers.
1342 Recall that these locks have become the go-to means for shaping consumer
1343 behavior, making it technically impossible to use third-party ink,
1344 insulin, apps, or service depots in connection with your lawfully
1345 acquired property.
1346
1347 Recall also that these copyright locks are backstopped by legislation
1348 (such as Section 1201 of the DMCA or Article 6 of the 2001 EU Copyright
1349 Directive) that ban tampering with (“circumventing”) them, and these
1350 statutes have been used to threaten security researchers who make
1351 disclosures about vulnerabilities without permission from manufacturers.
1352
1353 This amounts to a manufacturer’s veto over safety warnings and
1354 criticism. While this is far from the legislative intent of the DMCA and
1355 its sister statutes around the world, Congress has not intervened to
1356 clarify the statute nor will it because to do so would run counter to
1357 the interests of powerful, large firms whose lobbying muscle is
1358 unstoppable.
1359
1360 Copyright locks are a double whammy: They create bad security decisions
1361 that can’t be freely investigated or discussed. If markets are supposed
1362 to be machines for aggregating information (and if surveillance
1363 capitalism’s notional mind-control rays are what make it a “rogue
1364 capitalism” because it denies consumers the power to make decisions),
1365 then a program of legally enforced ignorance of the risks of products
1366 makes monopolism even more of a “rogue capitalism” than surveillance
1367 capitalism’s influence campaigns.
1368
1369 And unlike mind-control rays, enforced silence over security is an
1370 immediate, documented problem, and it *does* constitute an existential
1371 threat to our civilization and possibly our species. The proliferation
1372 of insecure devices — especially devices that spy on us and especially
1373 when those devices also can manipulate the physical world by, say,
1374 steering your car or flipping a breaker at a power station — is a kind
1375 of technology debt.
1376
1377 In software design, “technology debt” refers to old, baked-in decisions
1378 that turn out to be bad ones in hindsight. Perhaps a long-ago developer
1379 decided to incorporate a networking protocol made by a vendor that has
1380 since stopped supporting it. But everything in the product still relies
1381 on that superannuated protocol, and so, with each revision, the product
1382 team has to work around this obsolete core, adding compatibility layers,
1383 surrounding it with security checks that try to shore up its defenses,
1384 and so on. These Band-Aid measures compound the debt because every
1385 subsequent revision has to make allowances for *them*, too, like
1386 interest mounting on a predatory subprime loan. And like a subprime
1387 loan, the interest mounts faster than you can hope to pay it off: The
1388 product team has to put so much energy into maintaining this complex,
1389 brittle system that they don’t have any time left over to refactor the
1390 product from the ground up and “pay off the debt” once and for all.
1391
1392 Typically, technology debt results in a technological bankruptcy: The
1393 product gets so brittle and unsustainable that it fails
1394 catastrophically. Think of the antiquated COBOL-based banking and
1395 accounting systems that fell over at the start of the pandemic emergency
1396 when confronted with surges of unemployment claims. Sometimes that ends
1397 the product; sometimes it takes the company down with it. Being caught
1398 in the default of a technology debt is scary and traumatic, just like
1399 losing your house due to bankruptcy is scary and traumatic.
1400
1401 But the technology debt created by copyright locks isn’t individual
1402 debt; it’s systemic. Everyone in the world is exposed to this
1403 over-leverage, as was the case with the 2008 financial crisis. When that
1404 debt comes due — when we face a cascade of security breaches that
1405 threaten global shipping and logistics, the food supply, pharmaceutical
1406 production pipelines, emergency communications, and other critical
1407 systems that are accumulating technology debt in part due to the
1408 presence of deliberately insecure and deliberately unauditable copyright
1409 locks — it will indeed pose an existential risk.
1410
1411 Privacy and monopoly
1412 --------------------
1413
1414 Many tech companies are gripped by an orthodoxy that holds that if they
1415 just gather enough data on enough of our activities, everything else is
1416 possible — the mind control and endless profits. This is an
1417 unfalsifiable hypothesis: If data gives a tech company even a tiny
1418 improvement in behavior prediction and modification, the company
1419 declares that it has taken the first step toward global domination with
1420 no end in sight. If a company *fails* to attain any improvements from
1421 gathering and analyzing data, it declares success to be just around the
1422 corner, attainable once more data is in hand.
1423
1424 Surveillance tech is far from the first industry to embrace a
1425 nonsensical, self-serving belief that harms the rest of the world, and
1426 it is not the first industry to profit handsomely from such a delusion.
1427 Long before hedge-fund managers were claiming (falsely) that they could
1428 beat the S&P 500, there were plenty of other “respectable” industries
1429 that have been revealed as quacks in hindsight. From the makers of
1430 radium suppositories (a real thing!) to the cruel sociopaths who claimed
1431 they could “cure” gay people, history is littered with the formerly
1432 respectable titans of discredited industries.
1433
1434 This is not to say that there’s nothing wrong with Big Tech and its
1435 ideological addiction to data. While surveillance’s benefits are mostly
1436 overstated, its harms are, if anything, *understated*.
1437
1438 There’s real irony here. The belief in surveillance capitalism as a
1439 “rogue capitalism” is driven by the belief that markets wouldn’t
1440 tolerate firms that are gripped by false beliefs. An oil company that
1441 has false beliefs about where the oil is will eventually go broke
1442 digging dry wells after all.
1443
1444 But monopolists get to do terrible things for a long time before they
1445 pay the price. Think of how concentration in the finance sector allowed
1446 the subprime crisis to fester as bond-rating agencies, regulators,
1447 investors, and critics all fell under the sway of a false belief that
1448 complex mathematics could construct “fully hedged” debt instruments that
1449 could not possibly default. A small bank that engaged in this kind of
1450 malfeasance would simply go broke rather than outrunning the inevitable
1451 crisis, perhaps growing so big that it averted it altogether. But large
1452 banks were able to continue to attract investors, and when they finally
1453 *did* come a-cropper, the world’s governments bailed them out. The worst
1454 offenders of the subprime crisis are bigger than they were in 2008,
1455 bringing home more profits and paying their execs even larger sums.
1456
1457 Big Tech is able to practice surveillance not just because it is tech
1458 but because it is *big*. The reason every web publisher embeds a
1459 Facebook “Like” button is that Facebook dominates the internet’s social
1460 media referrals — and every one of those “Like” buttons spies on
1461 everyone who lands on a page that contains them (see also: Google
1462 Analytics embeds, Twitter buttons, etc.).
1463
1464 The reason the world’s governments have been slow to create meaningful
1465 penalties for privacy breaches is that Big Tech’s concentration produces
1466 huge profits that can be used to lobby against those penalties — and Big
1467 Tech’s concentration means that the companies involved are able to
1468 arrive at a unified negotiating position that supercharges the lobbying.
1469
1470 The reason that the smartest engineers in the world want to work for Big
1471 Tech is that Big Tech commands the lion’s share of tech industry jobs.
1472
1473 The reason people who are aghast at Facebook’s and Google’s and Amazon’s
1474 data-handling practices continue to use these services is that all their
1475 friends are on Facebook; Google dominates search; and Amazon has put all
1476 the local merchants out of business.
1477
1478 Competitive markets would weaken the companies’ lobbying muscle by
1479 reducing their profits and pitting them against each other in regulatory
1480 forums. It would give customers other places to go to get their online
1481 services. It would make the companies small enough to regulate and pave
1482 the way to meaningful penalties for breaches. It would let engineers
1483 with ideas that challenged the surveillance orthodoxy raise capital to
1484 compete with the incumbents. It would give web publishers multiple ways
1485 to reach audiences and make the case against Facebook and Google and
1486 Twitter embeds.
1487
1488 In other words, while surveillance doesn’t cause monopolies, monopolies
1489 certainly abet surveillance.
1490
1491 Ronald Reagan, pioneer of tech monopolism
1492 -----------------------------------------
1493
1494 Technology exceptionalism is a sin, whether it’s practiced by
1495 technology’s blind proponents or by its critics. Both of these camps are
1496 prone to explaining away monopolistic concentration by citing some
1497 special characteristic of the tech industry, like network effects or
1498 first-mover advantage. The only real difference between these two groups
1499 is that the tech apologists say monopoly is inevitable so we should just
1500 let tech get away with its abuses while competition regulators in the
1501 U.S. and the EU say monopoly is inevitable so we should punish tech for
1502 its abuses but not try to break up the monopolies.
1503
1504 To understand how tech became so monopolistic, it’s useful to look at
1505 the dawn of the consumer tech industry: 1979, the year the Apple II Plus
1506 launched and became the first successful home computer. That also
1507 happens to be the year that Ronald Reagan hit the campaign trail for the
1508 1980 presidential race — a race he won, leading to a radical shift in
1509 the way that antitrust concerns are handled in America. Reagan’s cohort
1510 of politicians — including Margaret Thatcher in the U.K., Brian Mulroney
1511 in Canada, Helmut Kohl in Germany, and Augusto Pinochet in Chile — went
1512 on to enact similar reforms that eventually spread around the world.
1513
1514 Antitrust’s story began nearly a century before all that with laws like
1515 the Sherman Act, which took aim at monopolists on the grounds that
1516 monopolies were bad in and of themselves — squeezing out competitors,
1517 creating “diseconomies of scale” (when a company is so big that its
1518 constituent parts go awry and it is seemingly helpless to address the
1519 problems), and capturing their regulators to such a degree that they can
1520 get away with a host of evils.
1521
1522 Then came a fabulist named Robert Bork, a former solicitor general who
1523 Reagan appointed to the powerful U.S. Court of Appeals for the D.C.
1524 Circuit and who had created an alternate legislative history of the
1525 Sherman Act and its successors out of whole cloth. Bork insisted that
1526 these statutes were never targeted at monopolies (despite a wealth of
1527 evidence to the contrary, including the transcribed speeches of the
1528 acts’ authors) but, rather, that they were intended to prevent “consumer
1529 harm” — in the form of higher prices.
1530
1531 Bork was a crank, but he was a crank with a theory that rich people
1532 really liked. Monopolies are a great way to make rich people richer by
1533 allowing them to receive “monopoly rents” (that is, bigger profits) and
1534 capture regulators, leading to a weaker, more favorable regulatory
1535 environment with fewer protections for customers, suppliers, the
1536 environment, and workers.
1537
1538 Bork’s theories were especially palatable to the same power brokers who
1539 backed Reagan, and Reagan’s Department of Justice and other agencies
1540 began to incorporate Bork’s antitrust doctrine into their enforcement
1541 decisions (Reagan even put Bork up for a Supreme Court seat, but Bork
1542 flunked the Senate confirmation hearing so badly that, 40 years later,
1543 D.C. insiders use the term “borked” to refer to any catastrophically bad
1544 political performance).
1545
1546 Little by little, Bork’s theories entered the mainstream, and their
1547 backers began to infiltrate the legal education field, even putting on
1548 junkets where members of the judiciary were treated to lavish meals, fun
1549 outdoor activities, and seminars where they were indoctrinated into the
1550 consumer harm theory of antitrust. The more Bork’s theories took hold,
1551 the more money the monopolists were making — and the more surplus
1552 capital they had at their disposal to lobby for even more Borkian
1553 antitrust influence campaigns.
1554
1555 The history of Bork’s antitrust theories is a really good example of the
1556 kind of covertly engineered shifts in public opinion that Zuboff warns
1557 us against, where fringe ideas become mainstream orthodoxy. But Bork
1558 didn’t change the world overnight. He played a very long game, for over
1559 a generation, and he had a tailwind because the same forces that backed
1560 oligarchic antitrust theories also backed many other oligarchic shifts
1561 in public opinion. For example, the idea that taxation is theft, that
1562 wealth is a sign of virtue, and so on — all of these theories meshed to
1563 form a coherent ideology that elevated inequality to a virtue.
1564
1565 Today, many fear that machine learning allows surveillance capitalism to
1566 sell “Bork-as-a-Service,” at internet speeds, so that you can contract a
1567 machine-learning company to engineer *rapid* shifts in public sentiment
1568 without needing the capital to sustain a multipronged, multigenerational
1569 project working at the local, state, national, and global levels in
1570 business, law, and philosophy. I do not believe that such a project is
1571 plausible, though I agree that this is basically what the platforms
1572 claim to be selling. They’re just lying about it. Big Tech lies all the
1573 time, *including* in their sales literature.
1574
1575 The idea that tech forms “natural monopolies” (monopolies that are the
1576 inevitable result of the realities of an industry, such as the
1577 monopolies that accrue the first company to run long-haul phone lines or
1578 rail lines) is belied by tech’s own history: In the absence of
1579 anti-competitive tactics, Google was able to unseat AltaVista and Yahoo;
1580 Facebook was able to head off Myspace. There are some advantages to
1581 gathering mountains of data, but those mountains of data also have
1582 disadvantages: liability (from leaking), diminishing returns (from old
1583 data), and institutional inertia (big companies, like science, progress
1584 one funeral at a time).
1585
1586 Indeed, the birth of the web saw a mass-extinction event for the
1587 existing giant, wildly profitable proprietary technologies that had
1588 capital, network effects, and walls and moats surrounding their
1589 businesses. The web showed that when a new industry is built around a
1590 protocol, rather than a product, the combined might of everyone who uses
1591 the protocol to reach their customers or users or communities outweighs
1592 even the most massive products. CompuServe, AOL, MSN, and a host of
1593 other proprietary walled gardens learned this lesson the hard way: Each
1594 believed it could stay separate from the web, offering “curation” and a
1595 guarantee of consistency and quality instead of the chaos of an open
1596 system. Each was wrong and ended up being absorbed into the public web.
1597
1598 Yes, tech is heavily monopolized and is now closely associated with
1599 industry concentration, but this has more to do with a matter of timing
1600 than its intrinsically monopolistic tendencies. Tech was born at the
1601 moment that antitrust enforcement was being dismantled, and tech fell
1602 into exactly the same pathologies that antitrust was supposed to guard
1603 against. To a first approximation, it is reasonable to assume that
1604 tech’s monopolies are the result of a lack of anti-monopoly action and
1605 not the much-touted unique characteristics of tech, such as network
1606 effects, first-mover advantage, and so on.
1607
1608 In support of this thesis, I offer the concentration that every *other*
1609 industry has undergone over the same period. From professional wrestling
1610 to consumer packaged goods to commercial property leasing to banking to
1611 sea freight to oil to record labels to newspaper ownership to theme
1612 parks, *every* industry has undergone a massive shift toward
1613 concentration. There’s no obvious network effects or first-mover
1614 advantage at play in these industries. However, in every case, these
1615 industries attained their concentrated status through tactics that were
1616 prohibited before Bork’s triumph: merging with major competitors, buying
1617 out innovative new market entrants, horizontal and vertical integration,
1618 and a suite of anti-competitive tactics that were once illegal but are
1619 not any longer.
1620
1621 Again: When you change the laws intended to prevent monopolies and then
1622 monopolies form in exactly the way the law was supposed to prevent, it
1623 is reasonable to suppose that these facts are related. Tech’s
1624 concentration can be readily explained without recourse to radical
1625 theories of network effects — but only if you’re willing to indict
1626 unregulated markets as tending toward monopoly. Just as a lifelong
1627 smoker can give you a hundred reasons why their smoking didn’t cause
1628 their cancer (“It was the environmental toxins”), true believers in
1629 unregulated markets have a whole suite of unconvincing explanations for
1630 monopoly in tech that leave capitalism intact.
1631
1632 Steering with the windshield wipers
1633 -----------------------------------
1634
1635 It’s been 40 years since Bork’s project to rehabilitate monopolies
1636 achieved liftoff, and that is a generation and a half, which is plenty
1637 of time to take a common idea and make it seem outlandish and vice
1638 versa. Before the 1940s, affluent Americans dressed their baby boys in
1639 pink while baby girls wore blue (a “delicate and dainty” color). While
1640 gendered colors are obviously totally arbitrary, many still greet this
1641 news with amazement and find it hard to imagine a time when pink
1642 connoted masculinity.
1643
1644 After 40 years of studiously ignoring antitrust analysis and
1645 enforcement, it’s not surprising that we’ve all but forgotten that
1646 antitrust exists, that in living memory, growth through mergers and
1647 acquisitions were largely prohibited under law, that market-cornering
1648 strategies like vertical integration could land a company in court.
1649
1650 Antitrust is a market society’s steering wheel, the control of first
1651 resort to keep would-be masters of the universe in their lanes. But Bork
1652 and his cohort ripped out our steering wheel 40 years ago. The car is
1653 still barreling along, and so we’re yanking as hard as we can on all the
1654 *other* controls in the car as well as desperately flapping the doors
1655 and rolling the windows up and down in the hopes that one of these other
1656 controls can be repurposed to let us choose where we’re heading before
1657 we careen off a cliff.
1658
1659 It’s like a 1960s science-fiction plot come to life: People stuck in a
1660 “generation ship,” plying its way across the stars, a ship once piloted
1661 by their ancestors; and now, after a great cataclysm, the ship’s crew
1662 have forgotten that they’re in a ship at all and no longer remember
1663 where the control room is. Adrift, the ship is racing toward its
1664 extinction, and unless we can seize the controls and execute emergency
1665 course correction, we’re all headed for a fiery death in the heart of a
1666 sun.
1667
1668 Surveillance still matters
1669 --------------------------
1670
1671 None of this is to minimize the problems with surveillance. Surveillance
1672 matters, and Big Tech’s use of surveillance *is* an existential risk to
1673 our species, but that’s not because surveillance and machine learning
1674 rob us of our free will.
1675
1676 Surveillance has become *much* more efficient thanks to Big Tech. In
1677 1989, the Stasi — the East German secret police — had the whole country
1678 under surveillance, a massive undertaking that recruited one out of
1679 every 60 people to serve as an informant or intelligence operative.
1680
1681 Today, we know that the NSA is spying on a significant fraction of the
1682 entire world’s population, and its ratio of surveillance operatives to
1683 the surveilled is more like 1:10,000 (that’s probably on the low side
1684 since it assumes that every American with top-secret clearance is
1685 working for the NSA on this project — we don’t know how many of those
1686 cleared people are involved in NSA spying, but it’s definitely not all
1687 of them).
1688
1689 How did the ratio of surveillable citizens expand from 1:60 to 1:10,000
1690 in less than 30 years? It’s thanks to Big Tech. Our devices and services
1691 gather most of the data that the NSA mines for its surveillance project.
1692 We pay for these devices and the services they connect to, and then we
1693 painstakingly perform the data-entry tasks associated with logging facts
1694 about our lives, opinions, and preferences. This mass surveillance
1695 project has been largely useless for fighting terrorism: The NSA can
1696 `only point to a single minor success
1697 story <https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html>`__
1698 in which it used its data collection program to foil an attempt by a
1699 U.S. resident to wire a few thousand dollars to an overseas terror
1700 group. It’s ineffective for much the same reason that commercial
1701 surveillance projects are largely ineffective at targeting advertising:
1702 The people who want to commit acts of terror, like people who want to
1703 buy a refrigerator, are extremely rare. If you’re trying to detect a
1704 phenomenon whose base rate is one in a million with an instrument whose
1705 accuracy is only 99%, then every true positive will come at the cost of
1706 9,999 false positives.
1707
1708 Let me explain that again: If one in a million people is a terrorist,
1709 then there will only be about one terrorist in a random sample of one
1710 million people. If your test for detecting terrorists is 99% accurate,
1711 it will identify 10,000 terrorists in your million-person sample (1% of
1712 one million is 10,000). For every true positive, you’ll get 9,999 false
1713 positives.
1714
1715 In reality, the accuracy of algorithmic terrorism detection falls far
1716 short of the 99% mark, as does refrigerator ad targeting. The difference
1717 is that being falsely accused of wanting to buy a fridge is a minor
1718 nuisance while being falsely accused of planning a terror attack can
1719 destroy your life and the lives of everyone you love.
1720
1721 Mass state surveillance is only feasible because of surveillance
1722 capitalism and its extremely low-yield ad-targeting systems, which
1723 require a constant feed of personal data to remain barely viable.
1724 Surveillance capitalism’s primary failure mode is mistargeted ads while
1725 mass state surveillance’s primary failure mode is grotesque human rights
1726 abuses, tending toward totalitarianism.
1727
1728 State surveillance is no mere parasite on Big Tech, sucking up its data
1729 and giving nothing in return. In truth, the two are symbiotes: Big Tech
1730 sucks up our data for spy agencies, and spy agencies ensure that
1731 governments don’t limit Big Tech’s activities so severely that it would
1732 no longer serve the spy agencies’ needs. There is no firm distinction
1733 between state surveillance and surveillance capitalism; they are
1734 dependent on one another.
1735
1736 To see this at work today, look no further than Amazon’s home
1737 surveillance device, the Ring doorbell, and its associated app,
1738 Neighbors. Ring — a product that Amazon acquired and did not develop in
1739 house — makes a camera-enabled doorbell that streams footage from your
1740 front door to your mobile device. The Neighbors app allows you to form a
1741 neighborhood-wide surveillance grid with your fellow Ring owners through
1742 which you can share clips of “suspicious characters.” If you’re thinking
1743 that this sounds like a recipe for letting curtain-twitching racists
1744 supercharge their suspicions of people with brown skin who walk down
1745 their blocks, `you’re
1746 right <https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement>`__.
1747 Ring has become a *de facto,* off-the-books arm of the police without
1748 any of the pesky oversight or rules.
1749
1750 In mid-2019, a series of public records requests revealed that Amazon
1751 had struck confidential deals with more than 400 local law enforcement
1752 agencies through which the agencies would promote Ring and Neighbors and
1753 in exchange get access to footage from Ring cameras. In theory, cops
1754 would need to request this footage through Amazon (and internal
1755 documents reveal that Amazon devotes substantial resources to coaching
1756 cops on how to spin a convincing story when doing so), but in practice,
1757 when a Ring customer turns down a police request, Amazon only requires
1758 the agency to formally request the footage from the company, which it
1759 will then produce.
1760
1761 Ring and law enforcement have found many ways to intertwine their
1762 activities. Ring strikes secret deals to acquire real-time access to 911
1763 dispatch and then streams alarming crime reports to Neighbors users,
1764 which serve as convincers for anyone who’s contemplating a surveillance
1765 doorbell but isn’t sure whether their neighborhood is dangerous enough
1766 to warrant it.
1767
1768 The more the cops buzz-market the surveillance capitalist Ring, the more
1769 surveillance capability the state gets. Cops who rely on private
1770 entities for law-enforcement roles then brief against any controls on
1771 the deployment of that technology while the companies return the favor
1772 by lobbying against rules requiring public oversight of police
1773 surveillance technology. The more the cops rely on Ring and Neighbors,
1774 the harder it will be to pass laws to curb them. The fewer laws there
1775 are against them, the more the cops will rely on them.
1776
1777 Dignity and sanctuary
1778 ---------------------
1779
1780 But even if we could exercise democratic control over our states and
1781 force them to stop raiding surveillance capitalism’s reservoirs of
1782 behavioral data, surveillance capitalism would still harm us.
1783
1784 This is an area where Zuboff shines. Her chapter on “sanctuary” — the
1785 feeling of being unobserved — is a beautiful hymn to introspection,
1786 calmness, mindfulness, and tranquility.
1787
1788 When you are watched, something changes. Anyone who has ever raised a
1789 child knows this. You might look up from your book (or more
1790 realistically, from your phone) and catch your child in a moment of
1791 profound realization and growth, a moment where they are learning
1792 something that is right at the edge of their abilities, requiring their
1793 entire ferocious concentration. For a moment, you’re transfixed,
1794 watching that rare and beautiful moment of focus playing out before your
1795 eyes, and then your child looks up and sees you seeing them, and the
1796 moment collapses. To grow, you need to be and expose your authentic
1797 self, and in that moment, you are vulnerable like a hermit crab
1798 scuttling from one shell to the next. The tender, unprotected tissues
1799 you expose in that moment are too delicate to reveal in the presence of
1800 another, even someone you trust as implicitly as a child trusts their
1801 parent.
1802
1803 In the digital age, our authentic selves are inextricably tied to our
1804 digital lives. Your search history is a running ledger of the questions
1805 you’ve pondered. Your location history is a record of the places you’ve
1806 sought out and the experiences you’ve had there. Your social graph
1807 reveals the different facets of your identity, the people you’ve
1808 connected with.
1809
1810 To be observed in these activities is to lose the sanctuary of your
1811 authentic self.
1812
1813 There’s another way in which surveillance capitalism robs us of our
1814 capacity to be our authentic selves: by making us anxious. Surveillance
1815 capitalism isn’t really a mind-control ray, but you don’t need a
1816 mind-control ray to make someone anxious. After all, another word for
1817 anxiety is agitation, and to make someone experience agitation, you need
1818 merely to agitate them. To poke them and prod them and beep at them and
1819 buzz at them and bombard them on an intermittent schedule that is just
1820 random enough that our limbic systems never quite become inured to it.
1821
1822 Our devices and services are “general purpose” in that they can connect
1823 anything or anyone to anything or anyone else and that they can run any
1824 program that can be written. This means that the distraction rectangles
1825 in our pockets hold our most precious moments with our most beloved
1826 people and their most urgent or time-sensitive communications (from
1827 “running late can you get the kid?” to “doctor gave me bad news and I
1828 need to talk to you RIGHT NOW”) as well as ads for refrigerators and
1829 recruiting messages from Nazis.
1830
1831 All day and all night, our pockets buzz, shattering our concentration
1832 and tearing apart the fragile webs of connection we spin as we think
1833 through difficult ideas. If you locked someone in a cell and agitated
1834 them like this, we’d call it “sleep deprivation torture,” and it would
1835 be `a war crime under the Geneva
1836 Conventions <https://www.youtube.com/watch?v=1SKpRbvnx6g>`__.
1837
1838 Afflicting the afflicted
1839 ------------------------
1840
1841 The effects of surveillance on our ability to be our authentic selves
1842 are not equal for all people. Some of us are lucky enough to live in a
1843 time and place in which all the most important facts of our lives are
1844 widely and roundly socially acceptable and can be publicly displayed
1845 without the risk of social consequence.
1846
1847 But for many of us, this is not true. Recall that in living memory, many
1848 of the ways of being that we think of as socially acceptable today were
1849 once cause for dire social sanction or even imprisonment. If you are 65
1850 years old, you have lived through a time in which people living in “free
1851 societies” could be imprisoned or sanctioned for engaging in homosexual
1852 activity, for falling in love with a person whose skin was a different
1853 color than their own, or for smoking weed.
1854
1855 Today, these activities aren’t just decriminalized in much of the world,
1856 they’re considered normal, and the fallen prohibitions are viewed as
1857 shameful, regrettable relics of the past.
1858
1859 How did we get from prohibition to normalization? Through private,
1860 personal activity: People who were secretly gay or secret pot-smokers or
1861 who secretly loved someone with a different skin color were vulnerable
1862 to retaliation if they made their true selves known and were limited in
1863 how much they could advocate for their own right to exist in the world
1864 and be true to themselves. But because there was a private sphere, these
1865 people could form alliances with their friends and loved ones who did
1866 not share their disfavored traits by having private conversations in
1867 which they came out, disclosing their true selves to the people around
1868 them and bringing them to their cause one conversation at a time.
1869
1870 The right to choose the time and manner of these conversations was key
1871 to their success. It’s one thing to come out to your dad while you’re on
1872 a fishing trip away from the world and another thing entirely to blurt
1873 it out over the Christmas dinner table while your racist Facebook uncle
1874 is there to make a scene.
1875
1876 Without a private sphere, there’s a chance that none of these changes
1877 would have come to pass and that the people who benefited from these
1878 changes would have either faced social sanction for coming out to a
1879 hostile world or would have never been able to reveal their true selves
1880 to the people they love.
1881
1882 The corollary is that, unless you think that our society has attained
1883 social perfection — that your grandchildren in 50 years will ask you to
1884 tell them the story of how, in 2020, every injustice had been righted
1885 and no further change had to be made — then you should expect that right
1886 now, at this minute, there are people you love, whose happiness is key
1887 to your own, who have a secret in their hearts that stops them from ever
1888 being their authentic selves with you. These people are sorrowing and
1889 will go to their graves with that secret sorrow in their hearts, and the
1890 source of that sorrow will be the falsity of their relationship to you.
1891
1892 A private realm is necessary for human progress.
1893
1894 Any data you collect and retain will eventually leak
1895 ----------------------------------------------------
1896
1897 The lack of a private life can rob vulnerable people of the chance to be
1898 their authentic selves and constrain our actions by depriving us of
1899 sanctuary, but there is another risk that is borne by everyone, not just
1900 people with a secret: crime.
1901
1902 Personally identifying information is of very limited use for the
1903 purpose of controlling peoples’ minds, but identity theft — really a
1904 catchall term for a whole constellation of terrible criminal activities
1905 that can destroy your finances, compromise your personal integrity, ruin
1906 your reputation, or even expose you to physical danger — thrives on it.
1907
1908 Attackers are not limited to using data from one breached source,
1909 either. Multiple services have suffered breaches that exposed names,
1910 addresses, phone numbers, passwords, sexual tastes, school grades, work
1911 performance, brushes with the criminal justice system, family details,
1912 genetic information, fingerprints and other biometrics, reading habits,
1913 search histories, literary tastes, pseudonymous identities, and other
1914 sensitive information. Attackers can merge data from these different
1915 breaches to build up extremely detailed dossiers on random subjects and
1916 then use different parts of the data for different criminal purposes.
1917
1918 For example, attackers can use leaked username and password combinations
1919 to hijack whole fleets of commercial vehicles that `have been fitted
1920 with anti-theft GPS trackers and
1921 immobilizers <https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps>`__
1922 or to hijack baby monitors in order to `terrorize toddlers with the
1923 audio tracks from
1924 pornography <https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550>`__.
1925 Attackers use leaked data to trick phone companies into giving them your
1926 phone number, then they intercept SMS-based two-factor authentication
1927 codes in order to take over your email, bank account, and/or
1928 cryptocurrency wallets.
1929
1930 Attackers are endlessly inventive in the pursuit of creative ways to
1931 weaponize leaked data. One common use of leaked data is to penetrate
1932 companies in order to access *more* data.
1933
1934 Like spies, online fraudsters are totally dependent on companies
1935 over-collecting and over-retaining our data. Spy agencies sometimes pay
1936 companies for access to their data or intimidate them into giving it up,
1937 but sometimes they work just like criminals do — by `sneaking data out
1938 of companies’
1939 databases <https://www.bbc.com/news/world-us-canada-24751821>`__.
1940
1941 The over-collection of data has a host of terrible social consequences,
1942 from the erosion of our authentic selves to the undermining of social
1943 progress, from state surveillance to an epidemic of online crime.
1944 Commercial surveillance is also a boon to people running influence
1945 campaigns, but that’s the least of our troubles.
1946
1947 Critical tech exceptionalism is still tech exceptionalism
1948 ---------------------------------------------------------
1949
1950 Big Tech has long practiced technology exceptionalism: the idea that it
1951 should not be subject to the mundane laws and norms of “meatspace.”
1952 Mottoes like Facebook’s “move fast and break things” attracted
1953 justifiable scorn of the companies’ self-serving rhetoric.
1954
1955 Tech exceptionalism got us all into a lot of trouble, so it’s ironic and
1956 distressing to see Big Tech’s critics committing the same sin.
1957
1958 Big Tech is not a “rogue capitalism” that cannot be cured through the
1959 traditional anti-monopoly remedies of trustbusting (forcing companies to
1960 divest of competitors they have acquired) and bans on mergers to
1961 monopoly and other anti-competitive tactics. Big Tech does not have the
1962 power to use machine learning to influence our behavior so thoroughly
1963 that markets lose the ability to punish bad actors and reward superior
1964 competitors. Big Tech has no rule-writing mind-control ray that
1965 necessitates ditching our old toolbox.
1966
1967 The thing is, people have been claiming to have perfected mind-control
1968 rays for centuries, and every time, it turned out to be a con — though
1969 sometimes the con artists were also conning themselves.
1970
1971 For generations, the advertising industry has been steadily improving
1972 its ability to sell advertising services to businesses while only making
1973 marginal gains in selling those businesses’ products to prospective
1974 customers. John Wanamaker’s lament that “50% of my advertising budget is
1975 wasted, I just don’t know which 50%” is a testament to the triumph of
1976 *ad executives*, who successfully convinced Wanamaker that only half of
1977 the money he spent went to waste.
1978
1979 The tech industry has made enormous improvements in the science of
1980 convincing businesses that they’re good at advertising while their
1981 actual improvements to advertising — as opposed to targeting — have been
1982 pretty ho-hum. The vogue for machine learning — and the mystical
1983 invocation of “artificial intelligence” as a synonym for straightforward
1984 statistical inference techniques — has greatly boosted the efficacy of
1985 Big Tech’s sales pitch as marketers have exploited potential customers’
1986 lack of technical sophistication to get away with breathtaking acts of
1987 overpromising and underdelivering.
1988
1989 It’s tempting to think that if businesses are willing to pour billions
1990 into a venture that the venture must be a good one. Yet there are plenty
1991 of times when this rule of thumb has led us astray. For example, it’s
1992 virtually unheard of for managed investment funds to outperform simple
1993 index funds, and investors who put their money into the hands of expert
1994 money managers overwhelmingly fare worse than those who entrust their
1995 savings to index funds. But managed funds still account for the majority
1996 of the money invested in the markets, and they are patronized by some of
1997 the richest, most sophisticated investors in the world. Their vote of
1998 confidence in an underperforming sector is a parable about the role of
1999 luck in wealth accumulation, not a sign that managed funds are a good
2000 buy.
2001
2002 The claims of Big Tech’s mind-control system are full of tells that the
2003 enterprise is a con. For example, `the reliance on the “Big Five”
2004 personality
2005 traits <https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full>`__
2006 as a primary means of influencing people even though the “Big Five”
2007 theory is unsupported by any large-scale, peer-reviewed studies and is
2008 `mostly the realm of marketing hucksters and pop
2009 psych <https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/>`__.
2010
2011 Big Tech’s promotional materials also claim that their algorithms can
2012 accurately perform “sentiment analysis” or detect peoples’ moods based
2013 on their “microexpressions,” but `these are marketing claims, not
2014 scientific
2015 ones <https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it>`__.
2016 These methods are largely untested by independent scientific experts,
2017 and where they have been tested, they’ve been found sorely wanting.
2018 Microexpressions are particularly suspect as the companies that
2019 specialize in training people to detect them `have been
2020 shown <https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/>`__
2021 to underperform relative to random chance.
2022
2023 Big Tech has been so good at marketing its own supposed superpowers that
2024 it’s easy to believe that they can market everything else with similar
2025 acumen, but it’s a mistake to believe the hype. Any statement a company
2026 makes about the quality of its products is clearly not impartial. The
2027 fact that we distrust all the things that Big Tech says about its data
2028 handling, compliance with privacy laws, etc., is only reasonable — but
2029 why on Earth would we treat Big Tech’s marketing literature as the
2030 gospel truth? Big Tech lies about just about *everything*, including how
2031 well its machine-learning fueled persuasion systems work.
2032
2033 That skepticism should infuse all of our evaluations of Big Tech and its
2034 supposed abilities, including our perusal of its patents. Zuboff vests
2035 these patents with enormous significance, pointing out that Google
2036 claimed extensive new persuasion capabilities in `its patent
2037 filings <https://patents.google.com/patent/US20050131762A1/en>`__. These
2038 claims are doubly suspect: first, because they are so self-serving, and
2039 second, because the patent itself is so notoriously an invitation to
2040 exaggeration.
2041
2042 Patent applications take the form of a series of claims and range from
2043 broad to narrow. A typical patent starts out by claiming that its
2044 authors have invented a method or system for doing every conceivable
2045 thing that anyone might do, ever, with any tool or device. Then it
2046 narrows that claim in successive stages until we get to the actual
2047 “invention” that is the true subject of the patent. The hope is that the
2048 patent examiner — who is almost certainly overworked and underinformed —
2049 will miss the fact that some or all of these claims are ridiculous, or
2050 at least suspect, and grant the patent’s broader claims. Patents for
2051 unpatentable things are still incredibly useful because they can be
2052 wielded against competitors who might license that patent or steer clear
2053 of its claims rather than endure the lengthy, expensive process of
2054 contesting it.
2055
2056 What’s more, software patents are routinely granted even though the
2057 filer doesn’t have any evidence that they can do the thing claimed by
2058 the patent. That is, you can patent an “invention” that you haven’t
2059 actually made and that you don’t know how to make.
2060
2061 With these considerations in hand, it becomes obvious that the fact that
2062 a Big Tech company has patented what it *says* is an effective
2063 mind-control ray is largely irrelevant to whether Big Tech can in fact
2064 control our minds.
2065
2066 Big Tech collects our data for many reasons, including the diminishing
2067 returns on existing stores of data. But many tech companies also collect
2068 data out of a mistaken tech exceptionalist belief in the network effects
2069 of data. Network effects occur when each new user in a system increases
2070 its value. The classic example is fax machines: A single fax machine is
2071 of no use, two fax machines are of limited use, but every new fax
2072 machine that’s put to use after the first doubles the number of possible
2073 fax-to-fax links.
2074
2075 Data mined for predictive systems doesn’t necessarily produce these
2076 dividends. Think of Netflix: The predictive value of the data mined from
2077 a million English-speaking Netflix viewers is hardly improved by the
2078 addition of one more user’s viewing data. Most of the data Netflix
2079 acquires after that first minimum viable sample duplicates existing data
2080 and produces only minimal gains. Meanwhile, retraining models with new
2081 data gets progressively more expensive as the number of data points
2082 increases, and manual tasks like labeling and validating data do not get
2083 cheaper at scale.
2084
2085 Businesses pursue fads to the detriment of their profits all the time,
2086 especially when the businesses and their investors are not motivated by
2087 the prospect of becoming profitable but rather by the prospect of being
2088 acquired by a Big Tech giant or by having an IPO. For these firms,
2089 ticking faddish boxes like “collects as much data as possible” might
2090 realize a bigger return on investment than “collects a
2091 business-appropriate quantity of data.”
2092
2093 This is another harm of tech exceptionalism: The belief that more data
2094 always produces more profits in the form of more insights that can be
2095 translated into better mind-control rays drives firms to over-collect
2096 and over-retain data beyond all rationality. And since the firms are
2097 behaving irrationally, a good number of them will go out of business and
2098 become ghost ships whose cargo holds are stuffed full of data that can
2099 harm people in myriad ways — but which no one is responsible for antey
2100 longer. Even if the companies don’t go under, the data they collect is
2101 maintained behind the minimum viable security — just enough security to
2102 keep the company viable while it waits to get bought out by a tech
2103 giant, an amount calculated to spend not one penny more than is
2104 necessary on protecting data.
2105
2106 How monopolies, not mind control, drive surveillance capitalism: The Snapchat story
2107 -----------------------------------------------------------------------------------
2108
2109 For the first decade of its existence, Facebook competed with the social
2110 media giants of the day (Myspace, Orkut, etc.) by presenting itself as
2111 the pro-privacy alternative. Indeed, Facebook justified its walled
2112 garden — which let users bring in data from the web but blocked web
2113 services like Google Search from indexing and caching Facebook pages —
2114 as a pro-privacy measure that protected users from the
2115 surveillance-happy winners of the social media wars like Myspace.
2116
2117 Despite frequent promises that it would never collect or analyze its
2118 users’ data, Facebook periodically created initiatives that did just
2119 that, like the creepy, ham-fisted Beacon tool, which spied on you as you
2120 moved around the web and then added your online activities to your
2121 public timeline, allowing your friends to monitor your browsing habits.
2122 Beacon sparked a user revolt. Every time, Facebook backed off from its
2123 surveillance initiative, but not all the way; inevitably, the new
2124 Facebook would be more surveilling than the old Facebook, though not
2125 quite as surveilling as the intermediate Facebook following the launch
2126 of the new product or service.
2127
2128 The pace at which Facebook ramped up its surveillance efforts seems to
2129 have been set by Facebook’s competitive landscape. The more competitors
2130 Facebook had, the better it behaved. Every time a major competitor
2131 foundered, Facebook’s behavior `got markedly
2132 worse <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362>`__.
2133
2134 All the while, Facebook was prodigiously acquiring companies, including
2135 a company called Onavo. Nominally, Onavo made a battery-monitoring
2136 mobile app. But the permissions that Onavo required were so expansive
2137 that the app was able to gather fine-grained telemetry on everything
2138 users did with their phones, including which apps they used and how they
2139 were using them.
2140
2141 Through Onavo, Facebook discovered that it was losing market share to
2142 Snapchat, an app that — like Facebook a decade before — billed itself as
2143 the pro-privacy alternative to the status quo. Through Onavo, Facebook
2144 was able to mine data from the devices of Snapchat users, including both
2145 current and former Snapchat users. This spurred Facebook to acquire
2146 Instagram — some features of which competed with Snapchat — and then
2147 allowed Facebook to fine-tune Instagram’s features and sales pitch to
2148 erode Snapchat’s gains and ensure that Facebook would not have to face
2149 the kinds of competitive pressures it had earlier inflicted on Myspace
2150 and Orkut.
2151
2152 The story of how Facebook crushed Snapchat reveals the relationship
2153 between monopoly and surveillance capitalism. Facebook combined
2154 surveillance with lax antitrust enforcement to spot the competitive
2155 threat of Snapchat on its horizon and then take decisive action against
2156 it. Facebook’s surveillance capitalism let it avert competitive pressure
2157 with anti-competitive tactics. Facebook users still want privacy —
2158 Facebook hasn’t used surveillance to brainwash them out of it — but they
2159 can’t get it because Facebook’s surveillance lets it destroy any hope of
2160 a rival service emerging that competes on privacy features.
2161
2162 A monopoly over your friends
2163 ----------------------------
2164
2165 A decentralization movement has tried to erode the dominance of Facebook
2166 and other Big Tech companies by fielding “indieweb” alternatives —
2167 Mastodon as a Twitter alternative, Diaspora as a Facebook alternative,
2168 etc. — but these efforts have failed to attain any kind of liftoff.
2169
2170 Fundamentally, each of these services is hamstrung by the same problem:
2171 Every potential user for a Facebook or Twitter alternative has to
2172 convince all their friends to follow them to a decentralized web
2173 alternative in order to continue to realize the benefit of social media.
2174 For many of us, the only reason to have a Facebook account is that our
2175 friends have Facebook accounts, and the reason they have Facebook
2176 accounts is that *we* have Facebook accounts.
2177
2178 All of this has conspired to make Facebook — and other dominant
2179 platforms — into “kill zones” that investors will not fund new entrants
2180 for.
2181
2182 And yet, all of today’s tech giants came into existence despite the
2183 entrenched advantage of the companies that came before them. To
2184 understand how that happened, you have to understand both
2185 interoperability and adversarial interoperability.
2186
2187 The hard problem of our species is coordination.
2188
2189 “Interoperability” is the ability of two technologies to work with one
2190 another: Anyone can make an LP that will play on any record player,
2191 anyone can make a filter you can install in your stove’s extractor fan,
2192 anyone can make gasoline for your car, anyone can make a USB phone
2193 charger that fits in your car’s cigarette lighter receptacle, anyone can
2194 make a light bulb that works in your light socket, anyone can make bread
2195 that will toast in your toaster.
2196
2197 Interoperability is often a source of innovation and consumer benefit:
2198 Apple made the first commercially successful PC, but millions of
2199 independent software vendors made interoperable programs that ran on the
2200 Apple II Plus. The simple analog antenna inputs on the back of TVs first
2201 allowed cable operators to connect directly to TVs, then they allowed
2202 game console companies and then personal computer companies to use
2203 standard televisions as displays. Standard RJ-11 telephone jacks allowed
2204 for the production of phones from a variety of vendors in a variety of
2205 forms, from the free football-shaped phone that came with a *Sports
2206 Illustrated* subscription to business phones with speakers, hold
2207 functions, and so on and then answering machines and finally modems,
2208 paving the way for the internet revolution.
2209
2210 “Interoperability” is often used interchangeably with “standardization,”
2211 which is the process when manufacturers and other stakeholders hammer
2212 out a set of agreed-upon rules for implementing a technology, such as
2213 the electrical plug on your wall, the CAN bus used by your car’s
2214 computer systems, or the HTML instructions that your browser interprets.
2215
2216 But interoperability doesn’t require standardization — indeed,
2217 standardization often proceeds from the chaos of ad hoc interoperability
2218 measures. The inventor of the cigarette-lighter USB charger didn’t need
2219 to get permission from car manufacturers or even the manufacturers of
2220 the dashboard lighter subcomponent. The automakers didn’t take any
2221 countermeasures to prevent the use of these aftermarket accessories by
2222 their customers, but they also didn’t do anything to make life easier
2223 for the chargers’ manufacturers. This is a kind of “neutral
2224 interoperability.”
2225
2226 Beyond neutral interoperability, there is “adversarial
2227 interoperability.” That’s when a manufacturer makes a product that
2228 interoperates with another manufacturer’s product *despite the second
2229 manufacturer’s objections* and *even if that means bypassing a security
2230 system designed to prevent interoperability*.
2231
2232 Probably the most familiar form of adversarial interoperability is
2233 third-party printer ink. Printer manufacturers claim that they sell
2234 printers below cost and that the only way they can recoup the losses
2235 they incur is by charging high markups on ink. To prevent the owners of
2236 printers from buying ink elsewhere, the printer companies deploy a suite
2237 of anti-customer security systems that detect and reject both refilled
2238 and third-party cartridges.
2239
2240 Owners of printers take the position that HP and Epson and Brother are
2241 not charities and that customers for their wares have no obligation to
2242 help them survive, and so if the companies choose to sell their products
2243 at a loss, that’s their foolish choice and their consequences to live
2244 with. Likewise, competitors who make ink or refill kits observe that
2245 they don’t owe printer companies anything, and their erosion of printer
2246 companies’ margins are the printer companies’ problems, not their
2247 competitors’. After all, the printer companies shed no tears when they
2248 drive a refiller out of business, so why should the refillers concern
2249 themselves with the economic fortunes of the printer companies?
2250
2251 Adversarial interoperability has played an outsized role in the history
2252 of the tech industry: from the founding of the “alt.*” Usenet hierarchy
2253 (which was started against the wishes of Usenet’s maintainers and which
2254 grew to be bigger than all of Usenet combined) to the browser wars (when
2255 Netscape and Microsoft devoted massive engineering efforts to making
2256 their browsers incompatible with the other’s special commands and
2257 peccadilloes) to Facebook (whose success was built in part by helping
2258 its new users stay in touch with friends they’d left behind on Myspace
2259 because Facebook supplied them with a tool that scraped waiting messages
2260 from Myspace and imported them into Facebook, effectively creating an
2261 Facebook-based Myspace reader).
2262
2263 Today, incumbency is seen as an unassailable advantage. Facebook is
2264 where all of your friends are, so no one can start a Facebook
2265 competitor. But adversarial compatibility reverses the competitive
2266 advantage: If you were allowed to compete with Facebook by providing a
2267 tool that imported all your users’ waiting Facebook messages into an
2268 environment that competed on lines that Facebook couldn’t cross, like
2269 eliminating surveillance and ads, then Facebook would be at a huge
2270 disadvantage. It would have assembled all possible ex-Facebook users
2271 into a single, easy-to-find service; it would have educated them on how
2272 a Facebook-like service worked and what its potential benefits were; and
2273 it would have provided an easy means for disgruntled Facebook users to
2274 tell their friends where they might expect better treatment.
2275
2276 Adversarial interoperability was once the norm and a key contributor to
2277 the dynamic, vibrant tech scene, but now it is stuck behind a thicket of
2278 laws and regulations that add legal risks to the tried-and-true tactics
2279 of adversarial interoperability. New rules and new interpretations of
2280 existing rules mean that a would-be adversarial interoperator needs to
2281 steer clear of claims under copyright, terms of service, trade secrecy,
2282 tortious interference, and patent.
2283
2284 In the absence of a competitive market, lawmakers have resorted to
2285 assigning expensive, state-like duties to Big Tech firms, such as
2286 automatically filtering user contributions for copyright infringement or
2287 terrorist and extremist content or detecting and preventing harassment
2288 in real time or controlling access to sexual material.
2289
2290 These measures put a floor under how small we can make Big Tech because
2291 only the very largest companies can afford the humans and automated
2292 filters needed to perform these duties.
2293
2294 But that’s not the only way in which making platforms responsible for
2295 policing their users undermines competition. A platform that is expected
2296 to police its users’ conduct must prevent many vital adversarial
2297 interoperability techniques lest these subvert its policing measures.
2298 For example, if someone using a Twitter replacement like Mastodon is
2299 able to push messages into Twitter and read messages out of Twitter,
2300 they could avoid being caught by automated systems that detect and
2301 prevent harassment (such as systems that use the timing of messages or
2302 IP-based rules to make guesses about whether someone is a harasser).
2303
2304 To the extent that we are willing to let Big Tech police itself — rather
2305 than making Big Tech small enough that users can leave bad platforms for
2306 better ones and small enough that a regulation that simply puts a
2307 platform out of business will not destroy billions of users’ access to
2308 their communities and data — we build the case that Big Tech should be
2309 able to block its competitors and make it easier for Big Tech to demand
2310 legal enforcement tools to ban and punish attempts at adversarial
2311 interoperability.
2312
2313 Ultimately, we can try to fix Big Tech by making it responsible for bad
2314 acts by its users, or we can try to fix the internet by cutting Big Tech
2315 down to size. But we can’t do both. To replace today’s giant products
2316 with pluralistic protocols, we need to clear the legal thicket that
2317 prevents adversarial interoperability so that tomorrow’s nimble,
2318 personal, small-scale products can federate themselves with giants like
2319 Facebook, allowing the users who’ve left to continue to communicate with
2320 users who haven’t left yet, reaching tendrils over Facebook’s garden
2321 wall that Facebook’s trapped users can use to scale the walls and escape
2322 to the global, open web.
2323
2324 Fake news is an epistemological crisis
2325 --------------------------------------
2326
2327 Tech is not the only industry that has undergone massive concentration
2328 since the Reagan era. Virtually every major industry — from oil to
2329 newspapers to meatpacking to sea freight to eyewear to online
2330 pornography — has become a clubby oligarchy that just a few players
2331 dominate.
2332
2333 At the same time, every industry has become something of a tech industry
2334 as general-purpose computers and general-purpose networks and the
2335 promise of efficiencies through data-driven analysis infuse every
2336 device, process, and firm with tech.
2337
2338 This phenomenon of industrial concentration is part of a wider story
2339 about wealth concentration overall as a smaller and smaller number of
2340 people own more and more of our world. This concentration of both wealth
2341 and industries means that our political outcomes are increasingly
2342 beholden to the parochial interests of the people and companies with all
2343 the money.
2344
2345 That means that whenever a regulator asks a question with an obvious,
2346 empirical answer (“Are humans causing climate change?” or “Should we let
2347 companies conduct commercial mass surveillance?” or “Does society
2348 benefit from allowing network neutrality violations?”), the answer that
2349 comes out is only correct if that correctness meets with the approval of
2350 rich people and the industries that made them so wealthy.
2351
2352 Rich people have always played an outsized role in politics and more so
2353 since the Supreme Court’s *Citizens United* decision eliminated key
2354 controls over political spending. Widening inequality and wealth
2355 concentration means that the very richest people are now a lot richer
2356 and can afford to spend a lot more money on political projects than ever
2357 before. Think of the Koch brothers or George Soros or Bill Gates.
2358
2359 But the policy distortions of rich individuals pale in comparison to the
2360 policy distortions that concentrated industries are capable of. The
2361 companies in highly concentrated industries are much more profitable
2362 than companies in competitive industries — no competition means not
2363 having to reduce prices or improve quality to win customers — leaving
2364 them with bigger capital surpluses to spend on lobbying.
2365
2366 Concentrated industries also find it easier to collaborate on policy
2367 objectives than competitive ones. When all the top execs from your
2368 industry can fit around a single boardroom table, they often do. And
2369 *when* they do, they can forge a consensus position on regulation.
2370
2371 Rising through the ranks in a concentrated industry generally means
2372 working at two or three of the big companies. When there are only
2373 relatively few companies in a given industry, each company has a more
2374 ossified executive rank, leaving ambitious execs with fewer paths to
2375 higher positions unless they are recruited to a rival. This means that
2376 the top execs in concentrated industries are likely to have been
2377 colleagues at some point and socialize in the same circles — connected
2378 through social ties or, say, serving as trustees for each others’
2379 estates. These tight social bonds foster a collegial, rather than
2380 competitive, attitude.
2381
2382 Highly concentrated industries also present a regulatory conundrum. When
2383 an industry is dominated by just four or five companies, the only people
2384 who are likely to truly understand the industry’s practices are its
2385 veteran executives. This means that top regulators are often former
2386 execs of the companies they are supposed to be regulating. These turns
2387 in government are often tacitly understood to be leaves of absence from
2388 industry, with former employers welcoming their erstwhile watchdogs back
2389 into their executive ranks once their terms have expired.
2390
2391 All this is to say that the tight social bonds, small number of firms,
2392 and regulatory capture of concentrated industries give the companies
2393 that comprise them the power to dictate many, if not all, of the
2394 regulations that bind them.
2395
2396 This is increasingly obvious. Whether it’s payday lenders `winning the
2397 right to practice predatory
2398 lending <https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/>`__
2399 or Apple `winning the right to decide who can fix your
2400 phone <https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation>`__
2401 or Google and Facebook winning the right to breach your private data
2402 without suffering meaningful consequences or victories for pipeline
2403 companies or impunity for opioid manufacturers or massive tax subsidies
2404 for incredibly profitable dominant businesses, it’s increasingly
2405 apparent that many of our official, evidence-based truth-seeking
2406 processes are, in fact, auctions for sale to the highest bidder.
2407
2408 It’s really impossible to overstate what a terrifying prospect this is.
2409 We live in an incredibly high-tech society, and none of us could acquire
2410 the expertise to evaluate every technological proposition that stands
2411 between us and our untimely, horrible deaths. You might devote your life
2412 to acquiring the media literacy to distinguish good scientific journals
2413 from corrupt pay-for-play lookalikes and the statistical literacy to
2414 evaluate the quality of the analysis in the journals as well as the
2415 microbiology and epidemiology knowledge to determine whether you can
2416 trust claims about the safety of vaccines — but that would still leave
2417 you unqualified to judge whether the wiring in your home will give you a
2418 lethal shock *and* whether your car’s brakes’ software will cause them
2419 to fail unpredictably *and* whether the hygiene standards at your
2420 butcher are sufficient to keep you from dying after you finish your
2421 dinner.
2422
2423 In a world as complex as this one, we have to defer to authorities, and
2424 we keep them honest by making those authorities accountable to us and
2425 binding them with rules to prevent conflicts of interest. We can’t
2426 possibly acquire the expertise to adjudicate conflicting claims about
2427 the best way to make the world safe and prosperous, but we *can*
2428 determine whether the adjudication process itself is trustworthy.
2429
2430 Right now, it’s obviously not.
2431
2432 The past 40 years of rising inequality and industry concentration,
2433 together with increasingly weak accountability and transparency for
2434 expert agencies, has created an increasingly urgent sense of impending
2435 doom, the sense that there are vast conspiracies afoot that operate with
2436 tacit official approval despite the likelihood they are working to
2437 better themselves by ruining the rest of us.
2438
2439 For example, it’s been decades since Exxon’s own scientists concluded
2440 that its products would render the Earth uninhabitable by humans. And
2441 yet those decades were lost to us, in large part because Exxon lobbied
2442 governments and sowed doubt about the dangers of its products and did so
2443 with the cooperation of many public officials. When the survival of you
2444 and everyone you love is threatened by conspiracies, it’s not
2445 unreasonable to start questioning the things you think you know in an
2446 attempt to determine whether they, too, are the outcome of another
2447 conspiracy.
2448
2449 The collapse of the credibility of our systems for divining and
2450 upholding truths has left us in a state of epistemological chaos. Once,
2451 most of us might have assumed that the system was working and that our
2452 regulations reflected our best understanding of the empirical truths of
2453 the world as they were best understood — now we have to find our own
2454 experts to help us sort the true from the false.
2455
2456 If you’re like me, you probably believe that vaccines are safe, but you
2457 (like me) probably also can’t explain the microbiology or statistics.
2458 Few of us have the math skills to review the literature on vaccine
2459 safety and describe why their statistical reasoning is sound. Likewise,
2460 few of us can review the stats in the (now discredited) literature on
2461 opioid safety and explain how those stats were manipulated. Both
2462 vaccines and opioids were embraced by medical authorities, after all,
2463 and one is safe while the other could ruin your life. You’re left with a
2464 kind of inchoate constellation of rules of thumb about which experts you
2465 trust to fact-check controversial claims and then to explain how all
2466 those respectable doctors with their peer-reviewed research on opioid
2467 safety *were* an aberration and then how you know that the doctors
2468 writing about vaccine safety are *not* an aberration.
2469
2470 I’m 100% certain that vaccinating is safe and effective, but I’m also at
2471 something of a loss to explain exactly, *precisely,* why I believe this,
2472 given all the corruption I know about and the many times the stamp of
2473 certainty has turned out to be a parochial lie told to further enrich
2474 the super rich.
2475
2476 Fake news — conspiracy theories, racist ideologies, scientific denialism
2477 — has always been with us. What’s changed today is not the mix of ideas
2478 in the public discourse but the popularity of the worst ideas in that
2479 mix. Conspiracy and denial have skyrocketed in lockstep with the growth
2480 of Big Inequality, which has also tracked the rise of Big Tech and Big
2481 Pharma and Big Wrestling and Big Car and Big Movie Theater and Big
2482 Everything Else.
2483
2484 No one can say for certain why this has happened, but the two dominant
2485 camps are idealism (the belief that the people who argue for these
2486 conspiracies have gotten better at explaining them, maybe with the help
2487 of machine-learning tools) or materialism (the ideas have become more
2488 attractive because of material conditions in the world).
2489
2490 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2491 theorists all my life, and I have not experienced any qualitative leap
2492 in the quality of those arguments.
2493
2494 The major difference is in the world, not the arguments. In a time where
2495 actual conspiracies are commonplace, conspiracy theories acquire a ring
2496 of plausibility.
2497
2498 We have always had disagreements about what’s true, but today, we have a
2499 disagreement over how we know whether something is true. This is an
2500 epistemological crisis, not a crisis over belief. It’s a crisis over the
2501 credibility of our truth-seeking exercises, from scientific journals (in
2502 an era where the biggest journal publishers have been caught producing
2503 pay-to-play journals for junk science) to regulations (in an era where
2504 regulators are routinely cycling in and out of business) to education
2505 (in an era where universities are dependent on corporate donations to
2506 keep their lights on).
2507
2508 Targeting — surveillance capitalism — makes it easier to find people who
2509 are undergoing this epistemological crisis, but it doesn’t create the
2510 crisis. For that, you need to look to corruption.
2511
2512 And, conveniently enough, it’s corruption that allows surveillance
2513 capitalism to grow by dismantling monopoly protections, by permitting
2514 reckless collection and retention of personal data, by allowing ads to
2515 be targeted in secret, and by foreclosing on the possibility of going
2516 somewhere else where you might continue to enjoy your friends without
2517 subjecting yourself to commercial surveillance.
2518
2519 Tech is different
2520 -----------------
2521
2522 I reject both iterations of technological exceptionalism. I reject the
2523 idea that tech is uniquely terrible and led by people who are greedier
2524 or worse than the leaders of other industries, and I reject the idea
2525 that tech is so good — or so intrinsically prone to concentration — that
2526 it can’t be blamed for its present-day monopolistic status.
2527
2528 I think tech is just another industry, albeit one that grew up in the
2529 absence of real monopoly constraints. It may have been first, but it
2530 isn’t the worst nor will it be the last.
2531
2532 But there’s one way in which I *am* a tech exceptionalist. I believe
2533 that online tools are the key to overcoming problems that are much more
2534 urgent than tech monopolization: climate change, inequality, misogyny,
2535 and discrimination on the basis of race, gender identity, and other
2536 factors. The internet is how we will recruit people to fight those
2537 fights, and how we will coordinate their labor. Tech is not a substitute
2538 for democratic accountability, the rule of law, fairness, or stability —
2539 but it’s a means to achieve these things.
2540
2541 The hard problem of our species is coordination. Everything from climate
2542 change to social change to running a business to making a family work
2543 can be viewed as a collective action problem.
2544
2545 The internet makes it easier than at any time before to find people who
2546 want to work on a project with you — hence the success of free and
2547 open-source software, crowdfunding, and racist terror groups — and
2548 easier than ever to coordinate the work you do.
2549
2550 The internet and the computers we connect to it also possess an
2551 exceptional quality: general-purposeness. The internet is designed to
2552 allow any two parties to communicate any data, using any protocol,
2553 without permission from anyone else. The only production design we have
2554 for computers is the general-purpose, “Turing complete” computer that
2555 can run every program we can express in symbolic logic.
2556
2557 This means that every time someone with a special communications need
2558 invests in infrastructure and techniques to make the internet faster,
2559 cheaper, and more robust, this benefit redounds to everyone else who is
2560 using the internet to communicate. And this also means that every time
2561 someone with a special computing need invests to make computers faster,
2562 cheaper, and more robust, every other computing application is a
2563 potential beneficiary of this work.
2564
2565 For these reasons, every type of communication is gradually absorbed
2566 into the internet, and every type of device — from airplanes to
2567 pacemakers — eventually becomes a computer in a fancy case.
2568
2569 While these considerations don’t preclude regulating networks and
2570 computers, they do call for gravitas and caution when doing so because
2571 changes to regulatory frameworks could ripple out to have unintended
2572 consequences in many, many other domains.
2573
2574 The upshot of this is that our best hope of solving the big coordination
2575 problems — climate change, inequality, etc. — is with free, fair, and
2576 open tech. Our best hope of keeping tech free, fair, and open is to
2577 exercise caution in how we regulate tech and to attend closely to the
2578 ways in which interventions to solve one problem might create problems
2579 in other domains.
2580
2581 Ownership of facts
2582 ------------------
2583
2584 Big Tech has a funny relationship with information. When you’re
2585 generating information — anything from the location data streaming off
2586 your mobile device to the private messages you send to friends on a
2587 social network — it claims the rights to make unlimited use of that
2588 data.
2589
2590 But when you have the audacity to turn the tables — to use a tool that
2591 blocks ads or slurps your waiting updates out of a social network and
2592 puts them in another app that lets you set your own priorities and
2593 suggestions or crawls their system to allow you to start a rival
2594 business — they claim that you’re stealing from them.
2595
2596 The thing is, information is a very bad fit for any kind of private
2597 property regime. Property rights are useful for establishing markets
2598 that can lead to the effective development of fallow assets. These
2599 markets depend on clear titles to ensure that the things being bought
2600 and sold in them can, in fact, be bought and sold.
2601
2602 Information rarely has such a clear title. Take phone numbers: There’s
2603 clearly something going wrong when Facebook slurps up millions of users’
2604 address books and uses the phone numbers it finds in them to plot out
2605 social graphs and fill in missing information about other users.
2606
2607 But the phone numbers Facebook nonconsensually acquires in this
2608 transaction are not the “property” of the users they’re taken from nor
2609 do they belong to the people whose phones ring when you dial those
2610 numbers. The numbers are mere integers, 10 digits in the U.S. and
2611 Canada, and they appear in millions of places, including somewhere deep
2612 in pi as well as numerous other contexts. Giving people ownership titles
2613 to integers is an obviously terrible idea.
2614
2615 Likewise for the facts that Facebook and other commercial surveillance
2616 operators acquire about us, like that we are the children of our parents
2617 or the parents to our children or that we had a conversation with
2618 someone else or went to a public place. These data points can’t be
2619 property in the sense that your house or your shirt is your property
2620 because the title to them is intrinsically muddy: Does your mom own the
2621 fact that she is your mother? Do you? Do both of you? What about your
2622 dad — does he own this fact too, or does he have to license the fact
2623 from you (or your mom or both of you) in order to use this fact? What
2624 about the hundreds or thousands of other people who know these facts?
2625
2626 If you go to a Black Lives Matter demonstration, do the other
2627 demonstrators need your permission to post their photos from the event?
2628 The online fights over `when and how to post photos from
2629 demonstrations <https://www.wired.com/story/how-to-take-photos-at-protests/>`__
2630 reveal a nuanced, complex issue that cannot be easily hand-waved away by
2631 giving one party a property right that everyone else in the mix has to
2632 respect.
2633
2634 The fact that information isn’t a good fit with property and markets
2635 doesn’t mean that it’s not valuable. Babies aren’t property, but they’re
2636 inarguably valuable. In fact, we have a whole set of rules just for
2637 babies as well as a subset of those rules that apply to humans more
2638 generally. Someone who argues that babies won’t be truly valuable until
2639 they can be bought and sold like loaves of bread would be instantly and
2640 rightfully condemned as a monster.
2641
2642 It’s tempting to reach for the property hammer when Big Tech treats your
2643 information like a nail — not least because Big Tech are such prolific
2644 abusers of property hammers when it comes to *their* information. But
2645 this is a mistake. If we allow markets to dictate the use of our
2646 information, then we’ll find that we’re sellers in a buyers’ market
2647 where the Big Tech monopolies set a price for our data that is so low as
2648 to be insignificant or, more likely, set at a nonnegotiable price of
2649 zero in a click-through agreement that you don’t have the opportunity to
2650 modify.
2651
2652 Meanwhile, establishing property rights over information will create
2653 insurmountable barriers to independent data processing. Imagine that we
2654 require a license to be negotiated when a translated document is
2655 compared with its original, something Google has done and continues to
2656 do billions of times to train its automated language translation tools.
2657 Google can afford this, but independent third parties cannot. Google can
2658 staff a clearances department to negotiate one-time payments to the
2659 likes of the EU (one of the major repositories of translated documents)
2660 while independent watchdogs wanting to verify that the translations are
2661 well-prepared, or to root out bias in translations, will find themselves
2662 needing a staffed-up legal department and millions for licenses before
2663 they can even get started.
2664
2665 The same goes for things like search indexes of the web or photos of
2666 peoples’ houses, which have become contentious thanks to Google’s Street
2667 View project. Whatever problems may exist with Google’s photographing of
2668 street scenes, resolving them by letting people decide who can take
2669 pictures of the facades of their homes from a public street will surely
2670 create even worse ones. Think of how street photography is important for
2671 newsgathering — including informal newsgathering, like photographing
2672 abuses of authority — and how being able to document housing and street
2673 life are important for contesting eminent domain, advocating for social
2674 aid, reporting planning and zoning violations, documenting
2675 discriminatory and unequal living conditions, and more.
2676
2677 The ownership of facts is antithetical to many kinds of human progress.
2678 It’s hard to imagine a rule that limits Big Tech’s exploitation of our
2679 collective labors without inadvertently banning people from gathering
2680 data on online harassment or compiling indexes of changes in language or
2681 simply investigating how the platforms are shaping our discourse — all
2682 of which require scraping data that other people have created and
2683 subjecting it to scrutiny and analysis.
2684
2685 Persuasion works… slowly
2686 -------------------------
2687
2688 The platforms may oversell their ability to persuade people, but
2689 obviously, persuasion works sometimes. Whether it’s the private realm
2690 that LGBTQ people used to recruit allies and normalize sexual diversity
2691 or the decadeslong project to convince people that markets are the only
2692 efficient way to solve complicated resource allocation problems, it’s
2693 clear that our societal attitudes *can* change.
2694
2695 The project of shifting societal attitudes is a game of inches and
2696 years. For centuries, svengalis have purported to be able to accelerate
2697 this process, but even the most brutal forms of propaganda have
2698 struggled to make permanent changes. Joseph Goebbels was able to subject
2699 Germans to daily, mandatory, hourslong radio broadcasts, to round up and
2700 torture and murder dissidents, and to seize full control over their
2701 children’s education while banning any literature, broadcasts, or films
2702 that did not comport with his worldview.
2703
2704 Yet, after 12 years of terror, once the war ended, Nazi ideology was
2705 largely discredited in both East and West Germany, and a program of
2706 national truth and reconciliation was put in its place. Racism and
2707 authoritarianism were never fully abolished in Germany, but neither were
2708 the majority of Germans irrevocably convinced of Nazism — and the rise
2709 of racist authoritarianism in Germany today tells us that the liberal
2710 attitudes that replaced Nazism were no more permanent than Nazism
2711 itself.
2712
2713 Racism and authoritarianism have also always been with us. Anyone who’s
2714 reviewed the kind of messages and arguments that racists put forward
2715 today would be hard-pressed to say that they have gotten better at
2716 presenting their ideas. The same pseudoscience, appeals to fear, and
2717 circular logic that racists presented in the 1980s, when the cause of
2718 white supremacy was on the wane, are to be found in the communications
2719 of leading white nationalists today.
2720
2721 If racists haven’t gotten more convincing in the past decade, then how
2722 is it that more people were convinced to be openly racist at that time?
2723 I believe that the answer lies in the material world, not the world of
2724 ideas. The ideas haven’t gotten more convincing, but people have become
2725 more afraid. Afraid that the state can’t be trusted to act as an honest
2726 broker in life-or-death decisions, from those regarding the management
2727 of the economy to the regulation of painkillers to the rules for
2728 handling private information. Afraid that the world has become a game of
2729 musical chairs in which the chairs are being taken away at a
2730 never-before-seen rate. Afraid that justice for others will come at
2731 their expense. Monopolism isn’t the cause of these fears, but the
2732 inequality and material desperation and policy malpractice that
2733 monopolism contributes to is a significant contributor to these
2734 conditions. Inequality creates the conditions for both conspiracies and
2735 violent racist ideologies, and then surveillance capitalism lets
2736 opportunists target the fearful and the conspiracy-minded.
2737
2738 Paying won’t help
2739 ------------------
2740
2741 As the old saw goes, “If you’re not paying for the product, you’re the
2742 product.”
2743
2744 It’s a commonplace belief today that the advent of free, ad-supported
2745 media was the original sin of surveillance capitalism. The reasoning is
2746 that the companies that charged for access couldn’t “compete with free”
2747 and so they were driven out of business. Their ad-supported competitors,
2748 meanwhile, declared open season on their users’ data in a bid to improve
2749 their ad targeting and make more money and then resorted to the most
2750 sensationalist tactics to generate clicks on those ads. If only we’d pay
2751 for media again, we’d have a better, more responsible, more sober
2752 discourse that would be better for democracy.
2753
2754 But the degradation of news products long precedes the advent of
2755 ad-supported online news. Long before newspapers were online, lax
2756 antitrust enforcement had opened the door for unprecedented waves of
2757 consolidation and roll-ups in newsrooms. Rival newspapers were merged,
2758 reporters and ad sales staff were laid off, physical plants were sold
2759 and leased back, leaving the companies loaded up with debt through
2760 leveraged buyouts and subsequent profit-taking by the new owners. In
2761 other words, it wasn’t merely shifts in the classified advertising
2762 market, which was long held to be the primary driver in the decline of
2763 the traditional newsroom, that made news companies unable to adapt to
2764 the internet — it was monopolism.
2765
2766 Then, as news companies *did* come online, the ad revenues they
2767 commanded dropped even as the number of internet users (and thus
2768 potential online readers) increased. That shift was a function of
2769 consolidation in the ad sales market, with Google and Facebook emerging
2770 as duopolists who made more money every year from advertising while
2771 paying less and less of it to the publishers whose work the ads appeared
2772 alongside. Monopolism created a buyer’s market for ad inventory with
2773 Facebook and Google acting as gatekeepers.
2774
2775 Paid services continue to exist alongside free ones, and often it is
2776 these paid services — anxious to prevent people from bypassing their
2777 paywalls or sharing paid media with freeloaders — that exert the most
2778 control over their customers. Apple’s iTunes and App Stores are paid
2779 services, but to maximize their profitability, Apple has to lock its
2780 platforms so that third parties can’t make compatible software without
2781 permission. These locks allow the company to exercise both editorial
2782 control (enabling it to exclude `controversial political
2783 material <https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression>`__)
2784 and technological control, including control over who can repair the
2785 devices it makes. If we’re worried that ad-supported products deprive
2786 people of their right to self-determination by using persuasion
2787 techniques to nudge their purchase decisions a few degrees in one
2788 direction or the other, then the near-total control a single company
2789 holds over the decision of who gets to sell you software, parts, and
2790 service for your iPhone should have us very worried indeed.
2791
2792 We shouldn’t just be concerned about payment and control: The idea that
2793 paying will improve discourse is also dangerously wrong. The poor
2794 success rate of targeted advertising means that the platforms have to
2795 incentivize you to “engage” with posts at extremely high levels to
2796 generate enough pageviews to safeguard their profits. As discussed
2797 earlier, to increase engagement, platforms like Facebook use machine
2798 learning to guess which messages will be most inflammatory and make a
2799 point of shoving those into your eyeballs at every turn so that you will
2800 hate-click and argue with people.
2801
2802 Perhaps paying would fix this, the reasoning goes. If platforms could be
2803 economically viable even if you stopped clicking on them once your
2804 intellectual and social curiosity had been slaked, then they would have
2805 no reason to algorithmically enrage you to get more clicks out of you,
2806 right?
2807
2808 There may be something to that argument, but it still ignores the wider
2809 economic and political context of the platforms and the world that
2810 allowed them to grow so dominant.
2811
2812 Platforms are world-spanning and all-encompassing because they are
2813 monopolies, and they are monopolies because we have gutted our most
2814 important and reliable anti-monopoly rules. Antitrust was neutered as a
2815 key part of the project to make the wealthy wealthier, and that project
2816 has worked. The vast majority of people on Earth have a negative net
2817 worth, and even the dwindling middle class is in a precarious state,
2818 undersaved for retirement, underinsured for medical disasters, and
2819 undersecured against climate and technology shocks.
2820
2821 In this wildly unequal world, paying doesn’t improve the discourse; it
2822 simply prices discourse out of the range of the majority of people.
2823 Paying for the product is dandy, if you can afford it.
2824
2825 If you think today’s filter bubbles are a problem for our discourse,
2826 imagine what they’d be like if rich people inhabited free-flowing
2827 Athenian marketplaces of ideas where you have to pay for admission while
2828 everyone else lives in online spaces that are subsidized by wealthy
2829 benefactors who relish the chance to establish conversational spaces
2830 where the “house rules” forbid questioning the status quo. That is,
2831 imagine if the rich seceded from Facebook, and then, instead of running
2832 ads that made money for shareholders, Facebook became a billionaire’s
2833 vanity project that also happened to ensure that nobody talked about
2834 whether it was fair that only billionaires could afford to hang out in
2835 the rarified corners of the internet.
2836
2837 Behind the idea of paying for access is a belief that free markets will
2838 address Big Tech’s dysfunction. After all, to the extent that people
2839 have a view of surveillance at all, it is generally an unfavorable one,
2840 and the longer and more thoroughly one is surveilled, the less one tends
2841 to like it. Same goes for lock-in: If HP’s ink or Apple’s App Store were
2842 really obviously fantastic, they wouldn’t need technical measures to
2843 prevent users from choosing a rival’s product. The only reason these
2844 technical countermeasures exist is that the companies don’t believe
2845 their customers would *voluntarily* submit to their terms, and they want
2846 to deprive them of the choice to take their business elsewhere.
2847
2848 Advocates for markets laud their ability to aggregate the diffused
2849 knowledge of buyers and sellers across a whole society through demand
2850 signals, price signals, and so on. The argument for surveillance
2851 capitalism being a “rogue capitalism” is that machine-learning-driven
2852 persuasion techniques distort decision-making by consumers, leading to
2853 incorrect signals — consumers don’t buy what they prefer, they buy what
2854 they’re tricked into preferring. It follows that the monopolistic
2855 practices of lock-in, which do far more to constrain consumers’ free
2856 choices, are even more of a “rogue capitalism.”
2857
2858 The profitability of any business is constrained by the possibility that
2859 its customers will take their business elsewhere. Both surveillance and
2860 lock-in are anti-features that no customer wants. But monopolies can
2861 capture their regulators, crush their competitors, insert themselves
2862 into their customers’ lives, and corral people into “choosing” their
2863 services regardless of whether they want them — it’s fine to be terrible
2864 when there is no alternative.
2865
2866 Ultimately, surveillance and lock-in are both simply business strategies
2867 that monopolists can choose. Surveillance companies like Google are
2868 perfectly capable of deploying lock-in technologies — just look at the
2869 onerous Android licensing terms that require device-makers to bundle in
2870 Google’s suite of applications. And lock-in companies like Apple are
2871 perfectly capable of subjecting their users to surveillance if it means
2872 keeping the Chinese government happy and preserving ongoing access to
2873 Chinese markets. Monopolies may be made up of good, ethical people, but
2874 as institutions, they are not your friend — they will do whatever they
2875 can get away with to maximize their profits, and the more monopolistic
2876 they are, the more they *can* get away with.
2877
2878 An “ecology” moment for trustbusting
2879 ---------------------------------------
2880
2881 If we’re going to break Big Tech’s death grip on our digital lives,
2882 we’re going to have to fight monopolies. That may sound pretty mundane
2883 and old-fashioned, something out of the New Deal era, while ending the
2884 use of automated behavioral modification feels like the plotline of a
2885 really cool cyberpunk novel.
2886
2887 Meanwhile, breaking up monopolies is something we seem to have forgotten
2888 how to do. There is a bipartisan, trans-Atlantic consensus that breaking
2889 up companies is a fool’s errand at best — liable to mire your federal
2890 prosecutors in decades of litigation — and counterproductive at worst,
2891 eroding the “consumer benefits” of large companies with massive
2892 efficiencies of scale.
2893
2894 But trustbusters once strode the nation, brandishing law books,
2895 terrorizing robber barons, and shattering the illusion of monopolies’
2896 all-powerful grip on our society. The trustbusting era could not begin
2897 until we found the political will — until the people convinced
2898 politicians they’d have their backs when they went up against the
2899 richest, most powerful men in the world.
2900
2901 Could we find that political will again?
2902
2903 Copyright scholar James Boyle has described how the term “ecology”
2904 marked a turning point in environmental activism. Prior to the adoption
2905 of this term, people who wanted to preserve whale populations didn’t
2906 necessarily see themselves as fighting the same battle as people who
2907 wanted to protect the ozone layer or fight freshwater pollution or beat
2908 back smog or acid rain.
2909
2910 But the term “ecology” welded these disparate causes together into a
2911 single movement, and the members of this movement found solidarity with
2912 one another. The people who cared about smog signed petitions circulated
2913 by the people who wanted to end whaling, and the anti-whalers marched
2914 alongside the people demanding action on acid rain. This uniting behind
2915 a common cause completely changed the dynamics of environmentalism,
2916 setting the stage for today’s climate activism and the sense that
2917 preserving the habitability of the planet Earth is a shared duty among
2918 all people.
2919
2920 I believe we are on the verge of a new “ecology” moment dedicated to
2921 combating monopolies. After all, tech isn’t the only concentrated
2922 industry nor is it even the *most* concentrated of industries.
2923
2924 You can find partisans for trustbusting in every sector of the economy.
2925 Everywhere you look, you can find people who’ve been wronged by
2926 monopolists who’ve trashed their finances, their health, their privacy,
2927 their educations, and the lives of people they love. Those people have
2928 the same cause as the people who want to break up Big Tech and the same
2929 enemies. When most of the world’s wealth is in the hands of a very few,
2930 it follows that nearly every large company will have overlapping
2931 shareholders.
2932
2933 That’s the good news: With a little bit of work and a little bit of
2934 coalition building, we have more than enough political will to break up
2935 Big Tech and every other concentrated industry besides. First we take
2936 Facebook, then we take AT&T/WarnerMedia.
2937
2938 But here’s the bad news: Much of what we’re doing to tame Big Tech
2939 *instead* of breaking up the big companies also forecloses on the
2940 possibility of breaking them up later.
2941
2942 Big Tech’s concentration currently means that their inaction on
2943 harassment, for example, leaves users with an impossible choice: absent
2944 themselves from public discourse by, say, quitting Twitter or endure
2945 vile, constant abuse. Big Tech’s over-collection and over-retention of
2946 data results in horrific identity theft. And their inaction on extremist
2947 recruitment means that white supremacists who livestream their shooting
2948 rampages can reach an audience of billions. The combination of tech
2949 concentration and media concentration means that artists’ incomes are
2950 falling even as the revenue generated by their creations are increasing.
2951
2952 Yet governments confronting all of these problems all inevitably
2953 converge on the same solution: deputize the Big Tech giants to police
2954 their users and render them liable for their users’ bad actions. The
2955 drive to force Big Tech to use automated filters to block everything
2956 from copyright infringement to sex-trafficking to violent extremism
2957 means that tech companies will have to allocate hundreds of millions to
2958 run these compliance systems.
2959
2960 These rules — the EU’s new Directive on Copyright, Australia’s new
2961 terror regulation, America’s FOSTA/SESTA sex-trafficking law and more —
2962 are not just death warrants for small, upstart competitors that might
2963 challenge Big Tech’s dominance but who lack the deep pockets of
2964 established incumbents to pay for all these automated systems. Worse
2965 still, these rules put a floor under how small we can hope to make Big
2966 Tech.
2967
2968 That’s because any move to break up Big Tech and cut it down to size
2969 will have to cope with the hard limit of not making these companies so
2970 small that they can no longer afford to perform these duties — and it’s
2971 *expensive* to invest in those automated filters and outsource content
2972 moderation. It’s already going to be hard to unwind these deeply
2973 concentrated, chimeric behemoths that have been welded together in the
2974 pursuit of monopoly profits. Doing so while simultaneously finding some
2975 way to fill the regulatory void that will be left behind if these
2976 self-policing rulers were forced to suddenly abdicate will be much, much
2977 harder.
2978
2979 Allowing the platforms to grow to their present size has given them a
2980 dominance that is nearly insurmountable — deputizing them with public
2981 duties to redress the pathologies created by their size makes it
2982 virtually impossible to reduce that size. Lather, rinse, repeat: If the
2983 platforms don’t get smaller, they will get larger, and as they get
2984 larger, they will create more problems, which will give rise to more
2985 public duties for the companies, which will make them bigger still.
2986
2987 We can work to fix the internet by breaking up Big Tech and depriving
2988 them of monopoly profits, or we can work to fix Big Tech by making them
2989 spend their monopoly profits on governance. But we can’t do both. We
2990 have to choose between a vibrant, open internet or a dominated,
2991 monopolized internet commanded by Big Tech giants that we struggle with
2992 constantly to get them to behave themselves.
2993
2994 Make Big Tech small again
2995 -------------------------
2996
2997 Trustbusting is hard. Breaking big companies into smaller ones is
2998 expensive and time-consuming. So time-consuming that by the time you’re
2999 done, the world has often moved on and rendered years of litigation
3000 irrelevant. From 1969 to 1982, the U.S. government pursued an antitrust
3001 case against IBM over its dominance of mainframe computing — but the
3002 case collapsed in 1982 because mainframes were being speedily replaced
3003 by PCs.
3004
3005 A future U.S. president could simply direct their attorney general to
3006 enforce the law as it was written.
3007
3008 It’s far easier to prevent concentration than to fix it, and reinstating
3009 the traditional contours of U.S. antitrust enforcement will, at the very
3010 least, prevent further concentration. That means bans on mergers between
3011 large companies, on big companies acquiring nascent competitors, and on
3012 platform companies competing directly with the companies that rely on
3013 the platforms.
3014
3015 These powers are all in the plain language of U.S. antitrust laws, so in
3016 theory, a future U.S. president could simply direct their attorney
3017 general to enforce the law as it was written. But after decades of
3018 judicial “education” in the benefits of monopolies, after multiple
3019 administrations that have packed the federal courts with
3020 lifetime-appointed monopoly cheerleaders, it’s not clear that mere
3021 administrative action would do the trick.
3022
3023 If the courts frustrate the Justice Department and the president, the
3024 next stop would be Congress, which could eliminate any doubt about how
3025 antitrust law should be enforced in the U.S. by passing new laws that
3026 boil down to saying, “Knock it off. We all know what the Sherman Act
3027 says. Robert Bork was a deranged fantasist. For avoidance of doubt,
3028 *fuck that guy*.” In other words, the problem with monopolies is
3029 *monopolism* — the concentration of power into too few hands, which
3030 erodes our right to self-determination. If there is a monopoly, the law
3031 wants it gone, period. Sure, get rid of monopolies that create “consumer
3032 harm” in the form of higher prices, but also, *get rid of other
3033 monopolies, too.*
3034
3035 But this only prevents things from getting worse. To help them get
3036 better, we will have to build coalitions with other activists in the
3037 anti-monopoly ecology movement — a pluralism movement or a
3038 self-determination movement — and target existing monopolies in every
3039 industry for breakup and structural separation rules that prevent, for
3040 example, the giant eyewear monopolist Luxottica from dominating both the
3041 sale and the manufacture of spectacles.
3042
3043 In an important sense, it doesn’t matter which industry the breakups
3044 begin in. Once they start, shareholders in *every* industry will start
3045 to eye their investments in monopolists skeptically. As trustbusters
3046 ride into town and start making lives miserable for monopolists, the
3047 debate around every corporate boardroom’s table will shift. People
3048 within corporations who’ve always felt uneasy about monopolism will gain
3049 a powerful new argument to fend off their evil rivals in the corporate
3050 hierarchy: “If we do it my way, we make less money; if we do it your
3051 way, a judge will fine us billions and expose us to ridicule and public
3052 disapprobation. So even though I get that it would be really cool to do
3053 that merger, lock out that competitor, or buy that little company and
3054 kill it before it can threaten it, we really shouldn’t — not if we don’t
3055 want to get tied to the DOJ’s bumper and get dragged up and down
3056 Trustbuster Road for the next 10 years.”
3057
3058 20 GOTO 10
3059 ----------
3060
3061 Fixing Big Tech will require a lot of iteration. As cyber lawyer
3062 Lawrence Lessig wrote in his 1999 book, *Code and Other Laws of
3063 Cyberspace*, our lives are regulated by four forces: law (what’s legal),
3064 code (what’s technologically possible), norms (what’s socially
3065 acceptable), and markets (what’s profitable).
3066
3067 If you could wave a wand and get Congress to pass a law that re-fanged
3068 the Sherman Act tomorrow, you could use the impending breakups to
3069 convince venture capitalists to fund competitors to Facebook, Google,
3070 Twitter, and Apple that would be waiting in the wings after they were
3071 cut down to size.
3072
3073 But getting Congress to act will require a massive normative shift, a
3074 mass movement of people who care about monopolies — and pulling them
3075 apart.
3076
3077 Getting people to care about monopolies will take technological
3078 interventions that help them to see what a world free from Big Tech
3079 might look like. Imagine if someone could make a beloved (but
3080 unauthorized) third-party Facebook or Twitter client that dampens the
3081 anxiety-producing algorithmic drumbeat and still lets you talk to your
3082 friends without being spied upon — something that made social media more
3083 sociable and less toxic. Now imagine that it gets shut down in a brutal
3084 legal battle. It’s always easier to convince people that something must
3085 be done to save a thing they love than it is to excite them about
3086 something that doesn’t even exist yet.
3087
3088 Neither tech nor law nor code nor markets are sufficient to reform Big
3089 Tech. But a profitable competitor to Big Tech could bankroll a
3090 legislative push; legal reform can embolden a toolsmith to make a better
3091 tool; the tool can create customers for a potential business who value
3092 the benefits of the internet but want them delivered without Big Tech;
3093 and that business can get funded and divert some of its profits to legal
3094 reform. 20 GOTO 10 (or lather, rinse, repeat). Do it again, but this
3095 time, get farther! After all, this time you’re starting with weaker Big
3096 Tech adversaries, a constituency that understands things can be better,
3097 Big Tech rivals who’ll help ensure their own future by bankrolling
3098 reform, and code that other programmers can build on to weaken Big Tech
3099 even further.
3100
3101 The surveillance capitalism hypothesis — that Big Tech’s products really
3102 work as well as they say they do and that’s why everything is so screwed
3103 up — is way too easy on surveillance and even easier on capitalism.
3104 Companies spy because they believe their own BS, and companies spy
3105 because governments let them, and companies spy because any advantage
3106 from spying is so short-lived and minor that they have to do more and
3107 more of it just to stay in place.
3108
3109 As to why things are so screwed up? Capitalism. Specifically, the
3110 monopolism that creates inequality and the inequality that creates
3111 monopolism. It’s a form of capitalism that rewards sociopaths who
3112 destroy the real economy to inflate the bottom line, and they get away
3113 with it for the same reason companies get away with spying: because our
3114 governments are in thrall to both the ideology that says monopolies are
3115 actually just fine and in thrall to the ideology that says that in a
3116 monopolistic world, you’d better not piss off the monopolists.
3117
3118 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked rule
3119 begets surveillance. Surveillance isn’t bad because it lets people
3120 manipulate us. It’s bad because it crushes our ability to be our
3121 authentic selves — and because it lets the rich and powerful figure out
3122 who might be thinking of building guillotines and what dirt they can use
3123 to discredit those embryonic guillotine-builders before they can even
3124 get to the lumberyard.
3125
3126 Up and through
3127 --------------
3128
3129 With all the problems of Big Tech, it’s tempting to imagine solving the
3130 problem by returning to a world without tech at all. Resist that
3131 temptation.
3132
3133 The only way out of our Big Tech problem is up and through. If our
3134 future is not reliant upon high tech, it will be because civilization
3135 has fallen. Big Tech wired together a planetary, species-wide nervous
3136 system that, with the proper reforms and course corrections, is capable
3137 of seeing us through the existential challenge of our species and
3138 planet. Now it’s up to us to seize the means of computation, putting
3139 that electronic nervous system under democratic, accountable control.
3140
3141 I am, secretly, despite what I have said earlier, a tech exceptionalist.
3142 Not in the sense of thinking that tech should be given a free pass to
3143 monopolize because it has “economies of scale” or some other nebulous
3144 feature. I’m a tech exceptionalist because I believe that getting tech
3145 right matters and that getting it wrong will be an unmitigated
3146 catastrophe — and doing it right can give us the power to work together
3147 to save our civilization, our species, and our planet.