1 How to Destroy Surveillance Capitalism
2 ======================================
6 The net of a thousand lies
7 --------------------------
9 The most surprising thing about the rebirth of flat Earthers in the 21st
10 century is just how widespread the evidence against them is. You can
11 understand how, centuries ago, people who’d never gained a high-enough
12 vantage point from which to see the Earth’s curvature might come to the
13 commonsense belief that the flat-seeming Earth was, indeed, flat.
15 But today, when elementary schools routinely dangle GoPro cameras from
16 balloons and loft them high enough to photograph the Earth’s curve — to
17 say nothing of the unexceptional sight of the curved Earth from an
18 airplane window — it takes a heroic effort to maintain the belief that
21 Likewise for white nationalism and eugenics: In an age where you can
22 become a computational genomics datapoint by swabbing your cheek and
23 mailing it to a gene-sequencing company along with a modest sum of
24 money, “race science” has never been easier to refute.
26 We are living through a golden age of both readily available facts and
27 denial of those facts. Terrible ideas that have lingered on the fringes
28 for decades or even centuries have gone mainstream seemingly overnight.
30 When an obscure idea gains currency, there are only two things that can
31 explain its ascendance: Either the person expressing that idea has
32 gotten a lot better at stating their case, or the proposition has become
33 harder to deny in the face of mounting evidence. In other words, if we
34 want people to take climate change seriously, we can get a bunch of
35 Greta Thunbergs to make eloquent, passionate arguments from podiums,
36 winning our hearts and minds, or we can wait for flood, fire, broiling
37 sun, and pandemics to make the case for us. In practice, we’ll probably
38 have to do some of both: The more we’re boiling and burning and drowning
39 and wasting away, the easier it will be for the Greta Thunbergs of the
42 The arguments for ridiculous beliefs in odious conspiracies like
43 anti-vaccination, climate denial, a flat Earth, and eugenics are no
44 better than they were a generation ago. Indeed, they’re worse because
45 they are being pitched to people who have at least a background
46 awareness of the refuting facts.
48 Anti-vax has been around since the first vaccines, but the early
49 anti-vaxxers were pitching people who were less equipped to understand
50 even the most basic ideas from microbiology, and moreover, those people
51 had not witnessed the extermination of mass-murdering diseases like
52 polio, smallpox, and measles. Today’s anti-vaxxers are no more eloquent
53 than their forebears, and they have a much harder job.
55 So can these far-fetched conspiracy theorists really be succeeding on
56 the basis of superior arguments?
58 Some people think so. Today, there is a widespread belief that machine
59 learning and commercial surveillance can turn even the most
60 fumble-tongued conspiracy theorist into a svengali who can warp your
61 perceptions and win your belief by locating vulnerable people and then
62 pitching them with A.I.-refined arguments that bypass their rational
63 faculties and turn everyday people into flat Earthers, anti-vaxxers, or
64 even Nazis. When the RAND Corporation `blames Facebook for
65 “radicalization” <https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf>`__
66 and when Facebook’s role in spreading coronavirus misinformation is
68 algorithm <https://secure.avaaz.org/campaign/en/facebook_threat_health/>`__,
69 the implicit message is that machine learning and surveillance are
70 causing the changes in our consensus about what’s true.
72 After all, in a world where sprawling and incoherent conspiracy theories
73 like Pizzagate and its successor, QAnon, have widespread followings,
74 *something* must be afoot.
76 But what if there’s another explanation? What if it’s the material
77 circumstances, and not the arguments, that are making the difference for
78 these conspiracy pitchmen? What if the trauma of living through *real
79 conspiracies* all around us — conspiracies among wealthy people, their
80 lobbyists, and lawmakers to bury inconvenient facts and evidence of
81 wrongdoing (these conspiracies are commonly known as “corruption”) — is
82 making people vulnerable to conspiracy theories?
84 If it’s trauma and not contagion — material conditions and not ideology
85 — that is making the difference today and enabling a rise of repulsive
86 misinformation in the face of easily observed facts, that doesn’t mean
87 our computer networks are blameless. They’re still doing the heavy work
88 of locating vulnerable people and guiding them through a series of
89 ever-more-extreme ideas and communities.
91 Belief in conspiracy is a raging fire that has done real damage and
92 poses real danger to our planet and species, from epidemics `kicked off
93 by vaccine denial <https://www.cdc.gov/measles/cases-outbreaks.html>`__
94 to genocides `kicked off by racist
95 conspiracies <https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html>`__
96 to planetary meltdown caused by denial-inspired climate inaction. Our
97 world is on fire, and so we have to put the fires out — to figure out
98 how to help people see the truth of the world through the conspiracies
99 they’ve been confused by.
101 But firefighting is reactive. We need fire *prevention*. We need to
102 strike at the traumatic material conditions that make people vulnerable
103 to the contagion of conspiracy. Here, too, tech has a role to play.
105 There’s no shortage of proposals to address this. From the EU’s
106 `Terrorist Content Regulation <https://edri.org/tag/terreg/>`__, which
107 requires platforms to police and remove “extremist” content, to the U.S.
108 proposals to `force tech companies to spy on their
109 users <https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution>`__
110 and hold them liable `for their users’ bad
111 speech <https://www.natlawreview.com/article/repeal-cda-section-230>`__,
112 there’s a lot of energy to force tech companies to solve the problems
115 There’s a critical piece missing from the debate, though. All these
116 solutions assume that tech companies are a fixture, that their dominance
117 over the internet is a permanent fact. Proposals to replace Big Tech
118 with a more diffused, pluralistic internet are nowhere to be found.
119 Worse: The “solutions” on the table today *require* Big Tech to stay big
120 because only the very largest companies can afford to implement the
121 systems these laws demand.
123 Figuring out what we want our tech to look like is crucial if we’re
124 going to get out of this mess. Today, we’re at a crossroads where we’re
125 trying to figure out if we want to fix the Big Tech companies that
126 dominate our internet or if we want to fix the internet itself by
127 unshackling it from Big Tech’s stranglehold. We can’t do both, so we
130 I want us to choose wisely. Taming Big Tech is integral to fixing the
131 internet, and for that, we need digital rights activism.
133 Digital rights activism, a quarter-century on
134 ---------------------------------------------
136 Digital rights activism is more than 30 years old now. The Electronic
137 Frontier Foundation turned 30 this year; the Free Software Foundation
138 launched in 1985. For most of the history of the movement, the most
139 prominent criticism leveled against it was that it was irrelevant: The
140 real activist causes were real-world causes (think of the skepticism
141 when `Finland declared broadband a human right in
142 2010 <https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015.>`__),
143 and real-world activism was shoe-leather activism (think of Malcolm
144 Gladwell’s `contempt for
145 “clicktivism” <https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell>`__).
146 But as tech has grown more central to our daily lives, these accusations
147 of irrelevance have given way first to accusations of insincerity (“You
148 only care about tech because you’re `shilling for tech
149 companies <https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/>`__\ ”)
150 to accusations of negligence (“Why didn’t you foresee that tech could be
151 such a destructive force?”). But digital rights activism is right where
152 it’s always been: looking out for the humans in a world where tech is
153 inexorably taking over.
155 The latest version of this critique comes in the form of “surveillance
156 capitalism,” a term coined by business professor Shoshana Zuboff in her
157 long and influential 2019 book, *The Age of Surveillance Capitalism: The
158 Fight for a Human Future at the New Frontier of Power*. Zuboff argues
159 that “surveillance capitalism” is a unique creature of the tech industry
160 and that it is unlike any other abusive commercial practice in history,
161 one that is “constituted by unexpected and often illegible mechanisms of
162 extraction, commodification, and control that effectively exile persons
163 from their own behavior while producing new markets of behavioral
164 prediction and modification. Surveillance capitalism challenges
165 democratic norms and departs in key ways from the centuries-long
166 evolution of market capitalism.” It is a new and deadly form of
167 capitalism, a “rogue capitalism,” and our lack of understanding of its
168 unique capabilities and dangers represents an existential, species-wide
169 threat. She’s right that capitalism today threatens our species, and
170 she’s right that tech poses unique challenges to our species and
171 civilization, but she’s really wrong about how tech is different and why
172 it threatens our species.
174 What’s more, I think that her incorrect diagnosis will lead us down a
175 path that ends up making Big Tech stronger, not weaker. We need to take
176 down Big Tech, and to do that, we need to start by correctly identifying
179 Tech exceptionalism, then and now
180 ---------------------------------
182 Early critics of the digital rights movement — perhaps best represented
183 by campaigning organizations like the Electronic Frontier Foundation,
184 the Free Software Foundation, Public Knowledge, and others that focused
185 on preserving and enhancing basic human rights in the digital realm —
186 damned activists for practicing “tech exceptionalism.” Around the turn
187 of the millennium, serious people ridiculed any claim that tech policy
188 mattered in the “real world.” Claims that tech rules had implications
189 for speech, association, privacy, search and seizure, and fundamental
190 rights and equities were treated as ridiculous, an elevation of the
191 concerns of sad nerds arguing about *Star Trek* on bulletin board
192 systems above the struggles of the Freedom Riders, Nelson Mandela, or
193 the Warsaw ghetto uprising.
195 In the decades since, accusations of “tech exceptionalism” have only
196 sharpened as tech’s role in everyday life has expanded: Now that tech
197 has infiltrated every corner of our life and our online lives have been
198 monopolized by a handful of giants, defenders of digital freedoms are
199 accused of carrying water for Big Tech, providing cover for its
200 self-interested negligence (or worse, nefarious plots).
202 From my perspective, the digital rights movement has remained stationary
203 while the rest of the world has moved. From the earliest days, the
204 movement’s concern was users and the toolsmiths who provided the code
205 they needed to realize their fundamental rights. Digital rights
206 activists only cared about companies to the extent that companies were
207 acting to uphold users’ rights (or, just as often, when companies were
208 acting so foolishly that they threatened to bring down new rules that
209 would also make it harder for good actors to help users).
211 The “surveillance capitalism” critique recasts the digital rights
212 movement in a new light again: not as alarmists who overestimate the
213 importance of their shiny toys nor as shills for big tech but as serene
214 deck-chair rearrangers whose long-standing activism is a liability
215 because it makes them incapable of perceiving novel threats as they
216 continue to fight the last century’s tech battles.
218 But tech exceptionalism is a sin no matter who practices it.
220 Don’t believe the hype
221 -----------------------
223 You’ve probably heard that “if you’re not paying for the product, you’re
224 the product.” As we’ll see below, that’s true, if incomplete. But what
225 is *absolutely* true is that ad-driven Big Tech’s customers are
226 advertisers, and what companies like Google and Facebook sell is their
227 ability to convince *you* to buy stuff. Big Tech’s product is
228 persuasion. The services — social media, search engines, maps,
229 messaging, and more — are delivery systems for persuasion.
231 The fear of surveillance capitalism starts from the (correct)
232 presumption that everything Big Tech says about itself is probably a
233 lie. But the surveillance capitalism critique makes an exception for the
234 claims Big Tech makes in its sales literature — the breathless hype in
235 the pitches to potential advertisers online and in ad-tech seminars
236 about the efficacy of its products: It assumes that Big Tech is as good
237 at influencing us as they claim they are when they’re selling
238 influencing products to credulous customers. That’s a mistake because
239 sales literature is not a reliable indicator of a product’s efficacy.
241 Surveillance capitalism assumes that because advertisers buy a lot of
242 what Big Tech is selling, Big Tech must be selling something real. But
243 Big Tech’s massive sales could just as easily be the result of a popular
244 delusion or something even more pernicious: monopolistic control over
245 our communications and commerce.
247 Being watched changes your behavior, and not for the better. It creates
248 risks for our social progress. Zuboff’s book features beautifully
249 wrought explanations of these phenomena. But Zuboff also claims that
250 surveillance literally robs us of our free will — that when our personal
251 data is mixed with machine learning, it creates a system of persuasion
252 so devastating that we are helpless before it. That is, Facebook uses an
253 algorithm to analyze the data it nonconsensually extracts from your
254 daily life and uses it to customize your feed in ways that get you to
255 buy stuff. It is a mind-control ray out of a 1950s comic book, wielded
256 by mad scientists whose supercomputers guarantee them perpetual and
257 total world domination.
262 To understand why you shouldn’t worry about mind-control rays — but why
263 you *should* worry about surveillance *and* Big Tech — we must start by
264 unpacking what we mean by “persuasion.”
266 Google, Facebook, and other surveillance capitalists promise their
267 customers (the advertisers) that if they use machine-learning tools
268 trained on unimaginably large data sets of nonconsensually harvested
269 personal information, they will be able to uncover ways to bypass the
270 rational faculties of the public and direct their behavior, creating a
271 stream of purchases, votes, and other desired outcomes.
273 The impact of dominance far exceeds the impact of manipulation and
274 should be central to our analysis and any remedies we seek.
276 But there’s little evidence that this is happening. Instead, the
277 predictions that surveillance capitalism delivers to its customers are
278 much less impressive. Rather than finding ways to bypass our rational
279 faculties, surveillance capitalists like Mark Zuckerberg mostly do one
280 or more of three things:
285 If you’re selling diapers, you have better luck if you pitch them to
286 people in maternity wards. Not everyone who enters or leaves a maternity
287 ward just had a baby, and not everyone who just had a baby is in the
288 market for diapers. But having a baby is a really reliable correlate of
289 being in the market for diapers, and being in a maternity ward is highly
290 correlated with having a baby. Hence diaper ads around maternity wards
291 (and even pitchmen for baby products, who haunt maternity wards with
292 baskets full of freebies).
294 Surveillance capitalism is segmenting times a billion. Diaper vendors
295 can go way beyond people in maternity wards (though they can do that,
296 too, with things like location-based mobile ads). They can target you
297 based on whether you’re reading articles about child-rearing, diapers,
298 or a host of other subjects, and data mining can suggest unobvious
299 keywords to advertise against. They can target you based on the articles
300 you’ve recently read. They can target you based on what you’ve recently
301 purchased. They can target you based on whether you receive emails or
302 private messages about these subjects — or even if you speak aloud about
303 them (though Facebook and the like convincingly claim that’s not
306 This is seriously creepy.
308 But it’s not mind control.
310 It doesn’t deprive you of your free will. It doesn’t trick you.
312 Think of how surveillance capitalism works in politics. Surveillance
313 capitalist companies sell political operatives the power to locate
314 people who might be receptive to their pitch. Candidates campaigning on
315 finance industry corruption seek people struggling with debt; candidates
316 campaigning on xenophobia seek out racists. Political operatives have
317 always targeted their message whether their intentions were honorable or
318 not: Union organizers set up pitches at factory gates, and white
319 supremacists hand out fliers at John Birch Society meetings.
321 But this is an inexact and thus wasteful practice. The union organizer
322 can’t know which worker to approach on the way out of the factory gates
323 and may waste their time on a covert John Birch Society member; the
324 white supremacist doesn’t know which of the Birchers are so delusional
325 that making it to a meeting is as much as they can manage and which ones
326 might be convinced to cross the country to carry a tiki torch through
327 the streets of Charlottesville, Virginia.
329 Because targeting improves the yields on political pitches, it can
330 accelerate the pace of political upheaval by making it possible for
331 everyone who has secretly wished for the toppling of an autocrat — or
332 just an 11-term incumbent politician — to find everyone else who feels
333 the same way at very low cost. This has been critical to the rapid
334 crystallization of recent political movements including Black Lives
335 Matter and Occupy Wall Street as well as less savory players like the
336 far-right white nationalist movements that marched in Charlottesville.
338 It’s important to differentiate this kind of political organizing from
339 influence campaigns; finding people who secretly agree with you isn’t
340 the same as convincing people to agree with you. The rise of phenomena
341 like nonbinary or otherwise nonconforming gender identities is often
342 characterized by reactionaries as the result of online brainwashing
343 campaigns that convince impressionable people that they have been
344 secretly queer all along.
346 But the personal accounts of those who have come out tell a different
347 story where people who long harbored a secret about their gender were
348 emboldened by others coming forward and where people who knew that they
349 were different but lacked a vocabulary for discussing that difference
350 learned the right words from these low-cost means of finding people and
351 learning about their ideas.
356 Lies and fraud are pernicious, and surveillance capitalism supercharges
357 them through targeting. If you want to sell a fraudulent payday loan or
358 subprime mortgage, surveillance capitalism can help you find people who
359 are both desperate and unsophisticated and thus receptive to your pitch.
360 This accounts for the rise of many phenomena, like multilevel marketing
361 schemes, in which deceptive claims about potential earnings and the
362 efficacy of sales techniques are targeted at desperate people by
363 advertising against search queries that indicate, for example, someone
364 struggling with ill-advised loans.
366 Surveillance capitalism also abets fraud by making it easy to locate
367 other people who have been similarly deceived, forming a community of
368 people who reinforce one another’s false beliefs. Think of `the
369 forums <https://www.vulture.com/2020/01/the-dream-podcast-review.html>`__
370 where people who are being victimized by multilevel marketing frauds
371 gather to trade tips on how to improve their luck in peddling the
374 Sometimes, online deception involves replacing someone’s correct beliefs
375 with incorrect ones, as it does in the anti-vaccination movement, whose
376 victims are often people who start out believing in vaccines but are
377 convinced by seemingly plausible evidence that leads them into the false
378 belief that vaccines are harmful.
380 But it’s much more common for fraud to succeed when it doesn’t have to
381 displace a true belief. When my daughter contracted head lice at
382 daycare, one of the daycare workers told me I could get rid of them by
383 treating her hair and scalp with olive oil. I didn’t know anything about
384 head lice, and I assumed that the daycare worker did, so I tried it (it
385 didn’t work, and it doesn’t work). It’s easy to end up with false
386 beliefs when you simply don’t know any better and when those beliefs are
387 conveyed by someone who seems to know what they’re doing.
389 This is pernicious and difficult — and it’s also the kind of thing the
390 internet can help guard against by making true information available,
391 especially in a form that exposes the underlying deliberations among
392 parties with sharply divergent views, such as Wikipedia. But it’s not
393 brainwashing; it’s fraud. In the `majority of
394 cases <https://datasociety.net/library/data-voids/>`__, the victims of
395 these fraud campaigns have an informational void filled in the customary
396 way, by consulting a seemingly reliable source. If I look up the length
397 of the Brooklyn Bridge and learn that it is 5,800 feet long, but in
398 reality, it is 5,989 feet long, the underlying deception is a problem,
399 but it’s a problem with a simple remedy. It’s a very different problem
400 from the anti-vax issue in which someone’s true belief is displaced by a
401 false one by means of sophisticated persuasion.
406 Surveillance capitalism is the result of monopoly. Monopoly is the
407 cause, and surveillance capitalism and its negative outcomes are the
408 effects of monopoly. I’ll get into this in depth later, but for now,
409 suffice it to say that the tech industry has grown up with a radical
410 theory of antitrust that has allowed companies to grow by merging with
411 their rivals, buying up their nascent competitors, and expanding to
412 control whole market verticals.
414 One example of how monopolism aids in persuasion is through dominance:
415 Google makes editorial decisions about its algorithms that determine the
416 sort order of the responses to our queries. If a cabal of fraudsters
417 have set out to trick the world into thinking that the Brooklyn Bridge
418 is 5,800 feet long, and if Google gives a high search rank to this group
419 in response to queries like “How long is the Brooklyn Bridge?” then the
420 first eight or 10 screens’ worth of Google results could be wrong. And
421 since most people don’t go beyond the first couple of results — let
422 alone the first *page* of results — Google’s choice means that many
423 people will be deceived.
425 Google’s dominance over search — more than 86% of web searches are
426 performed through Google — means that the way it orders its search
427 results has an outsized effect on public beliefs. Ironically, Google
428 claims this is why it can’t afford to have any transparency in its
429 algorithm design: Google’s search dominance makes the results of its
430 sorting too important to risk telling the world how it arrives at those
431 results lest some bad actor discover a flaw in the ranking system and
432 exploit it to push its point of view to the top of the search results.
433 There’s an obvious remedy to a company that is too big to audit: break
434 it up into smaller pieces.
436 Zuboff calls surveillance capitalism a “rogue capitalism” whose
437 data-hoarding and machine-learning techniques rob us of our free will.
438 But influence campaigns that seek to displace existing, correct beliefs
439 with false ones have an effect that is small and temporary while
440 monopolistic dominance over informational systems has massive, enduring
441 effects. Controlling the results to the world’s search queries means
442 controlling access both to arguments and their rebuttals and, thus,
443 control over much of the world’s beliefs. If our concern is how
444 corporations are foreclosing on our ability to make up our own minds and
445 determine our own futures, the impact of dominance far exceeds the
446 impact of manipulation and should be central to our analysis and any
449 4. Bypassing our rational faculties
450 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
452 *This* is the good stuff: using machine learning, “dark patterns,”
453 engagement hacking, and other techniques to get us to do things that run
454 counter to our better judgment. This is mind control.
456 Some of these techniques have proven devastatingly effective (if only in
457 the short term). The use of countdown timers on a purchase completion
458 page can create a sense of urgency that causes you to ignore the nagging
459 internal voice suggesting that you should shop around or sleep on your
460 decision. The use of people from your social graph in ads can provide
461 “social proof” that a purchase is worth making. Even the auction system
462 pioneered by eBay is calculated to play on our cognitive blind spots,
463 letting us feel like we “own” something because we bid on it, thus
464 encouraging us to bid again when we are outbid to ensure that “our”
467 Games are extraordinarily good at this. “Free to play” games manipulate
468 us through many techniques, such as presenting players with a series of
469 smoothly escalating challenges that create a sense of mastery and
470 accomplishment but which sharply transition into a set of challenges
471 that are impossible to overcome without paid upgrades. Add some social
472 proof to the mix — a stream of notifications about how well your friends
473 are faring — and before you know it, you’re buying virtual power-ups to
474 get to the next level.
476 Companies have risen and fallen on these techniques, and the “fallen”
477 part is worth paying attention to. In general, living things adapt to
478 stimulus: Something that is very compelling or noteworthy when you first
479 encounter it fades with repetition until you stop noticing it
480 altogether. Consider the refrigerator hum that irritates you when it
481 starts up but disappears into the background so thoroughly that you only
482 notice it when it stops again.
484 That’s why behavioral conditioning uses “intermittent reinforcement
485 schedules.” Instead of giving you a steady drip of encouragement or
486 setbacks, games and gamified services scatter rewards on a randomized
487 schedule — often enough to keep you interested and random enough that
488 you can never quite find the pattern that would make it boring.
490 Intermittent reinforcement is a powerful behavioral tool, but it also
491 represents a collective action problem for surveillance capitalism. The
492 “engagement techniques” invented by the behaviorists of surveillance
493 capitalist companies are quickly copied across the whole sector so that
494 what starts as a mysteriously compelling fillip in the design of a
495 service—like “pull to refresh” or alerts when someone likes your posts
496 or side quests that your characters get invited to while in the midst of
497 main quests—quickly becomes dully ubiquitous. The
498 impossible-to-nail-down nonpattern of randomized drips from your phone
499 becomes a grey-noise wall of sound as every single app and site starts
500 to make use of whatever seems to be working at the time.
502 From the surveillance capitalist’s point of view, our adaptive capacity
503 is like a harmful bacterium that deprives it of its food source — our
504 attention — and novel techniques for snagging that attention are like
505 new antibiotics that can be used to breach our defenses and destroy our
506 self-determination. And there *are* techniques like that. Who can forget
507 the Great Zynga Epidemic, when all of our friends were caught in
508 *FarmVille*\ ’s endless, mindless dopamine loops? But every new
509 attention-commanding technique is jumped on by the whole industry and
510 used so indiscriminately that antibiotic resistance sets in. Given
511 enough repetition, almost all of us develop immunity to even the most
512 powerful techniques — by 2013, two years after Zynga’s peak, its user
515 Not everyone, of course. Some people never adapt to stimulus, just as
516 some people never stop hearing the hum of the refrigerator. This is why
517 most people who are exposed to slot machines play them for a while and
518 then move on while a small and tragic minority liquidate their kids’
519 college funds, buy adult diapers, and position themselves in front of a
520 machine until they collapse.
522 But surveillance capitalism’s margins on behavioral modification suck.
523 Tripling the rate at which someone buys a widget sounds great `unless
524 the base rate is way less than
525 1% <https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957>`__
526 with an improved rate of… still less than 1%. Even penny slot machines
527 pull down pennies for every spin while surveillance capitalism rakes in
528 infinitesimal penny fractions.
530 Slot machines’ high returns mean that they can be profitable just by
531 draining the fortunes of the small rump of people who are pathologically
532 vulnerable to them and unable to adapt to their tricks. But surveillance
533 capitalism can’t survive on the fractional pennies it brings down from
534 that vulnerable sliver — that’s why, after the Great Zynga Epidemic had
535 finally burned itself out, the small number of still-addicted players
536 left behind couldn’t sustain it as a global phenomenon. And new powerful
537 attention weapons aren’t easy to find, as is evidenced by the long years
538 since the last time Zynga had a hit. Despite the hundreds of millions of
539 dollars that Zynga has to spend on developing new tools to blast through
540 our adaptation, it has never managed to repeat the lucky accident that
541 let it snag so much of our attention for a brief moment in 2009.
542 Powerhouses like Supercell have fared a little better, but they are rare
543 and throw away many failures for every success.
545 The vulnerability of small segments of the population to dramatic,
546 efficient corporate manipulation is a real concern that’s worthy of our
547 attention and energy. But it’s not an existential threat to society.
549 If data is the new oil, then surveillance capitalism’s engine has a leak
550 -------------------------------------------------------------------------
552 This adaptation problem offers an explanation for one of surveillance
553 capitalism’s most alarming traits: its relentless hunger for data and
554 its endless expansion of data-gathering capabilities through the spread
555 of sensors, online surveillance, and acquisition of data streams from
558 Zuboff observes this phenomenon and concludes that data must be very
559 valuable if surveillance capitalism is so hungry for it. (In her words:
560 “Just as industrial capitalism was driven to the continuous
561 intensification of the means of production, so surveillance capitalists
562 and their market players are now locked into the continuous
563 intensification of the means of behavioral modification and the
564 gathering might of instrumentarian power.”) But what if the voracious
565 appetite is because data has such a short half-life — because people
566 become inured so quickly to new, data-driven persuasion techniques —
567 that the companies are locked in an arms race with our limbic system?
568 What if it’s all a Red Queen’s race where they have to run ever faster —
569 collect ever-more data — just to stay in the same spot?
571 Of course, all of Big Tech’s persuasion techniques work in concert with
572 one another, and collecting data is useful beyond mere behavioral
575 If someone wants to recruit you to buy a refrigerator or join a pogrom,
576 they might use profiling and targeting to send messages to people they
577 judge to be good sales prospects. The messages themselves may be
578 deceptive, making claims about things you’re not very knowledgeable
579 about (food safety and energy efficiency or eugenics and historical
580 claims about racial superiority). They might use search engine
581 optimization and/or armies of fake reviewers and commenters and/or paid
582 placement to dominate the discourse so that any search for further
583 information takes you back to their messages. And finally, they may
584 refine the different pitches using machine learning and other techniques
585 to figure out what kind of pitch works best on someone like you.
587 Each phase of this process benefits from surveillance: The more data
588 they have, the more precisely they can profile you and target you with
589 specific messages. Think of how you’d sell a fridge if you knew that the
590 warranty on your prospect’s fridge just expired and that they were
591 expecting a tax rebate in April.
593 Also, the more data they have, the better they can craft deceptive
594 messages — if I know that you’re into genealogy, I might not try to feed
595 you pseudoscience about genetic differences between “races,” sticking
596 instead to conspiratorial secret histories of “demographic replacement”
599 Facebook also helps you locate people who have the same odious or
600 antisocial views as you. It makes it possible to find other people who
601 want to carry tiki torches through the streets of Charlottesville in
602 Confederate cosplay. It can help you find other people who want to join
603 your militia and go to the border to look for undocumented migrants to
604 terrorize. It can help you find people who share your belief that
605 vaccines are poison and that the Earth is flat.
607 There is one way in which targeted advertising uniquely benefits those
608 advocating for socially unacceptable causes: It is invisible. Racism is
609 widely geographically dispersed, and there are few places where racists
610 — and only racists — gather. This is similar to the problem of selling
611 refrigerators in that potential refrigerator purchasers are
612 geographically dispersed and there are few places where you can buy an
613 ad that will be primarily seen by refrigerator customers. But buying a
614 refrigerator is socially acceptable while being a Nazi is not, so you
615 can buy a billboard or advertise in the newspaper sports section for
616 your refrigerator business, and the only potential downside is that your
617 ad will be seen by a lot of people who don’t want refrigerators,
618 resulting in a lot of wasted expense.
620 But even if you wanted to advertise your Nazi movement on a billboard or
621 prime-time TV or the sports section, you would struggle to find anyone
622 willing to sell you the space for your ad partly because they disagree
623 with your views and partly because they fear censure (boycott,
624 reputational damage, etc.) from other people who disagree with your
627 Targeted ads solve this problem: On the internet, every ad unit can be
628 different for every person, meaning that you can buy ads that are only
629 shown to people who appear to be Nazis and not to people who hate Nazis.
630 When there’s spillover — when someone who hates racism is shown a racist
631 recruiting ad — there is some fallout; the platform or publication might
632 get an angry public or private denunciation. But the nature of the risk
633 assumed by an online ad buyer is different than the risks to a
634 traditional publisher or billboard owner who might want to run a Nazi
637 Online ads are placed by algorithms that broker between a diverse
638 ecosystem of self-serve ad platforms that anyone can buy an ad through,
639 so the Nazi ad that slips onto your favorite online publication isn’t
640 seen as their moral failing but rather as a failure in some distant,
641 upstream ad supplier. When a publication gets a complaint about an
642 offensive ad that’s appearing in one of its units, it can take some
643 steps to block that ad, but the Nazi might buy a slightly different ad
644 from a different broker serving the same unit. And in any event,
645 internet users increasingly understand that when they see an ad, it’s
646 likely that the advertiser did not choose that publication and that the
647 publication has no idea who its advertisers are.
649 These layers of indirection between advertisers and publishers serve as
650 moral buffers: Today’s moral consensus is largely that publishers
651 shouldn’t be held responsible for the ads that appear on their pages
652 because they’re not actively choosing to put those ads there. Because of
653 this, Nazis are able to overcome significant barriers to organizing
656 Data has a complex relationship with domination. Being able to spy on
657 your customers can alert you to their preferences for your rivals and
658 allow you to head off your rivals at the pass.
660 More importantly, if you can dominate the information space while also
661 gathering data, then you make other deceptive tactics stronger because
662 it’s harder to break out of the web of deceit you’re spinning.
663 Domination — that is, ultimately becoming a monopoly — and not the data
664 itself is the supercharger that makes every tactic worth pursuing
665 because monopolistic domination deprives your target of an escape route.
667 If you’re a Nazi who wants to ensure that your prospects primarily see
668 deceptive, confirming information when they search for more, you can
669 improve your odds by seeding the search terms they use through your
670 initial communications. You don’t need to own the top 10 results for
671 “voter suppression” if you can convince your marks to confine their
672 search terms to “voter fraud,” which throws up a very different set of
675 Surveillance capitalists are like stage mentalists who claim that their
676 extraordinary insights into human behavior let them guess the word that
677 you wrote down and folded up in your pocket but who really use shills,
678 hidden cameras, sleight of hand, and brute-force memorization to amaze
681 Or perhaps they’re more like pick-up artists, the misogynistic cult that
682 promises to help awkward men have sex with women by teaching them
683 “neurolinguistic programming” phrases, body language techniques, and
684 psychological manipulation tactics like “negging” — offering unsolicited
685 negative feedback to women to lower their self-esteem and prick their
688 Some pick-up artists eventually manage to convince women to go home with
689 them, but it’s not because these men have figured out how to bypass
690 women’s critical faculties. Rather, pick-up artists’ “success” stories
691 are a mix of women who were incapable of giving consent, women who were
692 coerced, women who were intoxicated, self-destructive women, and a few
693 women who were sober and in command of their faculties but who didn’t
694 realize straightaway that they were with terrible men but rectified the
695 error as soon as they could.
697 Pick-up artists *believe* they have figured out a secret back door that
698 bypasses women’s critical faculties, but they haven’t. Many of the
699 tactics they deploy, like negging, became the butt of jokes (just like
700 people joke about bad ad targeting), and there’s a good chance that
701 anyone they try these tactics on will immediately recognize them and
702 dismiss the men who use them as irredeemable losers.
704 Pick-up artists are proof that people can believe they have developed a
705 system of mind control *even when it doesn’t work*. Pick-up artists
706 simply exploit the fact that one-in-a-million chances can come through
707 for you if you make a million attempts, and then they assume that the
708 other 999,999 times, they simply performed the technique incorrectly and
709 commit themselves to doing better next time. There’s only one group of
710 people who find pick-up artist lore reliably convincing: other would-be
711 pick-up artists whose anxiety and insecurity make them vulnerable to
712 scammers and delusional men who convince them that if they pay for
713 tutelage and follow instructions, then they will someday succeed.
714 Pick-up artists assume they fail to entice women because they are bad at
715 being pick-up artists, not because pick-up artistry is bullshit. Pick-up
716 artists are bad at selling themselves to women, but they’re much better
717 at selling themselves to men who pay to learn the secrets of pick-up
720 Department store pioneer John Wanamaker is said to have lamented, “Half
721 the money I spend on advertising is wasted; the trouble is I don’t know
722 which half.” The fact that Wanamaker thought that only half of his
723 advertising spending was wasted is a tribute to the persuasiveness of
724 advertising executives, who are *much* better at convincing potential
725 clients to buy their services than they are at convincing the general
726 public to buy their clients’ wares.
731 Facebook is heralded as the origin of all of our modern plagues, and
732 it’s not hard to see why. Some tech companies want to lock their users
733 in but make their money by monopolizing access to the market for apps
734 for their devices and gouging them on prices rather than by spying on
735 them (like Apple). Some companies don’t care about locking in users
736 because they’ve figured out how to spy on them no matter where they are
737 and what they’re doing and can turn that surveillance into money
738 (Google). Facebook alone among the Western tech giants has built a
739 business based on locking in its users *and* spying on them all the
742 Facebook’s surveillance regime is really without parallel in the Western
743 world. Though Facebook tries to prevent itself from being visible on the
744 public web, hiding most of what goes on there from people unless they’re
745 logged into Facebook, the company has nevertheless booby-trapped the
746 entire web with surveillance tools in the form of Facebook “Like”
747 buttons that web publishers include on their sites to boost their
748 Facebook profiles. Facebook also makes various libraries and other
749 useful code snippets available to web publishers that act as
750 surveillance tendrils on the sites where they’re used, funneling
751 information about visitors to the site — newspapers, dating sites,
752 message boards — to Facebook.
754 Big Tech is able to practice surveillance not just because it is tech
755 but because it is *big*.
757 Facebook offers similar tools to app developers, so the apps — games,
758 fart machines, business review services, apps for keeping abreast of
759 your kid’s schooling — you use will send information about your
760 activities to Facebook even if you don’t have a Facebook account and
761 even if you don’t download or use Facebook apps. On top of all that,
762 Facebook buys data from third-party brokers on shopping habits, physical
763 location, use of “loyalty” programs, financial transactions, etc., and
764 cross-references that with the dossiers it develops on activity on
765 Facebook and with apps and the public web.
767 Though it’s easy to integrate the web with Facebook — linking to news
768 stories and such — Facebook products are generally not available to be
769 integrated back into the web itself. You can embed a tweet in a Facebook
770 post, but if you embed a Facebook post in a tweet, you just get a link
771 back to Facebook and must log in before you can see it. Facebook has
772 used extreme technological and legal countermeasures to prevent rivals
773 from allowing their users to embed Facebook snippets in competing
774 services or to create alternative interfaces to Facebook that merge your
775 Facebook inbox with those of other services that you use.
777 And Facebook is incredibly popular, with 2.3 billion claimed users
778 (though many believe this figure to be inflated). Facebook has been used
779 to organize genocidal pogroms, racist riots, anti-vaccination movements,
780 flat Earth cults, and the political lives of some of the world’s
781 ugliest, most brutal autocrats. There are some really alarming things
782 going on in the world, and Facebook is implicated in many of them, so
783 it’s easy to conclude that these bad things are the result of Facebook’s
784 mind-control system, which it rents out to anyone with a few bucks to
787 To understand what role Facebook plays in the formulation and
788 mobilization of antisocial movements, we need to understand the dual
791 Because it has a lot of users and a lot of data about those users,
792 Facebook is a very efficient tool for locating people with hard-to-find
793 traits, the kinds of traits that are widely diffused in the population
794 such that advertisers have historically struggled to find a
795 cost-effective way to reach them. Think back to refrigerators: Most of
796 us only replace our major appliances a few times in our entire lives. If
797 you’re a refrigerator manufacturer or retailer, you have these brief
798 windows in the life of a consumer during which they are pondering a
799 purchase, and you have to somehow reach them. Anyone who’s ever
800 registered a title change after buying a house can attest that appliance
801 manufacturers are incredibly desperate to reach anyone who has even the
802 slenderest chance of being in the market for a new fridge.
804 Facebook makes finding people shopping for refrigerators a *lot* easier.
805 It can target ads to people who’ve registered a new home purchase, to
806 people who’ve searched for refrigerator buying advice, to people who
807 have complained about their fridge dying, or any combination thereof. It
808 can even target people who’ve recently bought *other* kitchen appliances
809 on the theory that someone who’s just replaced their stove and
810 dishwasher might be in a fridge-buying kind of mood. The vast majority
811 of people who are reached by these ads will not be in the market for a
812 new fridge, but — crucially — the percentage of people who *are* looking
813 for fridges that these ads reach is *much* larger than it is than for
814 any group that might be subjected to traditional, offline targeted
815 refrigerator marketing.
817 Facebook also makes it a lot easier to find people who have the same
818 rare disease as you, which might have been impossible in earlier eras —
819 the closest fellow sufferer might otherwise be hundreds of miles away.
820 It makes it easier to find people who went to the same high school as
821 you even though decades have passed and your former classmates have all
822 been scattered to the four corners of the Earth.
824 Facebook also makes it much easier to find people who hold the same rare
825 political beliefs as you. If you’ve always harbored a secret affinity
826 for socialism but never dared utter this aloud lest you be demonized by
827 your neighbors, Facebook can help you discover other people who feel the
828 same way (and it might just demonstrate to you that your affinity is
829 more widespread than you ever suspected). It can make it easier to find
830 people who share your sexual identity. And again, it can help you to
831 understand that what you thought was a shameful secret that affected
832 only you was really a widely shared trait, giving you both comfort and
833 the courage to come out to the people in your life.
835 All of this presents a dilemma for Facebook: Targeting makes the
836 company’s ads more effective than traditional ads, but it also lets
837 advertisers see just how effective their ads are. While advertisers are
838 pleased to learn that Facebook ads are more effective than ads on
839 systems with less sophisticated targeting, advertisers can also see that
840 in nearly every case, the people who see their ads ignore them. Or, at
841 best, the ads work on a subconscious level, creating nebulous
842 unmeasurables like “brand recognition.” This means that the price per ad
843 is very low in nearly every case.
845 To make things worse, many Facebook groups spark precious little
846 discussion. Your little-league soccer team, the people with the same
847 rare disease as you, and the people you share a political affinity with
848 may exchange the odd flurry of messages at critical junctures, but on a
849 daily basis, there’s not much to say to your old high school chums or
850 other hockey-card collectors.
852 With nothing but “organic” discussion, Facebook would not generate
853 enough traffic to sell enough ads to make the money it needs to
854 continually expand by buying up its competitors while returning handsome
855 sums to its investors.
857 So Facebook has to gin up traffic by sidetracking its own forums: Every
858 time Facebook’s algorithm injects controversial materials — inflammatory
859 political articles, conspiracy theories, outrage stories — into a group,
860 it can hijack that group’s nominal purpose with its desultory
861 discussions and supercharge those discussions by turning them into
862 bitter, unproductive arguments that drag on and on. Facebook is
863 optimized for engagement, not happiness, and it turns out that automated
864 systems are pretty good at figuring out things that people will get
867 Facebook *can* modify our behavior but only in a couple of trivial ways.
868 First, it can lock in all your friends and family members so that you
869 check and check and check with Facebook to find out what they are up to;
870 and second, it can make you angry and anxious. It can force you to
871 choose between being interrupted constantly by updates — a process that
872 breaks your concentration and makes it hard to be introspective — and
873 staying in touch with your friends. This is a very limited form of mind
874 control, and it can only really make us miserable, angry, and anxious.
876 This is why Facebook’s targeting systems — both the ones it shows to
877 advertisers and the ones that let users find people who share their
878 interests — are so next-gen and smooth and easy to use as well as why
879 its message boards have a toolset that seems like it hasn’t changed
880 since the mid-2000s. If Facebook delivered an equally flexible,
881 sophisticated message-reading system to its users, those users could
882 defend themselves against being nonconsensually eyeball-fucked with
883 Donald Trump headlines.
885 The more time you spend on Facebook, the more ads it gets to show you.
886 The solution to Facebook’s ads only working one in a thousand times is
887 for the company to try to increase how much time you spend on Facebook
888 by a factor of a thousand. Rather than thinking of Facebook as a company
889 that has figured out how to show you exactly the right ad in exactly the
890 right way to get you to do what its advertisers want, think of it as a
891 company that has figured out how to make you slog through an endless
892 torrent of arguments even though they make you miserable, spending so
893 much time on the site that it eventually shows you at least one ad that
896 Monopoly and the right to the future tense
897 ------------------------------------------
899 Zuboff and her cohort are particularly alarmed at the extent to which
900 surveillance allows corporations to influence our decisions, taking away
901 something she poetically calls “the right to the future tense” — that
902 is, the right to decide for yourself what you will do in the future.
904 It’s true that advertising can tip the scales one way or another: When
905 you’re thinking of buying a fridge, a timely fridge ad might end the
906 search on the spot. But Zuboff puts enormous and undue weight on the
907 persuasive power of surveillance-based influence techniques. Most of
908 these don’t work very well, and the ones that do won’t work for very
909 long. The makers of these influence tools are confident they will
910 someday refine them into systems of total control, but they are hardly
911 unbiased observers, and the risks from their dreams coming true are very
914 By contrast, Zuboff is rather sanguine about 40 years of lax antitrust
915 practice that has allowed a handful of companies to dominate the
916 internet, ushering in an information age with, `as one person on Twitter
917 noted <https://twitter.com/tveastman/status/1069674780826071040>`__,
918 five giant websites each filled with screenshots of the other four.
920 However, if we are to be alarmed that we might lose the right to choose
921 for ourselves what our future will hold, then monopoly’s nonspeculative,
922 concrete, here-and-now harms should be front and center in our debate
925 Start with “digital rights management.” In 1998, Bill Clinton signed the
926 Digital Millennium Copyright Act (DMCA) into law. It’s a complex piece
927 of legislation with many controversial clauses but none more so than
928 Section 1201, the “anti-circumvention” rule.
930 This is a blanket ban on tampering with systems that restrict access to
931 copyrighted works. The ban is so thoroughgoing that it prohibits
932 removing a copyright lock even when no copyright infringement takes
933 place. This is by design: The activities that the DMCA’s Section 1201
934 sets out to ban are not copyright infringements; rather, they are legal
935 activities that frustrate manufacturers’ commercial plans.
937 For example, Section 1201’s first major application was on DVD players
938 as a means of enforcing the region coding built into those devices.
939 DVD-CCA, the body that standardized DVDs and DVD players, divided the
940 world into six regions and specified that DVD players must check each
941 disc to determine which regions it was authorized to be played in. DVD
942 players would have their own corresponding region (a DVD player bought
943 in the U.S. would be region 1 while one bought in India would be region
944 5). If the player and the disc’s region matched, the player would play
945 the disc; otherwise, it would reject it.
947 However, watching a lawfully produced disc in a country other than the
948 one where you purchased it is not copyright infringement — it’s the
949 opposite. Copyright law imposes this duty on customers for a movie: You
950 must go into a store, find a licensed disc, and pay the asking price. Do
951 that — and *nothing else* — and you and copyright are square with one
954 The fact that a movie studio wants to charge Indians less than Americans
955 or release in Australia later than it releases in the U.K. has no
956 bearing on copyright law. Once you lawfully acquire a DVD, it is no
957 copyright infringement to watch it no matter where you happen to be.
959 So DVD and DVD player manufacturers would not be able to use accusations
960 of abetting copyright infringement to punish manufacturers who made
961 noncompliant players that would play discs from any region or repair
962 shops that modified players to let you watch out-of-region discs or
963 software programmers who created programs to let you do this.
965 That’s where Section 1201 of the DMCA comes in: By banning tampering
966 with an “access control,” the rule gave manufacturers and rights holders
967 standing to sue competitors who released superior products with lawful
968 features that the market demanded (in this case, region-free players).
970 This is an odious scam against consumers, but as time went by, Section
971 1201 grew to encompass a rapidly expanding constellation of devices and
972 services as canny manufacturers have realized certain things:
974 - Any device with software in it contains a “copyrighted work” — i.e.,
976 - A device can be designed so that reconfiguring the software requires
977 bypassing an “access control for copyrighted works,” which is a
978 potential felony under Section 1201.
979 - Thus, companies can control their customers’ behavior after they take
980 home their purchases by designing products so that all unpermitted
981 uses require modifications that fall afoul of Section 1201.
983 Section 1201 then becomes a means for manufacturers of all descriptions
984 to force their customers to arrange their affairs to benefit the
985 manufacturers’ shareholders instead of themselves.
987 This manifests in many ways: from a new generation of inkjet printers
988 that use countermeasures to prevent third-party ink that cannot be
989 bypassed without legal risks to similar systems in tractors that prevent
990 third-party technicians from swapping in the manufacturer’s own parts
991 that are not recognized by the tractor’s control system until it is
992 supplied with a manufacturer’s unlock code.
994 Closer to home, Apple’s iPhones use these measures to prevent both
995 third-party service and third-party software installation. This allows
996 Apple to decide when an iPhone is beyond repair and must be shredded and
997 landfilled as opposed to the iPhone’s purchaser. (Apple is notorious for
998 its environmentally catastrophic policy of destroying old electronics
999 rather than permitting them to be cannibalized for parts.) This is a
1000 very useful power to wield, especially in light of CEO Tim Cook’s
1001 January 2019 warning to investors that the company’s profits are
1002 endangered by customers choosing to hold onto their phones for longer
1003 rather than replacing them.
1005 Apple’s use of copyright locks also allows it to establish a monopoly
1006 over how its customers acquire software for their mobile devices. The
1007 App Store’s commercial terms guarantee Apple a share of all revenues
1008 generated by the apps sold there, meaning that Apple gets paid when you
1009 buy an app from its store and then continues to get paid every time you
1010 buy something using that app. This comes out of the bottom line of
1011 software developers, who must either charge more or accept lower profits
1014 Crucially, Apple’s use of copyright locks gives it the power to make
1015 editorial decisions about which apps you may and may not install on your
1016 own device. Apple has used this power to `reject
1017 dictionaries <https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html>`__
1018 for containing obscene words; to `limit political
1019 speech <https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again>`__,
1020 especially from apps that make sensitive political commentary such as an
1021 app that notifies you every time a U.S. drone kills someone somewhere in
1022 the world; and to `object to a
1023 game <https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says>`__
1024 that commented on the Israel-Palestine conflict.
1026 Apple often justifies monopoly power over software installation in the
1027 name of security, arguing that its vetting of apps for its store means
1028 that it can guard its users against apps that contain surveillance code.
1029 But this cuts both ways. In China, the government `ordered Apple to
1030 prohibit the sale of privacy
1031 tools <https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc>`__
1032 like VPNs with the exception of VPNs that had deliberately introduced
1033 flaws designed to let the Chinese state eavesdrop on users. Because
1034 Apple uses technological countermeasures — with legal backstops — to
1035 block customers from installing unauthorized apps, Chinese iPhone owners
1036 cannot readily (or legally) acquire VPNs that would protect them from
1037 Chinese state snooping.
1039 Zuboff calls surveillance capitalism a “rogue capitalism.” Theoreticians
1040 of capitalism claim that its virtue is that it `aggregates information
1041 in the form of consumers’
1042 decisions <https://en.wikipedia.org/wiki/Price_signal>`__, producing
1043 efficient markets. Surveillance capitalism’s supposed power to rob its
1044 victims of their free will through computationally supercharged
1045 influence campaigns means that our markets no longer aggregate
1046 customers’ decisions because we customers no longer decide — we are
1047 given orders by surveillance capitalism’s mind-control rays.
1049 If our concern is that markets cease to function when consumers can no
1050 longer make choices, then copyright locks should concern us at *least*
1051 as much as influence campaigns. An influence campaign might nudge you to
1052 buy a certain brand of phone; but the copyright locks on that phone
1053 absolutely determine where you get it serviced, which apps can run on
1054 it, and when you have to throw it away rather than fixing it.
1056 Search order and the right to the future tense
1057 ----------------------------------------------
1059 Markets are posed as a kind of magic: By discovering otherwise hidden
1060 information conveyed by the free choices of consumers, those consumers’
1061 local knowledge is integrated into a self-correcting system that makes
1062 efficient allocations—more efficient than any computer could calculate.
1063 But monopolies are incompatible with that notion. When you only have one
1064 app store, the owner of the store — not the consumer — decides on the
1065 range of choices. As Boss Tweed once said, “I don’t care who does the
1066 electing, so long as I get to do the nominating.” A monopolized market
1067 is an election whose candidates are chosen by the monopolist.
1069 This ballot rigging is made more pernicious by the existence of
1070 monopolies over search order. Google’s search market share is about 90%.
1071 When Google’s ranking algorithm puts a result for a popular search term
1072 in its top 10, that helps determine the behavior of millions of people.
1073 If Google’s answer to “Are vaccines dangerous?” is a page that rebuts
1074 anti-vax conspiracy theories, then a sizable portion of the public will
1075 learn that vaccines are safe. If, on the other hand, Google sends those
1076 people to a site affirming the anti-vax conspiracies, a sizable portion
1077 of those millions will come away convinced that vaccines are dangerous.
1079 Google’s algorithm is often tricked into serving disinformation as a
1080 prominent search result. But in these cases, Google isn’t persuading
1081 people to change their minds; it’s just presenting something untrue as
1082 fact when the user has no cause to doubt it.
1084 This is true whether the search is for “Are vaccines dangerous?” or
1085 “best restaurants near me.” Most users will never look past the first
1086 page of search results, and when the overwhelming majority of people all
1087 use the same search engine, the ranking algorithm deployed by that
1088 search engine will determine myriad outcomes (whether to adopt a child,
1089 whether to have cancer surgery, where to eat dinner, where to move,
1090 where to apply for a job) to a degree that vastly outstrips any
1091 behavioral outcomes dictated by algorithmic persuasion techniques.
1093 Many of the questions we ask search engines have no empirically correct
1094 answers: “Where should I eat dinner?” is not an objective question. Even
1095 questions that do have correct answers (“Are vaccines dangerous?”) don’t
1096 have one empirically superior source for that answer. Many pages affirm
1097 the safety of vaccines, so which one goes first? Under conditions of
1098 competition, consumers can choose from many search engines and stick
1099 with the one whose algorithmic judgment suits them best, but under
1100 conditions of monopoly, we all get our answers from the same place.
1102 Google’s search dominance isn’t a matter of pure merit: The company has
1103 leveraged many tactics that would have been prohibited under classical,
1104 pre-Ronald-Reagan antitrust enforcement standards to attain its
1105 dominance. After all, this is a company that has developed two major
1106 products: a really good search engine and a pretty good Hotmail clone.
1107 Every other major success it’s had — Android, YouTube, Google Maps, etc.
1108 — has come through an acquisition of a nascent competitor. Many of the
1109 company’s key divisions, such as the advertising technology of
1110 DoubleClick, violate the historical antitrust principle of structural
1111 separation, which forbade firms from owning subsidiaries that competed
1112 with their customers. Railroads, for example, were barred from owning
1113 freight companies that competed with the shippers whose freight they
1116 If we’re worried about giant companies subverting markets by stripping
1117 consumers of their ability to make free choices, then vigorous antitrust
1118 enforcement seems like an excellent remedy. If we’d denied Google the
1119 right to effect its many mergers, we would also have probably denied it
1120 its total search dominance. Without that dominance, the pet theories,
1121 biases, errors (and good judgment, too) of Google search engineers and
1122 product managers would not have such an outsized effect on consumer
1125 This goes for many other companies. Amazon, a classic surveillance
1126 capitalist, is obviously the dominant tool for searching Amazon — though
1127 many people find their way to Amazon through Google searches and
1128 Facebook posts — and obviously, Amazon controls Amazon search. That
1129 means that Amazon’s own self-serving editorial choices—like promoting
1130 its own house brands over rival goods from its sellers as well as its
1131 own pet theories, biases, and errors— determine much of what we buy on
1132 Amazon. And since Amazon is the dominant e-commerce retailer outside of
1133 China and since it attained that dominance by buying up both large
1134 rivals and nascent competitors in defiance of historical antitrust
1135 rules, we can blame the monopoly for stripping consumers of their right
1136 to the future tense and the ability to shape markets by making informed
1139 Not every monopolist is a surveillance capitalist, but that doesn’t mean
1140 they’re not able to shape consumer choices in wide-ranging ways. Zuboff
1141 lauds Apple for its App Store and iTunes Store, insisting that adding
1142 price tags to the features on its platforms has been the secret to
1143 resisting surveillance and thus creating markets. But Apple is the only
1144 retailer allowed to sell on its platforms, and it’s the second-largest
1145 mobile device vendor in the world. The independent software vendors that
1146 sell through Apple’s marketplace accuse the company of the same
1147 surveillance sins as Amazon and other big retailers: spying on its
1148 customers to find lucrative new products to launch, effectively using
1149 independent software vendors as free-market researchers, then forcing
1150 them out of any markets they discover.
1152 Because of its use of copyright locks, Apple’s mobile customers are not
1153 legally allowed to switch to a rival retailer for its apps if they want
1154 to do so on an iPhone. Apple, obviously, is the only entity that gets to
1155 decide how it ranks the results of search queries in its stores. These
1156 decisions ensure that some apps are often installed (because they appear
1157 on page one) and others are never installed (because they appear on page
1158 one million). Apple’s search-ranking design decisions have a vastly more
1159 significant effect on consumer behaviors than influence campaigns
1160 delivered by surveillance capitalism’s ad-serving bots.
1162 Monopolists can afford sleeping pills for watchdogs
1163 ---------------------------------------------------
1165 Only the most extreme market ideologues think that markets can
1166 self-regulate without state oversight. Markets need watchdogs —
1167 regulators, lawmakers, and other elements of democratic control — to
1168 keep them honest. When these watchdogs sleep on the job, then markets
1169 cease to aggregate consumer choices because those choices are
1170 constrained by illegitimate and deceptive activities that companies are
1171 able to get away with because no one is holding them to account.
1173 But this kind of regulatory capture doesn’t come cheap. In competitive
1174 sectors, where rivals are constantly eroding one another’s margins,
1175 individual firms lack the surplus capital to effectively lobby for laws
1176 and regulations that serve their ends.
1178 Many of the harms of surveillance capitalism are the result of weak or
1179 nonexistent regulation. Those regulatory vacuums spring from the power
1180 of monopolists to resist stronger regulation and to tailor what
1181 regulation exists to permit their existing businesses.
1183 Here’s an example: When firms over-collect and over-retain our data,
1184 they are at increased risk of suffering a breach — you can’t leak data
1185 you never collected, and once you delete all copies of that data, you
1186 can no longer leak it. For more than a decade, we’ve lived through an
1187 endless parade of ever-worsening data breaches, each one uniquely
1188 horrible in the scale of data breached and the sensitivity of that data.
1190 But still, firms continue to over-collect and over-retain our data for
1193 **1. They are locked in the aforementioned limbic arms race with our
1194 capacity to shore up our attentional defense systems to resist their new
1195 persuasion techniques.** They’re also locked in an arms race with their
1196 competitors to find new ways to target people for sales pitches. As soon
1197 as they discover a soft spot in our attentional defenses (a
1198 counterintuitive, unobvious way to target potential refrigerator
1199 buyers), the public begins to wise up to the tactic, and their
1200 competitors leap on it, hastening the day in which all potential
1201 refrigerator buyers have been inured to the pitch.
1203 **2. They believe the surveillance capitalism story.** Data is cheap to
1204 aggregate and store, and both proponents and opponents of surveillance
1205 capitalism have assured managers and product designers that if you
1206 collect enough data, you will be able to perform sorcerous acts of mind
1207 control, thus supercharging your sales. Even if you never figure out how
1208 to profit from the data, someone else will eventually offer to buy it
1209 from you to give it a try. This is the hallmark of all economic bubbles:
1210 acquiring an asset on the assumption that someone else will buy it from
1211 you for more than you paid for it, often to sell to someone else at an
1214 **3. The penalties for leaking data are negligible.** Most countries
1215 limit these penalties to actual damages, meaning that consumers who’ve
1216 had their data breached have to show actual monetary harms to get a
1217 reward. In 2014, Home Depot disclosed that it had lost credit-card data
1218 for 53 million of its customers, but it settled the matter by paying
1219 those customers about $0.34 each — and a third of that $0.34 wasn’t even
1220 paid in cash. It took the form of a credit to procure a largely
1221 ineffectual credit-monitoring service.
1223 But the harms from breaches are much more extensive than these
1224 actual-damages rules capture. Identity thieves and fraudsters are wily
1225 and endlessly inventive. All the vast breaches of our century are being
1226 continuously recombined, the data sets merged and mined for new ways to
1227 victimize the people whose data was present in them. Any reasonable,
1228 evidence-based theory of deterrence and compensation for breaches would
1229 not confine damages to actual damages but rather would allow users to
1230 claim these future harms.
1232 However, even the most ambitious privacy rules, such as the EU General
1233 Data Protection Regulation, fall far short of capturing the negative
1234 externalities of the platforms’ negligent over-collection and
1235 over-retention, and what penalties they do provide are not aggressively
1236 pursued by regulators.
1238 This tolerance of — or indifference to — data over-collection and
1239 over-retention can be ascribed in part to the sheer lobbying muscle of
1240 the platforms. They are so profitable that they can handily afford to
1241 divert gigantic sums to fight any real change — that is, change that
1242 would force them to internalize the costs of their surveillance
1245 And then there’s state surveillance, which the surveillance capitalism
1246 story dismisses as a relic of another era when the big worry was being
1247 jailed for your dissident speech, not having your free will stripped
1248 away with machine learning.
1250 But state surveillance and private surveillance are intimately related.
1251 As we saw when Apple was conscripted by the Chinese government as a
1252 vital collaborator in state surveillance, the only really affordable and
1253 tractable way to conduct mass surveillance on the scale practiced by
1254 modern states — both “free” and autocratic states — is to suborn
1255 commercial services.
1257 Whether it’s Google being used as a location tracking tool by local law
1258 enforcement across the U.S. or the use of social media tracking by the
1259 Department of Homeland Security to build dossiers on participants in
1260 protests against Immigration and Customs Enforcement’s family separation
1261 practices, any hard limits on surveillance capitalism would hamstring
1262 the state’s own surveillance capability. Without Palantir, Amazon,
1263 Google, and other major tech contractors, U.S. cops would not be able to
1264 spy on Black people, ICE would not be able to manage the caging of
1265 children at the U.S. border, and state welfare systems would not be able
1266 to purge their rolls by dressing up cruelty as empiricism and claiming
1267 that poor and vulnerable people are ineligible for assistance. At least
1268 some of the states’ unwillingness to take meaningful action to curb
1269 surveillance should be attributed to this symbiotic relationship. There
1270 is no mass state surveillance without mass commercial surveillance.
1272 Monopolism is key to the project of mass state surveillance. It’s true
1273 that smaller tech firms are apt to be less well-defended than Big Tech,
1274 whose security experts are drawn from the tops of their field and who
1275 are given enormous resources to secure and monitor their systems against
1276 intruders. But smaller firms also have less to protect: fewer users
1277 whose data is more fragmented across more systems and have to be
1278 suborned one at a time by state actors.
1280 A concentrated tech sector that works with authorities is a much more
1281 powerful ally in the project of mass state surveillance than a
1282 fragmented one composed of smaller actors. The U.S. tech sector is small
1283 enough that all of its top executives fit around a single boardroom
1284 table in Trump Tower in 2017, shortly after Trump’s inauguration. Most
1285 of its biggest players bid to win JEDI, the Pentagon’s $10 billion Joint
1286 Enterprise Defense Infrastructure cloud contract. Like other highly
1287 concentrated industries, Big Tech rotates its key employees in and out
1288 of government service, sending them to serve in the Department of
1289 Defense and the White House, then hiring ex-Pentagon and ex-DOD top
1290 staffers and officers to work in their own government relations
1293 They can even make a good case for doing this: After all, when there are
1294 only four or five big companies in an industry, everyone qualified to
1295 regulate those companies has served as an executive in at least a couple
1296 of them — because, likewise, when there are only five companies in an
1297 industry, everyone qualified for a senior role at any of them is by
1298 definition working at one of the other ones.
1300 While surveillance doesn’t cause monopolies, monopolies certainly
1303 Industries that are competitive are fragmented — composed of companies
1304 that are at each other’s throats all the time and eroding one another’s
1305 margins in bids to steal their best customers. This leaves them with
1306 much more limited capital to use to lobby for favorable rules and a much
1307 harder job of getting everyone to agree to pool their resources to
1308 benefit the industry as a whole.
1310 Surveillance combined with machine learning is supposed to be an
1311 existential crisis, a species-defining moment at which our free will is
1312 just a few more advances in the field from being stripped away. I am
1313 skeptical of this claim, but I *do* think that tech poses an existential
1314 threat to our society and possibly our species.
1316 But that threat grows out of monopoly.
1318 One of the consequences of tech’s regulatory capture is that it can
1319 shift liability for poor security decisions onto its customers and the
1320 wider society. It is absolutely normal in tech for companies to
1321 obfuscate the workings of their products, to make them deliberately hard
1322 to understand, and to threaten security researchers who seek to
1323 independently audit those products.
1325 IT is the only field in which this is practiced: No one builds a bridge
1326 or a hospital and keeps the composition of the steel or the equations
1327 used to calculate load stresses a secret. It is a frankly bizarre
1328 practice that leads, time and again, to grotesque security defects on
1329 farcical scales, with whole classes of devices being revealed as
1330 vulnerable long after they are deployed in the field and put into
1333 The monopoly power that keeps any meaningful consequences for breaches
1334 at bay means that tech companies continue to build terrible products
1335 that are insecure by design and that end up integrated into our lives,
1336 in possession of our data, and connected to our physical world. For
1337 years, Boeing has struggled with the aftermath of a series of bad
1338 technology decisions that made its 737 fleet a global pariah, a rare
1339 instance in which bad tech decisions have been seriously punished in the
1342 These bad security decisions are compounded yet again by the use of
1343 copyright locks to enforce business-model decisions against consumers.
1344 Recall that these locks have become the go-to means for shaping consumer
1345 behavior, making it technically impossible to use third-party ink,
1346 insulin, apps, or service depots in connection with your lawfully
1349 Recall also that these copyright locks are backstopped by legislation
1350 (such as Section 1201 of the DMCA or Article 6 of the 2001 EU Copyright
1351 Directive) that ban tampering with (“circumventing”) them, and these
1352 statutes have been used to threaten security researchers who make
1353 disclosures about vulnerabilities without permission from manufacturers.
1355 This amounts to a manufacturer’s veto over safety warnings and
1356 criticism. While this is far from the legislative intent of the DMCA and
1357 its sister statutes around the world, Congress has not intervened to
1358 clarify the statute nor will it because to do so would run counter to
1359 the interests of powerful, large firms whose lobbying muscle is
1362 Copyright locks are a double whammy: They create bad security decisions
1363 that can’t be freely investigated or discussed. If markets are supposed
1364 to be machines for aggregating information (and if surveillance
1365 capitalism’s notional mind-control rays are what make it a “rogue
1366 capitalism” because it denies consumers the power to make decisions),
1367 then a program of legally enforced ignorance of the risks of products
1368 makes monopolism even more of a “rogue capitalism” than surveillance
1369 capitalism’s influence campaigns.
1371 And unlike mind-control rays, enforced silence over security is an
1372 immediate, documented problem, and it *does* constitute an existential
1373 threat to our civilization and possibly our species. The proliferation
1374 of insecure devices — especially devices that spy on us and especially
1375 when those devices also can manipulate the physical world by, say,
1376 steering your car or flipping a breaker at a power station — is a kind
1379 In software design, “technology debt” refers to old, baked-in decisions
1380 that turn out to be bad ones in hindsight. Perhaps a long-ago developer
1381 decided to incorporate a networking protocol made by a vendor that has
1382 since stopped supporting it. But everything in the product still relies
1383 on that superannuated protocol, and so, with each revision, the product
1384 team has to work around this obsolete core, adding compatibility layers,
1385 surrounding it with security checks that try to shore up its defenses,
1386 and so on. These Band-Aid measures compound the debt because every
1387 subsequent revision has to make allowances for *them*, too, like
1388 interest mounting on a predatory subprime loan. And like a subprime
1389 loan, the interest mounts faster than you can hope to pay it off: The
1390 product team has to put so much energy into maintaining this complex,
1391 brittle system that they don’t have any time left over to refactor the
1392 product from the ground up and “pay off the debt” once and for all.
1394 Typically, technology debt results in a technological bankruptcy: The
1395 product gets so brittle and unsustainable that it fails
1396 catastrophically. Think of the antiquated COBOL-based banking and
1397 accounting systems that fell over at the start of the pandemic emergency
1398 when confronted with surges of unemployment claims. Sometimes that ends
1399 the product; sometimes it takes the company down with it. Being caught
1400 in the default of a technology debt is scary and traumatic, just like
1401 losing your house due to bankruptcy is scary and traumatic.
1403 But the technology debt created by copyright locks isn’t individual
1404 debt; it’s systemic. Everyone in the world is exposed to this
1405 over-leverage, as was the case with the 2008 financial crisis. When that
1406 debt comes due — when we face a cascade of security breaches that
1407 threaten global shipping and logistics, the food supply, pharmaceutical
1408 production pipelines, emergency communications, and other critical
1409 systems that are accumulating technology debt in part due to the
1410 presence of deliberately insecure and deliberately unauditable copyright
1411 locks — it will indeed pose an existential risk.
1413 Privacy and monopoly
1414 --------------------
1416 Many tech companies are gripped by an orthodoxy that holds that if they
1417 just gather enough data on enough of our activities, everything else is
1418 possible — the mind control and endless profits. This is an
1419 unfalsifiable hypothesis: If data gives a tech company even a tiny
1420 improvement in behavior prediction and modification, the company
1421 declares that it has taken the first step toward global domination with
1422 no end in sight. If a company *fails* to attain any improvements from
1423 gathering and analyzing data, it declares success to be just around the
1424 corner, attainable once more data is in hand.
1426 Surveillance tech is far from the first industry to embrace a
1427 nonsensical, self-serving belief that harms the rest of the world, and
1428 it is not the first industry to profit handsomely from such a delusion.
1429 Long before hedge-fund managers were claiming (falsely) that they could
1430 beat the S&P 500, there were plenty of other “respectable” industries
1431 that have been revealed as quacks in hindsight. From the makers of
1432 radium suppositories (a real thing!) to the cruel sociopaths who claimed
1433 they could “cure” gay people, history is littered with the formerly
1434 respectable titans of discredited industries.
1436 This is not to say that there’s nothing wrong with Big Tech and its
1437 ideological addiction to data. While surveillance’s benefits are mostly
1438 overstated, its harms are, if anything, *understated*.
1440 There’s real irony here. The belief in surveillance capitalism as a
1441 “rogue capitalism” is driven by the belief that markets wouldn’t
1442 tolerate firms that are gripped by false beliefs. An oil company that
1443 has false beliefs about where the oil is will eventually go broke
1444 digging dry wells after all.
1446 But monopolists get to do terrible things for a long time before they
1447 pay the price. Think of how concentration in the finance sector allowed
1448 the subprime crisis to fester as bond-rating agencies, regulators,
1449 investors, and critics all fell under the sway of a false belief that
1450 complex mathematics could construct “fully hedged” debt instruments that
1451 could not possibly default. A small bank that engaged in this kind of
1452 malfeasance would simply go broke rather than outrunning the inevitable
1453 crisis, perhaps growing so big that it averted it altogether. But large
1454 banks were able to continue to attract investors, and when they finally
1455 *did* come a-cropper, the world’s governments bailed them out. The worst
1456 offenders of the subprime crisis are bigger than they were in 2008,
1457 bringing home more profits and paying their execs even larger sums.
1459 Big Tech is able to practice surveillance not just because it is tech
1460 but because it is *big*. The reason every web publisher embeds a
1461 Facebook “Like” button is that Facebook dominates the internet’s social
1462 media referrals — and every one of those “Like” buttons spies on
1463 everyone who lands on a page that contains them (see also: Google
1464 Analytics embeds, Twitter buttons, etc.).
1466 The reason the world’s governments have been slow to create meaningful
1467 penalties for privacy breaches is that Big Tech’s concentration produces
1468 huge profits that can be used to lobby against those penalties — and Big
1469 Tech’s concentration means that the companies involved are able to
1470 arrive at a unified negotiating position that supercharges the lobbying.
1472 The reason that the smartest engineers in the world want to work for Big
1473 Tech is that Big Tech commands the lion’s share of tech industry jobs.
1475 The reason people who are aghast at Facebook’s and Google’s and Amazon’s
1476 data-handling practices continue to use these services is that all their
1477 friends are on Facebook; Google dominates search; and Amazon has put all
1478 the local merchants out of business.
1480 Competitive markets would weaken the companies’ lobbying muscle by
1481 reducing their profits and pitting them against each other in regulatory
1482 forums. It would give customers other places to go to get their online
1483 services. It would make the companies small enough to regulate and pave
1484 the way to meaningful penalties for breaches. It would let engineers
1485 with ideas that challenged the surveillance orthodoxy raise capital to
1486 compete with the incumbents. It would give web publishers multiple ways
1487 to reach audiences and make the case against Facebook and Google and
1490 In other words, while surveillance doesn’t cause monopolies, monopolies
1491 certainly abet surveillance.
1493 Ronald Reagan, pioneer of tech monopolism
1494 -----------------------------------------
1496 Technology exceptionalism is a sin, whether it’s practiced by
1497 technology’s blind proponents or by its critics. Both of these camps are
1498 prone to explaining away monopolistic concentration by citing some
1499 special characteristic of the tech industry, like network effects or
1500 first-mover advantage. The only real difference between these two groups
1501 is that the tech apologists say monopoly is inevitable so we should just
1502 let tech get away with its abuses while competition regulators in the
1503 U.S. and the EU say monopoly is inevitable so we should punish tech for
1504 its abuses but not try to break up the monopolies.
1506 To understand how tech became so monopolistic, it’s useful to look at
1507 the dawn of the consumer tech industry: 1979, the year the Apple II Plus
1508 launched and became the first successful home computer. That also
1509 happens to be the year that Ronald Reagan hit the campaign trail for the
1510 1980 presidential race — a race he won, leading to a radical shift in
1511 the way that antitrust concerns are handled in America. Reagan’s cohort
1512 of politicians — including Margaret Thatcher in the U.K., Brian Mulroney
1513 in Canada, Helmut Kohl in Germany, and Augusto Pinochet in Chile — went
1514 on to enact similar reforms that eventually spread around the world.
1516 Antitrust’s story began nearly a century before all that with laws like
1517 the Sherman Act, which took aim at monopolists on the grounds that
1518 monopolies were bad in and of themselves — squeezing out competitors,
1519 creating “diseconomies of scale” (when a company is so big that its
1520 constituent parts go awry and it is seemingly helpless to address the
1521 problems), and capturing their regulators to such a degree that they can
1522 get away with a host of evils.
1524 Then came a fabulist named Robert Bork, a former solicitor general who
1525 Reagan appointed to the powerful U.S. Court of Appeals for the D.C.
1526 Circuit and who had created an alternate legislative history of the
1527 Sherman Act and its successors out of whole cloth. Bork insisted that
1528 these statutes were never targeted at monopolies (despite a wealth of
1529 evidence to the contrary, including the transcribed speeches of the
1530 acts’ authors) but, rather, that they were intended to prevent “consumer
1531 harm” — in the form of higher prices.
1533 Bork was a crank, but he was a crank with a theory that rich people
1534 really liked. Monopolies are a great way to make rich people richer by
1535 allowing them to receive “monopoly rents” (that is, bigger profits) and
1536 capture regulators, leading to a weaker, more favorable regulatory
1537 environment with fewer protections for customers, suppliers, the
1538 environment, and workers.
1540 Bork’s theories were especially palatable to the same power brokers who
1541 backed Reagan, and Reagan’s Department of Justice and other agencies
1542 began to incorporate Bork’s antitrust doctrine into their enforcement
1543 decisions (Reagan even put Bork up for a Supreme Court seat, but Bork
1544 flunked the Senate confirmation hearing so badly that, 40 years later,
1545 D.C. insiders use the term “borked” to refer to any catastrophically bad
1546 political performance).
1548 Little by little, Bork’s theories entered the mainstream, and their
1549 backers began to infiltrate the legal education field, even putting on
1550 junkets where members of the judiciary were treated to lavish meals, fun
1551 outdoor activities, and seminars where they were indoctrinated into the
1552 consumer harm theory of antitrust. The more Bork’s theories took hold,
1553 the more money the monopolists were making — and the more surplus
1554 capital they had at their disposal to lobby for even more Borkian
1555 antitrust influence campaigns.
1557 The history of Bork’s antitrust theories is a really good example of the
1558 kind of covertly engineered shifts in public opinion that Zuboff warns
1559 us against, where fringe ideas become mainstream orthodoxy. But Bork
1560 didn’t change the world overnight. He played a very long game, for over
1561 a generation, and he had a tailwind because the same forces that backed
1562 oligarchic antitrust theories also backed many other oligarchic shifts
1563 in public opinion. For example, the idea that taxation is theft, that
1564 wealth is a sign of virtue, and so on — all of these theories meshed to
1565 form a coherent ideology that elevated inequality to a virtue.
1567 Today, many fear that machine learning allows surveillance capitalism to
1568 sell “Bork-as-a-Service,” at internet speeds, so that you can contract a
1569 machine-learning company to engineer *rapid* shifts in public sentiment
1570 without needing the capital to sustain a multipronged, multigenerational
1571 project working at the local, state, national, and global levels in
1572 business, law, and philosophy. I do not believe that such a project is
1573 plausible, though I agree that this is basically what the platforms
1574 claim to be selling. They’re just lying about it. Big Tech lies all the
1575 time, *including* in their sales literature.
1577 The idea that tech forms “natural monopolies” (monopolies that are the
1578 inevitable result of the realities of an industry, such as the
1579 monopolies that accrue the first company to run long-haul phone lines or
1580 rail lines) is belied by tech’s own history: In the absence of
1581 anti-competitive tactics, Google was able to unseat AltaVista and Yahoo;
1582 Facebook was able to head off Myspace. There are some advantages to
1583 gathering mountains of data, but those mountains of data also have
1584 disadvantages: liability (from leaking), diminishing returns (from old
1585 data), and institutional inertia (big companies, like science, progress
1586 one funeral at a time).
1588 Indeed, the birth of the web saw a mass-extinction event for the
1589 existing giant, wildly profitable proprietary technologies that had
1590 capital, network effects, and walls and moats surrounding their
1591 businesses. The web showed that when a new industry is built around a
1592 protocol, rather than a product, the combined might of everyone who uses
1593 the protocol to reach their customers or users or communities outweighs
1594 even the most massive products. CompuServe, AOL, MSN, and a host of
1595 other proprietary walled gardens learned this lesson the hard way: Each
1596 believed it could stay separate from the web, offering “curation” and a
1597 guarantee of consistency and quality instead of the chaos of an open
1598 system. Each was wrong and ended up being absorbed into the public web.
1600 Yes, tech is heavily monopolized and is now closely associated with
1601 industry concentration, but this has more to do with a matter of timing
1602 than its intrinsically monopolistic tendencies. Tech was born at the
1603 moment that antitrust enforcement was being dismantled, and tech fell
1604 into exactly the same pathologies that antitrust was supposed to guard
1605 against. To a first approximation, it is reasonable to assume that
1606 tech’s monopolies are the result of a lack of anti-monopoly action and
1607 not the much-touted unique characteristics of tech, such as network
1608 effects, first-mover advantage, and so on.
1610 In support of this thesis, I offer the concentration that every *other*
1611 industry has undergone over the same period. From professional wrestling
1612 to consumer packaged goods to commercial property leasing to banking to
1613 sea freight to oil to record labels to newspaper ownership to theme
1614 parks, *every* industry has undergone a massive shift toward
1615 concentration. There’s no obvious network effects or first-mover
1616 advantage at play in these industries. However, in every case, these
1617 industries attained their concentrated status through tactics that were
1618 prohibited before Bork’s triumph: merging with major competitors, buying
1619 out innovative new market entrants, horizontal and vertical integration,
1620 and a suite of anti-competitive tactics that were once illegal but are
1623 Again: When you change the laws intended to prevent monopolies and then
1624 monopolies form in exactly the way the law was supposed to prevent, it
1625 is reasonable to suppose that these facts are related. Tech’s
1626 concentration can be readily explained without recourse to radical
1627 theories of network effects — but only if you’re willing to indict
1628 unregulated markets as tending toward monopoly. Just as a lifelong
1629 smoker can give you a hundred reasons why their smoking didn’t cause
1630 their cancer (“It was the environmental toxins”), true believers in
1631 unregulated markets have a whole suite of unconvincing explanations for
1632 monopoly in tech that leave capitalism intact.
1634 Steering with the windshield wipers
1635 -----------------------------------
1637 It’s been 40 years since Bork’s project to rehabilitate monopolies
1638 achieved liftoff, and that is a generation and a half, which is plenty
1639 of time to take a common idea and make it seem outlandish and vice
1640 versa. Before the 1940s, affluent Americans dressed their baby boys in
1641 pink while baby girls wore blue (a “delicate and dainty” color). While
1642 gendered colors are obviously totally arbitrary, many still greet this
1643 news with amazement and find it hard to imagine a time when pink
1644 connoted masculinity.
1646 After 40 years of studiously ignoring antitrust analysis and
1647 enforcement, it’s not surprising that we’ve all but forgotten that
1648 antitrust exists, that in living memory, growth through mergers and
1649 acquisitions were largely prohibited under law, that market-cornering
1650 strategies like vertical integration could land a company in court.
1652 Antitrust is a market society’s steering wheel, the control of first
1653 resort to keep would-be masters of the universe in their lanes. But Bork
1654 and his cohort ripped out our steering wheel 40 years ago. The car is
1655 still barreling along, and so we’re yanking as hard as we can on all the
1656 *other* controls in the car as well as desperately flapping the doors
1657 and rolling the windows up and down in the hopes that one of these other
1658 controls can be repurposed to let us choose where we’re heading before
1659 we careen off a cliff.
1661 It’s like a 1960s science-fiction plot come to life: People stuck in a
1662 “generation ship,” plying its way across the stars, a ship once piloted
1663 by their ancestors; and now, after a great cataclysm, the ship’s crew
1664 have forgotten that they’re in a ship at all and no longer remember
1665 where the control room is. Adrift, the ship is racing toward its
1666 extinction, and unless we can seize the controls and execute emergency
1667 course correction, we’re all headed for a fiery death in the heart of a
1670 Surveillance still matters
1671 --------------------------
1673 None of this is to minimize the problems with surveillance. Surveillance
1674 matters, and Big Tech’s use of surveillance *is* an existential risk to
1675 our species, but that’s not because surveillance and machine learning
1676 rob us of our free will.
1678 Surveillance has become *much* more efficient thanks to Big Tech. In
1679 1989, the Stasi — the East German secret police — had the whole country
1680 under surveillance, a massive undertaking that recruited one out of
1681 every 60 people to serve as an informant or intelligence operative.
1683 Today, we know that the NSA is spying on a significant fraction of the
1684 entire world’s population, and its ratio of surveillance operatives to
1685 the surveilled is more like 1:10,000 (that’s probably on the low side
1686 since it assumes that every American with top-secret clearance is
1687 working for the NSA on this project — we don’t know how many of those
1688 cleared people are involved in NSA spying, but it’s definitely not all
1691 How did the ratio of surveillable citizens expand from 1:60 to 1:10,000
1692 in less than 30 years? It’s thanks to Big Tech. Our devices and services
1693 gather most of the data that the NSA mines for its surveillance project.
1694 We pay for these devices and the services they connect to, and then we
1695 painstakingly perform the data-entry tasks associated with logging facts
1696 about our lives, opinions, and preferences. This mass surveillance
1697 project has been largely useless for fighting terrorism: The NSA can
1698 `only point to a single minor success
1699 story <https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html>`__
1700 in which it used its data collection program to foil an attempt by a
1701 U.S. resident to wire a few thousand dollars to an overseas terror
1702 group. It’s ineffective for much the same reason that commercial
1703 surveillance projects are largely ineffective at targeting advertising:
1704 The people who want to commit acts of terror, like people who want to
1705 buy a refrigerator, are extremely rare. If you’re trying to detect a
1706 phenomenon whose base rate is one in a million with an instrument whose
1707 accuracy is only 99%, then every true positive will come at the cost of
1708 9,999 false positives.
1710 Let me explain that again: If one in a million people is a terrorist,
1711 then there will only be about one terrorist in a random sample of one
1712 million people. If your test for detecting terrorists is 99% accurate,
1713 it will identify 10,000 terrorists in your million-person sample (1% of
1714 one million is 10,000). For every true positive, you’ll get 9,999 false
1717 In reality, the accuracy of algorithmic terrorism detection falls far
1718 short of the 99% mark, as does refrigerator ad targeting. The difference
1719 is that being falsely accused of wanting to buy a fridge is a minor
1720 nuisance while being falsely accused of planning a terror attack can
1721 destroy your life and the lives of everyone you love.
1723 Mass state surveillance is only feasible because of surveillance
1724 capitalism and its extremely low-yield ad-targeting systems, which
1725 require a constant feed of personal data to remain barely viable.
1726 Surveillance capitalism’s primary failure mode is mistargeted ads while
1727 mass state surveillance’s primary failure mode is grotesque human rights
1728 abuses, tending toward totalitarianism.
1730 State surveillance is no mere parasite on Big Tech, sucking up its data
1731 and giving nothing in return. In truth, the two are symbiotes: Big Tech
1732 sucks up our data for spy agencies, and spy agencies ensure that
1733 governments don’t limit Big Tech’s activities so severely that it would
1734 no longer serve the spy agencies’ needs. There is no firm distinction
1735 between state surveillance and surveillance capitalism; they are
1736 dependent on one another.
1738 To see this at work today, look no further than Amazon’s home
1739 surveillance device, the Ring doorbell, and its associated app,
1740 Neighbors. Ring — a product that Amazon acquired and did not develop in
1741 house — makes a camera-enabled doorbell that streams footage from your
1742 front door to your mobile device. The Neighbors app allows you to form a
1743 neighborhood-wide surveillance grid with your fellow Ring owners through
1744 which you can share clips of “suspicious characters.” If you’re thinking
1745 that this sounds like a recipe for letting curtain-twitching racists
1746 supercharge their suspicions of people with brown skin who walk down
1747 their blocks, `you’re
1748 right <https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement>`__.
1749 Ring has become a *de facto,* off-the-books arm of the police without
1750 any of the pesky oversight or rules.
1752 In mid-2019, a series of public records requests revealed that Amazon
1753 had struck confidential deals with more than 400 local law enforcement
1754 agencies through which the agencies would promote Ring and Neighbors and
1755 in exchange get access to footage from Ring cameras. In theory, cops
1756 would need to request this footage through Amazon (and internal
1757 documents reveal that Amazon devotes substantial resources to coaching
1758 cops on how to spin a convincing story when doing so), but in practice,
1759 when a Ring customer turns down a police request, Amazon only requires
1760 the agency to formally request the footage from the company, which it
1763 Ring and law enforcement have found many ways to intertwine their
1764 activities. Ring strikes secret deals to acquire real-time access to 911
1765 dispatch and then streams alarming crime reports to Neighbors users,
1766 which serve as convincers for anyone who’s contemplating a surveillance
1767 doorbell but isn’t sure whether their neighborhood is dangerous enough
1770 The more the cops buzz-market the surveillance capitalist Ring, the more
1771 surveillance capability the state gets. Cops who rely on private
1772 entities for law-enforcement roles then brief against any controls on
1773 the deployment of that technology while the companies return the favor
1774 by lobbying against rules requiring public oversight of police
1775 surveillance technology. The more the cops rely on Ring and Neighbors,
1776 the harder it will be to pass laws to curb them. The fewer laws there
1777 are against them, the more the cops will rely on them.
1779 Dignity and sanctuary
1780 ---------------------
1782 But even if we could exercise democratic control over our states and
1783 force them to stop raiding surveillance capitalism’s reservoirs of
1784 behavioral data, surveillance capitalism would still harm us.
1786 This is an area where Zuboff shines. Her chapter on “sanctuary” — the
1787 feeling of being unobserved — is a beautiful hymn to introspection,
1788 calmness, mindfulness, and tranquility.
1790 When you are watched, something changes. Anyone who has ever raised a
1791 child knows this. You might look up from your book (or more
1792 realistically, from your phone) and catch your child in a moment of
1793 profound realization and growth, a moment where they are learning
1794 something that is right at the edge of their abilities, requiring their
1795 entire ferocious concentration. For a moment, you’re transfixed,
1796 watching that rare and beautiful moment of focus playing out before your
1797 eyes, and then your child looks up and sees you seeing them, and the
1798 moment collapses. To grow, you need to be and expose your authentic
1799 self, and in that moment, you are vulnerable like a hermit crab
1800 scuttling from one shell to the next. The tender, unprotected tissues
1801 you expose in that moment are too delicate to reveal in the presence of
1802 another, even someone you trust as implicitly as a child trusts their
1805 In the digital age, our authentic selves are inextricably tied to our
1806 digital lives. Your search history is a running ledger of the questions
1807 you’ve pondered. Your location history is a record of the places you’ve
1808 sought out and the experiences you’ve had there. Your social graph
1809 reveals the different facets of your identity, the people you’ve
1812 To be observed in these activities is to lose the sanctuary of your
1815 There’s another way in which surveillance capitalism robs us of our
1816 capacity to be our authentic selves: by making us anxious. Surveillance
1817 capitalism isn’t really a mind-control ray, but you don’t need a
1818 mind-control ray to make someone anxious. After all, another word for
1819 anxiety is agitation, and to make someone experience agitation, you need
1820 merely to agitate them. To poke them and prod them and beep at them and
1821 buzz at them and bombard them on an intermittent schedule that is just
1822 random enough that our limbic systems never quite become inured to it.
1824 Our devices and services are “general purpose” in that they can connect
1825 anything or anyone to anything or anyone else and that they can run any
1826 program that can be written. This means that the distraction rectangles
1827 in our pockets hold our most precious moments with our most beloved
1828 people and their most urgent or time-sensitive communications (from
1829 “running late can you get the kid?” to “doctor gave me bad news and I
1830 need to talk to you RIGHT NOW”) as well as ads for refrigerators and
1831 recruiting messages from Nazis.
1833 All day and all night, our pockets buzz, shattering our concentration
1834 and tearing apart the fragile webs of connection we spin as we think
1835 through difficult ideas. If you locked someone in a cell and agitated
1836 them like this, we’d call it “sleep deprivation torture,” and it would
1837 be `a war crime under the Geneva
1838 Conventions <https://www.youtube.com/watch?v=1SKpRbvnx6g>`__.
1840 Afflicting the afflicted
1841 ------------------------
1843 The effects of surveillance on our ability to be our authentic selves
1844 are not equal for all people. Some of us are lucky enough to live in a
1845 time and place in which all the most important facts of our lives are
1846 widely and roundly socially acceptable and can be publicly displayed
1847 without the risk of social consequence.
1849 But for many of us, this is not true. Recall that in living memory, many
1850 of the ways of being that we think of as socially acceptable today were
1851 once cause for dire social sanction or even imprisonment. If you are 65
1852 years old, you have lived through a time in which people living in “free
1853 societies” could be imprisoned or sanctioned for engaging in homosexual
1854 activity, for falling in love with a person whose skin was a different
1855 color than their own, or for smoking weed.
1857 Today, these activities aren’t just decriminalized in much of the world,
1858 they’re considered normal, and the fallen prohibitions are viewed as
1859 shameful, regrettable relics of the past.
1861 How did we get from prohibition to normalization? Through private,
1862 personal activity: People who were secretly gay or secret pot-smokers or
1863 who secretly loved someone with a different skin color were vulnerable
1864 to retaliation if they made their true selves known and were limited in
1865 how much they could advocate for their own right to exist in the world
1866 and be true to themselves. But because there was a private sphere, these
1867 people could form alliances with their friends and loved ones who did
1868 not share their disfavored traits by having private conversations in
1869 which they came out, disclosing their true selves to the people around
1870 them and bringing them to their cause one conversation at a time.
1872 The right to choose the time and manner of these conversations was key
1873 to their success. It’s one thing to come out to your dad while you’re on
1874 a fishing trip away from the world and another thing entirely to blurt
1875 it out over the Christmas dinner table while your racist Facebook uncle
1876 is there to make a scene.
1878 Without a private sphere, there’s a chance that none of these changes
1879 would have come to pass and that the people who benefited from these
1880 changes would have either faced social sanction for coming out to a
1881 hostile world or would have never been able to reveal their true selves
1882 to the people they love.
1884 The corollary is that, unless you think that our society has attained
1885 social perfection — that your grandchildren in 50 years will ask you to
1886 tell them the story of how, in 2020, every injustice had been righted
1887 and no further change had to be made — then you should expect that right
1888 now, at this minute, there are people you love, whose happiness is key
1889 to your own, who have a secret in their hearts that stops them from ever
1890 being their authentic selves with you. These people are sorrowing and
1891 will go to their graves with that secret sorrow in their hearts, and the
1892 source of that sorrow will be the falsity of their relationship to you.
1894 A private realm is necessary for human progress.
1896 Any data you collect and retain will eventually leak
1897 ----------------------------------------------------
1899 The lack of a private life can rob vulnerable people of the chance to be
1900 their authentic selves and constrain our actions by depriving us of
1901 sanctuary, but there is another risk that is borne by everyone, not just
1902 people with a secret: crime.
1904 Personally identifying information is of very limited use for the
1905 purpose of controlling peoples’ minds, but identity theft — really a
1906 catchall term for a whole constellation of terrible criminal activities
1907 that can destroy your finances, compromise your personal integrity, ruin
1908 your reputation, or even expose you to physical danger — thrives on it.
1910 Attackers are not limited to using data from one breached source,
1911 either. Multiple services have suffered breaches that exposed names,
1912 addresses, phone numbers, passwords, sexual tastes, school grades, work
1913 performance, brushes with the criminal justice system, family details,
1914 genetic information, fingerprints and other biometrics, reading habits,
1915 search histories, literary tastes, pseudonymous identities, and other
1916 sensitive information. Attackers can merge data from these different
1917 breaches to build up extremely detailed dossiers on random subjects and
1918 then use different parts of the data for different criminal purposes.
1920 For example, attackers can use leaked username and password combinations
1921 to hijack whole fleets of commercial vehicles that `have been fitted
1922 with anti-theft GPS trackers and
1923 immobilizers <https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps>`__
1924 or to hijack baby monitors in order to `terrorize toddlers with the
1926 pornography <https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550>`__.
1927 Attackers use leaked data to trick phone companies into giving them your
1928 phone number, then they intercept SMS-based two-factor authentication
1929 codes in order to take over your email, bank account, and/or
1930 cryptocurrency wallets.
1932 Attackers are endlessly inventive in the pursuit of creative ways to
1933 weaponize leaked data. One common use of leaked data is to penetrate
1934 companies in order to access *more* data.
1936 Like spies, online fraudsters are totally dependent on companies
1937 over-collecting and over-retaining our data. Spy agencies sometimes pay
1938 companies for access to their data or intimidate them into giving it up,
1939 but sometimes they work just like criminals do — by `sneaking data out
1941 databases <https://www.bbc.com/news/world-us-canada-24751821>`__.
1943 The over-collection of data has a host of terrible social consequences,
1944 from the erosion of our authentic selves to the undermining of social
1945 progress, from state surveillance to an epidemic of online crime.
1946 Commercial surveillance is also a boon to people running influence
1947 campaigns, but that’s the least of our troubles.
1949 Critical tech exceptionalism is still tech exceptionalism
1950 ---------------------------------------------------------
1952 Big Tech has long practiced technology exceptionalism: the idea that it
1953 should not be subject to the mundane laws and norms of “meatspace.”
1954 Mottoes like Facebook’s “move fast and break things” attracted
1955 justifiable scorn of the companies’ self-serving rhetoric.
1957 Tech exceptionalism got us all into a lot of trouble, so it’s ironic and
1958 distressing to see Big Tech’s critics committing the same sin.
1960 Big Tech is not a “rogue capitalism” that cannot be cured through the
1961 traditional anti-monopoly remedies of trustbusting (forcing companies to
1962 divest of competitors they have acquired) and bans on mergers to
1963 monopoly and other anti-competitive tactics. Big Tech does not have the
1964 power to use machine learning to influence our behavior so thoroughly
1965 that markets lose the ability to punish bad actors and reward superior
1966 competitors. Big Tech has no rule-writing mind-control ray that
1967 necessitates ditching our old toolbox.
1969 The thing is, people have been claiming to have perfected mind-control
1970 rays for centuries, and every time, it turned out to be a con — though
1971 sometimes the con artists were also conning themselves.
1973 For generations, the advertising industry has been steadily improving
1974 its ability to sell advertising services to businesses while only making
1975 marginal gains in selling those businesses’ products to prospective
1976 customers. John Wanamaker’s lament that “50% of my advertising budget is
1977 wasted, I just don’t know which 50%” is a testament to the triumph of
1978 *ad executives*, who successfully convinced Wanamaker that only half of
1979 the money he spent went to waste.
1981 The tech industry has made enormous improvements in the science of
1982 convincing businesses that they’re good at advertising while their
1983 actual improvements to advertising — as opposed to targeting — have been
1984 pretty ho-hum. The vogue for machine learning — and the mystical
1985 invocation of “artificial intelligence” as a synonym for straightforward
1986 statistical inference techniques — has greatly boosted the efficacy of
1987 Big Tech’s sales pitch as marketers have exploited potential customers’
1988 lack of technical sophistication to get away with breathtaking acts of
1989 overpromising and underdelivering.
1991 It’s tempting to think that if businesses are willing to pour billions
1992 into a venture that the venture must be a good one. Yet there are plenty
1993 of times when this rule of thumb has led us astray. For example, it’s
1994 virtually unheard of for managed investment funds to outperform simple
1995 index funds, and investors who put their money into the hands of expert
1996 money managers overwhelmingly fare worse than those who entrust their
1997 savings to index funds. But managed funds still account for the majority
1998 of the money invested in the markets, and they are patronized by some of
1999 the richest, most sophisticated investors in the world. Their vote of
2000 confidence in an underperforming sector is a parable about the role of
2001 luck in wealth accumulation, not a sign that managed funds are a good
2004 The claims of Big Tech’s mind-control system are full of tells that the
2005 enterprise is a con. For example, `the reliance on the “Big Five”
2007 traits <https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full>`__
2008 as a primary means of influencing people even though the “Big Five”
2009 theory is unsupported by any large-scale, peer-reviewed studies and is
2010 `mostly the realm of marketing hucksters and pop
2011 psych <https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/>`__.
2013 Big Tech’s promotional materials also claim that their algorithms can
2014 accurately perform “sentiment analysis” or detect peoples’ moods based
2015 on their “microexpressions,” but `these are marketing claims, not
2017 ones <https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it>`__.
2018 These methods are largely untested by independent scientific experts,
2019 and where they have been tested, they’ve been found sorely wanting.
2020 Microexpressions are particularly suspect as the companies that
2021 specialize in training people to detect them `have been
2022 shown <https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/>`__
2023 to underperform relative to random chance.
2025 Big Tech has been so good at marketing its own supposed superpowers that
2026 it’s easy to believe that they can market everything else with similar
2027 acumen, but it’s a mistake to believe the hype. Any statement a company
2028 makes about the quality of its products is clearly not impartial. The
2029 fact that we distrust all the things that Big Tech says about its data
2030 handling, compliance with privacy laws, etc., is only reasonable — but
2031 why on Earth would we treat Big Tech’s marketing literature as the
2032 gospel truth? Big Tech lies about just about *everything*, including how
2033 well its machine-learning fueled persuasion systems work.
2035 That skepticism should infuse all of our evaluations of Big Tech and its
2036 supposed abilities, including our perusal of its patents. Zuboff vests
2037 these patents with enormous significance, pointing out that Google
2038 claimed extensive new persuasion capabilities in `its patent
2039 filings <https://patents.google.com/patent/US20050131762A1/en>`__. These
2040 claims are doubly suspect: first, because they are so self-serving, and
2041 second, because the patent itself is so notoriously an invitation to
2044 Patent applications take the form of a series of claims and range from
2045 broad to narrow. A typical patent starts out by claiming that its
2046 authors have invented a method or system for doing every conceivable
2047 thing that anyone might do, ever, with any tool or device. Then it
2048 narrows that claim in successive stages until we get to the actual
2049 “invention” that is the true subject of the patent. The hope is that the
2050 patent examiner — who is almost certainly overworked and underinformed —
2051 will miss the fact that some or all of these claims are ridiculous, or
2052 at least suspect, and grant the patent’s broader claims. Patents for
2053 unpatentable things are still incredibly useful because they can be
2054 wielded against competitors who might license that patent or steer clear
2055 of its claims rather than endure the lengthy, expensive process of
2058 What’s more, software patents are routinely granted even though the
2059 filer doesn’t have any evidence that they can do the thing claimed by
2060 the patent. That is, you can patent an “invention” that you haven’t
2061 actually made and that you don’t know how to make.
2063 With these considerations in hand, it becomes obvious that the fact that
2064 a Big Tech company has patented what it *says* is an effective
2065 mind-control ray is largely irrelevant to whether Big Tech can in fact
2068 Big Tech collects our data for many reasons, including the diminishing
2069 returns on existing stores of data. But many tech companies also collect
2070 data out of a mistaken tech exceptionalist belief in the network effects
2071 of data. Network effects occur when each new user in a system increases
2072 its value. The classic example is fax machines: A single fax machine is
2073 of no use, two fax machines are of limited use, but every new fax
2074 machine that’s put to use after the first doubles the number of possible
2077 Data mined for predictive systems doesn’t necessarily produce these
2078 dividends. Think of Netflix: The predictive value of the data mined from
2079 a million English-speaking Netflix viewers is hardly improved by the
2080 addition of one more user’s viewing data. Most of the data Netflix
2081 acquires after that first minimum viable sample duplicates existing data
2082 and produces only minimal gains. Meanwhile, retraining models with new
2083 data gets progressively more expensive as the number of data points
2084 increases, and manual tasks like labeling and validating data do not get
2087 Businesses pursue fads to the detriment of their profits all the time,
2088 especially when the businesses and their investors are not motivated by
2089 the prospect of becoming profitable but rather by the prospect of being
2090 acquired by a Big Tech giant or by having an IPO. For these firms,
2091 ticking faddish boxes like “collects as much data as possible” might
2092 realize a bigger return on investment than “collects a
2093 business-appropriate quantity of data.”
2095 This is another harm of tech exceptionalism: The belief that more data
2096 always produces more profits in the form of more insights that can be
2097 translated into better mind-control rays drives firms to over-collect
2098 and over-retain data beyond all rationality. And since the firms are
2099 behaving irrationally, a good number of them will go out of business and
2100 become ghost ships whose cargo holds are stuffed full of data that can
2101 harm people in myriad ways — but which no one is responsible for antey
2102 longer. Even if the companies don’t go under, the data they collect is
2103 maintained behind the minimum viable security — just enough security to
2104 keep the company viable while it waits to get bought out by a tech
2105 giant, an amount calculated to spend not one penny more than is
2106 necessary on protecting data.
2108 How monopolies, not mind control, drive surveillance capitalism: The Snapchat story
2109 -----------------------------------------------------------------------------------
2111 For the first decade of its existence, Facebook competed with the social
2112 media giants of the day (Myspace, Orkut, etc.) by presenting itself as
2113 the pro-privacy alternative. Indeed, Facebook justified its walled
2114 garden — which let users bring in data from the web but blocked web
2115 services like Google Search from indexing and caching Facebook pages —
2116 as a pro-privacy measure that protected users from the
2117 surveillance-happy winners of the social media wars like Myspace.
2119 Despite frequent promises that it would never collect or analyze its
2120 users’ data, Facebook periodically created initiatives that did just
2121 that, like the creepy, ham-fisted Beacon tool, which spied on you as you
2122 moved around the web and then added your online activities to your
2123 public timeline, allowing your friends to monitor your browsing habits.
2124 Beacon sparked a user revolt. Every time, Facebook backed off from its
2125 surveillance initiative, but not all the way; inevitably, the new
2126 Facebook would be more surveilling than the old Facebook, though not
2127 quite as surveilling as the intermediate Facebook following the launch
2128 of the new product or service.
2130 The pace at which Facebook ramped up its surveillance efforts seems to
2131 have been set by Facebook’s competitive landscape. The more competitors
2132 Facebook had, the better it behaved. Every time a major competitor
2133 foundered, Facebook’s behavior `got markedly
2134 worse <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362>`__.
2136 All the while, Facebook was prodigiously acquiring companies, including
2137 a company called Onavo. Nominally, Onavo made a battery-monitoring
2138 mobile app. But the permissions that Onavo required were so expansive
2139 that the app was able to gather fine-grained telemetry on everything
2140 users did with their phones, including which apps they used and how they
2143 Through Onavo, Facebook discovered that it was losing market share to
2144 Snapchat, an app that — like Facebook a decade before — billed itself as
2145 the pro-privacy alternative to the status quo. Through Onavo, Facebook
2146 was able to mine data from the devices of Snapchat users, including both
2147 current and former Snapchat users. This spurred Facebook to acquire
2148 Instagram — some features of which competed with Snapchat — and then
2149 allowed Facebook to fine-tune Instagram’s features and sales pitch to
2150 erode Snapchat’s gains and ensure that Facebook would not have to face
2151 the kinds of competitive pressures it had earlier inflicted on Myspace
2154 The story of how Facebook crushed Snapchat reveals the relationship
2155 between monopoly and surveillance capitalism. Facebook combined
2156 surveillance with lax antitrust enforcement to spot the competitive
2157 threat of Snapchat on its horizon and then take decisive action against
2158 it. Facebook’s surveillance capitalism let it avert competitive pressure
2159 with anti-competitive tactics. Facebook users still want privacy —
2160 Facebook hasn’t used surveillance to brainwash them out of it — but they
2161 can’t get it because Facebook’s surveillance lets it destroy any hope of
2162 a rival service emerging that competes on privacy features.
2164 A monopoly over your friends
2165 ----------------------------
2167 A decentralization movement has tried to erode the dominance of Facebook
2168 and other Big Tech companies by fielding “indieweb” alternatives —
2169 Mastodon as a Twitter alternative, Diaspora as a Facebook alternative,
2170 etc. — but these efforts have failed to attain any kind of liftoff.
2172 Fundamentally, each of these services is hamstrung by the same problem:
2173 Every potential user for a Facebook or Twitter alternative has to
2174 convince all their friends to follow them to a decentralized web
2175 alternative in order to continue to realize the benefit of social media.
2176 For many of us, the only reason to have a Facebook account is that our
2177 friends have Facebook accounts, and the reason they have Facebook
2178 accounts is that *we* have Facebook accounts.
2180 All of this has conspired to make Facebook — and other dominant
2181 platforms — into “kill zones” that investors will not fund new entrants
2184 And yet, all of today’s tech giants came into existence despite the
2185 entrenched advantage of the companies that came before them. To
2186 understand how that happened, you have to understand both
2187 interoperability and adversarial interoperability.
2189 The hard problem of our species is coordination.
2191 “Interoperability” is the ability of two technologies to work with one
2192 another: Anyone can make an LP that will play on any record player,
2193 anyone can make a filter you can install in your stove’s extractor fan,
2194 anyone can make gasoline for your car, anyone can make a USB phone
2195 charger that fits in your car’s cigarette lighter receptacle, anyone can
2196 make a light bulb that works in your light socket, anyone can make bread
2197 that will toast in your toaster.
2199 Interoperability is often a source of innovation and consumer benefit:
2200 Apple made the first commercially successful PC, but millions of
2201 independent software vendors made interoperable programs that ran on the
2202 Apple II Plus. The simple analog antenna inputs on the back of TVs first
2203 allowed cable operators to connect directly to TVs, then they allowed
2204 game console companies and then personal computer companies to use
2205 standard televisions as displays. Standard RJ-11 telephone jacks allowed
2206 for the production of phones from a variety of vendors in a variety of
2207 forms, from the free football-shaped phone that came with a *Sports
2208 Illustrated* subscription to business phones with speakers, hold
2209 functions, and so on and then answering machines and finally modems,
2210 paving the way for the internet revolution.
2212 “Interoperability” is often used interchangeably with “standardization,”
2213 which is the process when manufacturers and other stakeholders hammer
2214 out a set of agreed-upon rules for implementing a technology, such as
2215 the electrical plug on your wall, the CAN bus used by your car’s
2216 computer systems, or the HTML instructions that your browser interprets.
2218 But interoperability doesn’t require standardization — indeed,
2219 standardization often proceeds from the chaos of ad hoc interoperability
2220 measures. The inventor of the cigarette-lighter USB charger didn’t need
2221 to get permission from car manufacturers or even the manufacturers of
2222 the dashboard lighter subcomponent. The automakers didn’t take any
2223 countermeasures to prevent the use of these aftermarket accessories by
2224 their customers, but they also didn’t do anything to make life easier
2225 for the chargers’ manufacturers. This is a kind of “neutral
2228 Beyond neutral interoperability, there is “adversarial
2229 interoperability.” That’s when a manufacturer makes a product that
2230 interoperates with another manufacturer’s product *despite the second
2231 manufacturer’s objections* and *even if that means bypassing a security
2232 system designed to prevent interoperability*.
2234 Probably the most familiar form of adversarial interoperability is
2235 third-party printer ink. Printer manufacturers claim that they sell
2236 printers below cost and that the only way they can recoup the losses
2237 they incur is by charging high markups on ink. To prevent the owners of
2238 printers from buying ink elsewhere, the printer companies deploy a suite
2239 of anti-customer security systems that detect and reject both refilled
2240 and third-party cartridges.
2242 Owners of printers take the position that HP and Epson and Brother are
2243 not charities and that customers for their wares have no obligation to
2244 help them survive, and so if the companies choose to sell their products
2245 at a loss, that’s their foolish choice and their consequences to live
2246 with. Likewise, competitors who make ink or refill kits observe that
2247 they don’t owe printer companies anything, and their erosion of printer
2248 companies’ margins are the printer companies’ problems, not their
2249 competitors’. After all, the printer companies shed no tears when they
2250 drive a refiller out of business, so why should the refillers concern
2251 themselves with the economic fortunes of the printer companies?
2253 Adversarial interoperability has played an outsized role in the history
2254 of the tech industry: from the founding of the “alt.*” Usenet hierarchy
2255 (which was started against the wishes of Usenet’s maintainers and which
2256 grew to be bigger than all of Usenet combined) to the browser wars (when
2257 Netscape and Microsoft devoted massive engineering efforts to making
2258 their browsers incompatible with the other’s special commands and
2259 peccadilloes) to Facebook (whose success was built in part by helping
2260 its new users stay in touch with friends they’d left behind on Myspace
2261 because Facebook supplied them with a tool that scraped waiting messages
2262 from Myspace and imported them into Facebook, effectively creating an
2263 Facebook-based Myspace reader).
2265 Today, incumbency is seen as an unassailable advantage. Facebook is
2266 where all of your friends are, so no one can start a Facebook
2267 competitor. But adversarial compatibility reverses the competitive
2268 advantage: If you were allowed to compete with Facebook by providing a
2269 tool that imported all your users’ waiting Facebook messages into an
2270 environment that competed on lines that Facebook couldn’t cross, like
2271 eliminating surveillance and ads, then Facebook would be at a huge
2272 disadvantage. It would have assembled all possible ex-Facebook users
2273 into a single, easy-to-find service; it would have educated them on how
2274 a Facebook-like service worked and what its potential benefits were; and
2275 it would have provided an easy means for disgruntled Facebook users to
2276 tell their friends where they might expect better treatment.
2278 Adversarial interoperability was once the norm and a key contributor to
2279 the dynamic, vibrant tech scene, but now it is stuck behind a thicket of
2280 laws and regulations that add legal risks to the tried-and-true tactics
2281 of adversarial interoperability. New rules and new interpretations of
2282 existing rules mean that a would-be adversarial interoperator needs to
2283 steer clear of claims under copyright, terms of service, trade secrecy,
2284 tortious interference, and patent.
2286 In the absence of a competitive market, lawmakers have resorted to
2287 assigning expensive, state-like duties to Big Tech firms, such as
2288 automatically filtering user contributions for copyright infringement or
2289 terrorist and extremist content or detecting and preventing harassment
2290 in real time or controlling access to sexual material.
2292 These measures put a floor under how small we can make Big Tech because
2293 only the very largest companies can afford the humans and automated
2294 filters needed to perform these duties.
2296 But that’s not the only way in which making platforms responsible for
2297 policing their users undermines competition. A platform that is expected
2298 to police its users’ conduct must prevent many vital adversarial
2299 interoperability techniques lest these subvert its policing measures.
2300 For example, if someone using a Twitter replacement like Mastodon is
2301 able to push messages into Twitter and read messages out of Twitter,
2302 they could avoid being caught by automated systems that detect and
2303 prevent harassment (such as systems that use the timing of messages or
2304 IP-based rules to make guesses about whether someone is a harasser).
2306 To the extent that we are willing to let Big Tech police itself — rather
2307 than making Big Tech small enough that users can leave bad platforms for
2308 better ones and small enough that a regulation that simply puts a
2309 platform out of business will not destroy billions of users’ access to
2310 their communities and data — we build the case that Big Tech should be
2311 able to block its competitors and make it easier for Big Tech to demand
2312 legal enforcement tools to ban and punish attempts at adversarial
2315 Ultimately, we can try to fix Big Tech by making it responsible for bad
2316 acts by its users, or we can try to fix the internet by cutting Big Tech
2317 down to size. But we can’t do both. To replace today’s giant products
2318 with pluralistic protocols, we need to clear the legal thicket that
2319 prevents adversarial interoperability so that tomorrow’s nimble,
2320 personal, small-scale products can federate themselves with giants like
2321 Facebook, allowing the users who’ve left to continue to communicate with
2322 users who haven’t left yet, reaching tendrils over Facebook’s garden
2323 wall that Facebook’s trapped users can use to scale the walls and escape
2324 to the global, open web.
2326 Fake news is an epistemological crisis
2327 --------------------------------------
2329 Tech is not the only industry that has undergone massive concentration
2330 since the Reagan era. Virtually every major industry — from oil to
2331 newspapers to meatpacking to sea freight to eyewear to online
2332 pornography — has become a clubby oligarchy that just a few players
2335 At the same time, every industry has become something of a tech industry
2336 as general-purpose computers and general-purpose networks and the
2337 promise of efficiencies through data-driven analysis infuse every
2338 device, process, and firm with tech.
2340 This phenomenon of industrial concentration is part of a wider story
2341 about wealth concentration overall as a smaller and smaller number of
2342 people own more and more of our world. This concentration of both wealth
2343 and industries means that our political outcomes are increasingly
2344 beholden to the parochial interests of the people and companies with all
2347 That means that whenever a regulator asks a question with an obvious,
2348 empirical answer (“Are humans causing climate change?” or “Should we let
2349 companies conduct commercial mass surveillance?” or “Does society
2350 benefit from allowing network neutrality violations?”), the answer that
2351 comes out is only correct if that correctness meets with the approval of
2352 rich people and the industries that made them so wealthy.
2354 Rich people have always played an outsized role in politics and more so
2355 since the Supreme Court’s *Citizens United* decision eliminated key
2356 controls over political spending. Widening inequality and wealth
2357 concentration means that the very richest people are now a lot richer
2358 and can afford to spend a lot more money on political projects than ever
2359 before. Think of the Koch brothers or George Soros or Bill Gates.
2361 But the policy distortions of rich individuals pale in comparison to the
2362 policy distortions that concentrated industries are capable of. The
2363 companies in highly concentrated industries are much more profitable
2364 than companies in competitive industries — no competition means not
2365 having to reduce prices or improve quality to win customers — leaving
2366 them with bigger capital surpluses to spend on lobbying.
2368 Concentrated industries also find it easier to collaborate on policy
2369 objectives than competitive ones. When all the top execs from your
2370 industry can fit around a single boardroom table, they often do. And
2371 *when* they do, they can forge a consensus position on regulation.
2373 Rising through the ranks in a concentrated industry generally means
2374 working at two or three of the big companies. When there are only
2375 relatively few companies in a given industry, each company has a more
2376 ossified executive rank, leaving ambitious execs with fewer paths to
2377 higher positions unless they are recruited to a rival. This means that
2378 the top execs in concentrated industries are likely to have been
2379 colleagues at some point and socialize in the same circles — connected
2380 through social ties or, say, serving as trustees for each others’
2381 estates. These tight social bonds foster a collegial, rather than
2382 competitive, attitude.
2384 Highly concentrated industries also present a regulatory conundrum. When
2385 an industry is dominated by just four or five companies, the only people
2386 who are likely to truly understand the industry’s practices are its
2387 veteran executives. This means that top regulators are often former
2388 execs of the companies they are supposed to be regulating. These turns
2389 in government are often tacitly understood to be leaves of absence from
2390 industry, with former employers welcoming their erstwhile watchdogs back
2391 into their executive ranks once their terms have expired.
2393 All this is to say that the tight social bonds, small number of firms,
2394 and regulatory capture of concentrated industries give the companies
2395 that comprise them the power to dictate many, if not all, of the
2396 regulations that bind them.
2398 This is increasingly obvious. Whether it’s payday lenders `winning the
2399 right to practice predatory
2400 lending <https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/>`__
2401 or Apple `winning the right to decide who can fix your
2402 phone <https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation>`__
2403 or Google and Facebook winning the right to breach your private data
2404 without suffering meaningful consequences or victories for pipeline
2405 companies or impunity for opioid manufacturers or massive tax subsidies
2406 for incredibly profitable dominant businesses, it’s increasingly
2407 apparent that many of our official, evidence-based truth-seeking
2408 processes are, in fact, auctions for sale to the highest bidder.
2410 It’s really impossible to overstate what a terrifying prospect this is.
2411 We live in an incredibly high-tech society, and none of us could acquire
2412 the expertise to evaluate every technological proposition that stands
2413 between us and our untimely, horrible deaths. You might devote your life
2414 to acquiring the media literacy to distinguish good scientific journals
2415 from corrupt pay-for-play lookalikes and the statistical literacy to
2416 evaluate the quality of the analysis in the journals as well as the
2417 microbiology and epidemiology knowledge to determine whether you can
2418 trust claims about the safety of vaccines — but that would still leave
2419 you unqualified to judge whether the wiring in your home will give you a
2420 lethal shock *and* whether your car’s brakes’ software will cause them
2421 to fail unpredictably *and* whether the hygiene standards at your
2422 butcher are sufficient to keep you from dying after you finish your
2425 In a world as complex as this one, we have to defer to authorities, and
2426 we keep them honest by making those authorities accountable to us and
2427 binding them with rules to prevent conflicts of interest. We can’t
2428 possibly acquire the expertise to adjudicate conflicting claims about
2429 the best way to make the world safe and prosperous, but we *can*
2430 determine whether the adjudication process itself is trustworthy.
2432 Right now, it’s obviously not.
2434 The past 40 years of rising inequality and industry concentration,
2435 together with increasingly weak accountability and transparency for
2436 expert agencies, has created an increasingly urgent sense of impending
2437 doom, the sense that there are vast conspiracies afoot that operate with
2438 tacit official approval despite the likelihood they are working to
2439 better themselves by ruining the rest of us.
2441 For example, it’s been decades since Exxon’s own scientists concluded
2442 that its products would render the Earth uninhabitable by humans. And
2443 yet those decades were lost to us, in large part because Exxon lobbied
2444 governments and sowed doubt about the dangers of its products and did so
2445 with the cooperation of many public officials. When the survival of you
2446 and everyone you love is threatened by conspiracies, it’s not
2447 unreasonable to start questioning the things you think you know in an
2448 attempt to determine whether they, too, are the outcome of another
2451 The collapse of the credibility of our systems for divining and
2452 upholding truths has left us in a state of epistemological chaos. Once,
2453 most of us might have assumed that the system was working and that our
2454 regulations reflected our best understanding of the empirical truths of
2455 the world as they were best understood — now we have to find our own
2456 experts to help us sort the true from the false.
2458 If you’re like me, you probably believe that vaccines are safe, but you
2459 (like me) probably also can’t explain the microbiology or statistics.
2460 Few of us have the math skills to review the literature on vaccine
2461 safety and describe why their statistical reasoning is sound. Likewise,
2462 few of us can review the stats in the (now discredited) literature on
2463 opioid safety and explain how those stats were manipulated. Both
2464 vaccines and opioids were embraced by medical authorities, after all,
2465 and one is safe while the other could ruin your life. You’re left with a
2466 kind of inchoate constellation of rules of thumb about which experts you
2467 trust to fact-check controversial claims and then to explain how all
2468 those respectable doctors with their peer-reviewed research on opioid
2469 safety *were* an aberration and then how you know that the doctors
2470 writing about vaccine safety are *not* an aberration.
2472 I’m 100% certain that vaccinating is safe and effective, but I’m also at
2473 something of a loss to explain exactly, *precisely,* why I believe this,
2474 given all the corruption I know about and the many times the stamp of
2475 certainty has turned out to be a parochial lie told to further enrich
2478 Fake news — conspiracy theories, racist ideologies, scientific denialism
2479 — has always been with us. What’s changed today is not the mix of ideas
2480 in the public discourse but the popularity of the worst ideas in that
2481 mix. Conspiracy and denial have skyrocketed in lockstep with the growth
2482 of Big Inequality, which has also tracked the rise of Big Tech and Big
2483 Pharma and Big Wrestling and Big Car and Big Movie Theater and Big
2486 No one can say for certain why this has happened, but the two dominant
2487 camps are idealism (the belief that the people who argue for these
2488 conspiracies have gotten better at explaining them, maybe with the help
2489 of machine-learning tools) or materialism (the ideas have become more
2490 attractive because of material conditions in the world).
2492 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2493 theorists all my life, and I have not experienced any qualitative leap
2494 in the quality of those arguments.
2496 The major difference is in the world, not the arguments. In a time where
2497 actual conspiracies are commonplace, conspiracy theories acquire a ring
2500 We have always had disagreements about what’s true, but today, we have a
2501 disagreement over how we know whether something is true. This is an
2502 epistemological crisis, not a crisis over belief. It’s a crisis over the
2503 credibility of our truth-seeking exercises, from scientific journals (in
2504 an era where the biggest journal publishers have been caught producing
2505 pay-to-play journals for junk science) to regulations (in an era where
2506 regulators are routinely cycling in and out of business) to education
2507 (in an era where universities are dependent on corporate donations to
2508 keep their lights on).
2510 Targeting — surveillance capitalism — makes it easier to find people who
2511 are undergoing this epistemological crisis, but it doesn’t create the
2512 crisis. For that, you need to look to corruption.
2514 And, conveniently enough, it’s corruption that allows surveillance
2515 capitalism to grow by dismantling monopoly protections, by permitting
2516 reckless collection and retention of personal data, by allowing ads to
2517 be targeted in secret, and by foreclosing on the possibility of going
2518 somewhere else where you might continue to enjoy your friends without
2519 subjecting yourself to commercial surveillance.
2524 I reject both iterations of technological exceptionalism. I reject the
2525 idea that tech is uniquely terrible and led by people who are greedier
2526 or worse than the leaders of other industries, and I reject the idea
2527 that tech is so good — or so intrinsically prone to concentration — that
2528 it can’t be blamed for its present-day monopolistic status.
2530 I think tech is just another industry, albeit one that grew up in the
2531 absence of real monopoly constraints. It may have been first, but it
2532 isn’t the worst nor will it be the last.
2534 But there’s one way in which I *am* a tech exceptionalist. I believe
2535 that online tools are the key to overcoming problems that are much more
2536 urgent than tech monopolization: climate change, inequality, misogyny,
2537 and discrimination on the basis of race, gender identity, and other
2538 factors. The internet is how we will recruit people to fight those
2539 fights, and how we will coordinate their labor. Tech is not a substitute
2540 for democratic accountability, the rule of law, fairness, or stability —
2541 but it’s a means to achieve these things.
2543 The hard problem of our species is coordination. Everything from climate
2544 change to social change to running a business to making a family work
2545 can be viewed as a collective action problem.
2547 The internet makes it easier than at any time before to find people who
2548 want to work on a project with you — hence the success of free and
2549 open-source software, crowdfunding, and racist terror groups — and
2550 easier than ever to coordinate the work you do.
2552 The internet and the computers we connect to it also possess an
2553 exceptional quality: general-purposeness. The internet is designed to
2554 allow any two parties to communicate any data, using any protocol,
2555 without permission from anyone else. The only production design we have
2556 for computers is the general-purpose, “Turing complete” computer that
2557 can run every program we can express in symbolic logic.
2559 This means that every time someone with a special communications need
2560 invests in infrastructure and techniques to make the internet faster,
2561 cheaper, and more robust, this benefit redounds to everyone else who is
2562 using the internet to communicate. And this also means that every time
2563 someone with a special computing need invests to make computers faster,
2564 cheaper, and more robust, every other computing application is a
2565 potential beneficiary of this work.
2567 For these reasons, every type of communication is gradually absorbed
2568 into the internet, and every type of device — from airplanes to
2569 pacemakers — eventually becomes a computer in a fancy case.
2571 While these considerations don’t preclude regulating networks and
2572 computers, they do call for gravitas and caution when doing so because
2573 changes to regulatory frameworks could ripple out to have unintended
2574 consequences in many, many other domains.
2576 The upshot of this is that our best hope of solving the big coordination
2577 problems — climate change, inequality, etc. — is with free, fair, and
2578 open tech. Our best hope of keeping tech free, fair, and open is to
2579 exercise caution in how we regulate tech and to attend closely to the
2580 ways in which interventions to solve one problem might create problems
2586 Big Tech has a funny relationship with information. When you’re
2587 generating information — anything from the location data streaming off
2588 your mobile device to the private messages you send to friends on a
2589 social network — it claims the rights to make unlimited use of that
2592 But when you have the audacity to turn the tables — to use a tool that
2593 blocks ads or slurps your waiting updates out of a social network and
2594 puts them in another app that lets you set your own priorities and
2595 suggestions or crawls their system to allow you to start a rival
2596 business — they claim that you’re stealing from them.
2598 The thing is, information is a very bad fit for any kind of private
2599 property regime. Property rights are useful for establishing markets
2600 that can lead to the effective development of fallow assets. These
2601 markets depend on clear titles to ensure that the things being bought
2602 and sold in them can, in fact, be bought and sold.
2604 Information rarely has such a clear title. Take phone numbers: There’s
2605 clearly something going wrong when Facebook slurps up millions of users’
2606 address books and uses the phone numbers it finds in them to plot out
2607 social graphs and fill in missing information about other users.
2609 But the phone numbers Facebook nonconsensually acquires in this
2610 transaction are not the “property” of the users they’re taken from nor
2611 do they belong to the people whose phones ring when you dial those
2612 numbers. The numbers are mere integers, 10 digits in the U.S. and
2613 Canada, and they appear in millions of places, including somewhere deep
2614 in pi as well as numerous other contexts. Giving people ownership titles
2615 to integers is an obviously terrible idea.
2617 Likewise for the facts that Facebook and other commercial surveillance
2618 operators acquire about us, like that we are the children of our parents
2619 or the parents to our children or that we had a conversation with
2620 someone else or went to a public place. These data points can’t be
2621 property in the sense that your house or your shirt is your property
2622 because the title to them is intrinsically muddy: Does your mom own the
2623 fact that she is your mother? Do you? Do both of you? What about your
2624 dad — does he own this fact too, or does he have to license the fact
2625 from you (or your mom or both of you) in order to use this fact? What
2626 about the hundreds or thousands of other people who know these facts?
2628 If you go to a Black Lives Matter demonstration, do the other
2629 demonstrators need your permission to post their photos from the event?
2630 The online fights over `when and how to post photos from
2631 demonstrations <https://www.wired.com/story/how-to-take-photos-at-protests/>`__
2632 reveal a nuanced, complex issue that cannot be easily hand-waved away by
2633 giving one party a property right that everyone else in the mix has to
2636 The fact that information isn’t a good fit with property and markets
2637 doesn’t mean that it’s not valuable. Babies aren’t property, but they’re
2638 inarguably valuable. In fact, we have a whole set of rules just for
2639 babies as well as a subset of those rules that apply to humans more
2640 generally. Someone who argues that babies won’t be truly valuable until
2641 they can be bought and sold like loaves of bread would be instantly and
2642 rightfully condemned as a monster.
2644 It’s tempting to reach for the property hammer when Big Tech treats your
2645 information like a nail — not least because Big Tech are such prolific
2646 abusers of property hammers when it comes to *their* information. But
2647 this is a mistake. If we allow markets to dictate the use of our
2648 information, then we’ll find that we’re sellers in a buyers’ market
2649 where the Big Tech monopolies set a price for our data that is so low as
2650 to be insignificant or, more likely, set at a nonnegotiable price of
2651 zero in a click-through agreement that you don’t have the opportunity to
2654 Meanwhile, establishing property rights over information will create
2655 insurmountable barriers to independent data processing. Imagine that we
2656 require a license to be negotiated when a translated document is
2657 compared with its original, something Google has done and continues to
2658 do billions of times to train its automated language translation tools.
2659 Google can afford this, but independent third parties cannot. Google can
2660 staff a clearances department to negotiate one-time payments to the
2661 likes of the EU (one of the major repositories of translated documents)
2662 while independent watchdogs wanting to verify that the translations are
2663 well-prepared, or to root out bias in translations, will find themselves
2664 needing a staffed-up legal department and millions for licenses before
2665 they can even get started.
2667 The same goes for things like search indexes of the web or photos of
2668 peoples’ houses, which have become contentious thanks to Google’s Street
2669 View project. Whatever problems may exist with Google’s photographing of
2670 street scenes, resolving them by letting people decide who can take
2671 pictures of the facades of their homes from a public street will surely
2672 create even worse ones. Think of how street photography is important for
2673 newsgathering — including informal newsgathering, like photographing
2674 abuses of authority — and how being able to document housing and street
2675 life are important for contesting eminent domain, advocating for social
2676 aid, reporting planning and zoning violations, documenting
2677 discriminatory and unequal living conditions, and more.
2679 The ownership of facts is antithetical to many kinds of human progress.
2680 It’s hard to imagine a rule that limits Big Tech’s exploitation of our
2681 collective labors without inadvertently banning people from gathering
2682 data on online harassment or compiling indexes of changes in language or
2683 simply investigating how the platforms are shaping our discourse — all
2684 of which require scraping data that other people have created and
2685 subjecting it to scrutiny and analysis.
2687 Persuasion works… slowly
2688 -------------------------
2690 The platforms may oversell their ability to persuade people, but
2691 obviously, persuasion works sometimes. Whether it’s the private realm
2692 that LGBTQ people used to recruit allies and normalize sexual diversity
2693 or the decadeslong project to convince people that markets are the only
2694 efficient way to solve complicated resource allocation problems, it’s
2695 clear that our societal attitudes *can* change.
2697 The project of shifting societal attitudes is a game of inches and
2698 years. For centuries, svengalis have purported to be able to accelerate
2699 this process, but even the most brutal forms of propaganda have
2700 struggled to make permanent changes. Joseph Goebbels was able to subject
2701 Germans to daily, mandatory, hourslong radio broadcasts, to round up and
2702 torture and murder dissidents, and to seize full control over their
2703 children’s education while banning any literature, broadcasts, or films
2704 that did not comport with his worldview.
2706 Yet, after 12 years of terror, once the war ended, Nazi ideology was
2707 largely discredited in both East and West Germany, and a program of
2708 national truth and reconciliation was put in its place. Racism and
2709 authoritarianism were never fully abolished in Germany, but neither were
2710 the majority of Germans irrevocably convinced of Nazism — and the rise
2711 of racist authoritarianism in Germany today tells us that the liberal
2712 attitudes that replaced Nazism were no more permanent than Nazism
2715 Racism and authoritarianism have also always been with us. Anyone who’s
2716 reviewed the kind of messages and arguments that racists put forward
2717 today would be hard-pressed to say that they have gotten better at
2718 presenting their ideas. The same pseudoscience, appeals to fear, and
2719 circular logic that racists presented in the 1980s, when the cause of
2720 white supremacy was on the wane, are to be found in the communications
2721 of leading white nationalists today.
2723 If racists haven’t gotten more convincing in the past decade, then how
2724 is it that more people were convinced to be openly racist at that time?
2725 I believe that the answer lies in the material world, not the world of
2726 ideas. The ideas haven’t gotten more convincing, but people have become
2727 more afraid. Afraid that the state can’t be trusted to act as an honest
2728 broker in life-or-death decisions, from those regarding the management
2729 of the economy to the regulation of painkillers to the rules for
2730 handling private information. Afraid that the world has become a game of
2731 musical chairs in which the chairs are being taken away at a
2732 never-before-seen rate. Afraid that justice for others will come at
2733 their expense. Monopolism isn’t the cause of these fears, but the
2734 inequality and material desperation and policy malpractice that
2735 monopolism contributes to is a significant contributor to these
2736 conditions. Inequality creates the conditions for both conspiracies and
2737 violent racist ideologies, and then surveillance capitalism lets
2738 opportunists target the fearful and the conspiracy-minded.
2743 As the old saw goes, “If you’re not paying for the product, you’re the
2746 It’s a commonplace belief today that the advent of free, ad-supported
2747 media was the original sin of surveillance capitalism. The reasoning is
2748 that the companies that charged for access couldn’t “compete with free”
2749 and so they were driven out of business. Their ad-supported competitors,
2750 meanwhile, declared open season on their users’ data in a bid to improve
2751 their ad targeting and make more money and then resorted to the most
2752 sensationalist tactics to generate clicks on those ads. If only we’d pay
2753 for media again, we’d have a better, more responsible, more sober
2754 discourse that would be better for democracy.
2756 But the degradation of news products long precedes the advent of
2757 ad-supported online news. Long before newspapers were online, lax
2758 antitrust enforcement had opened the door for unprecedented waves of
2759 consolidation and roll-ups in newsrooms. Rival newspapers were merged,
2760 reporters and ad sales staff were laid off, physical plants were sold
2761 and leased back, leaving the companies loaded up with debt through
2762 leveraged buyouts and subsequent profit-taking by the new owners. In
2763 other words, it wasn’t merely shifts in the classified advertising
2764 market, which was long held to be the primary driver in the decline of
2765 the traditional newsroom, that made news companies unable to adapt to
2766 the internet — it was monopolism.
2768 Then, as news companies *did* come online, the ad revenues they
2769 commanded dropped even as the number of internet users (and thus
2770 potential online readers) increased. That shift was a function of
2771 consolidation in the ad sales market, with Google and Facebook emerging
2772 as duopolists who made more money every year from advertising while
2773 paying less and less of it to the publishers whose work the ads appeared
2774 alongside. Monopolism created a buyer’s market for ad inventory with
2775 Facebook and Google acting as gatekeepers.
2777 Paid services continue to exist alongside free ones, and often it is
2778 these paid services — anxious to prevent people from bypassing their
2779 paywalls or sharing paid media with freeloaders — that exert the most
2780 control over their customers. Apple’s iTunes and App Stores are paid
2781 services, but to maximize their profitability, Apple has to lock its
2782 platforms so that third parties can’t make compatible software without
2783 permission. These locks allow the company to exercise both editorial
2784 control (enabling it to exclude `controversial political
2785 material <https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression>`__)
2786 and technological control, including control over who can repair the
2787 devices it makes. If we’re worried that ad-supported products deprive
2788 people of their right to self-determination by using persuasion
2789 techniques to nudge their purchase decisions a few degrees in one
2790 direction or the other, then the near-total control a single company
2791 holds over the decision of who gets to sell you software, parts, and
2792 service for your iPhone should have us very worried indeed.
2794 We shouldn’t just be concerned about payment and control: The idea that
2795 paying will improve discourse is also dangerously wrong. The poor
2796 success rate of targeted advertising means that the platforms have to
2797 incentivize you to “engage” with posts at extremely high levels to
2798 generate enough pageviews to safeguard their profits. As discussed
2799 earlier, to increase engagement, platforms like Facebook use machine
2800 learning to guess which messages will be most inflammatory and make a
2801 point of shoving those into your eyeballs at every turn so that you will
2802 hate-click and argue with people.
2804 Perhaps paying would fix this, the reasoning goes. If platforms could be
2805 economically viable even if you stopped clicking on them once your
2806 intellectual and social curiosity had been slaked, then they would have
2807 no reason to algorithmically enrage you to get more clicks out of you,
2810 There may be something to that argument, but it still ignores the wider
2811 economic and political context of the platforms and the world that
2812 allowed them to grow so dominant.
2814 Platforms are world-spanning and all-encompassing because they are
2815 monopolies, and they are monopolies because we have gutted our most
2816 important and reliable anti-monopoly rules. Antitrust was neutered as a
2817 key part of the project to make the wealthy wealthier, and that project
2818 has worked. The vast majority of people on Earth have a negative net
2819 worth, and even the dwindling middle class is in a precarious state,
2820 undersaved for retirement, underinsured for medical disasters, and
2821 undersecured against climate and technology shocks.
2823 In this wildly unequal world, paying doesn’t improve the discourse; it
2824 simply prices discourse out of the range of the majority of people.
2825 Paying for the product is dandy, if you can afford it.
2827 If you think today’s filter bubbles are a problem for our discourse,
2828 imagine what they’d be like if rich people inhabited free-flowing
2829 Athenian marketplaces of ideas where you have to pay for admission while
2830 everyone else lives in online spaces that are subsidized by wealthy
2831 benefactors who relish the chance to establish conversational spaces
2832 where the “house rules” forbid questioning the status quo. That is,
2833 imagine if the rich seceded from Facebook, and then, instead of running
2834 ads that made money for shareholders, Facebook became a billionaire’s
2835 vanity project that also happened to ensure that nobody talked about
2836 whether it was fair that only billionaires could afford to hang out in
2837 the rarified corners of the internet.
2839 Behind the idea of paying for access is a belief that free markets will
2840 address Big Tech’s dysfunction. After all, to the extent that people
2841 have a view of surveillance at all, it is generally an unfavorable one,
2842 and the longer and more thoroughly one is surveilled, the less one tends
2843 to like it. Same goes for lock-in: If HP’s ink or Apple’s App Store were
2844 really obviously fantastic, they wouldn’t need technical measures to
2845 prevent users from choosing a rival’s product. The only reason these
2846 technical countermeasures exist is that the companies don’t believe
2847 their customers would *voluntarily* submit to their terms, and they want
2848 to deprive them of the choice to take their business elsewhere.
2850 Advocates for markets laud their ability to aggregate the diffused
2851 knowledge of buyers and sellers across a whole society through demand
2852 signals, price signals, and so on. The argument for surveillance
2853 capitalism being a “rogue capitalism” is that machine-learning-driven
2854 persuasion techniques distort decision-making by consumers, leading to
2855 incorrect signals — consumers don’t buy what they prefer, they buy what
2856 they’re tricked into preferring. It follows that the monopolistic
2857 practices of lock-in, which do far more to constrain consumers’ free
2858 choices, are even more of a “rogue capitalism.”
2860 The profitability of any business is constrained by the possibility that
2861 its customers will take their business elsewhere. Both surveillance and
2862 lock-in are anti-features that no customer wants. But monopolies can
2863 capture their regulators, crush their competitors, insert themselves
2864 into their customers’ lives, and corral people into “choosing” their
2865 services regardless of whether they want them — it’s fine to be terrible
2866 when there is no alternative.
2868 Ultimately, surveillance and lock-in are both simply business strategies
2869 that monopolists can choose. Surveillance companies like Google are
2870 perfectly capable of deploying lock-in technologies — just look at the
2871 onerous Android licensing terms that require device-makers to bundle in
2872 Google’s suite of applications. And lock-in companies like Apple are
2873 perfectly capable of subjecting their users to surveillance if it means
2874 keeping the Chinese government happy and preserving ongoing access to
2875 Chinese markets. Monopolies may be made up of good, ethical people, but
2876 as institutions, they are not your friend — they will do whatever they
2877 can get away with to maximize their profits, and the more monopolistic
2878 they are, the more they *can* get away with.
2880 An “ecology” moment for trustbusting
2881 ---------------------------------------
2883 If we’re going to break Big Tech’s death grip on our digital lives,
2884 we’re going to have to fight monopolies. That may sound pretty mundane
2885 and old-fashioned, something out of the New Deal era, while ending the
2886 use of automated behavioral modification feels like the plotline of a
2887 really cool cyberpunk novel.
2889 Meanwhile, breaking up monopolies is something we seem to have forgotten
2890 how to do. There is a bipartisan, trans-Atlantic consensus that breaking
2891 up companies is a fool’s errand at best — liable to mire your federal
2892 prosecutors in decades of litigation — and counterproductive at worst,
2893 eroding the “consumer benefits” of large companies with massive
2894 efficiencies of scale.
2896 But trustbusters once strode the nation, brandishing law books,
2897 terrorizing robber barons, and shattering the illusion of monopolies’
2898 all-powerful grip on our society. The trustbusting era could not begin
2899 until we found the political will — until the people convinced
2900 politicians they’d have their backs when they went up against the
2901 richest, most powerful men in the world.
2903 Could we find that political will again?
2905 Copyright scholar James Boyle has described how the term “ecology”
2906 marked a turning point in environmental activism. Prior to the adoption
2907 of this term, people who wanted to preserve whale populations didn’t
2908 necessarily see themselves as fighting the same battle as people who
2909 wanted to protect the ozone layer or fight freshwater pollution or beat
2910 back smog or acid rain.
2912 But the term “ecology” welded these disparate causes together into a
2913 single movement, and the members of this movement found solidarity with
2914 one another. The people who cared about smog signed petitions circulated
2915 by the people who wanted to end whaling, and the anti-whalers marched
2916 alongside the people demanding action on acid rain. This uniting behind
2917 a common cause completely changed the dynamics of environmentalism,
2918 setting the stage for today’s climate activism and the sense that
2919 preserving the habitability of the planet Earth is a shared duty among
2922 I believe we are on the verge of a new “ecology” moment dedicated to
2923 combating monopolies. After all, tech isn’t the only concentrated
2924 industry nor is it even the *most* concentrated of industries.
2926 You can find partisans for trustbusting in every sector of the economy.
2927 Everywhere you look, you can find people who’ve been wronged by
2928 monopolists who’ve trashed their finances, their health, their privacy,
2929 their educations, and the lives of people they love. Those people have
2930 the same cause as the people who want to break up Big Tech and the same
2931 enemies. When most of the world’s wealth is in the hands of a very few,
2932 it follows that nearly every large company will have overlapping
2935 That’s the good news: With a little bit of work and a little bit of
2936 coalition building, we have more than enough political will to break up
2937 Big Tech and every other concentrated industry besides. First we take
2938 Facebook, then we take AT&T/WarnerMedia.
2940 But here’s the bad news: Much of what we’re doing to tame Big Tech
2941 *instead* of breaking up the big companies also forecloses on the
2942 possibility of breaking them up later.
2944 Big Tech’s concentration currently means that their inaction on
2945 harassment, for example, leaves users with an impossible choice: absent
2946 themselves from public discourse by, say, quitting Twitter or endure
2947 vile, constant abuse. Big Tech’s over-collection and over-retention of
2948 data results in horrific identity theft. And their inaction on extremist
2949 recruitment means that white supremacists who livestream their shooting
2950 rampages can reach an audience of billions. The combination of tech
2951 concentration and media concentration means that artists’ incomes are
2952 falling even as the revenue generated by their creations are increasing.
2954 Yet governments confronting all of these problems all inevitably
2955 converge on the same solution: deputize the Big Tech giants to police
2956 their users and render them liable for their users’ bad actions. The
2957 drive to force Big Tech to use automated filters to block everything
2958 from copyright infringement to sex-trafficking to violent extremism
2959 means that tech companies will have to allocate hundreds of millions to
2960 run these compliance systems.
2962 These rules — the EU’s new Directive on Copyright, Australia’s new
2963 terror regulation, America’s FOSTA/SESTA sex-trafficking law and more —
2964 are not just death warrants for small, upstart competitors that might
2965 challenge Big Tech’s dominance but who lack the deep pockets of
2966 established incumbents to pay for all these automated systems. Worse
2967 still, these rules put a floor under how small we can hope to make Big
2970 That’s because any move to break up Big Tech and cut it down to size
2971 will have to cope with the hard limit of not making these companies so
2972 small that they can no longer afford to perform these duties — and it’s
2973 *expensive* to invest in those automated filters and outsource content
2974 moderation. It’s already going to be hard to unwind these deeply
2975 concentrated, chimeric behemoths that have been welded together in the
2976 pursuit of monopoly profits. Doing so while simultaneously finding some
2977 way to fill the regulatory void that will be left behind if these
2978 self-policing rulers were forced to suddenly abdicate will be much, much
2981 Allowing the platforms to grow to their present size has given them a
2982 dominance that is nearly insurmountable — deputizing them with public
2983 duties to redress the pathologies created by their size makes it
2984 virtually impossible to reduce that size. Lather, rinse, repeat: If the
2985 platforms don’t get smaller, they will get larger, and as they get
2986 larger, they will create more problems, which will give rise to more
2987 public duties for the companies, which will make them bigger still.
2989 We can work to fix the internet by breaking up Big Tech and depriving
2990 them of monopoly profits, or we can work to fix Big Tech by making them
2991 spend their monopoly profits on governance. But we can’t do both. We
2992 have to choose between a vibrant, open internet or a dominated,
2993 monopolized internet commanded by Big Tech giants that we struggle with
2994 constantly to get them to behave themselves.
2996 Make Big Tech small again
2997 -------------------------
2999 Trustbusting is hard. Breaking big companies into smaller ones is
3000 expensive and time-consuming. So time-consuming that by the time you’re
3001 done, the world has often moved on and rendered years of litigation
3002 irrelevant. From 1969 to 1982, the U.S. government pursued an antitrust
3003 case against IBM over its dominance of mainframe computing — but the
3004 case collapsed in 1982 because mainframes were being speedily replaced
3007 A future U.S. president could simply direct their attorney general to
3008 enforce the law as it was written.
3010 It’s far easier to prevent concentration than to fix it, and reinstating
3011 the traditional contours of U.S. antitrust enforcement will, at the very
3012 least, prevent further concentration. That means bans on mergers between
3013 large companies, on big companies acquiring nascent competitors, and on
3014 platform companies competing directly with the companies that rely on
3017 These powers are all in the plain language of U.S. antitrust laws, so in
3018 theory, a future U.S. president could simply direct their attorney
3019 general to enforce the law as it was written. But after decades of
3020 judicial “education” in the benefits of monopolies, after multiple
3021 administrations that have packed the federal courts with
3022 lifetime-appointed monopoly cheerleaders, it’s not clear that mere
3023 administrative action would do the trick.
3025 If the courts frustrate the Justice Department and the president, the
3026 next stop would be Congress, which could eliminate any doubt about how
3027 antitrust law should be enforced in the U.S. by passing new laws that
3028 boil down to saying, “Knock it off. We all know what the Sherman Act
3029 says. Robert Bork was a deranged fantasist. For avoidance of doubt,
3030 *fuck that guy*.” In other words, the problem with monopolies is
3031 *monopolism* — the concentration of power into too few hands, which
3032 erodes our right to self-determination. If there is a monopoly, the law
3033 wants it gone, period. Sure, get rid of monopolies that create “consumer
3034 harm” in the form of higher prices, but also, *get rid of other
3037 But this only prevents things from getting worse. To help them get
3038 better, we will have to build coalitions with other activists in the
3039 anti-monopoly ecology movement — a pluralism movement or a
3040 self-determination movement — and target existing monopolies in every
3041 industry for breakup and structural separation rules that prevent, for
3042 example, the giant eyewear monopolist Luxottica from dominating both the
3043 sale and the manufacture of spectacles.
3045 In an important sense, it doesn’t matter which industry the breakups
3046 begin in. Once they start, shareholders in *every* industry will start
3047 to eye their investments in monopolists skeptically. As trustbusters
3048 ride into town and start making lives miserable for monopolists, the
3049 debate around every corporate boardroom’s table will shift. People
3050 within corporations who’ve always felt uneasy about monopolism will gain
3051 a powerful new argument to fend off their evil rivals in the corporate
3052 hierarchy: “If we do it my way, we make less money; if we do it your
3053 way, a judge will fine us billions and expose us to ridicule and public
3054 disapprobation. So even though I get that it would be really cool to do
3055 that merger, lock out that competitor, or buy that little company and
3056 kill it before it can threaten it, we really shouldn’t — not if we don’t
3057 want to get tied to the DOJ’s bumper and get dragged up and down
3058 Trustbuster Road for the next 10 years.”
3063 Fixing Big Tech will require a lot of iteration. As cyber lawyer
3064 Lawrence Lessig wrote in his 1999 book, *Code and Other Laws of
3065 Cyberspace*, our lives are regulated by four forces: law (what’s legal),
3066 code (what’s technologically possible), norms (what’s socially
3067 acceptable), and markets (what’s profitable).
3069 If you could wave a wand and get Congress to pass a law that re-fanged
3070 the Sherman Act tomorrow, you could use the impending breakups to
3071 convince venture capitalists to fund competitors to Facebook, Google,
3072 Twitter, and Apple that would be waiting in the wings after they were
3075 But getting Congress to act will require a massive normative shift, a
3076 mass movement of people who care about monopolies — and pulling them
3079 Getting people to care about monopolies will take technological
3080 interventions that help them to see what a world free from Big Tech
3081 might look like. Imagine if someone could make a beloved (but
3082 unauthorized) third-party Facebook or Twitter client that dampens the
3083 anxiety-producing algorithmic drumbeat and still lets you talk to your
3084 friends without being spied upon — something that made social media more
3085 sociable and less toxic. Now imagine that it gets shut down in a brutal
3086 legal battle. It’s always easier to convince people that something must
3087 be done to save a thing they love than it is to excite them about
3088 something that doesn’t even exist yet.
3090 Neither tech nor law nor code nor markets are sufficient to reform Big
3091 Tech. But a profitable competitor to Big Tech could bankroll a
3092 legislative push; legal reform can embolden a toolsmith to make a better
3093 tool; the tool can create customers for a potential business who value
3094 the benefits of the internet but want them delivered without Big Tech;
3095 and that business can get funded and divert some of its profits to legal
3096 reform. 20 GOTO 10 (or lather, rinse, repeat). Do it again, but this
3097 time, get farther! After all, this time you’re starting with weaker Big
3098 Tech adversaries, a constituency that understands things can be better,
3099 Big Tech rivals who’ll help ensure their own future by bankrolling
3100 reform, and code that other programmers can build on to weaken Big Tech
3103 The surveillance capitalism hypothesis — that Big Tech’s products really
3104 work as well as they say they do and that’s why everything is so screwed
3105 up — is way too easy on surveillance and even easier on capitalism.
3106 Companies spy because they believe their own BS, and companies spy
3107 because governments let them, and companies spy because any advantage
3108 from spying is so short-lived and minor that they have to do more and
3109 more of it just to stay in place.
3111 As to why things are so screwed up? Capitalism. Specifically, the
3112 monopolism that creates inequality and the inequality that creates
3113 monopolism. It’s a form of capitalism that rewards sociopaths who
3114 destroy the real economy to inflate the bottom line, and they get away
3115 with it for the same reason companies get away with spying: because our
3116 governments are in thrall to both the ideology that says monopolies are
3117 actually just fine and in thrall to the ideology that says that in a
3118 monopolistic world, you’d better not piss off the monopolists.
3120 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked rule
3121 begets surveillance. Surveillance isn’t bad because it lets people
3122 manipulate us. It’s bad because it crushes our ability to be our
3123 authentic selves — and because it lets the rich and powerful figure out
3124 who might be thinking of building guillotines and what dirt they can use
3125 to discredit those embryonic guillotine-builders before they can even
3126 get to the lumberyard.
3131 With all the problems of Big Tech, it’s tempting to imagine solving the
3132 problem by returning to a world without tech at all. Resist that
3135 The only way out of our Big Tech problem is up and through. If our
3136 future is not reliant upon high tech, it will be because civilization
3137 has fallen. Big Tech wired together a planetary, species-wide nervous
3138 system that, with the proper reforms and course corrections, is capable
3139 of seeing us through the existential challenge of our species and
3140 planet. Now it’s up to us to seize the means of computation, putting
3141 that electronic nervous system under democratic, accountable control.
3143 I am, secretly, despite what I have said earlier, a tech exceptionalist.
3144 Not in the sense of thinking that tech should be given a free pass to
3145 monopolize because it has “economies of scale” or some other nebulous
3146 feature. I’m a tech exceptionalist because I believe that getting tech
3147 right matters and that getting it wrong will be an unmitigated
3148 catastrophe — and doing it right can give us the power to work together
3149 to save our civilization, our species, and our planet.