]> pere.pagekite.me Git - text-destroy-surveillance.git/blob - public/how-to-destroy-surveillance-capitalism.pl.html
Updated web version.
[text-destroy-surveillance.git] / public / how-to-destroy-surveillance-capitalism.pl.html
1 <html><head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"><title>Jak zniszczyć kapitalizm oparty na inwigilacyjnych systemach nadzoru</title><meta name="generator" content="DocBook XSL Stylesheets V1.79.1"><meta name="description" content="Nasze urządzenia i usługi zbierają większość danych, które NSA następnie wydobywa w ramach swojego inwigilacyjnego projektu nadzoru. Płacimy za te urządzenia i usługi, z którymi się łączą, a następnie skrupulatnie wykonujemy zadania związane z wprowadzaniem danych, które rejestrują fakty dotyczące naszego życia, opinii i preferencji. Dzięki `Big Tech` kapitalizm oparty na inwigilacyjnym systemie nadzoru jest wszędzie. Nie dzieje się tak dlatego, że jest naprawdę dobry w manipulowaniu naszym zachowaniem lub nieuczciwym nadużywaniu władzy korporacji. Jest to wynik niekontrolowanego monopolizmu i obelżywego zachowania, jakie on wywołuje. Jest to system działający zgodnie z zamierzeniami i oczekiwaniami. Cory Doctorow napisał obszerną krytykę książki Shoshana Zuboff pt. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, z obiektywną, nie-magiczną analizą problemu, prowadzącą do innej propozycji jego rozwiązania."><style type="text/css">
2 body { background-image: url('images/draft.png');
3 background-repeat: no-repeat;
4 background-position: top left;
5 /* The following properties make the watermark "fixed" on the page. */
6 /* I think that's just a bit too distracting for the reader... */
7 /* background-attachment: fixed; */
8 /* background-position: center center; */
9 }</style></head><body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF"><div lang="en" class="article"><div class="titlepage"><div><div><h2 class="title"><a name="index"></a>Jak zniszczyć kapitalizm oparty na inwigilacyjnych systemach nadzoru</h2></div><div><div class="authorgroup"><div class="author"><h3 class="author"><span class="firstname">Cory</span> <span class="surname">Doctorow</span></h3></div></div></div><div><p class="copyright">Copyright © 2020 Cory Doctorow</p></div><div><p class="copyright">Copyright © 2020 Petter Reinholdtsen</p></div><div><div class="legalnotice"><a name="idm18"></a><p>
10 Jak zniszczyć kapitalizm oparty na inwigilacyjnych systemach nadzoru Autor:
11 Cory Doctorow.
12 </p><p>
13 Wydawca: Petter Reinholdtsen.
14 </p><p>
15 ISBN 978-82-93828-XX-X (oprawa twarda)
16 </p><p>
17 ISBN 978-82-93828-XX-X (oprawa miękka)
18 </p><p>
19 ISBN 978-82-93828-XX-X (ePub)
20 </p><p>
21 Ta książka jest dostępna w sprzedaży na <a class="ulink" href="https://www.lulu.com/" target="_top">https://www.lulu.com/</a>.
22 </p><p>
23 Jeśli znajdziesz literówkę, błąd lub masz inne uwagi na temat poprawienia
24 tekstu, proszę zaktualizuj to na <a class="ulink" href="https://hosted.weblate.org/projects/rms-personal-data-safe/how-to-destroy-surveillance-capitalism/nb_NO/" target="_top">https://hosted.weblate.org/projects/rms-personal-data-safe/how-to-destroy-surveillance-capitalism/nb_NO/</a>.
25 </p><p>
26 <span class="inlinemediaobject"><img src="images/cc-some-rights-reserved.png" align="middle" height="38" alt="Creative Commons, Pewne prawa zastrzeżone"></span>
27 </p><p>
28 Ta książka jest wydana na licencji Creative Commons. Licencja ta pozwala na
29 dowolny użytek tej pracy tak długo, jak długo podane jest jej autorstwo i
30 nie jest rozpowszechniany żaden materiał pochodny. Więcej informacji na
31 temat tej licencji znajdziesz na <a class="ulink" href="https://creativecommons.org/licenses/by-nd/4.0/" target="_top">https://creativecommons.org/licenses/by-nd/4.0/</a>.
32 </p></div></div><div><div class="abstract"><p class="title"><b>Abstract</b></p><p>
33 Nasze urządzenia i usługi zbierają większość danych, które NSA następnie
34 wydobywa w ramach swojego inwigilacyjnego projektu nadzoru. Płacimy za te
35 urządzenia i usługi, z którymi się łączą, a następnie skrupulatnie
36 wykonujemy zadania związane z wprowadzaniem danych, które rejestrują fakty
37 dotyczące naszego życia, opinii i preferencji.
38 </p><p>
39 Dzięki `Big Tech` kapitalizm oparty na inwigilacyjnym systemie nadzoru jest
40 wszędzie. Nie dzieje się tak dlatego, że jest naprawdę dobry w manipulowaniu
41 naszym zachowaniem lub nieuczciwym nadużywaniu władzy korporacji. Jest to
42 wynik niekontrolowanego monopolizmu i obelżywego zachowania, jakie on
43 wywołuje. Jest to system działający zgodnie z zamierzeniami i
44 oczekiwaniami. Cory Doctorow napisał obszerną krytykę książki Shoshana
45 Zuboff pt. The Age of Surveillance Capitalism: The Fight for a Human Future
46 at the New Frontier of Power, z obiektywną, nie-magiczną analizą problemu,
47 prowadzącą do innej propozycji jego rozwiązania.
48 </p></div></div></div><hr></div><div class="toc"><p><b>Table of Contents</b></p><dl class="toc"><dt><span class="sect1"><a href="#the-net-of-a-thousand-lies">Sieć tysięcy kłamstw</a></span></dt><dt><span class="sect1"><a href="#digital-rights-activism-a-quarter-century-on">Aktywizm praw cyfrowych, krótka historia 25 lat działalności</a></span></dt><dt><span class="sect1"><a href="#tech-exceptionalism-then-and-now">Wyjątkowość Technologii, dawniej i obecnie</a></span></dt><dt><span class="sect1"><a href="#dont-believe-the-hype">Nie wierz 'szumom' medialnym</a></span></dt><dt><span class="sect1"><a href="#what-is-persuasion">Co to jest przekonywanie?</a></span></dt><dd><dl><dt><span class="sect2"><a href="#segmenting">1. Segmentacja</a></span></dt><dt><span class="sect2"><a href="#deception">2. Podstęp</a></span></dt><dt><span class="sect2"><a href="#domination">3. Dominacja</a></span></dt><dt><span class="sect2"><a href="#bypassing-our-rational-faculties">4. Omijanie naszych racjonalnych zdolności</a></span></dt></dl></dd><dt><span class="sect1"><a href="#if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak">Jeśli dane są nowym paliwem, to silnik kapitalistycznych systemów nadzoru ma
49 wyciek</a></span></dt><dt><span class="sect1"><a href="#what-is-facebook">Co to jest Facebook?</a></span></dt><dt><span class="sect1"><a href="#monopoly-and-the-right-to-the-future-tense">Monopol i prawo do czasu przyszłego</a></span></dt><dt><span class="sect1"><a href="#search-order-and-the-right-to-the-future-tense">Porządek wyszukiwania i prawo do czasu przyszłego</a></span></dt><dt><span class="sect1"><a href="#monopolists-can-afford-sleeping-pills-for-watchdogs">Monopoliści mogą sobie pozwolić na proszki nasenne dla strażników</a></span></dt><dt><span class="sect1"><a href="#privacy-and-monopoly">Prywatność a monopol</a></span></dt><dt><span class="sect1"><a href="#ronald-reagan-pioneer-of-tech-monopolism">Ronald Reagan, pionier monopolizmu technologicznego</a></span></dt><dt><span class="sect1"><a href="#steering-with-the-windshield-wipers">Sterowanie za pomocą wycieraczek przedniej szyby</a></span></dt><dt><span class="sect1"><a href="#surveillance-still-matters">Systemy nadzoru mają ciągle znaczenie</a></span></dt><dt><span class="sect1"><a href="#dignity-and-sanctuary">Godność i sanktuarium</a></span></dt><dt><span class="sect1"><a href="#afflicting-the-afflicted">Dręczenie udręczonych</a></span></dt><dt><span class="sect1"><a href="#any-data-you-collect-and-retain-will-eventually-leak">Jakiekolwiek dane, które zbierasz i przetwarzasz, kiedyś w końcu wyciekną</a></span></dt><dt><span class="sect1"><a href="#critical-tech-exceptionalism-is-still-tech-exceptionalism">Przełomowa wyjątkowość technologiczna jest nadal technologiczną
50 wyjątkowością</a></span></dt><dt><span class="sect1"><a href="#how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story">Jak monopole, a nie kontrola umysłu, sterują kapitalizmen opartym na
51 systemach nadzoru: historia Snapchat</a></span></dt><dt><span class="sect1"><a href="#a-monopoly-over-your-friends">Monopol sprawowany nad twoimi przyjaciółmi</a></span></dt><dt><span class="sect1"><a href="#fake-news-is-an-epistemological-crisis">Fałszywe wiadomości to oznaka kryzysu epistemologicznego</a></span></dt><dt><span class="sect1"><a href="#tech-is-different">Technologia jest czymś odmiennym</a></span></dt><dt><span class="sect1"><a href="#ownership-of-facts">Własność faktów</a></span></dt><dt><span class="sect1"><a href="#persuasion-works-slowly">Przekonywanie działa… powoli</a></span></dt><dt><span class="sect1"><a href="#paying-wont-help">Płacenie nie pomoże</a></span></dt><dt><span class="sect1"><a href="#an-ecology-moment-for-trustbusting"><span class="quote"><span class="quote"> ekologia</span></span> chwila na zerwanie zaufania</a></span></dt><dt><span class="sect1"><a href="#make-big-tech-small-again">Spraw, aby 'Big Tech' stała się ponownie 'małą' technologią</a></span></dt><dt><span class="sect1"><a href="#goto-10">20 GOTO 10</a></span></dt><dt><span class="sect1"><a href="#up-and-through">W górę i na wylot</a></span></dt></dl></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="the-net-of-a-thousand-lies"></a>Sieć tysięcy kłamstw</h2></div></div></div><p>
52 Najbardziej zaskakującą kwestią w dziedzinie powtórnych narodzin `płaskich`
53 Ziemian w XXI wieku jest to, jak szeroko rozpowszechnione są przeciwko nim
54 dowody. Można teraz zrozumieć, jak przed wiekami ludzie, którzy nigdy nie
55 mieli okazji zobaczyć Ziemi z orbity ziemskiej, mogli dojść do przekonania,
56 opartego na zdrowym rozsądku, że Ziemia - wyglądająca na płaską - jest
57 faktycznie, płaska.
58 </p><p>
59 Lecz dzisiaj, gdy szkoły podstawowe rutynowo podczepiają do balonów kamery
60 GoPro i umieszczają je na tyle wysoko, aby sfotografować krzywą Ziemi — nie
61 mówiąc już o niezrównanie wyjątkowym widoku zakrzywionej Ziemi z okna
62 samolotu — trzeba naprawdę heroicznego wysiłku, aby uwierzyć, że Ziemia jest
63 płaska.
64 </p><p>
65 Podobnie jest z eugeniką i białym nacjonalizmem: w epoce, w której można
66 stać się obliczeniowym punktem danych genomicznych, poprzez pobranie wymazu
67 z policzka, i przesłanie go — za pomocą poczty elektronicznej — do firmy
68 sekwencjonujoącej geny, wraz ze skromną sumą pieniędzy, nigdy nie było
69 łatwiej zaprzeczyć twierdzeniom wysuwanym podczas <span class="quote"><span class="quote">naukowego wyścigu
70 po sukces</span></span>.
71 </p><p>
72 Żyjemy w złotej epoce, zarówno łatwo dostępnych faktów, jak i zaprzeczenia
73 tym faktom. Okropne idee, które pozostawały na marginesie przez
74 dziesięciolecia, a nawet stulecia, z dnia na dzień weszły do głównego nurtu.
75 </p><p>
76 Kiedy niejasny pomysł zyskuje na popularności, istnieją tylko dwie rzeczy,
77 które mogą wyjaśnić jego przewagę: albo osoba wyrażająca ten pomysł znacznie
78 lepiej radzi sobie z przedstawieniem swojej racji, albo twierdzenie staje
79 się trudniejsze do zaprzeczenia w obliczu rosnących dowodów. Innymi słowy,
80 jeśli chcemy, aby ludzie poważnie podchodzili do zmian klimatycznych, możemy
81 skłonić grupę Grety Thunberg do wyrażenia wymownych, namiętnych argumentów w
82 sposób publiczny, przez co zdobędzie ona nasze serca i umysły, lub możemy
83 poczekać na powódź, ogień, palące słońce i pandemie, aby przemówiła za
84 nami. W praktyce prawdopodobnie będziemy musieli zrobić jedno i drugie: im
85 więcej będziemy gotować, palić, topić i marnować, tym łatwiej będzie Grecie
86 Thunberg nas przekonać.
87 </p><p>
88 Argumenty za absurdalnymi wierzeniami w oparciu o teorie spiskowe, takie jak
89 teoria antyszczepionkowa, teoria negująca zmiany klimatu, teoria o istnieniu
90 płaskiej Ziemi i teoria eugeniczna, wcale nie są lepsze niż w pokoleniu
91 wcześniejszym. W rzeczywistości, są gorsze, ponieważ są przedstawiane
92 ludziom, którzy mają przynajmniej podstawową świadomość faktów obalających
93 te teorie.
94 </p><p>
95 Antyszczepionkowcy pojawiali się już od czasu wynalezienia pierwszych
96 szczepionek, lecz pierwsi z nich byli ludźmi słabo przygotowanymi do
97 zrozumienia nawet najbardziej podstawowych kwestii mikrobiologii, a –
98 ponadto - ludzie ci nie byli świadkami masowej eksterminacji, spowodowanej
99 takimi morderczymi chorobami, jak polio, ospa czy odra. Dzisiejsi
100 przeciwnicy szczepień nie są bardziej elokwentni niż ich przodkowie, a mają
101 znacznie cięższą pracę.
102 </p><p>
103 A więc, czy ci wyrafinowani zwolennicy teorii spiskowych mogą, tak naprawdę,
104 odnieść sukces w oparciu o lepsze argumenty?
105 </p><p>
106 Niektórzy ludzie myślą, że tak. Obecnie panuje powszechne przekonanie, że
107 uczenie maszynowe i komercyjne systemy inwigilacji mogą zmienić nawet
108 najbardziej nieudolnego teoretyka spiskowego w `swengali`, osobę, która może
109 wypaczyć czyjeś spostrzeżenia i zdobyć jego /jej zaufanie, lokalizując osoby
110 wrażliwe, a następnie przedstawiając im wyrafinowane argumenty Sztucznej
111 Inteligencji, omijające racjonalne umiejętności ludzi i zmieniające ich w
112 `płaskich` Ziemian, przeciwników szczepień, a nawet nazistów. Gdy the RAND
113 Corporation <a class="ulink" href="https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf" target="_top">oskarża
114 Facebook'a za <span class="quote"><span class="quote">radykalizację</span></span></a>, kiedy Facebook'a oskarża
115 się o rozpowszechnianie dezinformacji na temat koronawirusa, <a class="ulink" href="https://secure.avaaz.org/campaign/en/facebook_threat_health/" target="_top">a jest to
116 spowodowane jego algorytmem</a>, to domyślnym przesłaniem jest fakt, że
117 uczenie maszynowe i systemy inwigilacyjnego nadzoru powodują zmiany w naszym
118 konsensusie wobec tego, co jest prawdą.
119 </p><p>
120 W końcu, w świecie, w którym rozległe i niespójne teorie spiskowe, takie jak
121 Pizzagate i jej nastepczyni, QAnon, mają szeroko rozpowszechnionych
122 następców, <span class="emphasis"><em>coś</em></span>, musi trwać nadal.
123 </p><p>
124 But what if there’s another explanation? What if it’s the material
125 circumstances, and not the arguments, that are making the difference for
126 these conspiracy pitchmen? What if the trauma of living through
127 <span class="emphasis"><em>real conspiracies</em></span> all around us — conspiracies among
128 wealthy people, their lobbyists, and lawmakers to bury inconvenient facts
129 and evidence of wrongdoing (these conspiracies are commonly known as
130 <span class="quote"><span class="quote">corruption</span></span>) — is making people vulnerable to conspiracy
131 theories?
132 </p><p>
133 If it’s trauma and not contagion — material conditions and not ideology —
134 that is making the difference today and enabling a rise of repulsive
135 misinformation in the face of easily observed facts, that doesn’t mean our
136 computer networks are blameless. They’re still doing the heavy work of
137 locating vulnerable people and guiding them through a series of
138 ever-more-extreme ideas and communities.
139 </p><p>
140 Belief in conspiracy is a raging fire that has done real damage and poses
141 real danger to our planet and species, from epidemics <a class="ulink" href="https://www.cdc.gov/measles/cases-outbreaks.html" target="_top">kicked off by vaccine
142 denial</a> to genocides <a class="ulink" href="https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html" target="_top">kicked
143 off by racist conspiracies</a> to planetary meltdown caused by
144 denial-inspired climate inaction. Our world is on fire, and so we have to
145 put the fires out — to figure out how to help people see the truth of the
146 world through the conspiracies they’ve been confused by.
147 </p><p>
148 But firefighting is reactive. We need fire
149 <span class="emphasis"><em>prevention</em></span>. We need to strike at the traumatic material
150 conditions that make people vulnerable to the contagion of conspiracy. Here,
151 too, tech has a role to play.
152 </p><p>
153 There’s no shortage of proposals to address this. From the EU’s <a class="ulink" href="https://edri.org/tag/terreg/" target="_top">Terrorist Content Regulation</a>,
154 which requires platforms to police and remove <span class="quote"><span class="quote">extremist</span></span>
155 content, to the U.S. proposals to <a class="ulink" href="https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution" target="_top">force
156 tech companies to spy on their users</a> and hold them liable <a class="ulink" href="https://www.natlawreview.com/article/repeal-cda-section-230" target="_top">for their
157 users’ bad speech</a>, there’s a lot of energy to force tech companies
158 to solve the problems they created.
159 </p><p>
160 There’s a critical piece missing from the debate, though. All these
161 solutions assume that tech companies are a fixture, that their dominance
162 over the internet is a permanent fact. Proposals to replace Big Tech with a
163 more diffused, pluralistic internet are nowhere to be found. Worse: The
164 <span class="quote"><span class="quote">solutions</span></span> on the table today <span class="emphasis"><em>require</em></span> Big
165 Tech to stay big because only the very largest companies can afford to
166 implement the systems these laws demand.
167 </p><p>
168 Figuring out what we want our tech to look like is crucial if we’re going to
169 get out of this mess. Today, we’re at a crossroads where we’re trying to
170 figure out if we want to fix the Big Tech companies that dominate our
171 internet or if we want to fix the internet itself by unshackling it from Big
172 Tech’s stranglehold. We can’t do both, so we have to choose.
173 </p><p>
174 I want us to choose wisely. Taming Big Tech is integral to fixing the
175 Internet, and for that, we need digital rights activism.
176 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="digital-rights-activism-a-quarter-century-on"></a>Aktywizm praw cyfrowych, krótka historia 25 lat działalności</h2></div></div></div><p>
177 Digital rights activism is more than 30 years old now. The Electronic
178 Frontier Foundation turned 30 this year; the Free Software Foundation
179 launched in 1985. For most of the history of the movement, the most
180 prominent criticism leveled against it was that it was irrelevant: The real
181 activist causes were real-world causes (think of the skepticism when <a class="ulink" href="https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&amp;text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015." target="_top">Finland
182 declared broadband a human right in 2010</a>), and real-world activism
183 was shoe-leather activism (think of Malcolm Gladwell’s <a class="ulink" href="https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell" target="_top">contempt
184 for <span class="quote"><span class="quote">clicktivism</span></span></a>). But as tech has grown more central
185 to our daily lives, these accusations of irrelevance have given way first to
186 accusations of insincerity (<span class="quote"><span class="quote">You only care about tech because you’re
187 <a class="ulink" href="https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/" target="_top">shilling
188 for tech companies</a></span></span>) to accusations of negligence (<span class="quote"><span class="quote">Why
189 didn’t you foresee that tech could be such a destructive force?</span></span>).
190 But digital rights activism is right where it’s always been: looking out for
191 the humans in a world where tech is inexorably taking over.
192 </p><p>
193 The latest version of this critique comes in the form of <span class="quote"><span class="quote">surveillance
194 capitalism,</span></span> a term coined by business professor Shoshana Zuboff in
195 her long and influential 2019 book, <span class="emphasis"><em>The Age of Surveillance
196 Capitalism: The Fight for a Human Future at the New Frontier of
197 Power</em></span>. Zuboff argues that <span class="quote"><span class="quote">surveillance capitalism</span></span>
198 is a unique creature of the tech industry and that it is unlike any other
199 abusive commercial practice in history, one that is <span class="quote"><span class="quote">constituted by
200 unexpected and often illegible mechanisms of extraction, commodification,
201 and control that effectively exile persons from their own behavior while
202 producing new markets of behavioral prediction and
203 modification. Surveillance capitalism challenges democratic norms and
204 departs in key ways from the centuries-long evolution of market
205 capitalism.</span></span> It is a new and deadly form of capitalism, a
206 <span class="quote"><span class="quote">rogue capitalism,</span></span> and our lack of understanding of its unique
207 capabilities and dangers represents an existential, species-wide
208 threat. She’s right that capitalism today threatens our species, and she’s
209 right that tech poses unique challenges to our species and civilization, but
210 she’s really wrong about how tech is different and why it threatens our
211 species.
212 </p><p>
213 What’s more, I think that her incorrect diagnosis will lead us down a path
214 that ends up making Big Tech stronger, not weaker. We need to take down Big
215 Tech, and to do that, we need to start by correctly identifying the problem.
216 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="tech-exceptionalism-then-and-now"></a>Wyjątkowość Technologii, dawniej i obecnie</h2></div></div></div><p>
217 Early critics of the digital rights movement — perhaps best represented by
218 campaigning organizations like the Electronic Frontier Foundation, the Free
219 Software Foundation, Public Knowledge, and others that focused on preserving
220 and enhancing basic human rights in the digital realm — damned activists for
221 practicing <span class="quote"><span class="quote">tech exceptionalism.</span></span> Around the turn of the
222 millennium, serious people ridiculed any claim that tech policy mattered in
223 the <span class="quote"><span class="quote">real world.</span></span> Claims that tech rules had implications for
224 speech, association, privacy, search and seizure, and fundamental rights and
225 equities were treated as ridiculous, an elevation of the concerns of sad
226 nerds arguing about <span class="emphasis"><em>Star Trek</em></span> on bulletin board systems
227 above the struggles of the Freedom Riders, Nelson Mandela, or the Warsaw
228 ghetto uprising.
229 </p><p>
230 In the decades since, accusations of <span class="quote"><span class="quote">tech exceptionalism</span></span> have
231 only sharpened as tech’s role in everyday life has expanded: Now that tech
232 has infiltrated every corner of our life and our online lives have been
233 monopolized by a handful of giants, defenders of digital freedoms are
234 accused of carrying water for Big Tech, providing cover for its
235 self-interested negligence (or worse, nefarious plots).
236 </p><p>
237 From my perspective, the digital rights movement has remained stationary
238 while the rest of the world has moved. From the earliest days, the
239 movement’s concern was users and the toolsmiths who provided the code they
240 needed to realize their fundamental rights. Digital rights activists only
241 cared about companies to the extent that companies were acting to uphold
242 users’ rights (or, just as often, when companies were acting so foolishly
243 that they threatened to bring down new rules that would also make it harder
244 for good actors to help users).
245 </p><p>
246 The <span class="quote"><span class="quote">surveillance capitalism</span></span> critique recasts the digital
247 rights movement in a new light again: not as alarmists who overestimate the
248 importance of their shiny toys nor as shills for big tech but as serene
249 deck-chair rearrangers whose long-standing activism is a liability because
250 it makes them incapable of perceiving novel threats as they continue to
251 fight the last century’s tech battles.
252 </p><p>
253 But tech exceptionalism is a sin no matter who practices it.
254 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="dont-believe-the-hype"></a>Nie wierz 'szumom' medialnym</h2></div></div></div><p>
255 You’ve probably heard that <span class="quote"><span class="quote">if you’re not paying for the product,
256 you’re the product.</span></span> As we’ll see below, that’s true, if incomplete.
257 But what is <span class="emphasis"><em>absolutely</em></span> true is that ad-driven Big
258 Tech’s customers are advertisers, and what companies like Google and
259 Facebook sell is their ability to convince <span class="emphasis"><em>you</em></span> to buy
260 stuff. Big Tech’s product is persuasion. The services — social media, search
261 engines, maps, messaging, and more — are delivery systems for persuasion.
262 </p><p>
263 The fear of surveillance capitalism starts from the (correct) presumption
264 that everything Big Tech says about itself is probably a lie. But the
265 surveillance capitalism critique makes an exception for the claims Big Tech
266 makes in its sales literature — the breathless hype in the pitches to
267 potential advertisers online and in ad-tech seminars about the efficacy of
268 its products: It assumes that Big Tech is as good at influencing us as they
269 claim they are when they’re selling influencing products to credulous
270 customers. That’s a mistake because sales literature is not a reliable
271 indicator of a product’s efficacy.
272 </p><p>
273 Surveillance capitalism assumes that because advertisers buy a lot of what
274 Big Tech is selling, Big Tech must be selling something real. But Big Tech’s
275 massive sales could just as easily be the result of a popular delusion or
276 something even more pernicious: monopolistic control over our communications
277 and commerce.
278 </p><p>
279 Being watched changes your behavior, and not for the better. It creates
280 risks for our social progress. Zuboff’s book features beautifully wrought
281 explanations of these phenomena. But Zuboff also claims that surveillance
282 literally robs us of our free will — that when our personal data is mixed
283 with machine learning, it creates a system of persuasion so devastating that
284 we are helpless before it. That is, Facebook uses an algorithm to analyze
285 the data it nonconsensually extracts from your daily life and uses it to
286 customize your feed in ways that get you to buy stuff. It is a mind-control
287 ray out of a 1950s comic book, wielded by mad scientists whose
288 supercomputers guarantee them perpetual and total world domination.
289 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="what-is-persuasion"></a>Co to jest przekonywanie?</h2></div></div></div><p>
290 To understand why you shouldn’t worry about mind-control rays — but why you
291 <span class="emphasis"><em>should</em></span> worry about surveillance
292 <span class="emphasis"><em>and</em></span> Big Tech — we must start by unpacking what we mean
293 by <span class="quote"><span class="quote">persuasion.</span></span>
294 </p><p>
295 Google, Facebook, and other surveillance capitalists promise their customers
296 (the advertisers) that if they use machine-learning tools trained on
297 unimaginably large data sets of nonconsensually harvested personal
298 information, they will be able to uncover ways to bypass the rational
299 faculties of the public and direct their behavior, creating a stream of
300 purchases, votes, and other desired outcomes.
301 </p><div class="blockquote"><blockquote class="blockquote"><p>
302 The impact of dominance far exceeds the impact of manipulation and should be
303 central to our analysis and any remedies we seek.
304 </p></blockquote></div><p>
305 But there’s little evidence that this is happening. Instead, the predictions
306 that surveillance capitalism delivers to its customers are much less
307 impressive. Rather than finding ways to bypass our rational faculties,
308 surveillance capitalists like Mark Zuckerberg mostly do one or more of three
309 things:
310 </p><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="segmenting"></a>1. Segmentacja</h3></div></div></div><p>
311 If you’re selling diapers, you have better luck if you pitch them to people
312 in maternity wards. Not everyone who enters or leaves a maternity ward just
313 had a baby, and not everyone who just had a baby is in the market for
314 diapers. But having a baby is a really reliable correlate of being in the
315 market for diapers, and being in a maternity ward is highly correlated with
316 having a baby. Hence diaper ads around maternity wards (and even pitchmen
317 for baby products, who haunt maternity wards with baskets full of freebies).
318 </p><p>
319 Surveillance capitalism is segmenting times a billion. Diaper vendors can go
320 way beyond people in maternity wards (though they can do that, too, with
321 things like location-based mobile ads). They can target you based on
322 whether you’re reading articles about child-rearing, diapers, or a host of
323 other subjects, and data mining can suggest unobvious keywords to advertise
324 against. They can target you based on the articles you’ve recently
325 read. They can target you based on what you’ve recently purchased. They can
326 target you based on whether you receive emails or private messages about
327 these subjects — or even if you speak aloud about them (though Facebook and
328 the like convincingly claim that’s not happening — yet).
329 </p><p>
330 This is seriously creepy.
331 </p><p>
332 But it’s not mind control.
333 </p><p>
334 It doesn’t deprive you of your free will. It doesn’t trick you.
335 </p><p>
336 Think of how surveillance capitalism works in politics. Surveillance
337 capitalist companies sell political operatives the power to locate people
338 who might be receptive to their pitch. Candidates campaigning on finance
339 industry corruption seek people struggling with debt; candidates campaigning
340 on xenophobia seek out racists. Political operatives have always targeted
341 their message whether their intentions were honorable or not: Union
342 organizers set up pitches at factory gates, and white supremacists hand out
343 fliers at John Birch Society meetings.
344 </p><p>
345 But this is an inexact and thus wasteful practice. The union organizer can’t
346 know which worker to approach on the way out of the factory gates and may
347 waste their time on a covert John Birch Society member; the white
348 supremacist doesn’t know which of the Birchers are so delusional that making
349 it to a meeting is as much as they can manage and which ones might be
350 convinced to cross the country to carry a tiki torch through the streets of
351 Charlottesville, Virginia.
352 </p><p>
353 Because targeting improves the yields on political pitches, it can
354 accelerate the pace of political upheaval by making it possible for everyone
355 who has secretly wished for the toppling of an autocrat — or just an 11-term
356 incumbent politician — to find everyone else who feels the same way at very
357 low cost. This has been critical to the rapid crystallization of recent
358 political movements including Black Lives Matter and Occupy Wall Street as
359 well as less savory players like the far-right white nationalist movements
360 that marched in Charlottesville.
361 </p><p>
362 It’s important to differentiate this kind of political organizing from
363 influence campaigns; finding people who secretly agree with you isn’t the
364 same as convincing people to agree with you. The rise of phenomena like
365 nonbinary or otherwise nonconforming gender identities is often
366 characterized by reactionaries as the result of online brainwashing
367 campaigns that convince impressionable people that they have been secretly
368 queer all along.
369 </p><p>
370 But the personal accounts of those who have come out tell a different story
371 where people who long harbored a secret about their gender were emboldened
372 by others coming forward and where people who knew that they were different
373 but lacked a vocabulary for discussing that difference learned the right
374 words from these low-cost means of finding people and learning about their
375 ideas.
376 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="deception"></a>2. Podstęp</h3></div></div></div><p>
377 Lies and fraud are pernicious, and surveillance capitalism supercharges them
378 through targeting. If you want to sell a fraudulent payday loan or subprime
379 mortgage, surveillance capitalism can help you find people who are both
380 desperate and unsophisticated and thus receptive to your pitch. This
381 accounts for the rise of many phenomena, like multilevel marketing schemes,
382 in which deceptive claims about potential earnings and the efficacy of sales
383 techniques are targeted at desperate people by advertising against search
384 queries that indicate, for example, someone struggling with ill-advised
385 loans.
386 </p><p>
387 Surveillance capitalism also abets fraud by making it easy to locate other
388 people who have been similarly deceived, forming a community of people who
389 reinforce one another’s false beliefs. Think of <a class="ulink" href="https://www.vulture.com/2020/01/the-dream-podcast-review.html" target="_top">the
390 forums</a> where people who are being victimized by multilevel marketing
391 frauds gather to trade tips on how to improve their luck in peddling the
392 product.
393 </p><p>
394 Sometimes, online deception involves replacing someone’s correct beliefs
395 with incorrect ones, as it does in the anti-vaccination movement, whose
396 victims are often people who start out believing in vaccines but are
397 convinced by seemingly plausible evidence that leads them into the false
398 belief that vaccines are harmful.
399 </p><p>
400 But it’s much more common for fraud to succeed when it doesn’t have to
401 displace a true belief. When my daughter contracted head lice at daycare,
402 one of the daycare workers told me I could get rid of them by treating her
403 hair and scalp with olive oil. I didn’t know anything about head lice, and I
404 assumed that the daycare worker did, so I tried it (it didn’t work, and it
405 doesn’t work). It’s easy to end up with false beliefs when you simply don’t
406 know any better and when those beliefs are conveyed by someone who seems to
407 know what they’re doing.
408 </p><p>
409 This is pernicious and difficult — and it’s also the kind of thing the
410 internet can help guard against by making true information available,
411 especially in a form that exposes the underlying deliberations among parties
412 with sharply divergent views, such as Wikipedia. But it’s not brainwashing;
413 it’s fraud. In the <a class="ulink" href="https://datasociety.net/library/data-voids/" target="_top">majority of cases</a>,
414 the victims of these fraud campaigns have an informational void filled in
415 the customary way, by consulting a seemingly reliable source. If I look up
416 the length of the Brooklyn Bridge and learn that it is 5,800 feet long, but
417 in reality, it is 5,989 feet long, the underlying deception is a problem,
418 but it’s a problem with a simple remedy. It’s a very different problem from
419 the anti-vax issue in which someone’s true belief is displaced by a false
420 one by means of sophisticated persuasion.
421 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="domination"></a>3. Dominacja</h3></div></div></div><p>
422 Surveillance capitalism is the result of monopoly. Monopoly is the cause,
423 and surveillance capitalism and its negative outcomes are the effects of
424 monopoly. I’ll get into this in depth later, but for now, suffice it to say
425 that the tech industry has grown up with a radical theory of antitrust that
426 has allowed companies to grow by merging with their rivals, buying up their
427 nascent competitors, and expanding to control whole market verticals.
428 </p><p>
429 One example of how monopolism aids in persuasion is through dominance:
430 Google makes editorial decisions about its algorithms that determine the
431 sort order of the responses to our queries. If a cabal of fraudsters have
432 set out to trick the world into thinking that the Brooklyn Bridge is 5,800
433 feet long, and if Google gives a high search rank to this group in response
434 to queries like <span class="quote"><span class="quote">How long is the Brooklyn Bridge?</span></span> then the
435 first eight or 10 screens’ worth of Google results could be wrong. And since
436 most people don’t go beyond the first couple of results — let alone the
437 first <span class="emphasis"><em>page</em></span> of results — Google’s choice means that many
438 people will be deceived.
439 </p><p>
440 Google’s dominance over search — more than 86% of web searches are performed
441 through Google — means that the way it orders its search results has an
442 outsized effect on public beliefs. Ironically, Google claims this is why it
443 can’t afford to have any transparency in its algorithm design: Google’s
444 search dominance makes the results of its sorting too important to risk
445 telling the world how it arrives at those results lest some bad actor
446 discover a flaw in the ranking system and exploit it to push its point of
447 view to the top of the search results. There’s an obvious remedy to a
448 company that is too big to audit: break it up into smaller pieces.
449 </p><p>
450 Zuboff calls surveillance capitalism a <span class="quote"><span class="quote">rogue capitalism</span></span> whose
451 data-hoarding and machine-learning techniques rob us of our free will. But
452 influence campaigns that seek to displace existing, correct beliefs with
453 false ones have an effect that is small and temporary while monopolistic
454 dominance over informational systems has massive, enduring
455 effects. Controlling the results to the world’s search queries means
456 controlling access both to arguments and their rebuttals and, thus, control
457 over much of the world’s beliefs. If our concern is how corporations are
458 foreclosing on our ability to make up our own minds and determine our own
459 futures, the impact of dominance far exceeds the impact of manipulation and
460 should be central to our analysis and any remedies we seek.
461 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="bypassing-our-rational-faculties"></a>4. Omijanie naszych racjonalnych zdolności</h3></div></div></div><p>
462 <span class="emphasis"><em>This</em></span> is the good stuff: using machine learning,
463 <span class="quote"><span class="quote">dark patterns,</span></span> engagement hacking, and other techniques to
464 get us to do things that run counter to our better judgment. This is mind
465 control.
466 </p><p>
467 Some of these techniques have proven devastatingly effective (if only in the
468 short term). The use of countdown timers on a purchase completion page can
469 create a sense of urgency that causes you to ignore the nagging internal
470 voice suggesting that you should shop around or sleep on your decision. The
471 use of people from your social graph in ads can provide <span class="quote"><span class="quote">social
472 proof</span></span> that a purchase is worth making. Even the auction system
473 pioneered by eBay is calculated to play on our cognitive blind spots,
474 letting us feel like we <span class="quote"><span class="quote">own</span></span> something because we bid on it,
475 thus encouraging us to bid again when we are outbid to ensure that
476 <span class="quote"><span class="quote">our</span></span> things stay ours.
477 </p><p>
478 Games are extraordinarily good at this. <span class="quote"><span class="quote">Free to play</span></span> games
479 manipulate us through many techniques, such as presenting players with a
480 series of smoothly escalating challenges that create a sense of mastery and
481 accomplishment but which sharply transition into a set of challenges that
482 are impossible to overcome without paid upgrades. Add some social proof to
483 the mix — a stream of notifications about how well your friends are faring —
484 and before you know it, you’re buying virtual power-ups to get to the next
485 level.
486 </p><p>
487 Companies have risen and fallen on these techniques, and the
488 <span class="quote"><span class="quote">fallen</span></span> part is worth paying attention to. In general, living
489 things adapt to stimulus: Something that is very compelling or noteworthy
490 when you first encounter it fades with repetition until you stop noticing it
491 altogether. Consider the refrigerator hum that irritates you when it starts
492 up but disappears into the background so thoroughly that you only notice it
493 when it stops again.
494 </p><p>
495 That’s why behavioral conditioning uses <span class="quote"><span class="quote">intermittent reinforcement
496 schedules.</span></span> Instead of giving you a steady drip of encouragement or
497 setbacks, games and gamified services scatter rewards on a randomized
498 schedule — often enough to keep you interested and random enough that you
499 can never quite find the pattern that would make it boring.
500 </p><p>
501 Intermittent reinforcement is a powerful behavioral tool, but it also
502 represents a collective action problem for surveillance capitalism. The
503 <span class="quote"><span class="quote">engagement techniques</span></span> invented by the behaviorists of
504 surveillance capitalist companies are quickly copied across the whole sector
505 so that what starts as a mysteriously compelling fillip in the design of a
506 service—like <span class="quote"><span class="quote">pull to refresh</span></span> or alerts when someone likes
507 your posts or side quests that your characters get invited to while in the
508 midst of main quests—quickly becomes dully ubiquitous. The
509 impossible-to-nail-down nonpattern of randomized drips from your phone
510 becomes a grey-noise wall of sound as every single app and site starts to
511 make use of whatever seems to be working at the time.
512 </p><p>
513 From the surveillance capitalist’s point of view, our adaptive capacity is
514 like a harmful bacterium that deprives it of its food source — our attention
515 — and novel techniques for snagging that attention are like new antibiotics
516 that can be used to breach our defenses and destroy our
517 self-determination. And there <span class="emphasis"><em>are</em></span> techniques like
518 that. Who can forget the Great Zynga Epidemic, when all of our friends were
519 caught in <span class="emphasis"><em>FarmVille</em></span>’s endless, mindless dopamine loops?
520 But every new attention-commanding technique is jumped on by the whole
521 industry and used so indiscriminately that antibiotic resistance sets
522 in. Given enough repetition, almost all of us develop immunity to even the
523 most powerful techniques — by 2013, two years after Zynga’s peak, its user
524 base had halved.
525 </p><p>
526 Not everyone, of course. Some people never adapt to stimulus, just as some
527 people never stop hearing the hum of the refrigerator. This is why most
528 people who are exposed to slot machines play them for a while and then move
529 on while a small and tragic minority liquidate their kids’ college funds,
530 buy adult diapers, and position themselves in front of a machine until they
531 collapse.
532 </p><p>
533 But surveillance capitalism’s margins on behavioral modification
534 suck. Tripling the rate at which someone buys a widget sounds great <a class="ulink" href="https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957" target="_top">unless
535 the base rate is way less than 1%</a> with an improved rate of… still
536 less than 1%. Even penny slot machines pull down pennies for every spin
537 while surveillance capitalism rakes in infinitesimal penny fractions.
538 </p><p>
539 Slot machines’ high returns mean that they can be profitable just by
540 draining the fortunes of the small rump of people who are pathologically
541 vulnerable to them and unable to adapt to their tricks. But surveillance
542 capitalism can’t survive on the fractional pennies it brings down from that
543 vulnerable sliver — that’s why, after the Great Zynga Epidemic had finally
544 burned itself out, the small number of still-addicted players left behind
545 couldn’t sustain it as a global phenomenon. And new powerful attention
546 weapons aren’t easy to find, as is evidenced by the long years since the
547 last time Zynga had a hit. Despite the hundreds of millions of dollars that
548 Zynga has to spend on developing new tools to blast through our adaptation,
549 it has never managed to repeat the lucky accident that let it snag so much
550 of our attention for a brief moment in 2009. Powerhouses like Supercell have
551 fared a little better, but they are rare and throw away many failures for
552 every success.
553 </p><p>
554 The vulnerability of small segments of the population to dramatic, efficient
555 corporate manipulation is a real concern that’s worthy of our attention and
556 energy. But it’s not an existential threat to society.
557 </p></div></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak"></a>Jeśli dane są nowym paliwem, to silnik kapitalistycznych systemów nadzoru ma
558 wyciek</h2></div></div></div><p>
559 This adaptation problem offers an explanation for one of surveillance
560 capitalism’s most alarming traits: its relentless hunger for data and its
561 endless expansion of data-gathering capabilities through the spread of
562 sensors, online surveillance, and acquisition of data streams from third
563 parties.
564 </p><p>
565 Zuboff observes this phenomenon and concludes that data must be very
566 valuable if surveillance capitalism is so hungry for it. (In her words:
567 <span class="quote"><span class="quote">Just as industrial capitalism was driven to the continuous
568 intensification of the means of production, so surveillance capitalists and
569 their market players are now locked into the continuous intensification of
570 the means of behavioral modification and the gathering might of
571 instrumentarian power.</span></span>) But what if the voracious appetite is
572 because data has such a short half-life — because people become inured so
573 quickly to new, data-driven persuasion techniques — that the companies are
574 locked in an arms race with our limbic system? What if it’s all a Red
575 Queen’s race where they have to run ever faster — collect ever-more data —
576 just to stay in the same spot?
577 </p><p>
578 Of course, all of Big Tech’s persuasion techniques work in concert with one
579 another, and collecting data is useful beyond mere behavioral trickery.
580 </p><p>
581 If someone wants to recruit you to buy a refrigerator or join a pogrom, they
582 might use profiling and targeting to send messages to people they judge to
583 be good sales prospects. The messages themselves may be deceptive, making
584 claims about things you’re not very knowledgeable about (food safety and
585 energy efficiency or eugenics and historical claims about racial
586 superiority). They might use search engine optimization and/or armies of
587 fake reviewers and commenters and/or paid placement to dominate the
588 discourse so that any search for further information takes you back to their
589 messages. And finally, they may refine the different pitches using machine
590 learning and other techniques to figure out what kind of pitch works best on
591 someone like you.
592 </p><p>
593 Each phase of this process benefits from surveillance: The more data they
594 have, the more precisely they can profile you and target you with specific
595 messages. Think of how you’d sell a fridge if you knew that the warranty on
596 your prospect’s fridge just expired and that they were expecting a tax
597 rebate in April.
598 </p><p>
599 Also, the more data they have, the better they can craft deceptive messages
600 — if I know that you’re into genealogy, I might not try to feed you
601 pseudoscience about genetic differences between <span class="quote"><span class="quote">races,</span></span>
602 sticking instead to conspiratorial secret histories of <span class="quote"><span class="quote">demographic
603 replacement</span></span> and the like.
604 </p><p>
605 Facebook also helps you locate people who have the same odious or antisocial
606 views as you. It makes it possible to find other people who want to carry
607 tiki torches through the streets of Charlottesville in Confederate
608 cosplay. It can help you find other people who want to join your militia and
609 go to the border to look for undocumented migrants to terrorize. It can help
610 you find people who share your belief that vaccines are poison and that the
611 Earth is flat.
612 </p><p>
613 There is one way in which targeted advertising uniquely benefits those
614 advocating for socially unacceptable causes: It is invisible. Racism is
615 widely geographically dispersed, and there are few places where racists —
616 and only racists — gather. This is similar to the problem of selling
617 refrigerators in that potential refrigerator purchasers are geographically
618 dispersed and there are few places where you can buy an ad that will be
619 primarily seen by refrigerator customers. But buying a refrigerator is
620 socially acceptable while being a Nazi is not, so you can buy a billboard or
621 advertise in the newspaper sports section for your refrigerator business,
622 and the only potential downside is that your ad will be seen by a lot of
623 people who don’t want refrigerators, resulting in a lot of wasted expense.
624 </p><p>
625 But even if you wanted to advertise your Nazi movement on a billboard or
626 prime-time TV or the sports section, you would struggle to find anyone
627 willing to sell you the space for your ad partly because they disagree with
628 your views and partly because they fear censure (boycott, reputational
629 damage, etc.) from other people who disagree with your views.
630 </p><p>
631 Targeted ads solve this problem: On the internet, every ad unit can be
632 different for every person, meaning that you can buy ads that are only shown
633 to people who appear to be Nazis and not to people who hate Nazis. When
634 there’s spillover — when someone who hates racism is shown a racist
635 recruiting ad — there is some fallout; the platform or publication might get
636 an angry public or private denunciation. But the nature of the risk assumed
637 by an online ad buyer is different than the risks to a traditional publisher
638 or billboard owner who might want to run a Nazi ad.
639 </p><p>
640 Online ads are placed by algorithms that broker between a diverse ecosystem
641 of self-serve ad platforms that anyone can buy an ad through, so the Nazi ad
642 that slips onto your favorite online publication isn’t seen as their moral
643 failing but rather as a failure in some distant, upstream ad supplier. When
644 a publication gets a complaint about an offensive ad that’s appearing in one
645 of its units, it can take some steps to block that ad, but the Nazi might
646 buy a slightly different ad from a different broker serving the same
647 unit. And in any event, internet users increasingly understand that when
648 they see an ad, it’s likely that the advertiser did not choose that
649 publication and that the publication has no idea who its advertisers are.
650 </p><p>
651 These layers of indirection between advertisers and publishers serve as
652 moral buffers: Today’s moral consensus is largely that publishers shouldn’t
653 be held responsible for the ads that appear on their pages because they’re
654 not actively choosing to put those ads there. Because of this, Nazis are
655 able to overcome significant barriers to organizing their movement.
656 </p><p>
657 Data has a complex relationship with domination. Being able to spy on your
658 customers can alert you to their preferences for your rivals and allow you
659 to head off your rivals at the pass.
660 </p><p>
661 More importantly, if you can dominate the information space while also
662 gathering data, then you make other deceptive tactics stronger because it’s
663 harder to break out of the web of deceit you’re spinning. Domination — that
664 is, ultimately becoming a monopoly — and not the data itself is the
665 supercharger that makes every tactic worth pursuing because monopolistic
666 domination deprives your target of an escape route.
667 </p><p>
668 If you’re a Nazi who wants to ensure that your prospects primarily see
669 deceptive, confirming information when they search for more, you can improve
670 your odds by seeding the search terms they use through your initial
671 communications. You don’t need to own the top 10 results for <span class="quote"><span class="quote">voter
672 suppression</span></span> if you can convince your marks to confine their search
673 terms to <span class="quote"><span class="quote">voter fraud,</span></span> which throws up a very different set of
674 search results.
675 </p><p>
676 Surveillance capitalists are like stage mentalists who claim that their
677 extraordinary insights into human behavior let them guess the word that you
678 wrote down and folded up in your pocket but who really use shills, hidden
679 cameras, sleight of hand, and brute-force memorization to amaze you.
680 </p><p>
681 Or perhaps they’re more like pick-up artists, the misogynistic cult that
682 promises to help awkward men have sex with women by teaching them
683 <span class="quote"><span class="quote">neurolinguistic programming</span></span> phrases, body language
684 techniques, and psychological manipulation tactics like
685 <span class="quote"><span class="quote">negging</span></span> — offering unsolicited negative feedback to women to
686 lower their self-esteem and prick their interest.
687 </p><p>
688 Some pick-up artists eventually manage to convince women to go home with
689 them, but it’s not because these men have figured out how to bypass women’s
690 critical faculties. Rather, pick-up artists’ <span class="quote"><span class="quote">success</span></span> stories
691 are a mix of women who were incapable of giving consent, women who were
692 coerced, women who were intoxicated, self-destructive women, and a few women
693 who were sober and in command of their faculties but who didn’t realize
694 straightaway that they were with terrible men but rectified the error as
695 soon as they could.
696 </p><p>
697 Pick-up artists <span class="emphasis"><em>believe</em></span> they have figured out a secret
698 back door that bypasses women’s critical faculties, but they haven’t. Many
699 of the tactics they deploy, like negging, became the butt of jokes (just
700 like people joke about bad ad targeting), and there’s a good chance that
701 anyone they try these tactics on will immediately recognize them and dismiss
702 the men who use them as irredeemable losers.
703 </p><p>
704 Pick-up artists are proof that people can believe they have developed a
705 system of mind control <span class="emphasis"><em>even when it doesn’t
706 work</em></span>. Pick-up artists simply exploit the fact that
707 one-in-a-million chances can come through for you if you make a million
708 attempts, and then they assume that the other 999,999 times, they simply
709 performed the technique incorrectly and commit themselves to doing better
710 next time. There’s only one group of people who find pick-up artist lore
711 reliably convincing: other would-be pick-up artists whose anxiety and
712 insecurity make them vulnerable to scammers and delusional men who convince
713 them that if they pay for tutelage and follow instructions, then they will
714 someday succeed. Pick-up artists assume they fail to entice women because
715 they are bad at being pick-up artists, not because pick-up artistry is
716 bullshit. Pick-up artists are bad at selling themselves to women, but
717 they’re much better at selling themselves to men who pay to learn the
718 secrets of pick-up artistry.
719 </p><p>
720 Department store pioneer John Wanamaker is said to have lamented,
721 <span class="quote"><span class="quote">Half the money I spend on advertising is wasted; the trouble is I
722 don’t know which half.</span></span> The fact that Wanamaker thought that only
723 half of his advertising spending was wasted is a tribute to the
724 persuasiveness of advertising executives, who are <span class="emphasis"><em>much</em></span>
725 better at convincing potential clients to buy their services than they are
726 at convincing the general public to buy their clients’ wares.
727 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="what-is-facebook"></a>Co to jest Facebook?</h2></div></div></div><p>
728 Facebook is heralded as the origin of all of our modern plagues, and it’s
729 not hard to see why. Some tech companies want to lock their users in but
730 make their money by monopolizing access to the market for apps for their
731 devices and gouging them on prices rather than by spying on them (like
732 Apple). Some companies don’t care about locking in users because they’ve
733 figured out how to spy on them no matter where they are and what they’re
734 doing and can turn that surveillance into money (Google). Facebook alone
735 among the Western tech giants has built a business based on locking in its
736 users <span class="emphasis"><em>and</em></span> spying on them all the time.
737 </p><p>
738 Facebook’s surveillance regime is really without parallel in the Western
739 world. Though Facebook tries to prevent itself from being visible on the
740 public web, hiding most of what goes on there from people unless they’re
741 logged into Facebook, the company has nevertheless booby-trapped the entire
742 web with surveillance tools in the form of Facebook <span class="quote"><span class="quote">Like</span></span>
743 buttons that web publishers include on their sites to boost their Facebook
744 profiles. Facebook also makes various libraries and other useful code
745 snippets available to web publishers that act as surveillance tendrils on
746 the sites where they’re used, funneling information about visitors to the
747 site — newspapers, dating sites, message boards — to Facebook.
748 </p><div class="blockquote"><blockquote class="blockquote"><p>
749 Big Tech is able to practice surveillance not just because it is tech but
750 because it is <span class="emphasis"><em>big</em></span>.
751 </p></blockquote></div><p>
752 Facebook offers similar tools to app developers, so the apps — games, fart
753 machines, business review services, apps for keeping abreast of your kid’s
754 schooling — you use will send information about your activities to Facebook
755 even if you don’t have a Facebook account and even if you don’t download or
756 use Facebook apps. On top of all that, Facebook buys data from third-party
757 brokers on shopping habits, physical location, use of <span class="quote"><span class="quote">loyalty</span></span>
758 programs, financial transactions, etc., and cross-references that with the
759 dossiers it develops on activity on Facebook and with apps and the public
760 web.
761 </p><p>
762 Though it’s easy to integrate the web with Facebook — linking to news
763 stories and such — Facebook products are generally not available to be
764 integrated back into the web itself. You can embed a tweet in a Facebook
765 post, but if you embed a Facebook post in a tweet, you just get a link back
766 to Facebook and must log in before you can see it. Facebook has used extreme
767 technological and legal countermeasures to prevent rivals from allowing
768 their users to embed Facebook snippets in competing services or to create
769 alternative interfaces to Facebook that merge your Facebook inbox with those
770 of other services that you use.
771 </p><p>
772 And Facebook is incredibly popular, with 2.3 billion claimed users (though
773 many believe this figure to be inflated). Facebook has been used to organize
774 genocidal pogroms, racist riots, anti-vaccination movements, flat Earth
775 cults, and the political lives of some of the world’s ugliest, most brutal
776 autocrats. There are some really alarming things going on in the world, and
777 Facebook is implicated in many of them, so it’s easy to conclude that these
778 bad things are the result of Facebook’s mind-control system, which it rents
779 out to anyone with a few bucks to spend.
780 </p><p>
781 To understand what role Facebook plays in the formulation and mobilization
782 of antisocial movements, we need to understand the dual nature of Facebook.
783 </p><p>
784 Because it has a lot of users and a lot of data about those users, Facebook
785 is a very efficient tool for locating people with hard-to-find traits, the
786 kinds of traits that are widely diffused in the population such that
787 advertisers have historically struggled to find a cost-effective way to
788 reach them. Think back to refrigerators: Most of us only replace our major
789 appliances a few times in our entire lives. If you’re a refrigerator
790 manufacturer or retailer, you have these brief windows in the life of a
791 consumer during which they are pondering a purchase, and you have to somehow
792 reach them. Anyone who’s ever registered a title change after buying a house
793 can attest that appliance manufacturers are incredibly desperate to reach
794 anyone who has even the slenderest chance of being in the market for a new
795 fridge.
796 </p><p>
797 Facebook makes finding people shopping for refrigerators a
798 <span class="emphasis"><em>lot</em></span> easier. It can target ads to people who’ve
799 registered a new home purchase, to people who’ve searched for refrigerator
800 buying advice, to people who have complained about their fridge dying, or
801 any combination thereof. It can even target people who’ve recently bought
802 <span class="emphasis"><em>other</em></span> kitchen appliances on the theory that someone
803 who’s just replaced their stove and dishwasher might be in a fridge-buying
804 kind of mood. The vast majority of people who are reached by these ads will
805 not be in the market for a new fridge, but — crucially — the percentage of
806 people who <span class="emphasis"><em>are</em></span> looking for fridges that these ads reach
807 is <span class="emphasis"><em>much</em></span> larger than it is than for any group that might
808 be subjected to traditional, offline targeted refrigerator marketing.
809 </p><p>
810 Facebook also makes it a lot easier to find people who have the same rare
811 disease as you, which might have been impossible in earlier eras — the
812 closest fellow sufferer might otherwise be hundreds of miles away. It makes
813 it easier to find people who went to the same high school as you even though
814 decades have passed and your former classmates have all been scattered to
815 the four corners of the Earth.
816 </p><p>
817 Facebook also makes it much easier to find people who hold the same rare
818 political beliefs as you. If you’ve always harbored a secret affinity for
819 socialism but never dared utter this aloud lest you be demonized by your
820 neighbors, Facebook can help you discover other people who feel the same way
821 (and it might just demonstrate to you that your affinity is more widespread
822 than you ever suspected). It can make it easier to find people who share
823 your sexual identity. And again, it can help you to understand that what
824 you thought was a shameful secret that affected only you was really a widely
825 shared trait, giving you both comfort and the courage to come out to the
826 people in your life.
827 </p><p>
828 All of this presents a dilemma for Facebook: Targeting makes the company’s
829 ads more effective than traditional ads, but it also lets advertisers see
830 just how effective their ads are. While advertisers are pleased to learn
831 that Facebook ads are more effective than ads on systems with less
832 sophisticated targeting, advertisers can also see that in nearly every case,
833 the people who see their ads ignore them. Or, at best, the ads work on a
834 subconscious level, creating nebulous unmeasurables like <span class="quote"><span class="quote">brand
835 recognition.</span></span> This means that the price per ad is very low in nearly
836 every case.
837 </p><p>
838 To make things worse, many Facebook groups spark precious little
839 discussion. Your little-league soccer team, the people with the same rare
840 disease as you, and the people you share a political affinity with may
841 exchange the odd flurry of messages at critical junctures, but on a daily
842 basis, there’s not much to say to your old high school chums or other
843 hockey-card collectors.
844 </p><p>
845 With nothing but <span class="quote"><span class="quote">organic</span></span> discussion, Facebook would not
846 generate enough traffic to sell enough ads to make the money it needs to
847 continually expand by buying up its competitors while returning handsome
848 sums to its investors.
849 </p><p>
850 So Facebook has to gin up traffic by sidetracking its own forums: Every time
851 Facebook’s algorithm injects controversial materials — inflammatory
852 political articles, conspiracy theories, outrage stories — into a group, it
853 can hijack that group’s nominal purpose with its desultory discussions and
854 supercharge those discussions by turning them into bitter, unproductive
855 arguments that drag on and on. Facebook is optimized for engagement, not
856 happiness, and it turns out that automated systems are pretty good at
857 figuring out things that people will get angry about.
858 </p><p>
859 Facebook <span class="emphasis"><em>can</em></span> modify our behavior but only in a couple
860 of trivial ways. First, it can lock in all your friends and family members
861 so that you check and check and check with Facebook to find out what they
862 are up to; and second, it can make you angry and anxious. It can force you
863 to choose between being interrupted constantly by updates — a process that
864 breaks your concentration and makes it hard to be introspective — and
865 staying in touch with your friends. This is a very limited form of mind
866 control, and it can only really make us miserable, angry, and anxious.
867 </p><p>
868 This is why Facebook’s targeting systems — both the ones it shows to
869 advertisers and the ones that let users find people who share their
870 interests — are so next-gen and smooth and easy to use as well as why its
871 message boards have a toolset that seems like it hasn’t changed since the
872 mid-2000s. If Facebook delivered an equally flexible, sophisticated
873 message-reading system to its users, those users could defend themselves
874 against being nonconsensually eyeball-fucked with Donald Trump headlines.
875 </p><p>
876 The more time you spend on Facebook, the more ads it gets to show you. The
877 solution to Facebook’s ads only working one in a thousand times is for the
878 company to try to increase how much time you spend on Facebook by a factor
879 of a thousand. Rather than thinking of Facebook as a company that has
880 figured out how to show you exactly the right ad in exactly the right way to
881 get you to do what its advertisers want, think of it as a company that has
882 figured out how to make you slog through an endless torrent of arguments
883 even though they make you miserable, spending so much time on the site that
884 it eventually shows you at least one ad that you respond to.
885 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="monopoly-and-the-right-to-the-future-tense"></a>Monopol i prawo do czasu przyszłego</h2></div></div></div><p>
886 Zuboff and her cohort are particularly alarmed at the extent to which
887 surveillance allows corporations to influence our decisions, taking away
888 something she poetically calls <span class="quote"><span class="quote">the right to the future tense</span></span>
889 — that is, the right to decide for yourself what you will do in the future.
890 </p><p>
891 It’s true that advertising can tip the scales one way or another: When
892 you’re thinking of buying a fridge, a timely fridge ad might end the search
893 on the spot. But Zuboff puts enormous and undue weight on the persuasive
894 power of surveillance-based influence techniques. Most of these don’t work
895 very well, and the ones that do won’t work for very long. The makers of
896 these influence tools are confident they will someday refine them into
897 systems of total control, but they are hardly unbiased observers, and the
898 risks from their dreams coming true are very speculative.
899 </p><p>
900 By contrast, Zuboff is rather sanguine about 40 years of lax antitrust
901 practice that has allowed a handful of companies to dominate the internet,
902 ushering in an information age with, <a class="ulink" href="https://twitter.com/tveastman/status/1069674780826071040" target="_top">as one person
903 on Twitter noted</a>, five giant websites each filled with screenshots
904 of the other four.
905 </p><p>
906 However, if we are to be alarmed that we might lose the right to choose for
907 ourselves what our future will hold, then monopoly’s nonspeculative,
908 concrete, here-and-now harms should be front and center in our debate over
909 tech policy.
910 </p><p>
911 Start with <span class="quote"><span class="quote">digital rights management.</span></span> In 1998, Bill Clinton
912 signed the Digital Millennium Copyright Act (DMCA) into law. It’s a complex
913 piece of legislation with many controversial clauses but none more so than
914 Section 1201, the <span class="quote"><span class="quote">anti-circumvention</span></span> rule.
915 </p><p>
916 This is a blanket ban on tampering with systems that restrict access to
917 copyrighted works. The ban is so thoroughgoing that it prohibits removing a
918 copyright lock even when no copyright infringement takes place. This is by
919 design: The activities that the DMCA’s Section 1201 sets out to ban are not
920 copyright infringements; rather, they are legal activities that frustrate
921 manufacturers’ commercial plans.
922 </p><p>
923 For example, Section 1201’s first major application was on DVD players as a
924 means of enforcing the region coding built into those devices. DVD-CCA, the
925 body that standardized DVDs and DVD players, divided the world into six
926 regions and specified that DVD players must check each disc to determine
927 which regions it was authorized to be played in. DVD players would have
928 their own corresponding region (a DVD player bought in the U.S. would be
929 region 1 while one bought in India would be region 5). If the player and the
930 disc’s region matched, the player would play the disc; otherwise, it would
931 reject it.
932 </p><p>
933 However, watching a lawfully produced disc in a country other than the one
934 where you purchased it is not copyright infringement — it’s the
935 opposite. Copyright law imposes this duty on customers for a movie: You must
936 go into a store, find a licensed disc, and pay the asking price. Do that —
937 and <span class="emphasis"><em>nothing else</em></span> — and you and copyright are square
938 with one another.
939 </p><p>
940 The fact that a movie studio wants to charge Indians less than Americans or
941 release in Australia later than it releases in the U.K. has no bearing on
942 copyright law. Once you lawfully acquire a DVD, it is no copyright
943 infringement to watch it no matter where you happen to be.
944 </p><p>
945 So DVD and DVD player manufacturers would not be able to use accusations of
946 abetting copyright infringement to punish manufacturers who made
947 noncompliant players that would play discs from any region or repair shops
948 that modified players to let you watch out-of-region discs or software
949 programmers who created programs to let you do this.
950 </p><p>
951 That’s where Section 1201 of the DMCA comes in: By banning tampering with an
952 <span class="quote"><span class="quote">access control,</span></span> the rule gave manufacturers and rights
953 holders standing to sue competitors who released superior products with
954 lawful features that the market demanded (in this case, region-free
955 players).
956 </p><p>
957 This is an odious scam against consumers, but as time went by, Section 1201
958 grew to encompass a rapidly expanding constellation of devices and services
959 as canny manufacturers have realized certain things:
960 </p><div class="itemizedlist"><ul class="itemizedlist compact" style="list-style-type: disc; "><li class="listitem"><p>
961 Any device with software in it contains a <span class="quote"><span class="quote">copyrighted work</span></span>
962 i.e., the software.
963 </p></li><li class="listitem"><p>
964 A device can be designed so that reconfiguring the software requires
965 bypassing an <span class="quote"><span class="quote">access control for copyrighted works,</span></span> which is a
966 potential felony under Section 1201.
967 </p></li><li class="listitem"><p>
968 Thus, companies can control their customers’ behavior after they take home
969 their purchases by designing products so that all unpermitted uses require
970 modifications that fall afoul of Section 1201.
971 </p></li></ul></div><p>
972 Section 1201 then becomes a means for manufacturers of all descriptions to
973 force their customers to arrange their affairs to benefit the manufacturers’
974 shareholders instead of themselves.
975 </p><p>
976 This manifests in many ways: from a new generation of inkjet printers that
977 use countermeasures to prevent third-party ink that cannot be bypassed
978 without legal risks to similar systems in tractors that prevent third-party
979 technicians from swapping in the manufacturer’s own parts that are not
980 recognized by the tractor’s control system until it is supplied with a
981 manufacturer’s unlock code.
982 </p><p>
983 Closer to home, Apple’s iPhones use these measures to prevent both
984 third-party service and third-party software installation. This allows Apple
985 to decide when an iPhone is beyond repair and must be shredded and
986 landfilled as opposed to the iPhone’s purchaser. (Apple is notorious for its
987 environmentally catastrophic policy of destroying old electronics rather
988 than permitting them to be cannibalized for parts.) This is a very useful
989 power to wield, especially in light of CEO Tim Cook’s January 2019 warning
990 to investors that the company’s profits are endangered by customers choosing
991 to hold onto their phones for longer rather than replacing them.
992 </p><p>
993 Apple’s use of copyright locks also allows it to establish a monopoly over
994 how its customers acquire software for their mobile devices. The App Store’s
995 commercial terms guarantee Apple a share of all revenues generated by the
996 apps sold there, meaning that Apple gets paid when you buy an app from its
997 store and then continues to get paid every time you buy something using that
998 app. This comes out of the bottom line of software developers, who must
999 either charge more or accept lower profits for their products.
1000 </p><p>
1001 Crucially, Apple’s use of copyright locks gives it the power to make
1002 editorial decisions about which apps you may and may not install on your own
1003 device. Apple has used this power to <a class="ulink" href="https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html" target="_top">reject
1004 dictionaries</a> for containing obscene words; to <a class="ulink" href="https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again" target="_top">limit
1005 political speech</a>, especially from apps that make sensitive political
1006 commentary such as an app that notifies you every time a U.S. drone kills
1007 someone somewhere in the world; and to <a class="ulink" href="https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says" target="_top">object
1008 to a game</a> that commented on the Israel-Palestine conflict.
1009 </p><p>
1010 Apple often justifies monopoly power over software installation in the name
1011 of security, arguing that its vetting of apps for its store means that it
1012 can guard its users against apps that contain surveillance code. But this
1013 cuts both ways. In China, the government <a class="ulink" href="https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc" target="_top">ordered
1014 Apple to prohibit the sale of privacy tools</a> like VPNs with the
1015 exception of VPNs that had deliberately introduced flaws designed to let the
1016 Chinese state eavesdrop on users. Because Apple uses technological
1017 countermeasures — with legal backstops — to block customers from installing
1018 unauthorized apps, Chinese iPhone owners cannot readily (or legally) acquire
1019 VPNs that would protect them from Chinese state snooping.
1020 </p><p>
1021 Zuboff calls surveillance capitalism a <span class="quote"><span class="quote">rogue capitalism.</span></span>
1022 Theoreticians of capitalism claim that its virtue is that it <a class="ulink" href="https://en.wikipedia.org/wiki/Price_signal" target="_top">aggregates information in
1023 the form of consumers’ decisions</a>, producing efficient
1024 markets. Surveillance capitalism’s supposed power to rob its victims of
1025 their free will through computationally supercharged influence campaigns
1026 means that our markets no longer aggregate customers’ decisions because we
1027 customers no longer decide — we are given orders by surveillance
1028 capitalism’s mind-control rays.
1029 </p><p>
1030 If our concern is that markets cease to function when consumers can no
1031 longer make choices, then copyright locks should concern us at
1032 <span class="emphasis"><em>least</em></span> as much as influence campaigns. An influence
1033 campaign might nudge you to buy a certain brand of phone; but the copyright
1034 locks on that phone absolutely determine where you get it serviced, which
1035 apps can run on it, and when you have to throw it away rather than fixing
1036 it.
1037 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="search-order-and-the-right-to-the-future-tense"></a>Porządek wyszukiwania i prawo do czasu przyszłego</h2></div></div></div><p>
1038 Markets are posed as a kind of magic: By discovering otherwise hidden
1039 information conveyed by the free choices of consumers, those consumers’
1040 local knowledge is integrated into a self-correcting system that makes
1041 efficient allocations—more efficient than any computer could calculate. But
1042 monopolies are incompatible with that notion. When you only have one app
1043 store, the owner of the store — not the consumer — decides on the range of
1044 choices. As Boss Tweed once said, <span class="quote"><span class="quote">I don’t care who does the electing,
1045 so long as I get to do the nominating.</span></span> A monopolized market is an
1046 election whose candidates are chosen by the monopolist.
1047 </p><p>
1048 This ballot rigging is made more pernicious by the existence of monopolies
1049 over search order. Google’s search market share is about 90%. When Google’s
1050 ranking algorithm puts a result for a popular search term in its top 10,
1051 that helps determine the behavior of millions of people. If Google’s answer
1052 to <span class="quote"><span class="quote">Are vaccines dangerous?</span></span> is a page that rebuts anti-vax
1053 conspiracy theories, then a sizable portion of the public will learn that
1054 vaccines are safe. If, on the other hand, Google sends those people to a
1055 site affirming the anti-vax conspiracies, a sizable portion of those
1056 millions will come away convinced that vaccines are dangerous.
1057 </p><p>
1058 Google’s algorithm is often tricked into serving disinformation as a
1059 prominent search result. But in these cases, Google isn’t persuading people
1060 to change their minds; it’s just presenting something untrue as fact when
1061 the user has no cause to doubt it.
1062 </p><p>
1063 This is true whether the search is for <span class="quote"><span class="quote">Are vaccines
1064 dangerous?</span></span> or <span class="quote"><span class="quote">best restaurants near me.</span></span> Most users
1065 will never look past the first page of search results, and when the
1066 overwhelming majority of people all use the same search engine, the ranking
1067 algorithm deployed by that search engine will determine myriad outcomes
1068 (whether to adopt a child, whether to have cancer surgery, where to eat
1069 dinner, where to move, where to apply for a job) to a degree that vastly
1070 outstrips any behavioral outcomes dictated by algorithmic persuasion
1071 techniques.
1072 </p><p>
1073 Many of the questions we ask search engines have no empirically correct
1074 answers: <span class="quote"><span class="quote">Where should I eat dinner?</span></span> is not an objective
1075 question. Even questions that do have correct answers (<span class="quote"><span class="quote">Are vaccines
1076 dangerous?</span></span>) don’t have one empirically superior source for that
1077 answer. Many pages affirm the safety of vaccines, so which one goes first?
1078 Under conditions of competition, consumers can choose from many search
1079 engines and stick with the one whose algorithmic judgment suits them best,
1080 but under conditions of monopoly, we all get our answers from the same
1081 place.
1082 </p><p>
1083 Google’s search dominance isn’t a matter of pure merit: The company has
1084 leveraged many tactics that would have been prohibited under classical,
1085 pre-Ronald-Reagan antitrust enforcement standards to attain its
1086 dominance. After all, this is a company that has developed two major
1087 products: a really good search engine and a pretty good Hotmail clone. Every
1088 other major success it’s had — Android, YouTube, Google Maps, etc. — has
1089 come through an acquisition of a nascent competitor. Many of the company’s
1090 key divisions, such as the advertising technology of DoubleClick, violate
1091 the historical antitrust principle of structural separation, which forbade
1092 firms from owning subsidiaries that competed with their
1093 customers. Railroads, for example, were barred from owning freight companies
1094 that competed with the shippers whose freight they carried.
1095 </p><p>
1096 If we’re worried about giant companies subverting markets by stripping
1097 consumers of their ability to make free choices, then vigorous antitrust
1098 enforcement seems like an excellent remedy. If we’d denied Google the right
1099 to effect its many mergers, we would also have probably denied it its total
1100 search dominance. Without that dominance, the pet theories, biases, errors
1101 (and good judgment, too) of Google search engineers and product managers
1102 would not have such an outsized effect on consumer choice.
1103 </p><p>
1104 This goes for many other companies. Amazon, a classic surveillance
1105 capitalist, is obviously the dominant tool for searching Amazon — though
1106 many people find their way to Amazon through Google searches and Facebook
1107 posts — and obviously, Amazon controls Amazon search. That means that
1108 Amazon’s own self-serving editorial choices—like promoting its own house
1109 brands over rival goods from its sellers as well as its own pet theories,
1110 biases, and errors— determine much of what we buy on Amazon. And since
1111 Amazon is the dominant e-commerce retailer outside of China and since it
1112 attained that dominance by buying up both large rivals and nascent
1113 competitors in defiance of historical antitrust rules, we can blame the
1114 monopoly for stripping consumers of their right to the future tense and the
1115 ability to shape markets by making informed choices.
1116 </p><p>
1117 Not every monopolist is a surveillance capitalist, but that doesn’t mean
1118 they’re not able to shape consumer choices in wide-ranging ways. Zuboff
1119 lauds Apple for its App Store and iTunes Store, insisting that adding price
1120 tags to the features on its platforms has been the secret to resisting
1121 surveillance and thus creating markets. But Apple is the only retailer
1122 allowed to sell on its platforms, and it’s the second-largest mobile device
1123 vendor in the world. The independent software vendors that sell through
1124 Apple’s marketplace accuse the company of the same surveillance sins as
1125 Amazon and other big retailers: spying on its customers to find lucrative
1126 new products to launch, effectively using independent software vendors as
1127 free-market researchers, then forcing them out of any markets they discover.
1128 </p><p>
1129 Because of its use of copyright locks, Apple’s mobile customers are not
1130 legally allowed to switch to a rival retailer for its apps if they want to
1131 do so on an iPhone. Apple, obviously, is the only entity that gets to decide
1132 how it ranks the results of search queries in its stores. These decisions
1133 ensure that some apps are often installed (because they appear on page one)
1134 and others are never installed (because they appear on page one
1135 million). Apple’s search-ranking design decisions have a vastly more
1136 significant effect on consumer behaviors than influence campaigns delivered
1137 by surveillance capitalism’s ad-serving bots.
1138 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="monopolists-can-afford-sleeping-pills-for-watchdogs"></a>Monopoliści mogą sobie pozwolić na proszki nasenne dla strażników</h2></div></div></div><p>
1139 Only the most extreme market ideologues think that markets can self-regulate
1140 without state oversight. Markets need watchdogs — regulators, lawmakers, and
1141 other elements of democratic control — to keep them honest. When these
1142 watchdogs sleep on the job, then markets cease to aggregate consumer choices
1143 because those choices are constrained by illegitimate and deceptive
1144 activities that companies are able to get away with because no one is
1145 holding them to account.
1146 </p><p>
1147 But this kind of regulatory capture doesn’t come cheap. In competitive
1148 sectors, where rivals are constantly eroding one another’s margins,
1149 individual firms lack the surplus capital to effectively lobby for laws and
1150 regulations that serve their ends.
1151 </p><p>
1152 Many of the harms of surveillance capitalism are the result of weak or
1153 nonexistent regulation. Those regulatory vacuums spring from the power of
1154 monopolists to resist stronger regulation and to tailor what regulation
1155 exists to permit their existing businesses.
1156 </p><p>
1157 Here’s an example: When firms over-collect and over-retain our data, they
1158 are at increased risk of suffering a breach — you can’t leak data you never
1159 collected, and once you delete all copies of that data, you can no longer
1160 leak it. For more than a decade, we’ve lived through an endless parade of
1161 ever-worsening data breaches, each one uniquely horrible in the scale of
1162 data breached and the sensitivity of that data.
1163 </p><p>
1164 But still, firms continue to over-collect and over-retain our data for three
1165 reasons:
1166 </p><p>
1167 <span class="strong"><strong>1. They are locked in the aforementioned limbic arms
1168 race with our capacity to shore up our attentional defense systems to resist
1169 their new persuasion techniques.</strong></span> They’re also locked in an arms
1170 race with their competitors to find new ways to target people for sales
1171 pitches. As soon as they discover a soft spot in our attentional defenses (a
1172 counterintuitive, unobvious way to target potential refrigerator buyers),
1173 the public begins to wise up to the tactic, and their competitors leap on
1174 it, hastening the day in which all potential refrigerator buyers have been
1175 inured to the pitch.
1176 </p><p>
1177 <span class="strong"><strong>2. They believe the surveillance capitalism
1178 story.</strong></span> Data is cheap to aggregate and store, and both proponents
1179 and opponents of surveillance capitalism have assured managers and product
1180 designers that if you collect enough data, you will be able to perform
1181 sorcerous acts of mind control, thus supercharging your sales. Even if you
1182 never figure out how to profit from the data, someone else will eventually
1183 offer to buy it from you to give it a try. This is the hallmark of all
1184 economic bubbles: acquiring an asset on the assumption that someone else
1185 will buy it from you for more than you paid for it, often to sell to someone
1186 else at an even greater price.
1187 </p><p>
1188 <span class="strong"><strong>3. The penalties for leaking data are
1189 negligible.</strong></span> Most countries limit these penalties to actual
1190 damages, meaning that consumers who’ve had their data breached have to show
1191 actual monetary harms to get a reward. In 2014, Home Depot disclosed that it
1192 had lost credit-card data for 53 million of its customers, but it settled
1193 the matter by paying those customers about $0.34 each — and a third of that
1194 $0.34 wasn’t even paid in cash. It took the form of a credit to procure a
1195 largely ineffectual credit-monitoring service.
1196 </p><p>
1197 But the harms from breaches are much more extensive than these
1198 actual-damages rules capture. Identity thieves and fraudsters are wily and
1199 endlessly inventive. All the vast breaches of our century are being
1200 continuously recombined, the data sets merged and mined for new ways to
1201 victimize the people whose data was present in them. Any reasonable,
1202 evidence-based theory of deterrence and compensation for breaches would not
1203 confine damages to actual damages but rather would allow users to claim
1204 these future harms.
1205 </p><p>
1206 However, even the most ambitious privacy rules, such as the EU General Data
1207 Protection Regulation, fall far short of capturing the negative
1208 externalities of the platforms’ negligent over-collection and
1209 over-retention, and what penalties they do provide are not aggressively
1210 pursued by regulators.
1211 </p><p>
1212 This tolerance of — or indifference to — data over-collection and
1213 over-retention can be ascribed in part to the sheer lobbying muscle of the
1214 platforms. They are so profitable that they can handily afford to divert
1215 gigantic sums to fight any real change — that is, change that would force
1216 them to internalize the costs of their surveillance activities.
1217 </p><p>
1218 And then there’s state surveillance, which the surveillance capitalism story
1219 dismisses as a relic of another era when the big worry was being jailed for
1220 your dissident speech, not having your free will stripped away with machine
1221 learning.
1222 </p><p>
1223 But state surveillance and private surveillance are intimately related. As
1224 we saw when Apple was conscripted by the Chinese government as a vital
1225 collaborator in state surveillance, the only really affordable and tractable
1226 way to conduct mass surveillance on the scale practiced by modern states —
1227 both <span class="quote"><span class="quote">free</span></span> and autocratic states — is to suborn commercial
1228 services.
1229 </p><p>
1230 Whether it’s Google being used as a location tracking tool by local law
1231 enforcement across the U.S. or the use of social media tracking by the
1232 Department of Homeland Security to build dossiers on participants in
1233 protests against Immigration and Customs Enforcement’s family separation
1234 practices, any hard limits on surveillance capitalism would hamstring the
1235 state’s own surveillance capability. Without Palantir, Amazon, Google, and
1236 other major tech contractors, U.S. cops would not be able to spy on Black
1237 people, ICE would not be able to manage the caging of children at the U.S.
1238 border, and state welfare systems would not be able to purge their rolls by
1239 dressing up cruelty as empiricism and claiming that poor and vulnerable
1240 people are ineligible for assistance. At least some of the states’
1241 unwillingness to take meaningful action to curb surveillance should be
1242 attributed to this symbiotic relationship. There is no mass state
1243 surveillance without mass commercial surveillance.
1244 </p><p>
1245 Monopolism is key to the project of mass state surveillance. It’s true that
1246 smaller tech firms are apt to be less well-defended than Big Tech, whose
1247 security experts are drawn from the tops of their field and who are given
1248 enormous resources to secure and monitor their systems against
1249 intruders. But smaller firms also have less to protect: fewer users whose
1250 data is more fragmented across more systems and have to be suborned one at a
1251 time by state actors.
1252 </p><p>
1253 A concentrated tech sector that works with authorities is a much more
1254 powerful ally in the project of mass state surveillance than a fragmented
1255 one composed of smaller actors. The U.S. tech sector is small enough that
1256 all of its top executives fit around a single boardroom table in Trump Tower
1257 in 2017, shortly after Trump’s inauguration. Most of its biggest players bid
1258 to win JEDI, the Pentagon’s $10 billion Joint Enterprise Defense
1259 Infrastructure cloud contract. Like other highly concentrated industries,
1260 Big Tech rotates its key employees in and out of government service, sending
1261 them to serve in the Department of Defense and the White House, then hiring
1262 ex-Pentagon and ex-DOD top staffers and officers to work in their own
1263 government relations departments.
1264 </p><p>
1265 They can even make a good case for doing this: After all, when there are
1266 only four or five big companies in an industry, everyone qualified to
1267 regulate those companies has served as an executive in at least a couple of
1268 them — because, likewise, when there are only five companies in an industry,
1269 everyone qualified for a senior role at any of them is by definition working
1270 at one of the other ones.
1271 </p><div class="blockquote"><blockquote class="blockquote"><p>
1272 While surveillance doesn’t cause monopolies, monopolies certainly abet
1273 surveillance.
1274 </p></blockquote></div><p>
1275 Industries that are competitive are fragmented — composed of companies that
1276 are at each other’s throats all the time and eroding one another’s margins
1277 in bids to steal their best customers. This leaves them with much more
1278 limited capital to use to lobby for favorable rules and a much harder job of
1279 getting everyone to agree to pool their resources to benefit the industry as
1280 a whole.
1281 </p><p>
1282 Surveillance combined with machine learning is supposed to be an existential
1283 crisis, a species-defining moment at which our free will is just a few more
1284 advances in the field from being stripped away. I am skeptical of this
1285 claim, but I <span class="emphasis"><em>do</em></span> think that tech poses an existential
1286 threat to our society and possibly our species.
1287 </p><p>
1288 But that threat grows out of monopoly.
1289 </p><p>
1290 One of the consequences of tech’s regulatory capture is that it can shift
1291 liability for poor security decisions onto its customers and the wider
1292 society. It is absolutely normal in tech for companies to obfuscate the
1293 workings of their products, to make them deliberately hard to understand,
1294 and to threaten security researchers who seek to independently audit those
1295 products.
1296 </p><p>
1297 IT is the only field in which this is practiced: No one builds a bridge or a
1298 hospital and keeps the composition of the steel or the equations used to
1299 calculate load stresses a secret. It is a frankly bizarre practice that
1300 leads, time and again, to grotesque security defects on farcical scales,
1301 with whole classes of devices being revealed as vulnerable long after they
1302 are deployed in the field and put into sensitive places.
1303 </p><p>
1304 The monopoly power that keeps any meaningful consequences for breaches at
1305 bay means that tech companies continue to build terrible products that are
1306 insecure by design and that end up integrated into our lives, in possession
1307 of our data, and connected to our physical world. For years, Boeing has
1308 struggled with the aftermath of a series of bad technology decisions that
1309 made its 737 fleet a global pariah, a rare instance in which bad tech
1310 decisions have been seriously punished in the market.
1311 </p><p>
1312 These bad security decisions are compounded yet again by the use of
1313 copyright locks to enforce business-model decisions against
1314 consumers. Recall that these locks have become the go-to means for shaping
1315 consumer behavior, making it technically impossible to use third-party ink,
1316 insulin, apps, or service depots in connection with your lawfully acquired
1317 property.
1318 </p><p>
1319 Recall also that these copyright locks are backstopped by legislation (such
1320 as Section 1201 of the DMCA or Article 6 of the 2001 EU Copyright Directive)
1321 that ban tampering with (<span class="quote"><span class="quote">circumventing</span></span>) them, and these
1322 statutes have been used to threaten security researchers who make
1323 disclosures about vulnerabilities without permission from manufacturers.
1324 </p><p>
1325 This amounts to a manufacturer’s veto over safety warnings and
1326 criticism. While this is far from the legislative intent of the DMCA and its
1327 sister statutes around the world, Congress has not intervened to clarify the
1328 statute nor will it because to do so would run counter to the interests of
1329 powerful, large firms whose lobbying muscle is unstoppable.
1330 </p><p>
1331 Copyright locks are a double whammy: They create bad security decisions that
1332 can’t be freely investigated or discussed. If markets are supposed to be
1333 machines for aggregating information (and if surveillance capitalism’s
1334 notional mind-control rays are what make it a <span class="quote"><span class="quote">rogue
1335 capitalism</span></span> because it denies consumers the power to make decisions),
1336 then a program of legally enforced ignorance of the risks of products makes
1337 monopolism even more of a <span class="quote"><span class="quote">rogue capitalism</span></span> than surveillance
1338 capitalism’s influence campaigns.
1339 </p><p>
1340 And unlike mind-control rays, enforced silence over security is an
1341 immediate, documented problem, and it <span class="emphasis"><em>does</em></span> constitute
1342 an existential threat to our civilization and possibly our species. The
1343 proliferation of insecure devices — especially devices that spy on us and
1344 especially when those devices also can manipulate the physical world by,
1345 say, steering your car or flipping a breaker at a power station — is a kind
1346 of technology debt.
1347 </p><p>
1348 In software design, <span class="quote"><span class="quote">technology debt</span></span> refers to old, baked-in
1349 decisions that turn out to be bad ones in hindsight. Perhaps a long-ago
1350 developer decided to incorporate a networking protocol made by a vendor that
1351 has since stopped supporting it. But everything in the product still relies
1352 on that superannuated protocol, and so, with each revision, the product team
1353 has to work around this obsolete core, adding compatibility layers,
1354 surrounding it with security checks that try to shore up its defenses, and
1355 so on. These Band-Aid measures compound the debt because every subsequent
1356 revision has to make allowances for <span class="emphasis"><em>them</em></span>, too, like
1357 interest mounting on a predatory subprime loan. And like a subprime loan,
1358 the interest mounts faster than you can hope to pay it off: The product team
1359 has to put so much energy into maintaining this complex, brittle system that
1360 they don’t have any time left over to refactor the product from the ground
1361 up and <span class="quote"><span class="quote">pay off the debt</span></span> once and for all.
1362 </p><p>
1363 Typically, technology debt results in a technological bankruptcy: The
1364 product gets so brittle and unsustainable that it fails
1365 catastrophically. Think of the antiquated COBOL-based banking and accounting
1366 systems that fell over at the start of the pandemic emergency when
1367 confronted with surges of unemployment claims. Sometimes that ends the
1368 product; sometimes it takes the company down with it. Being caught in the
1369 default of a technology debt is scary and traumatic, just like losing your
1370 house due to bankruptcy is scary and traumatic.
1371 </p><p>
1372 But the technology debt created by copyright locks isn’t individual debt;
1373 it’s systemic. Everyone in the world is exposed to this over-leverage, as
1374 was the case with the 2008 financial crisis. When that debt comes due — when
1375 we face a cascade of security breaches that threaten global shipping and
1376 logistics, the food supply, pharmaceutical production pipelines, emergency
1377 communications, and other critical systems that are accumulating technology
1378 debt in part due to the presence of deliberately insecure and deliberately
1379 unauditable copyright locks — it will indeed pose an existential risk.
1380 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="privacy-and-monopoly"></a>Prywatność a monopol</h2></div></div></div><p>
1381 Many tech companies are gripped by an orthodoxy that holds that if they just
1382 gather enough data on enough of our activities, everything else is possible
1383 — the mind control and endless profits. This is an unfalsifiable hypothesis:
1384 If data gives a tech company even a tiny improvement in behavior prediction
1385 and modification, the company declares that it has taken the first step
1386 toward global domination with no end in sight. If a company
1387 <span class="emphasis"><em>fails</em></span> to attain any improvements from gathering and
1388 analyzing data, it declares success to be just around the corner, attainable
1389 once more data is in hand.
1390 </p><p>
1391 Surveillance tech is far from the first industry to embrace a nonsensical,
1392 self-serving belief that harms the rest of the world, and it is not the
1393 first industry to profit handsomely from such a delusion. Long before
1394 hedge-fund managers were claiming (falsely) that they could beat the
1395 S&amp;P 500, there were plenty of other <span class="quote"><span class="quote">respectable</span></span>
1396 industries that have been revealed as quacks in hindsight. From the makers
1397 of radium suppositories (a real thing!) to the cruel sociopaths who claimed
1398 they could <span class="quote"><span class="quote">cure</span></span> gay people, history is littered with the
1399 formerly respectable titans of discredited industries.
1400 </p><p>
1401 This is not to say that there’s nothing wrong with Big Tech and its
1402 ideological addiction to data. While surveillance’s benefits are mostly
1403 overstated, its harms are, if anything, <span class="emphasis"><em>understated</em></span>.
1404 </p><p>
1405 There’s real irony here. The belief in surveillance capitalism as a
1406 <span class="quote"><span class="quote">rogue capitalism</span></span> is driven by the belief that markets
1407 wouldn’t tolerate firms that are gripped by false beliefs. An oil company
1408 that has false beliefs about where the oil is will eventually go broke
1409 digging dry wells after all.
1410 </p><p>
1411 But monopolists get to do terrible things for a long time before they pay
1412 the price. Think of how concentration in the finance sector allowed the
1413 subprime crisis to fester as bond-rating agencies, regulators, investors,
1414 and critics all fell under the sway of a false belief that complex
1415 mathematics could construct <span class="quote"><span class="quote">fully hedged</span></span> debt instruments
1416 that could not possibly default. A small bank that engaged in this kind of
1417 malfeasance would simply go broke rather than outrunning the inevitable
1418 crisis, perhaps growing so big that it averted it altogether. But large
1419 banks were able to continue to attract investors, and when they finally
1420 <span class="emphasis"><em>did</em></span> come a-cropper, the world’s governments bailed them
1421 out. The worst offenders of the subprime crisis are bigger than they were in
1422 2008, bringing home more profits and paying their execs even larger sums.
1423 </p><p>
1424 Big Tech is able to practice surveillance not just because it is tech but
1425 because it is <span class="emphasis"><em>big</em></span>. The reason every web publisher
1426 embeds a Facebook <span class="quote"><span class="quote">Like</span></span> button is that Facebook dominates the
1427 internet’s social media referrals — and every one of those
1428 <span class="quote"><span class="quote">Like</span></span> buttons spies on everyone who lands on a page that
1429 contains them (see also: Google Analytics embeds, Twitter buttons, etc.).
1430 </p><p>
1431 The reason the world’s governments have been slow to create meaningful
1432 penalties for privacy breaches is that Big Tech’s concentration produces
1433 huge profits that can be used to lobby against those penalties — and Big
1434 Tech’s concentration means that the companies involved are able to arrive at
1435 a unified negotiating position that supercharges the lobbying.
1436 </p><p>
1437 The reason that the smartest engineers in the world want to work for Big
1438 Tech is that Big Tech commands the lion’s share of tech industry jobs.
1439 </p><p>
1440 The reason people who are aghast at Facebook’s and Google’s and Amazon’s
1441 data-handling practices continue to use these services is that all their
1442 friends are on Facebook; Google dominates search; and Amazon has put all the
1443 local merchants out of business.
1444 </p><p>
1445 Competitive markets would weaken the companies’ lobbying muscle by reducing
1446 their profits and pitting them against each other in regulatory forums. It
1447 would give customers other places to go to get their online services. It
1448 would make the companies small enough to regulate and pave the way to
1449 meaningful penalties for breaches. It would let engineers with ideas that
1450 challenged the surveillance orthodoxy raise capital to compete with the
1451 incumbents. It would give web publishers multiple ways to reach audiences
1452 and make the case against Facebook and Google and Twitter embeds.
1453 </p><p>
1454 In other words, while surveillance doesn’t cause monopolies, monopolies
1455 certainly abet surveillance.
1456 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="ronald-reagan-pioneer-of-tech-monopolism"></a>Ronald Reagan, pionier monopolizmu technologicznego</h2></div></div></div><p>
1457 Technology exceptionalism is a sin, whether it’s practiced by technology’s
1458 blind proponents or by its critics. Both of these camps are prone to
1459 explaining away monopolistic concentration by citing some special
1460 characteristic of the tech industry, like network effects or first-mover
1461 advantage. The only real difference between these two groups is that the
1462 tech apologists say monopoly is inevitable so we should just let tech get
1463 away with its abuses while competition regulators in the U.S. and the EU say
1464 monopoly is inevitable so we should punish tech for its abuses but not try
1465 to break up the monopolies.
1466 </p><p>
1467 To understand how tech became so monopolistic, it’s useful to look at the
1468 dawn of the consumer tech industry: 1979, the year the Apple II Plus
1469 launched and became the first successful home computer. That also happens to
1470 be the year that Ronald Reagan hit the campaign trail for the 1980
1471 presidential race — a race he won, leading to a radical shift in the way
1472 that antitrust concerns are handled in America. Reagan’s cohort of
1473 politicians — including Margaret Thatcher in the U.K., Brian Mulroney in
1474 Canada, Helmut Kohl in Germany, and Augusto Pinochet in Chile — went on to
1475 enact similar reforms that eventually spread around the world.
1476 </p><p>
1477 Antitrust’s story began nearly a century before all that with laws like the
1478 Sherman Act, which took aim at monopolists on the grounds that monopolies
1479 were bad in and of themselves — squeezing out competitors, creating
1480 <span class="quote"><span class="quote">diseconomies of scale</span></span> (when a company is so big that its
1481 constituent parts go awry and it is seemingly helpless to address the
1482 problems), and capturing their regulators to such a degree that they can get
1483 away with a host of evils.
1484 </p><p>
1485 Then came a fabulist named Robert Bork, a former solicitor general who
1486 Reagan appointed to the powerful U.S. Court of Appeals for the D.C. Circuit
1487 and who had created an alternate legislative history of the Sherman Act and
1488 its successors out of whole cloth. Bork insisted that these statutes were
1489 never targeted at monopolies (despite a wealth of evidence to the contrary,
1490 including the transcribed speeches of the acts’ authors) but, rather, that
1491 they were intended to prevent <span class="quote"><span class="quote">consumer harm</span></span> — in the form of
1492 higher prices.
1493 </p><p>
1494 Bork was a crank, but he was a crank with a theory that rich people really
1495 liked. Monopolies are a great way to make rich people richer by allowing
1496 them to receive <span class="quote"><span class="quote">monopoly rents</span></span> (that is, bigger profits) and
1497 capture regulators, leading to a weaker, more favorable regulatory
1498 environment with fewer protections for customers, suppliers, the
1499 environment, and workers.
1500 </p><p>
1501 Bork’s theories were especially palatable to the same power brokers who
1502 backed Reagan, and Reagan’s Department of Justice and other agencies began
1503 to incorporate Bork’s antitrust doctrine into their enforcement decisions
1504 (Reagan even put Bork up for a Supreme Court seat, but Bork flunked the
1505 Senate confirmation hearing so badly that, 40 years later, D.C. insiders use
1506 the term <span class="quote"><span class="quote">borked</span></span> to refer to any catastrophically bad
1507 political performance).
1508 </p><p>
1509 Little by little, Bork’s theories entered the mainstream, and their backers
1510 began to infiltrate the legal education field, even putting on junkets where
1511 members of the judiciary were treated to lavish meals, fun outdoor
1512 activities, and seminars where they were indoctrinated into the consumer
1513 harm theory of antitrust. The more Bork’s theories took hold, the more money
1514 the monopolists were making — and the more surplus capital they had at their
1515 disposal to lobby for even more Borkian antitrust influence campaigns.
1516 </p><p>
1517 The history of Bork’s antitrust theories is a really good example of the
1518 kind of covertly engineered shifts in public opinion that Zuboff warns us
1519 against, where fringe ideas become mainstream orthodoxy. But Bork didn’t
1520 change the world overnight. He played a very long game, for over a
1521 generation, and he had a tailwind because the same forces that backed
1522 oligarchic antitrust theories also backed many other oligarchic shifts in
1523 public opinion. For example, the idea that taxation is theft, that wealth is
1524 a sign of virtue, and so on — all of these theories meshed to form a
1525 coherent ideology that elevated inequality to a virtue.
1526 </p><p>
1527 Today, many fear that machine learning allows surveillance capitalism to
1528 sell <span class="quote"><span class="quote">Bork-as-a-Service,</span></span> at internet speeds, so that you can
1529 contract a machine-learning company to engineer <span class="emphasis"><em>rapid</em></span>
1530 shifts in public sentiment without needing the capital to sustain a
1531 multipronged, multigenerational project working at the local, state,
1532 national, and global levels in business, law, and philosophy. I do not
1533 believe that such a project is plausible, though I agree that this is
1534 basically what the platforms claim to be selling. They’re just lying about
1535 it. Big Tech lies all the time, <span class="emphasis"><em>including</em></span> in their
1536 sales literature.
1537 </p><p>
1538 The idea that tech forms <span class="quote"><span class="quote">natural monopolies</span></span> (monopolies that
1539 are the inevitable result of the realities of an industry, such as the
1540 monopolies that accrue the first company to run long-haul phone lines or
1541 rail lines) is belied by tech’s own history: In the absence of
1542 anti-competitive tactics, Google was able to unseat AltaVista and Yahoo;
1543 Facebook was able to head off Myspace. There are some advantages to
1544 gathering mountains of data, but those mountains of data also have
1545 disadvantages: liability (from leaking), diminishing returns (from old
1546 data), and institutional inertia (big companies, like science, progress one
1547 funeral at a time).
1548 </p><p>
1549 Indeed, the birth of the web saw a mass-extinction event for the existing
1550 giant, wildly profitable proprietary technologies that had capital, network
1551 effects, and walls and moats surrounding their businesses. The web showed
1552 that when a new industry is built around a protocol, rather than a product,
1553 the combined might of everyone who uses the protocol to reach their
1554 customers or users or communities outweighs even the most massive
1555 products. CompuServe, AOL, MSN, and a host of other proprietary walled
1556 gardens learned this lesson the hard way: Each believed it could stay
1557 separate from the web, offering <span class="quote"><span class="quote">curation</span></span> and a guarantee of
1558 consistency and quality instead of the chaos of an open system. Each was
1559 wrong and ended up being absorbed into the public web.
1560 </p><p>
1561 Yes, tech is heavily monopolized and is now closely associated with industry
1562 concentration, but this has more to do with a matter of timing than its
1563 intrinsically monopolistic tendencies. Tech was born at the moment that
1564 antitrust enforcement was being dismantled, and tech fell into exactly the
1565 same pathologies that antitrust was supposed to guard against. To a first
1566 approximation, it is reasonable to assume that tech’s monopolies are the
1567 result of a lack of anti-monopoly action and not the much-touted unique
1568 characteristics of tech, such as network effects, first-mover advantage, and
1569 so on.
1570 </p><p>
1571 In support of this thesis, I offer the concentration that every
1572 <span class="emphasis"><em>other</em></span> industry has undergone over the same period. From
1573 professional wrestling to consumer packaged goods to commercial property
1574 leasing to banking to sea freight to oil to record labels to newspaper
1575 ownership to theme parks, <span class="emphasis"><em>every</em></span> industry has undergone
1576 a massive shift toward concentration. There’s no obvious network effects or
1577 first-mover advantage at play in these industries. However, in every case,
1578 these industries attained their concentrated status through tactics that
1579 were prohibited before Bork’s triumph: merging with major competitors,
1580 buying out innovative new market entrants, horizontal and vertical
1581 integration, and a suite of anti-competitive tactics that were once illegal
1582 but are not any longer.
1583 </p><p>
1584 Again: When you change the laws intended to prevent monopolies and then
1585 monopolies form in exactly the way the law was supposed to prevent, it is
1586 reasonable to suppose that these facts are related. Tech’s concentration
1587 can be readily explained without recourse to radical theories of network
1588 effects — but only if you’re willing to indict unregulated markets as
1589 tending toward monopoly. Just as a lifelong smoker can give you a hundred
1590 reasons why their smoking didn’t cause their cancer (<span class="quote"><span class="quote">It was the
1591 environmental toxins</span></span>), true believers in unregulated markets have a
1592 whole suite of unconvincing explanations for monopoly in tech that leave
1593 capitalism intact.
1594 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="steering-with-the-windshield-wipers"></a>Sterowanie za pomocą wycieraczek przedniej szyby</h2></div></div></div><p>
1595 It’s been 40 years since Bork’s project to rehabilitate monopolies achieved
1596 liftoff, and that is a generation and a half, which is plenty of time to
1597 take a common idea and make it seem outlandish and vice versa. Before the
1598 1940s, affluent Americans dressed their baby boys in pink while baby girls
1599 wore blue (a <span class="quote"><span class="quote">delicate and dainty</span></span> color). While gendered
1600 colors are obviously totally arbitrary, many still greet this news with
1601 amazement and find it hard to imagine a time when pink connoted masculinity.
1602 </p><p>
1603 After 40 years of studiously ignoring antitrust analysis and enforcement,
1604 it’s not surprising that we’ve all but forgotten that antitrust exists, that
1605 in living memory, growth through mergers and acquisitions were largely
1606 prohibited under law, that market-cornering strategies like vertical
1607 integration could land a company in court.
1608 </p><p>
1609 Antitrust is a market society’s steering wheel, the control of first resort
1610 to keep would-be masters of the universe in their lanes. But Bork and his
1611 cohort ripped out our steering wheel 40 years ago. The car is still
1612 barreling along, and so we’re yanking as hard as we can on all the
1613 <span class="emphasis"><em>other</em></span> controls in the car as well as desperately
1614 flapping the doors and rolling the windows up and down in the hopes that one
1615 of these other controls can be repurposed to let us choose where we’re
1616 heading before we careen off a cliff.
1617 </p><p>
1618 It’s like a 1960s science-fiction plot come to life: People stuck in a
1619 <span class="quote"><span class="quote">generation ship,</span></span> plying its way across the stars, a ship once
1620 piloted by their ancestors; and now, after a great cataclysm, the ship’s
1621 crew have forgotten that they’re in a ship at all and no longer remember
1622 where the control room is. Adrift, the ship is racing toward its extinction,
1623 and unless we can seize the controls and execute emergency course
1624 correction, we’re all headed for a fiery death in the heart of a sun.
1625 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="surveillance-still-matters"></a>Systemy nadzoru mają ciągle znaczenie</h2></div></div></div><p>
1626 None of this is to minimize the problems with surveillance. Surveillance
1627 matters, and Big Tech’s use of surveillance <span class="emphasis"><em>is</em></span> an
1628 existential risk to our species, but that’s not because surveillance and
1629 machine learning rob us of our free will.
1630 </p><p>
1631 Surveillance has become <span class="emphasis"><em>much</em></span> more efficient thanks to
1632 Big Tech. In 1989, the Stasi — the East German secret police — had the whole
1633 country under surveillance, a massive undertaking that recruited one out of
1634 every 60 people to serve as an informant or intelligence operative.
1635 </p><p>
1636 Today, we know that the NSA is spying on a significant fraction of the
1637 entire world’s population, and its ratio of surveillance operatives to the
1638 surveilled is more like 1:10,000 (that’s probably on the low side since it
1639 assumes that every American with top-secret clearance is working for the NSA
1640 on this project — we don’t know how many of those cleared people are
1641 involved in NSA spying, but it’s definitely not all of them).
1642 </p><p>
1643 How did the ratio of surveillable citizens expand from 1:60 to 1:10,000 in
1644 less than 30 years? It’s thanks to Big Tech. Our devices and services gather
1645 most of the data that the NSA mines for its surveillance project. We pay for
1646 these devices and the services they connect to, and then we painstakingly
1647 perform the data-entry tasks associated with logging facts about our lives,
1648 opinions, and preferences. This mass surveillance project has been largely
1649 useless for fighting terrorism: The NSA can <a class="ulink" href="https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html" target="_top">only
1650 point to a single minor success story</a> in which it used its data
1651 collection program to foil an attempt by a U.S. resident to wire a few
1652 thousand dollars to an overseas terror group. It’s ineffective for much the
1653 same reason that commercial surveillance projects are largely ineffective at
1654 targeting advertising: The people who want to commit acts of terror, like
1655 people who want to buy a refrigerator, are extremely rare. If you’re trying
1656 to detect a phenomenon whose base rate is one in a million with an
1657 instrument whose accuracy is only 99%, then every true positive will come at
1658 the cost of 9,999 false positives.
1659 </p><p>
1660 Let me explain that again: If one in a million people is a terrorist, then
1661 there will only be about one terrorist in a random sample of one million
1662 people. If your test for detecting terrorists is 99% accurate, it will
1663 identify 10,000 terrorists in your million-person sample (1% of one million
1664 is 10,000). For every true positive, you’ll get 9,999 false positives.
1665 </p><p>
1666 In reality, the accuracy of algorithmic terrorism detection falls far short
1667 of the 99% mark, as does refrigerator ad targeting. The difference is that
1668 being falsely accused of wanting to buy a fridge is a minor nuisance while
1669 being falsely accused of planning a terror attack can destroy your life and
1670 the lives of everyone you love.
1671 </p><p>
1672 Mass state surveillance is only feasible because of surveillance capitalism
1673 and its extremely low-yield ad-targeting systems, which require a constant
1674 feed of personal data to remain barely viable. Surveillance capitalism’s
1675 primary failure mode is mistargeted ads while mass state surveillance’s
1676 primary failure mode is grotesque human rights abuses, tending toward
1677 totalitarianism.
1678 </p><p>
1679 State surveillance is no mere parasite on Big Tech, sucking up its data and
1680 giving nothing in return. In truth, the two are symbiotes: Big Tech sucks up
1681 our data for spy agencies, and spy agencies ensure that governments don’t
1682 limit Big Tech’s activities so severely that it would no longer serve the
1683 spy agencies’ needs. There is no firm distinction between state surveillance
1684 and surveillance capitalism; they are dependent on one another.
1685 </p><p>
1686 To see this at work today, look no further than Amazon’s home surveillance
1687 device, the Ring doorbell, and its associated app, Neighbors. Ring — a
1688 product that Amazon acquired and did not develop in house — makes a
1689 camera-enabled doorbell that streams footage from your front door to your
1690 mobile device. The Neighbors app allows you to form a neighborhood-wide
1691 surveillance grid with your fellow Ring owners through which you can share
1692 clips of <span class="quote"><span class="quote">suspicious characters.</span></span> If you’re thinking that this
1693 sounds like a recipe for letting curtain-twitching racists supercharge their
1694 suspicions of people with brown skin who walk down their blocks, <a class="ulink" href="https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement" target="_top">you’re
1695 right</a>. Ring has become a <span class="emphasis"><em>de facto,</em></span>
1696 off-the-books arm of the police without any of the pesky oversight or rules.
1697 </p><p>
1698 In mid-2019, a series of public records requests revealed that Amazon had
1699 struck confidential deals with more than 400 local law enforcement agencies
1700 through which the agencies would promote Ring and Neighbors and in exchange
1701 get access to footage from Ring cameras. In theory, cops would need to
1702 request this footage through Amazon (and internal documents reveal that
1703 Amazon devotes substantial resources to coaching cops on how to spin a
1704 convincing story when doing so), but in practice, when a Ring customer turns
1705 down a police request, Amazon only requires the agency to formally request
1706 the footage from the company, which it will then produce.
1707 </p><p>
1708 Ring and law enforcement have found many ways to intertwine their
1709 activities. Ring strikes secret deals to acquire real-time access to 911
1710 dispatch and then streams alarming crime reports to Neighbors users, which
1711 serve as convincers for anyone who’s contemplating a surveillance doorbell
1712 but isn’t sure whether their neighborhood is dangerous enough to warrant it.
1713 </p><p>
1714 The more the cops buzz-market the surveillance capitalist Ring, the more
1715 surveillance capability the state gets. Cops who rely on private entities
1716 for law-enforcement roles then brief against any controls on the deployment
1717 of that technology while the companies return the favor by lobbying against
1718 rules requiring public oversight of police surveillance technology. The more
1719 the cops rely on Ring and Neighbors, the harder it will be to pass laws to
1720 curb them. The fewer laws there are against them, the more the cops will
1721 rely on them.
1722 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="dignity-and-sanctuary"></a>Godność i sanktuarium</h2></div></div></div><p>
1723 But even if we could exercise democratic control over our states and force
1724 them to stop raiding surveillance capitalism’s reservoirs of behavioral
1725 data, surveillance capitalism would still harm us.
1726 </p><p>
1727 This is an area where Zuboff shines. Her chapter on <span class="quote"><span class="quote">sanctuary</span></span>
1728 — the feeling of being unobserved — is a beautiful hymn to introspection,
1729 calmness, mindfulness, and tranquility.
1730 </p><p>
1731 When you are watched, something changes. Anyone who has ever raised a child
1732 knows this. You might look up from your book (or more realistically, from
1733 your phone) and catch your child in a moment of profound realization and
1734 growth, a moment where they are learning something that is right at the edge
1735 of their abilities, requiring their entire ferocious concentration. For a
1736 moment, you’re transfixed, watching that rare and beautiful moment of focus
1737 playing out before your eyes, and then your child looks up and sees you
1738 seeing them, and the moment collapses. To grow, you need to be and expose
1739 your authentic self, and in that moment, you are vulnerable like a hermit
1740 crab scuttling from one shell to the next. The tender, unprotected tissues
1741 you expose in that moment are too delicate to reveal in the presence of
1742 another, even someone you trust as implicitly as a child trusts their
1743 parent.
1744 </p><p>
1745 In the digital age, our authentic selves are inextricably tied to our
1746 digital lives. Your search history is a running ledger of the questions
1747 you’ve pondered. Your location history is a record of the places you’ve
1748 sought out and the experiences you’ve had there. Your social graph reveals
1749 the different facets of your identity, the people you’ve connected with.
1750 </p><p>
1751 To be observed in these activities is to lose the sanctuary of your
1752 authentic self.
1753 </p><p>
1754 There’s another way in which surveillance capitalism robs us of our capacity
1755 to be our authentic selves: by making us anxious. Surveillance capitalism
1756 isn’t really a mind-control ray, but you don’t need a mind-control ray to
1757 make someone anxious. After all, another word for anxiety is agitation, and
1758 to make someone experience agitation, you need merely to agitate them. To
1759 poke them and prod them and beep at them and buzz at them and bombard them
1760 on an intermittent schedule that is just random enough that our limbic
1761 systems never quite become inured to it.
1762 </p><p>
1763 Our devices and services are <span class="quote"><span class="quote">general purpose</span></span> in that they can
1764 connect anything or anyone to anything or anyone else and that they can run
1765 any program that can be written. This means that the distraction rectangles
1766 in our pockets hold our most precious moments with our most beloved people
1767 and their most urgent or time-sensitive communications (from <span class="quote"><span class="quote">running
1768 late can you get the kid?</span></span> to <span class="quote"><span class="quote">doctor gave me bad news and I
1769 need to talk to you RIGHT NOW</span></span>) as well as ads for refrigerators and
1770 recruiting messages from Nazis.
1771 </p><p>
1772 All day and all night, our pockets buzz, shattering our concentration and
1773 tearing apart the fragile webs of connection we spin as we think through
1774 difficult ideas. If you locked someone in a cell and agitated them like
1775 this, we’d call it <span class="quote"><span class="quote">sleep deprivation torture,</span></span> and it would be
1776 <a class="ulink" href="https://www.youtube.com/watch?v=1SKpRbvnx6g" target="_top">a war crime under
1777 the Geneva Conventions</a>.
1778 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="afflicting-the-afflicted"></a>Dręczenie udręczonych</h2></div></div></div><p>
1779 The effects of surveillance on our ability to be our authentic selves are
1780 not equal for all people. Some of us are lucky enough to live in a time and
1781 place in which all the most important facts of our lives are widely and
1782 roundly socially acceptable and can be publicly displayed without the risk
1783 of social consequence.
1784 </p><p>
1785 But for many of us, this is not true. Recall that in living memory, many of
1786 the ways of being that we think of as socially acceptable today were once
1787 cause for dire social sanction or even imprisonment. If you are 65 years
1788 old, you have lived through a time in which people living in <span class="quote"><span class="quote">free
1789 societies</span></span> could be imprisoned or sanctioned for engaging in
1790 homosexual activity, for falling in love with a person whose skin was a
1791 different color than their own, or for smoking weed.
1792 </p><p>
1793 Today, these activities aren’t just decriminalized in much of the world,
1794 they’re considered normal, and the fallen prohibitions are viewed as
1795 shameful, regrettable relics of the past.
1796 </p><p>
1797 How did we get from prohibition to normalization? Through private, personal
1798 activity: People who were secretly gay or secret pot-smokers or who secretly
1799 loved someone with a different skin color were vulnerable to retaliation if
1800 they made their true selves known and were limited in how much they could
1801 advocate for their own right to exist in the world and be true to
1802 themselves. But because there was a private sphere, these people could form
1803 alliances with their friends and loved ones who did not share their
1804 disfavored traits by having private conversations in which they came out,
1805 disclosing their true selves to the people around them and bringing them to
1806 their cause one conversation at a time.
1807 </p><p>
1808 The right to choose the time and manner of these conversations was key to
1809 their success. It’s one thing to come out to your dad while you’re on a
1810 fishing trip away from the world and another thing entirely to blurt it out
1811 over the Christmas dinner table while your racist Facebook uncle is there to
1812 make a scene.
1813 </p><p>
1814 Without a private sphere, there’s a chance that none of these changes would
1815 have come to pass and that the people who benefited from these changes would
1816 have either faced social sanction for coming out to a hostile world or would
1817 have never been able to reveal their true selves to the people they love.
1818 </p><p>
1819 The corollary is that, unless you think that our society has attained social
1820 perfection — that your grandchildren in 50 years will ask you to tell them
1821 the story of how, in 2020, every injustice had been righted and no further
1822 change had to be made — then you should expect that right now, at this
1823 minute, there are people you love, whose happiness is key to your own, who
1824 have a secret in their hearts that stops them from ever being their
1825 authentic selves with you. These people are sorrowing and will go to their
1826 graves with that secret sorrow in their hearts, and the source of that
1827 sorrow will be the falsity of their relationship to you.
1828 </p><p>
1829 Prywatna rzeczywistość jest konieczna dla rozwoju ludzkości.
1830 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="any-data-you-collect-and-retain-will-eventually-leak"></a>Jakiekolwiek dane, które zbierasz i przetwarzasz, kiedyś w końcu wyciekną</h2></div></div></div><p>
1831 The lack of a private life can rob vulnerable people of the chance to be
1832 their authentic selves and constrain our actions by depriving us of
1833 sanctuary, but there is another risk that is borne by everyone, not just
1834 people with a secret: crime.
1835 </p><p>
1836 Personally identifying information is of very limited use for the purpose of
1837 controlling peoples’ minds, but identity theft — really a catchall term for
1838 a whole constellation of terrible criminal activities that can destroy your
1839 finances, compromise your personal integrity, ruin your reputation, or even
1840 expose you to physical danger — thrives on it.
1841 </p><p>
1842 Attackers are not limited to using data from one breached source,
1843 either. Multiple services have suffered breaches that exposed names,
1844 addresses, phone numbers, passwords, sexual tastes, school grades, work
1845 performance, brushes with the criminal justice system, family details,
1846 genetic information, fingerprints and other biometrics, reading habits,
1847 search histories, literary tastes, pseudonymous identities, and other
1848 sensitive information. Attackers can merge data from these different
1849 breaches to build up extremely detailed dossiers on random subjects and then
1850 use different parts of the data for different criminal purposes.
1851 </p><p>
1852 For example, attackers can use leaked username and password combinations to
1853 hijack whole fleets of commercial vehicles that <a class="ulink" href="https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps" target="_top">have
1854 been fitted with anti-theft GPS trackers and immobilizers</a> or to
1855 hijack baby monitors in order to <a class="ulink" href="https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550" target="_top">terrorize
1856 toddlers with the audio tracks from pornography</a>. Attackers use
1857 leaked data to trick phone companies into giving them your phone number,
1858 then they intercept SMS-based two-factor authentication codes in order to
1859 take over your email, bank account, and/or cryptocurrency wallets.
1860 </p><p>
1861 Attackers are endlessly inventive in the pursuit of creative ways to
1862 weaponize leaked data. One common use of leaked data is to penetrate
1863 companies in order to access <span class="emphasis"><em>more</em></span> data.
1864 </p><p>
1865 Like spies, online fraudsters are totally dependent on companies
1866 over-collecting and over-retaining our data. Spy agencies sometimes pay
1867 companies for access to their data or intimidate them into giving it up, but
1868 sometimes they work just like criminals do — by <a class="ulink" href="https://www.bbc.com/news/world-us-canada-24751821" target="_top">sneaking data out of
1869 companies’ databases</a>.
1870 </p><p>
1871 The over-collection of data has a host of terrible social consequences, from
1872 the erosion of our authentic selves to the undermining of social progress,
1873 from state surveillance to an epidemic of online crime. Commercial
1874 surveillance is also a boon to people running influence campaigns, but
1875 that’s the least of our troubles.
1876 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="critical-tech-exceptionalism-is-still-tech-exceptionalism"></a>Przełomowa wyjątkowość technologiczna jest nadal technologiczną
1877 wyjątkowością</h2></div></div></div><p>
1878 Big Tech has long practiced technology exceptionalism: the idea that it
1879 should not be subject to the mundane laws and norms of
1880 <span class="quote"><span class="quote">meatspace.</span></span> Mottoes like Facebook’s <span class="quote"><span class="quote">move fast and break
1881 things</span></span> attracted justifiable scorn of the companies’ self-serving
1882 rhetoric.
1883 </p><p>
1884 Tech exceptionalism got us all into a lot of trouble, so it’s ironic and
1885 distressing to see Big Tech’s critics committing the same sin.
1886 </p><p>
1887 Big Tech is not a <span class="quote"><span class="quote">rogue capitalism</span></span> that cannot be cured
1888 through the traditional anti-monopoly remedies of trustbusting (forcing
1889 companies to divest of competitors they have acquired) and bans on mergers
1890 to monopoly and other anti-competitive tactics. Big Tech does not have the
1891 power to use machine learning to influence our behavior so thoroughly that
1892 markets lose the ability to punish bad actors and reward superior
1893 competitors. Big Tech has no rule-writing mind-control ray that necessitates
1894 ditching our old toolbox.
1895 </p><p>
1896 The thing is, people have been claiming to have perfected mind-control rays
1897 for centuries, and every time, it turned out to be a con — though sometimes
1898 the con artists were also conning themselves.
1899 </p><p>
1900 For generations, the advertising industry has been steadily improving its
1901 ability to sell advertising services to businesses while only making
1902 marginal gains in selling those businesses’ products to prospective
1903 customers. John Wanamaker’s lament that <span class="quote"><span class="quote">50% of my advertising budget
1904 is wasted, I just don’t know which 50%</span></span> is a testament to the triumph
1905 of <span class="emphasis"><em>ad executives</em></span>, who successfully convinced Wanamaker
1906 that only half of the money he spent went to waste.
1907 </p><p>
1908 The tech industry has made enormous improvements in the science of
1909 convincing businesses that they’re good at advertising while their actual
1910 improvements to advertising — as opposed to targeting — have been pretty
1911 ho-hum. The vogue for machine learning — and the mystical invocation of
1912 <span class="quote"><span class="quote">artificial intelligence</span></span> as a synonym for straightforward
1913 statistical inference techniques — has greatly boosted the efficacy of Big
1914 Tech’s sales pitch as marketers have exploited potential customers’ lack of
1915 technical sophistication to get away with breathtaking acts of overpromising
1916 and underdelivering.
1917 </p><p>
1918 It’s tempting to think that if businesses are willing to pour billions into
1919 a venture that the venture must be a good one. Yet there are plenty of times
1920 when this rule of thumb has led us astray. For example, it’s virtually
1921 unheard of for managed investment funds to outperform simple index funds,
1922 and investors who put their money into the hands of expert money managers
1923 overwhelmingly fare worse than those who entrust their savings to index
1924 funds. But managed funds still account for the majority of the money
1925 invested in the markets, and they are patronized by some of the richest,
1926 most sophisticated investors in the world. Their vote of confidence in an
1927 underperforming sector is a parable about the role of luck in wealth
1928 accumulation, not a sign that managed funds are a good buy.
1929 </p><p>
1930 The claims of Big Tech’s mind-control system are full of tells that the
1931 enterprise is a con. For example, <a class="ulink" href="https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full" target="_top">the
1932 reliance on the <span class="quote"><span class="quote">Big Five</span></span> personality traits</a> as a
1933 primary means of influencing people even though the <span class="quote"><span class="quote">Big Five</span></span>
1934 theory is unsupported by any large-scale, peer-reviewed studies and is
1935 <a class="ulink" href="https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/" target="_top">mostly
1936 the realm of marketing hucksters and pop psych</a>.
1937 </p><p>
1938 Big Tech’s promotional materials also claim that their algorithms can
1939 accurately perform <span class="quote"><span class="quote">sentiment analysis</span></span> or detect peoples’
1940 moods based on their <span class="quote"><span class="quote">microexpressions,</span></span> but <a class="ulink" href="https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it" target="_top">these
1941 are marketing claims, not scientific ones</a>. These methods are largely
1942 untested by independent scientific experts, and where they have been tested,
1943 they’ve been found sorely wanting. Microexpressions are particularly
1944 suspect as the companies that specialize in training people to detect them
1945 <a class="ulink" href="https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/" target="_top">have
1946 been shown</a> to underperform relative to random chance.
1947 </p><p>
1948 Big Tech has been so good at marketing its own supposed superpowers that
1949 it’s easy to believe that they can market everything else with similar
1950 acumen, but it’s a mistake to believe the hype. Any statement a company
1951 makes about the quality of its products is clearly not impartial. The fact
1952 that we distrust all the things that Big Tech says about its data handling,
1953 compliance with privacy laws, etc., is only reasonable — but why on Earth
1954 would we treat Big Tech’s marketing literature as the gospel truth? Big Tech
1955 lies about just about <span class="emphasis"><em>everything</em></span>, including how well
1956 its machine-learning fueled persuasion systems work.
1957 </p><p>
1958 That skepticism should infuse all of our evaluations of Big Tech and its
1959 supposed abilities, including our perusal of its patents. Zuboff vests these
1960 patents with enormous significance, pointing out that Google claimed
1961 extensive new persuasion capabilities in <a class="ulink" href="https://patents.google.com/patent/US20050131762A1/en" target="_top">its patent
1962 filings</a>. These claims are doubly suspect: first, because they are so
1963 self-serving, and second, because the patent itself is so notoriously an
1964 invitation to exaggeration.
1965 </p><p>
1966 Patent applications take the form of a series of claims and range from broad
1967 to narrow. A typical patent starts out by claiming that its authors have
1968 invented a method or system for doing every conceivable thing that anyone
1969 might do, ever, with any tool or device. Then it narrows that claim in
1970 successive stages until we get to the actual <span class="quote"><span class="quote">invention</span></span> that
1971 is the true subject of the patent. The hope is that the patent examiner —
1972 who is almost certainly overworked and underinformed — will miss the fact
1973 that some or all of these claims are ridiculous, or at least suspect, and
1974 grant the patent’s broader claims. Patents for unpatentable things are still
1975 incredibly useful because they can be wielded against competitors who might
1976 license that patent or steer clear of its claims rather than endure the
1977 lengthy, expensive process of contesting it.
1978 </p><p>
1979 What’s more, software patents are routinely granted even though the filer
1980 doesn’t have any evidence that they can do the thing claimed by the
1981 patent. That is, you can patent an <span class="quote"><span class="quote">invention</span></span> that you haven’t
1982 actually made and that you don’t know how to make.
1983 </p><p>
1984 With these considerations in hand, it becomes obvious that the fact that a
1985 Big Tech company has patented what it <span class="emphasis"><em>says</em></span> is an
1986 effective mind-control ray is largely irrelevant to whether Big Tech can in
1987 fact control our minds.
1988 </p><p>
1989 Big Tech collects our data for many reasons, including the diminishing
1990 returns on existing stores of data. But many tech companies also collect
1991 data out of a mistaken tech exceptionalist belief in the network effects of
1992 data. Network effects occur when each new user in a system increases its
1993 value. The classic example is fax machines: A single fax machine is of no
1994 use, two fax machines are of limited use, but every new fax machine that’s
1995 put to use after the first doubles the number of possible fax-to-fax links.
1996 </p><p>
1997 Data mined for predictive systems doesn’t necessarily produce these
1998 dividends. Think of Netflix: The predictive value of the data mined from a
1999 million English-speaking Netflix viewers is hardly improved by the addition
2000 of one more user’s viewing data. Most of the data Netflix acquires after
2001 that first minimum viable sample duplicates existing data and produces only
2002 minimal gains. Meanwhile, retraining models with new data gets progressively
2003 more expensive as the number of data points increases, and manual tasks like
2004 labeling and validating data do not get cheaper at scale.
2005 </p><p>
2006 Businesses pursue fads to the detriment of their profits all the time,
2007 especially when the businesses and their investors are not motivated by the
2008 prospect of becoming profitable but rather by the prospect of being acquired
2009 by a Big Tech giant or by having an IPO. For these firms, ticking faddish
2010 boxes like <span class="quote"><span class="quote">collects as much data as possible</span></span> might realize a
2011 bigger return on investment than <span class="quote"><span class="quote">collects a business-appropriate
2012 quantity of data.</span></span>
2013 </p><p>
2014 This is another harm of tech exceptionalism: The belief that more data
2015 always produces more profits in the form of more insights that can be
2016 translated into better mind-control rays drives firms to over-collect and
2017 over-retain data beyond all rationality. And since the firms are behaving
2018 irrationally, a good number of them will go out of business and become ghost
2019 ships whose cargo holds are stuffed full of data that can harm people in
2020 myriad ways — but which no one is responsible for antey longer. Even if the
2021 companies don’t go under, the data they collect is maintained behind the
2022 minimum viable security — just enough security to keep the company viable
2023 while it waits to get bought out by a tech giant, an amount calculated to
2024 spend not one penny more than is necessary on protecting data.
2025 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story"></a>Jak monopole, a nie kontrola umysłu, sterują kapitalizmen opartym na
2026 systemach nadzoru: historia Snapchat</h2></div></div></div><p>
2027 For the first decade of its existence, Facebook competed with the social
2028 media giants of the day (Myspace, Orkut, etc.) by presenting itself as the
2029 pro-privacy alternative. Indeed, Facebook justified its walled garden —
2030 which let users bring in data from the web but blocked web services like
2031 Google Search from indexing and caching Facebook pages — as a pro-privacy
2032 measure that protected users from the surveillance-happy winners of the
2033 social media wars like Myspace.
2034 </p><p>
2035 Despite frequent promises that it would never collect or analyze its users’
2036 data, Facebook periodically created initiatives that did just that, like the
2037 creepy, ham-fisted Beacon tool, which spied on you as you moved around the
2038 web and then added your online activities to your public timeline, allowing
2039 your friends to monitor your browsing habits. Beacon sparked a user
2040 revolt. Every time, Facebook backed off from its surveillance initiative,
2041 but not all the way; inevitably, the new Facebook would be more surveilling
2042 than the old Facebook, though not quite as surveilling as the intermediate
2043 Facebook following the launch of the new product or service.
2044 </p><p>
2045 The pace at which Facebook ramped up its surveillance efforts seems to have
2046 been set by Facebook’s competitive landscape. The more competitors Facebook
2047 had, the better it behaved. Every time a major competitor foundered,
2048 Facebook’s behavior <a class="ulink" href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362" target="_top">got
2049 markedly worse</a>.
2050 </p><p>
2051 All the while, Facebook was prodigiously acquiring companies, including a
2052 company called Onavo. Nominally, Onavo made a battery-monitoring mobile
2053 app. But the permissions that Onavo required were so expansive that the app
2054 was able to gather fine-grained telemetry on everything users did with their
2055 phones, including which apps they used and how they were using them.
2056 </p><p>
2057 Through Onavo, Facebook discovered that it was losing market share to
2058 Snapchat, an app that — like Facebook a decade before — billed itself as the
2059 pro-privacy alternative to the status quo. Through Onavo, Facebook was able
2060 to mine data from the devices of Snapchat users, including both current and
2061 former Snapchat users. This spurred Facebook to acquire Instagram — some
2062 features of which competed with Snapchat — and then allowed Facebook to
2063 fine-tune Instagram’s features and sales pitch to erode Snapchat’s gains and
2064 ensure that Facebook would not have to face the kinds of competitive
2065 pressures it had earlier inflicted on Myspace and Orkut.
2066 </p><p>
2067 The story of how Facebook crushed Snapchat reveals the relationship between
2068 monopoly and surveillance capitalism. Facebook combined surveillance with
2069 lax antitrust enforcement to spot the competitive threat of Snapchat on its
2070 horizon and then take decisive action against it. Facebook’s surveillance
2071 capitalism let it avert competitive pressure with anti-competitive
2072 tactics. Facebook users still want privacy — Facebook hasn’t used
2073 surveillance to brainwash them out of it — but they can’t get it because
2074 Facebook’s surveillance lets it destroy any hope of a rival service emerging
2075 that competes on privacy features.
2076 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="a-monopoly-over-your-friends"></a>Monopol sprawowany nad twoimi przyjaciółmi</h2></div></div></div><p>
2077 A decentralization movement has tried to erode the dominance of Facebook and
2078 other Big Tech companies by fielding <span class="quote"><span class="quote">indieweb</span></span> alternatives —
2079 Mastodon as a Twitter alternative, Diaspora as a Facebook alternative,
2080 etc. — but these efforts have failed to attain any kind of liftoff.
2081 </p><p>
2082 Fundamentally, each of these services is hamstrung by the same problem:
2083 Every potential user for a Facebook or Twitter alternative has to convince
2084 all their friends to follow them to a decentralized web alternative in order
2085 to continue to realize the benefit of social media. For many of us, the only
2086 reason to have a Facebook account is that our friends have Facebook
2087 accounts, and the reason they have Facebook accounts is that
2088 <span class="emphasis"><em>we</em></span> have Facebook accounts.
2089 </p><p>
2090 All of this has conspired to make Facebook — and other dominant platforms —
2091 into <span class="quote"><span class="quote">kill zones</span></span> that investors will not fund new entrants
2092 for.
2093 </p><p>
2094 And yet, all of today’s tech giants came into existence despite the
2095 entrenched advantage of the companies that came before them. To understand
2096 how that happened, you have to understand both interoperability and
2097 adversarial interoperability.
2098 </p><div class="blockquote"><blockquote class="blockquote"><p>
2099 Trudnym problemem naszego gatunku jest koordynacja.
2100 </p></blockquote></div><p>
2101 <span class="quote"><span class="quote">Interoperability</span></span> is the ability of two technologies to work
2102 with one another: Anyone can make an LP that will play on any record player,
2103 anyone can make a filter you can install in your stove’s extractor fan,
2104 anyone can make gasoline for your car, anyone can make a USB phone charger
2105 that fits in your car’s cigarette lighter receptacle, anyone can make a
2106 light bulb that works in your light socket, anyone can make bread that will
2107 toast in your toaster.
2108 </p><p>
2109 Interoperability is often a source of innovation and consumer benefit: Apple
2110 made the first commercially successful PC, but millions of independent
2111 software vendors made interoperable programs that ran on the Apple II
2112 Plus. The simple analog antenna inputs on the back of TVs first allowed
2113 cable operators to connect directly to TVs, then they allowed game console
2114 companies and then personal computer companies to use standard televisions
2115 as displays. Standard RJ-11 telephone jacks allowed for the production of
2116 phones from a variety of vendors in a variety of forms, from the free
2117 football-shaped phone that came with a <span class="emphasis"><em>Sports
2118 Illustrated</em></span> subscription to business phones with speakers, hold
2119 functions, and so on and then answering machines and finally modems, paving
2120 the way for the internet revolution.
2121 </p><p>
2122 <span class="quote"><span class="quote">Interoperability</span></span> is often used interchangeably with
2123 <span class="quote"><span class="quote">standardization,</span></span> which is the process when manufacturers and
2124 other stakeholders hammer out a set of agreed-upon rules for implementing a
2125 technology, such as the electrical plug on your wall, the CAN bus used by
2126 your car’s computer systems, or the HTML instructions that your browser
2127 interprets.
2128 </p><p>
2129 But interoperability doesn’t require standardization — indeed,
2130 standardization often proceeds from the chaos of ad hoc interoperability
2131 measures. The inventor of the cigarette-lighter USB charger didn’t need to
2132 get permission from car manufacturers or even the manufacturers of the
2133 dashboard lighter subcomponent. The automakers didn’t take any
2134 countermeasures to prevent the use of these aftermarket accessories by their
2135 customers, but they also didn’t do anything to make life easier for the
2136 chargers’ manufacturers. This is a kind of <span class="quote"><span class="quote">neutral
2137 interoperability.</span></span>
2138 </p><p>
2139 Beyond neutral interoperability, there is <span class="quote"><span class="quote">adversarial
2140 interoperability.</span></span> That’s when a manufacturer makes a product that
2141 interoperates with another manufacturer’s product <span class="emphasis"><em>despite the
2142 second manufacturer’s objections</em></span> and <span class="emphasis"><em>even if that means
2143 bypassing a security system designed to prevent interoperability</em></span>.
2144 </p><p>
2145 Probably the most familiar form of adversarial interoperability is
2146 third-party printer ink. Printer manufacturers claim that they sell printers
2147 below cost and that the only way they can recoup the losses they incur is by
2148 charging high markups on ink. To prevent the owners of printers from buying
2149 ink elsewhere, the printer companies deploy a suite of anti-customer
2150 security systems that detect and reject both refilled and third-party
2151 cartridges.
2152 </p><p>
2153 Owners of printers take the position that HP and Epson and Brother are not
2154 charities and that customers for their wares have no obligation to help them
2155 survive, and so if the companies choose to sell their products at a loss,
2156 that’s their foolish choice and their consequences to live with. Likewise,
2157 competitors who make ink or refill kits observe that they don’t owe printer
2158 companies anything, and their erosion of printer companies’ margins are the
2159 printer companies’ problems, not their competitors’. After all, the printer
2160 companies shed no tears when they drive a refiller out of business, so why
2161 should the refillers concern themselves with the economic fortunes of the
2162 printer companies?
2163 </p><p>
2164 Adversarial interoperability has played an outsized role in the history of
2165 the tech industry: from the founding of the <span class="quote"><span class="quote">alt.*</span></span> Usenet
2166 hierarchy (which was started against the wishes of Usenet’s maintainers and
2167 which grew to be bigger than all of Usenet combined) to the browser wars
2168 (when Netscape and Microsoft devoted massive engineering efforts to making
2169 their browsers incompatible with the other’s special commands and
2170 peccadilloes) to Facebook (whose success was built in part by helping its
2171 new users stay in touch with friends they’d left behind on Myspace because
2172 Facebook supplied them with a tool that scraped waiting messages from
2173 Myspace and imported them into Facebook, effectively creating an
2174 Facebook-based Myspace reader).
2175 </p><p>
2176 Today, incumbency is seen as an unassailable advantage. Facebook is where
2177 all of your friends are, so no one can start a Facebook competitor. But
2178 adversarial compatibility reverses the competitive advantage: If you were
2179 allowed to compete with Facebook by providing a tool that imported all your
2180 users’ waiting Facebook messages into an environment that competed on lines
2181 that Facebook couldn’t cross, like eliminating surveillance and ads, then
2182 Facebook would be at a huge disadvantage. It would have assembled all
2183 possible ex-Facebook users into a single, easy-to-find service; it would
2184 have educated them on how a Facebook-like service worked and what its
2185 potential benefits were; and it would have provided an easy means for
2186 disgruntled Facebook users to tell their friends where they might expect
2187 better treatment.
2188 </p><p>
2189 Adversarial interoperability was once the norm and a key contributor to the
2190 dynamic, vibrant tech scene, but now it is stuck behind a thicket of laws
2191 and regulations that add legal risks to the tried-and-true tactics of
2192 adversarial interoperability. New rules and new interpretations of existing
2193 rules mean that a would-be adversarial interoperator needs to steer clear of
2194 claims under copyright, terms of service, trade secrecy, tortious
2195 interference, and patent.
2196 </p><p>
2197 In the absence of a competitive market, lawmakers have resorted to assigning
2198 expensive, state-like duties to Big Tech firms, such as automatically
2199 filtering user contributions for copyright infringement or terrorist and
2200 extremist content or detecting and preventing harassment in real time or
2201 controlling access to sexual material.
2202 </p><p>
2203 These measures put a floor under how small we can make Big Tech because only
2204 the very largest companies can afford the humans and automated filters
2205 needed to perform these duties.
2206 </p><p>
2207 But that’s not the only way in which making platforms responsible for
2208 policing their users undermines competition. A platform that is expected to
2209 police its users’ conduct must prevent many vital adversarial
2210 interoperability techniques lest these subvert its policing measures. For
2211 example, if someone using a Twitter replacement like Mastodon is able to
2212 push messages into Twitter and read messages out of Twitter, they could
2213 avoid being caught by automated systems that detect and prevent harassment
2214 (such as systems that use the timing of messages or IP-based rules to make
2215 guesses about whether someone is a harasser).
2216 </p><p>
2217 To the extent that we are willing to let Big Tech police itself — rather
2218 than making Big Tech small enough that users can leave bad platforms for
2219 better ones and small enough that a regulation that simply puts a platform
2220 out of business will not destroy billions of users’ access to their
2221 communities and data — we build the case that Big Tech should be able to
2222 block its competitors and make it easier for Big Tech to demand legal
2223 enforcement tools to ban and punish attempts at adversarial
2224 interoperability.
2225 </p><p>
2226 Ultimately, we can try to fix Big Tech by making it responsible for bad acts
2227 by its users, or we can try to fix the internet by cutting Big Tech down to
2228 size. But we can’t do both. To replace today’s giant products with
2229 pluralistic protocols, we need to clear the legal thicket that prevents
2230 adversarial interoperability so that tomorrow’s nimble, personal,
2231 small-scale products can federate themselves with giants like Facebook,
2232 allowing the users who’ve left to continue to communicate with users who
2233 haven’t left yet, reaching tendrils over Facebook’s garden wall that
2234 Facebook’s trapped users can use to scale the walls and escape to the
2235 global, open web.
2236 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="fake-news-is-an-epistemological-crisis"></a>Fałszywe wiadomości to oznaka kryzysu epistemologicznego</h2></div></div></div><p>
2237 Tech is not the only industry that has undergone massive concentration since
2238 the Reagan era. Virtually every major industry — from oil to newspapers to
2239 meatpacking to sea freight to eyewear to online pornography — has become a
2240 clubby oligarchy that just a few players dominate.
2241 </p><p>
2242 At the same time, every industry has become something of a tech industry as
2243 general-purpose computers and general-purpose networks and the promise of
2244 efficiencies through data-driven analysis infuse every device, process, and
2245 firm with tech.
2246 </p><p>
2247 This phenomenon of industrial concentration is part of a wider story about
2248 wealth concentration overall as a smaller and smaller number of people own
2249 more and more of our world. This concentration of both wealth and industries
2250 means that our political outcomes are increasingly beholden to the parochial
2251 interests of the people and companies with all the money.
2252 </p><p>
2253 That means that whenever a regulator asks a question with an obvious,
2254 empirical answer (<span class="quote"><span class="quote">Are humans causing climate change?</span></span> or
2255 <span class="quote"><span class="quote">Should we let companies conduct commercial mass surveillance?</span></span>
2256 or <span class="quote"><span class="quote">Does society benefit from allowing network neutrality
2257 violations?</span></span>), the answer that comes out is only correct if that
2258 correctness meets with the approval of rich people and the industries that
2259 made them so wealthy.
2260 </p><p>
2261 Rich people have always played an outsized role in politics and more so
2262 since the Supreme Court’s <span class="emphasis"><em>Citizens United</em></span> decision
2263 eliminated key controls over political spending. Widening inequality and
2264 wealth concentration means that the very richest people are now a lot richer
2265 and can afford to spend a lot more money on political projects than ever
2266 before. Think of the Koch brothers or George Soros or Bill Gates.
2267 </p><p>
2268 But the policy distortions of rich individuals pale in comparison to the
2269 policy distortions that concentrated industries are capable of. The
2270 companies in highly concentrated industries are much more profitable than
2271 companies in competitive industries — no competition means not having to
2272 reduce prices or improve quality to win customers — leaving them with bigger
2273 capital surpluses to spend on lobbying.
2274 </p><p>
2275 Concentrated industries also find it easier to collaborate on policy
2276 objectives than competitive ones. When all the top execs from your industry
2277 can fit around a single boardroom table, they often do. And
2278 <span class="emphasis"><em>when</em></span> they do, they can forge a consensus position on
2279 regulation.
2280 </p><p>
2281 Rising through the ranks in a concentrated industry generally means working
2282 at two or three of the big companies. When there are only relatively few
2283 companies in a given industry, each company has a more ossified executive
2284 rank, leaving ambitious execs with fewer paths to higher positions unless
2285 they are recruited to a rival. This means that the top execs in concentrated
2286 industries are likely to have been colleagues at some point and socialize in
2287 the same circles — connected through social ties or, say, serving as
2288 trustees for each others’ estates. These tight social bonds foster a
2289 collegial, rather than competitive, attitude.
2290 </p><p>
2291 Highly concentrated industries also present a regulatory conundrum. When an
2292 industry is dominated by just four or five companies, the only people who
2293 are likely to truly understand the industry’s practices are its veteran
2294 executives. This means that top regulators are often former execs of the
2295 companies they are supposed to be regulating. These turns in government are
2296 often tacitly understood to be leaves of absence from industry, with former
2297 employers welcoming their erstwhile watchdogs back into their executive
2298 ranks once their terms have expired.
2299 </p><p>
2300 All this is to say that the tight social bonds, small number of firms, and
2301 regulatory capture of concentrated industries give the companies that
2302 comprise them the power to dictate many, if not all, of the regulations that
2303 bind them.
2304 </p><p>
2305 This is increasingly obvious. Whether it’s payday lenders <a class="ulink" href="https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/" target="_top">winning
2306 the right to practice predatory lending</a> or Apple <a class="ulink" href="https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation" target="_top">winning
2307 the right to decide who can fix your phone</a> or Google and Facebook
2308 winning the right to breach your private data without suffering meaningful
2309 consequences or victories for pipeline companies or impunity for opioid
2310 manufacturers or massive tax subsidies for incredibly profitable dominant
2311 businesses, it’s increasingly apparent that many of our official,
2312 evidence-based truth-seeking processes are, in fact, auctions for sale to
2313 the highest bidder.
2314 </p><p>
2315 It’s really impossible to overstate what a terrifying prospect this is. We
2316 live in an incredibly high-tech society, and none of us could acquire the
2317 expertise to evaluate every technological proposition that stands between us
2318 and our untimely, horrible deaths. You might devote your life to acquiring
2319 the media literacy to distinguish good scientific journals from corrupt
2320 pay-for-play lookalikes and the statistical literacy to evaluate the quality
2321 of the analysis in the journals as well as the microbiology and epidemiology
2322 knowledge to determine whether you can trust claims about the safety of
2323 vaccines — but that would still leave you unqualified to judge whether the
2324 wiring in your home will give you a lethal shock <span class="emphasis"><em>and</em></span>
2325 whether your car’s brakes’ software will cause them to fail unpredictably
2326 <span class="emphasis"><em>and</em></span> whether the hygiene standards at your butcher are
2327 sufficient to keep you from dying after you finish your dinner.
2328 </p><p>
2329 In a world as complex as this one, we have to defer to authorities, and we
2330 keep them honest by making those authorities accountable to us and binding
2331 them with rules to prevent conflicts of interest. We can’t possibly acquire
2332 the expertise to adjudicate conflicting claims about the best way to make
2333 the world safe and prosperous, but we <span class="emphasis"><em>can</em></span> determine
2334 whether the adjudication process itself is trustworthy.
2335 </p><p>
2336 Right now, it’s obviously not.
2337 </p><p>
2338 The past 40 years of rising inequality and industry concentration, together
2339 with increasingly weak accountability and transparency for expert agencies,
2340 has created an increasingly urgent sense of impending doom, the sense that
2341 there are vast conspiracies afoot that operate with tacit official approval
2342 despite the likelihood they are working to better themselves by ruining the
2343 rest of us.
2344 </p><p>
2345 For example, it’s been decades since Exxon’s own scientists concluded that
2346 its products would render the Earth uninhabitable by humans. And yet those
2347 decades were lost to us, in large part because Exxon lobbied governments and
2348 sowed doubt about the dangers of its products and did so with the
2349 cooperation of many public officials. When the survival of you and everyone
2350 you love is threatened by conspiracies, it’s not unreasonable to start
2351 questioning the things you think you know in an attempt to determine whether
2352 they, too, are the outcome of another conspiracy.
2353 </p><p>
2354 The collapse of the credibility of our systems for divining and upholding
2355 truths has left us in a state of epistemological chaos. Once, most of us
2356 might have assumed that the system was working and that our regulations
2357 reflected our best understanding of the empirical truths of the world as
2358 they were best understood — now we have to find our own experts to help us
2359 sort the true from the false.
2360 </p><p>
2361 If you’re like me, you probably believe that vaccines are safe, but you
2362 (like me) probably also can’t explain the microbiology or statistics. Few of
2363 us have the math skills to review the literature on vaccine safety and
2364 describe why their statistical reasoning is sound. Likewise, few of us can
2365 review the stats in the (now discredited) literature on opioid safety and
2366 explain how those stats were manipulated. Both vaccines and opioids were
2367 embraced by medical authorities, after all, and one is safe while the other
2368 could ruin your life. You’re left with a kind of inchoate constellation of
2369 rules of thumb about which experts you trust to fact-check controversial
2370 claims and then to explain how all those respectable doctors with their
2371 peer-reviewed research on opioid safety <span class="emphasis"><em>were</em></span> an
2372 aberration and then how you know that the doctors writing about vaccine
2373 safety are <span class="emphasis"><em>not</em></span> an aberration.
2374 </p><p>
2375 I’m 100% certain that vaccinating is safe and effective, but I’m also at
2376 something of a loss to explain exactly, <span class="emphasis"><em>precisely,</em></span> why
2377 I believe this, given all the corruption I know about and the many times the
2378 stamp of certainty has turned out to be a parochial lie told to further
2379 enrich the super rich.
2380 </p><p>
2381 Fake news — conspiracy theories, racist ideologies, scientific denialism —
2382 has always been with us. What’s changed today is not the mix of ideas in the
2383 public discourse but the popularity of the worst ideas in that
2384 mix. Conspiracy and denial have skyrocketed in lockstep with the growth of
2385 Big Inequality, which has also tracked the rise of Big Tech and Big Pharma
2386 and Big Wrestling and Big Car and Big Movie Theater and Big Everything Else.
2387 </p><p>
2388 No one can say for certain why this has happened, but the two dominant camps
2389 are idealism (the belief that the people who argue for these conspiracies
2390 have gotten better at explaining them, maybe with the help of
2391 machine-learning tools) or materialism (the ideas have become more
2392 attractive because of material conditions in the world).
2393 </p><p>
2394 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2395 theorists all my life, and I have not experienced any qualitative leap in
2396 the quality of those arguments.
2397 </p><p>
2398 The major difference is in the world, not the arguments. In a time where
2399 actual conspiracies are commonplace, conspiracy theories acquire a ring of
2400 plausibility.
2401 </p><p>
2402 We have always had disagreements about what’s true, but today, we have a
2403 disagreement over how we know whether something is true. This is an
2404 epistemological crisis, not a crisis over belief. It’s a crisis over the
2405 credibility of our truth-seeking exercises, from scientific journals (in an
2406 era where the biggest journal publishers have been caught producing
2407 pay-to-play journals for junk science) to regulations (in an era where
2408 regulators are routinely cycling in and out of business) to education (in an
2409 era where universities are dependent on corporate donations to keep their
2410 lights on).
2411 </p><p>
2412 Targeting — surveillance capitalism — makes it easier to find people who are
2413 undergoing this epistemological crisis, but it doesn’t create the
2414 crisis. For that, you need to look to corruption.
2415 </p><p>
2416 And, conveniently enough, it’s corruption that allows surveillance
2417 capitalism to grow by dismantling monopoly protections, by permitting
2418 reckless collection and retention of personal data, by allowing ads to be
2419 targeted in secret, and by foreclosing on the possibility of going somewhere
2420 else where you might continue to enjoy your friends without subjecting
2421 yourself to commercial surveillance.
2422 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="tech-is-different"></a>Technologia jest czymś odmiennym</h2></div></div></div><p>
2423 I reject both iterations of technological exceptionalism. I reject the idea
2424 that tech is uniquely terrible and led by people who are greedier or worse
2425 than the leaders of other industries, and I reject the idea that tech is so
2426 good — or so intrinsically prone to concentration — that it can’t be blamed
2427 for its present-day monopolistic status.
2428 </p><p>
2429 I think tech is just another industry, albeit one that grew up in the
2430 absence of real monopoly constraints. It may have been first, but it isn’t
2431 the worst nor will it be the last.
2432 </p><p>
2433 But there’s one way in which I <span class="emphasis"><em>am</em></span> a tech
2434 exceptionalist. I believe that online tools are the key to overcoming
2435 problems that are much more urgent than tech monopolization: climate change,
2436 inequality, misogyny, and discrimination on the basis of race, gender
2437 identity, and other factors. The internet is how we will recruit people to
2438 fight those fights, and how we will coordinate their labor. Tech is not a
2439 substitute for democratic accountability, the rule of law, fairness, or
2440 stability — but it’s a means to achieve these things.
2441 </p><p>
2442 The hard problem of our species is coordination. Everything from climate
2443 change to social change to running a business to making a family work can be
2444 viewed as a collective action problem.
2445 </p><p>
2446 The internet makes it easier than at any time before to find people who want
2447 to work on a project with you — hence the success of free and open-source
2448 software, crowdfunding, and racist terror groups — and easier than ever to
2449 coordinate the work you do.
2450 </p><p>
2451 The internet and the computers we connect to it also possess an exceptional
2452 quality: general-purposeness. The internet is designed to allow any two
2453 parties to communicate any data, using any protocol, without permission from
2454 anyone else. The only production design we have for computers is the
2455 general-purpose, <span class="quote"><span class="quote">Turing complete</span></span> computer that can run every
2456 program we can express in symbolic logic.
2457 </p><p>
2458 This means that every time someone with a special communications need
2459 invests in infrastructure and techniques to make the internet faster,
2460 cheaper, and more robust, this benefit redounds to everyone else who is
2461 using the internet to communicate. And this also means that every time
2462 someone with a special computing need invests to make computers faster,
2463 cheaper, and more robust, every other computing application is a potential
2464 beneficiary of this work.
2465 </p><p>
2466 For these reasons, every type of communication is gradually absorbed into
2467 the internet, and every type of device — from airplanes to pacemakers —
2468 eventually becomes a computer in a fancy case.
2469 </p><p>
2470 While these considerations don’t preclude regulating networks and computers,
2471 they do call for gravitas and caution when doing so because changes to
2472 regulatory frameworks could ripple out to have unintended consequences in
2473 many, many other domains.
2474 </p><p>
2475 The upshot of this is that our best hope of solving the big coordination
2476 problems — climate change, inequality, etc. — is with free, fair, and open
2477 tech. Our best hope of keeping tech free, fair, and open is to exercise
2478 caution in how we regulate tech and to attend closely to the ways in which
2479 interventions to solve one problem might create problems in other domains.
2480 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="ownership-of-facts"></a>Własność faktów</h2></div></div></div><p>
2481 Big Tech has a funny relationship with information. When you’re generating
2482 information — anything from the location data streaming off your mobile
2483 device to the private messages you send to friends on a social network — it
2484 claims the rights to make unlimited use of that data.
2485 </p><p>
2486 But when you have the audacity to turn the tables — to use a tool that
2487 blocks ads or slurps your waiting updates out of a social network and puts
2488 them in another app that lets you set your own priorities and suggestions or
2489 crawls their system to allow you to start a rival business — they claim that
2490 you’re stealing from them.
2491 </p><p>
2492 The thing is, information is a very bad fit for any kind of private property
2493 regime. Property rights are useful for establishing markets that can lead to
2494 the effective development of fallow assets. These markets depend on clear
2495 titles to ensure that the things being bought and sold in them can, in fact,
2496 be bought and sold.
2497 </p><p>
2498 Information rarely has such a clear title. Take phone numbers: There’s
2499 clearly something going wrong when Facebook slurps up millions of users’
2500 address books and uses the phone numbers it finds in them to plot out social
2501 graphs and fill in missing information about other users.
2502 </p><p>
2503 But the phone numbers Facebook nonconsensually acquires in this transaction
2504 are not the <span class="quote"><span class="quote">property</span></span> of the users they’re taken from nor do
2505 they belong to the people whose phones ring when you dial those numbers. The
2506 numbers are mere integers, 10 digits in the U.S. and Canada, and they
2507 appear in millions of places, including somewhere deep in pi as well as
2508 numerous other contexts. Giving people ownership titles to integers is an
2509 obviously terrible idea.
2510 </p><p>
2511 Likewise for the facts that Facebook and other commercial surveillance
2512 operators acquire about us, like that we are the children of our parents or
2513 the parents to our children or that we had a conversation with someone else
2514 or went to a public place. These data points can’t be property in the sense
2515 that your house or your shirt is your property because the title to them is
2516 intrinsically muddy: Does your mom own the fact that she is your mother? Do
2517 you? Do both of you? What about your dad — does he own this fact too, or
2518 does he have to license the fact from you (or your mom or both of you) in
2519 order to use this fact? What about the hundreds or thousands of other people
2520 who know these facts?
2521 </p><p>
2522 If you go to a Black Lives Matter demonstration, do the other demonstrators
2523 need your permission to post their photos from the event? The online fights
2524 over <a class="ulink" href="https://www.wired.com/story/how-to-take-photos-at-protests/" target="_top">when and
2525 how to post photos from demonstrations</a> reveal a nuanced, complex
2526 issue that cannot be easily hand-waved away by giving one party a property
2527 right that everyone else in the mix has to respect.
2528 </p><p>
2529 The fact that information isn’t a good fit with property and markets doesn’t
2530 mean that it’s not valuable. Babies aren’t property, but they’re inarguably
2531 valuable. In fact, we have a whole set of rules just for babies as well as a
2532 subset of those rules that apply to humans more generally. Someone who
2533 argues that babies won’t be truly valuable until they can be bought and sold
2534 like loaves of bread would be instantly and rightfully condemned as a
2535 monster.
2536 </p><p>
2537 It’s tempting to reach for the property hammer when Big Tech treats your
2538 information like a nail — not least because Big Tech are such prolific
2539 abusers of property hammers when it comes to <span class="emphasis"><em>their</em></span>
2540 information. But this is a mistake. If we allow markets to dictate the use
2541 of our information, then we’ll find that we’re sellers in a buyers’ market
2542 where the Big Tech monopolies set a price for our data that is so low as to
2543 be insignificant or, more likely, set at a nonnegotiable price of zero in a
2544 click-through agreement that you don’t have the opportunity to modify.
2545 </p><p>
2546 Meanwhile, establishing property rights over information will create
2547 insurmountable barriers to independent data processing. Imagine that we
2548 require a license to be negotiated when a translated document is compared
2549 with its original, something Google has done and continues to do billions of
2550 times to train its automated language translation tools. Google can afford
2551 this, but independent third parties cannot. Google can staff a clearances
2552 department to negotiate one-time payments to the likes of the EU (one of the
2553 major repositories of translated documents) while independent watchdogs
2554 wanting to verify that the translations are well-prepared, or to root out
2555 bias in translations, will find themselves needing a staffed-up legal
2556 department and millions for licenses before they can even get started.
2557 </p><p>
2558 The same goes for things like search indexes of the web or photos of
2559 peoples’ houses, which have become contentious thanks to Google’s Street
2560 View project. Whatever problems may exist with Google’s photographing of
2561 street scenes, resolving them by letting people decide who can take pictures
2562 of the facades of their homes from a public street will surely create even
2563 worse ones. Think of how street photography is important for newsgathering —
2564 including informal newsgathering, like photographing abuses of authority —
2565 and how being able to document housing and street life are important for
2566 contesting eminent domain, advocating for social aid, reporting planning and
2567 zoning violations, documenting discriminatory and unequal living conditions,
2568 and more.
2569 </p><p>
2570 The ownership of facts is antithetical to many kinds of human progress. It’s
2571 hard to imagine a rule that limits Big Tech’s exploitation of our collective
2572 labors without inadvertently banning people from gathering data on online
2573 harassment or compiling indexes of changes in language or simply
2574 investigating how the platforms are shaping our discourse — all of which
2575 require scraping data that other people have created and subjecting it to
2576 scrutiny and analysis.
2577 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="persuasion-works-slowly"></a>Przekonywanie działa… powoli</h2></div></div></div><p>
2578 The platforms may oversell their ability to persuade people, but obviously,
2579 persuasion works sometimes. Whether it’s the private realm that LGBTQ people
2580 used to recruit allies and normalize sexual diversity or the decadeslong
2581 project to convince people that markets are the only efficient way to solve
2582 complicated resource allocation problems, it’s clear that our societal
2583 attitudes <span class="emphasis"><em>can</em></span> change.
2584 </p><p>
2585 The project of shifting societal attitudes is a game of inches and
2586 years. For centuries, svengalis have purported to be able to accelerate this
2587 process, but even the most brutal forms of propaganda have struggled to make
2588 permanent changes. Joseph Goebbels was able to subject Germans to daily,
2589 mandatory, hourslong radio broadcasts, to round up and torture and murder
2590 dissidents, and to seize full control over their children’s education while
2591 banning any literature, broadcasts, or films that did not comport with his
2592 worldview.
2593 </p><p>
2594 Yet, after 12 years of terror, once the war ended, Nazi ideology was largely
2595 discredited in both East and West Germany, and a program of national truth
2596 and reconciliation was put in its place. Racism and authoritarianism were
2597 never fully abolished in Germany, but neither were the majority of Germans
2598 irrevocably convinced of Nazism — and the rise of racist authoritarianism in
2599 Germany today tells us that the liberal attitudes that replaced Nazism were
2600 no more permanent than Nazism itself.
2601 </p><p>
2602 Racism and authoritarianism have also always been with us. Anyone who’s
2603 reviewed the kind of messages and arguments that racists put forward today
2604 would be hard-pressed to say that they have gotten better at presenting
2605 their ideas. The same pseudoscience, appeals to fear, and circular logic
2606 that racists presented in the 1980s, when the cause of white supremacy was
2607 on the wane, are to be found in the communications of leading white
2608 nationalists today.
2609 </p><p>
2610 If racists haven’t gotten more convincing in the past decade, then how is it
2611 that more people were convinced to be openly racist at that time? I believe
2612 that the answer lies in the material world, not the world of ideas. The
2613 ideas haven’t gotten more convincing, but people have become more
2614 afraid. Afraid that the state can’t be trusted to act as an honest broker in
2615 life-or-death decisions, from those regarding the management of the economy
2616 to the regulation of painkillers to the rules for handling private
2617 information. Afraid that the world has become a game of musical chairs in
2618 which the chairs are being taken away at a never-before-seen rate. Afraid
2619 that justice for others will come at their expense. Monopolism isn’t the
2620 cause of these fears, but the inequality and material desperation and policy
2621 malpractice that monopolism contributes to is a significant contributor to
2622 these conditions. Inequality creates the conditions for both conspiracies
2623 and violent racist ideologies, and then surveillance capitalism lets
2624 opportunists target the fearful and the conspiracy-minded.
2625 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="paying-wont-help"></a>Płacenie nie pomoże</h2></div></div></div><p>
2626 As the old saw goes, <span class="quote"><span class="quote">If you’re not paying for the product, you’re the
2627 product.</span></span>
2628 </p><p>
2629 It’s a commonplace belief today that the advent of free, ad-supported media
2630 was the original sin of surveillance capitalism. The reasoning is that the
2631 companies that charged for access couldn’t <span class="quote"><span class="quote">compete with free</span></span>
2632 and so they were driven out of business. Their ad-supported competitors,
2633 meanwhile, declared open season on their users’ data in a bid to improve
2634 their ad targeting and make more money and then resorted to the most
2635 sensationalist tactics to generate clicks on those ads. If only we’d pay for
2636 media again, we’d have a better, more responsible, more sober discourse that
2637 would be better for democracy.
2638 </p><p>
2639 But the degradation of news products long precedes the advent of
2640 ad-supported online news. Long before newspapers were online, lax antitrust
2641 enforcement had opened the door for unprecedented waves of consolidation and
2642 roll-ups in newsrooms. Rival newspapers were merged, reporters and ad sales
2643 staff were laid off, physical plants were sold and leased back, leaving the
2644 companies loaded up with debt through leveraged buyouts and subsequent
2645 profit-taking by the new owners. In other words, it wasn’t merely shifts in
2646 the classified advertising market, which was long held to be the primary
2647 driver in the decline of the traditional newsroom, that made news companies
2648 unable to adapt to the internet — it was monopolism.
2649 </p><p>
2650 Then, as news companies <span class="emphasis"><em>did</em></span> come online, the ad
2651 revenues they commanded dropped even as the number of internet users (and
2652 thus potential online readers) increased. That shift was a function of
2653 consolidation in the ad sales market, with Google and Facebook emerging as
2654 duopolists who made more money every year from advertising while paying less
2655 and less of it to the publishers whose work the ads appeared
2656 alongside. Monopolism created a buyer’s market for ad inventory with
2657 Facebook and Google acting as gatekeepers.
2658 </p><p>
2659 Paid services continue to exist alongside free ones, and often it is these
2660 paid services — anxious to prevent people from bypassing their paywalls or
2661 sharing paid media with freeloaders — that exert the most control over their
2662 customers. Apple’s iTunes and App Stores are paid services, but to maximize
2663 their profitability, Apple has to lock its platforms so that third parties
2664 can’t make compatible software without permission. These locks allow the
2665 company to exercise both editorial control (enabling it to exclude <a class="ulink" href="https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression" target="_top">controversial
2666 political material</a>) and technological control, including control
2667 over who can repair the devices it makes. If we’re worried that ad-supported
2668 products deprive people of their right to self-determination by using
2669 persuasion techniques to nudge their purchase decisions a few degrees in one
2670 direction or the other, then the near-total control a single company holds
2671 over the decision of who gets to sell you software, parts, and service for
2672 your iPhone should have us very worried indeed.
2673 </p><p>
2674 We shouldn’t just be concerned about payment and control: The idea that
2675 paying will improve discourse is also dangerously wrong. The poor success
2676 rate of targeted advertising means that the platforms have to incentivize
2677 you to <span class="quote"><span class="quote">engage</span></span> with posts at extremely high levels to generate
2678 enough pageviews to safeguard their profits. As discussed earlier, to
2679 increase engagement, platforms like Facebook use machine learning to guess
2680 which messages will be most inflammatory and make a point of shoving those
2681 into your eyeballs at every turn so that you will hate-click and argue with
2682 people.
2683 </p><p>
2684 Perhaps paying would fix this, the reasoning goes. If platforms could be
2685 economically viable even if you stopped clicking on them once your
2686 intellectual and social curiosity had been slaked, then they would have no
2687 reason to algorithmically enrage you to get more clicks out of you, right?
2688 </p><p>
2689 There may be something to that argument, but it still ignores the wider
2690 economic and political context of the platforms and the world that allowed
2691 them to grow so dominant.
2692 </p><p>
2693 Platforms are world-spanning and all-encompassing because they are
2694 monopolies, and they are monopolies because we have gutted our most
2695 important and reliable anti-monopoly rules. Antitrust was neutered as a key
2696 part of the project to make the wealthy wealthier, and that project has
2697 worked. The vast majority of people on Earth have a negative net worth, and
2698 even the dwindling middle class is in a precarious state, undersaved for
2699 retirement, underinsured for medical disasters, and undersecured against
2700 climate and technology shocks.
2701 </p><p>
2702 In this wildly unequal world, paying doesn’t improve the discourse; it
2703 simply prices discourse out of the range of the majority of people. Paying
2704 for the product is dandy, if you can afford it.
2705 </p><p>
2706 If you think today’s filter bubbles are a problem for our discourse, imagine
2707 what they’d be like if rich people inhabited free-flowing Athenian
2708 marketplaces of ideas where you have to pay for admission while everyone
2709 else lives in online spaces that are subsidized by wealthy benefactors who
2710 relish the chance to establish conversational spaces where the <span class="quote"><span class="quote">house
2711 rules</span></span> forbid questioning the status quo. That is, imagine if the
2712 rich seceded from Facebook, and then, instead of running ads that made money
2713 for shareholders, Facebook became a billionaire’s vanity project that also
2714 happened to ensure that nobody talked about whether it was fair that only
2715 billionaires could afford to hang out in the rarified corners of the
2716 internet.
2717 </p><p>
2718 Behind the idea of paying for access is a belief that free markets will
2719 address Big Tech’s dysfunction. After all, to the extent that people have a
2720 view of surveillance at all, it is generally an unfavorable one, and the
2721 longer and more thoroughly one is surveilled, the less one tends to like
2722 it. Same goes for lock-in: If HP’s ink or Apple’s App Store were really
2723 obviously fantastic, they wouldn’t need technical measures to prevent users
2724 from choosing a rival’s product. The only reason these technical
2725 countermeasures exist is that the companies don’t believe their customers
2726 would <span class="emphasis"><em>voluntarily</em></span> submit to their terms, and they want
2727 to deprive them of the choice to take their business elsewhere.
2728 </p><p>
2729 Advocates for markets laud their ability to aggregate the diffused knowledge
2730 of buyers and sellers across a whole society through demand signals, price
2731 signals, and so on. The argument for surveillance capitalism being a
2732 <span class="quote"><span class="quote">rogue capitalism</span></span> is that machine-learning-driven persuasion
2733 techniques distort decision-making by consumers, leading to incorrect
2734 signals — consumers don’t buy what they prefer, they buy what they’re
2735 tricked into preferring. It follows that the monopolistic practices of
2736 lock-in, which do far more to constrain consumers’ free choices, are even
2737 more of a <span class="quote"><span class="quote">rogue capitalism.</span></span>
2738 </p><p>
2739 The profitability of any business is constrained by the possibility that its
2740 customers will take their business elsewhere. Both surveillance and lock-in
2741 are anti-features that no customer wants. But monopolies can capture their
2742 regulators, crush their competitors, insert themselves into their customers’
2743 lives, and corral people into <span class="quote"><span class="quote">choosing</span></span> their services
2744 regardless of whether they want them — it’s fine to be terrible when there
2745 is no alternative.
2746 </p><p>
2747 Ultimately, surveillance and lock-in are both simply business strategies
2748 that monopolists can choose. Surveillance companies like Google are
2749 perfectly capable of deploying lock-in technologies — just look at the
2750 onerous Android licensing terms that require device-makers to bundle in
2751 Google’s suite of applications. And lock-in companies like Apple are
2752 perfectly capable of subjecting their users to surveillance if it means
2753 keeping the Chinese government happy and preserving ongoing access to
2754 Chinese markets. Monopolies may be made up of good, ethical people, but as
2755 institutions, they are not your friend — they will do whatever they can get
2756 away with to maximize their profits, and the more monopolistic they are, the
2757 more they <span class="emphasis"><em>can</em></span> get away with.
2758 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="an-ecology-moment-for-trustbusting"></a><span class="quote"><span class="quote"> ekologia</span></span> chwila na zerwanie zaufania</h2></div></div></div><p>
2759 If we’re going to break Big Tech’s death grip on our digital lives, we’re
2760 going to have to fight monopolies. That may sound pretty mundane and
2761 old-fashioned, something out of the New Deal era, while ending the use of
2762 automated behavioral modification feels like the plotline of a really cool
2763 cyberpunk novel.
2764 </p><p>
2765 Meanwhile, breaking up monopolies is something we seem to have forgotten how
2766 to do. There is a bipartisan, trans-Atlantic consensus that breaking up
2767 companies is a fool’s errand at best — liable to mire your federal
2768 prosecutors in decades of litigation — and counterproductive at worst,
2769 eroding the <span class="quote"><span class="quote">consumer benefits</span></span> of large companies with massive
2770 efficiencies of scale.
2771 </p><p>
2772 But trustbusters once strode the nation, brandishing law books, terrorizing
2773 robber barons, and shattering the illusion of monopolies’ all-powerful grip
2774 on our society. The trustbusting era could not begin until we found the
2775 political will — until the people convinced politicians they’d have their
2776 backs when they went up against the richest, most powerful men in the world.
2777 </p><p>
2778 Czy moglibyśmy ponownie znaleźć tę wolę polityczną?
2779 </p><p>
2780 Copyright scholar James Boyle has described how the term
2781 <span class="quote"><span class="quote">ecology</span></span> marked a turning point in environmental
2782 activism. Prior to the adoption of this term, people who wanted to preserve
2783 whale populations didn’t necessarily see themselves as fighting the same
2784 battle as people who wanted to protect the ozone layer or fight freshwater
2785 pollution or beat back smog or acid rain.
2786 </p><p>
2787 But the term <span class="quote"><span class="quote">ecology</span></span> welded these disparate causes together
2788 into a single movement, and the members of this movement found solidarity
2789 with one another. The people who cared about smog signed petitions
2790 circulated by the people who wanted to end whaling, and the anti-whalers
2791 marched alongside the people demanding action on acid rain. This uniting
2792 behind a common cause completely changed the dynamics of environmentalism,
2793 setting the stage for today’s climate activism and the sense that preserving
2794 the habitability of the planet Earth is a shared duty among all people.
2795 </p><p>
2796 I believe we are on the verge of a new <span class="quote"><span class="quote">ecology</span></span> moment
2797 dedicated to combating monopolies. After all, tech isn’t the only
2798 concentrated industry nor is it even the <span class="emphasis"><em>most</em></span>
2799 concentrated of industries.
2800 </p><p>
2801 You can find partisans for trustbusting in every sector of the
2802 economy. Everywhere you look, you can find people who’ve been wronged by
2803 monopolists who’ve trashed their finances, their health, their privacy,
2804 their educations, and the lives of people they love. Those people have the
2805 same cause as the people who want to break up Big Tech and the same
2806 enemies. When most of the world’s wealth is in the hands of a very few, it
2807 follows that nearly every large company will have overlapping shareholders.
2808 </p><p>
2809 That’s the good news: With a little bit of work and a little bit of
2810 coalition building, we have more than enough political will to break up Big
2811 Tech and every other concentrated industry besides. First we take Facebook,
2812 then we take AT&amp;T/WarnerMedia.
2813 </p><p>
2814 But here’s the bad news: Much of what we’re doing to tame Big Tech
2815 <span class="emphasis"><em>instead</em></span> of breaking up the big companies also
2816 forecloses on the possibility of breaking them up later.
2817 </p><p>
2818 Big Tech’s concentration currently means that their inaction on harassment,
2819 for example, leaves users with an impossible choice: absent themselves from
2820 public discourse by, say, quitting Twitter or endure vile, constant
2821 abuse. Big Tech’s over-collection and over-retention of data results in
2822 horrific identity theft. And their inaction on extremist recruitment means
2823 that white supremacists who livestream their shooting rampages can reach an
2824 audience of billions. The combination of tech concentration and media
2825 concentration means that artists’ incomes are falling even as the revenue
2826 generated by their creations are increasing.
2827 </p><p>
2828 Yet governments confronting all of these problems all inevitably converge on
2829 the same solution: deputize the Big Tech giants to police their users and
2830 render them liable for their users’ bad actions. The drive to force Big Tech
2831 to use automated filters to block everything from copyright infringement to
2832 sex-trafficking to violent extremism means that tech companies will have to
2833 allocate hundreds of millions to run these compliance systems.
2834 </p><p>
2835 These rules — the EU’s new Directive on Copyright, Australia’s new terror
2836 regulation, America’s FOSTA/SESTA sex-trafficking law and more — are not
2837 just death warrants for small, upstart competitors that might challenge Big
2838 Tech’s dominance but who lack the deep pockets of established incumbents to
2839 pay for all these automated systems. Worse still, these rules put a floor
2840 under how small we can hope to make Big Tech.
2841 </p><p>
2842 That’s because any move to break up Big Tech and cut it down to size will
2843 have to cope with the hard limit of not making these companies so small that
2844 they can no longer afford to perform these duties — and it’s
2845 <span class="emphasis"><em>expensive</em></span> to invest in those automated filters and
2846 outsource content moderation. It’s already going to be hard to unwind these
2847 deeply concentrated, chimeric behemoths that have been welded together in
2848 the pursuit of monopoly profits. Doing so while simultaneously finding some
2849 way to fill the regulatory void that will be left behind if these
2850 self-policing rulers were forced to suddenly abdicate will be much, much
2851 harder.
2852 </p><p>
2853 Allowing the platforms to grow to their present size has given them a
2854 dominance that is nearly insurmountable — deputizing them with public duties
2855 to redress the pathologies created by their size makes it virtually
2856 impossible to reduce that size. Lather, rinse, repeat: If the platforms
2857 don’t get smaller, they will get larger, and as they get larger, they will
2858 create more problems, which will give rise to more public duties for the
2859 companies, which will make them bigger still.
2860 </p><p>
2861 We can work to fix the internet by breaking up Big Tech and depriving them
2862 of monopoly profits, or we can work to fix Big Tech by making them spend
2863 their monopoly profits on governance. But we can’t do both. We have to
2864 choose between a vibrant, open internet or a dominated, monopolized internet
2865 commanded by Big Tech giants that we struggle with constantly to get them to
2866 behave themselves.
2867 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="make-big-tech-small-again"></a>Spraw, aby 'Big Tech' stała się ponownie 'małą' technologią</h2></div></div></div><p>
2868 Trustbusting is hard. Breaking big companies into smaller ones is expensive
2869 and time-consuming. So time-consuming that by the time you’re done, the
2870 world has often moved on and rendered years of litigation irrelevant. From
2871 1969 to 1982, the U.S. government pursued an antitrust case against IBM over
2872 its dominance of mainframe computing — but the case collapsed in 1982
2873 because mainframes were being speedily replaced by PCs.
2874 </p><div class="blockquote"><blockquote class="blockquote"><p>
2875 A future U.S. president could simply direct their attorney general to
2876 enforce the law as it was written.
2877 </p></blockquote></div><p>
2878 It’s far easier to prevent concentration than to fix it, and reinstating the
2879 traditional contours of U.S. antitrust enforcement will, at the very least,
2880 prevent further concentration. That means bans on mergers between large
2881 companies, on big companies acquiring nascent competitors, and on platform
2882 companies competing directly with the companies that rely on the platforms.
2883 </p><p>
2884 These powers are all in the plain language of U.S. antitrust laws, so in
2885 theory, a future U.S. president could simply direct their attorney general
2886 to enforce the law as it was written. But after decades of judicial
2887 <span class="quote"><span class="quote">education</span></span> in the benefits of monopolies, after multiple
2888 administrations that have packed the federal courts with lifetime-appointed
2889 monopoly cheerleaders, it’s not clear that mere administrative action would
2890 do the trick.
2891 </p><p>
2892 If the courts frustrate the Justice Department and the president, the next
2893 stop would be Congress, which could eliminate any doubt about how antitrust
2894 law should be enforced in the U.S. by passing new laws that boil down to
2895 saying, <span class="quote"><span class="quote">Knock it off. We all know what the Sherman Act says. Robert
2896 Bork was a deranged fantasist. For avoidance of doubt, <span class="emphasis"><em>fuck that
2897 guy</em></span>.</span></span> In other words, the problem with monopolies is
2898 <span class="emphasis"><em>monopolism</em></span> — the concentration of power into too few
2899 hands, which erodes our right to self-determination. If there is a monopoly,
2900 the law wants it gone, period. Sure, get rid of monopolies that create
2901 <span class="quote"><span class="quote">consumer harm</span></span> in the form of higher prices, but also,
2902 <span class="emphasis"><em>get rid of other monopolies, too</em></span>.
2903 </p><p>
2904 But this only prevents things from getting worse. To help them get better,
2905 we will have to build coalitions with other activists in the anti-monopoly
2906 ecology movement — a pluralism movement or a self-determination movement —
2907 and target existing monopolies in every industry for breakup and structural
2908 separation rules that prevent, for example, the giant eyewear monopolist
2909 Luxottica from dominating both the sale and the manufacture of spectacles.
2910 </p><p>
2911 In an important sense, it doesn’t matter which industry the breakups begin
2912 in. Once they start, shareholders in <span class="emphasis"><em>every</em></span> industry
2913 will start to eye their investments in monopolists skeptically. As
2914 trustbusters ride into town and start making lives miserable for
2915 monopolists, the debate around every corporate boardroom’s table will
2916 shift. People within corporations who’ve always felt uneasy about monopolism
2917 will gain a powerful new argument to fend off their evil rivals in the
2918 corporate hierarchy: <span class="quote"><span class="quote">If we do it my way, we make less money; if we do
2919 it your way, a judge will fine us billions and expose us to ridicule and
2920 public disapprobation. So even though I get that it would be really cool to
2921 do that merger, lock out that competitor, or buy that little company and
2922 kill it before it can threaten it, we really shouldn’t — not if we don’t
2923 want to get tied to the DOJ’s bumper and get dragged up and down Trustbuster
2924 Road for the next 10 years.</span></span>
2925 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="goto-10"></a>20 GOTO 10</h2></div></div></div><p>
2926 Fixing Big Tech will require a lot of iteration. As cyber lawyer Lawrence
2927 Lessig wrote in his 1999 book, <span class="emphasis"><em>Code and Other Laws of
2928 Cyberspace</em></span>, our lives are regulated by four forces: law (what’s
2929 legal), code (what’s technologically possible), norms (what’s socially
2930 acceptable), and markets (what’s profitable).
2931 </p><p>
2932 If you could wave a wand and get Congress to pass a law that re-fanged the
2933 Sherman Act tomorrow, you could use the impending breakups to convince
2934 venture capitalists to fund competitors to Facebook, Google, Twitter, and
2935 Apple that would be waiting in the wings after they were cut down to size.
2936 </p><p>
2937 But getting Congress to act will require a massive normative shift, a mass
2938 movement of people who care about monopolies — and pulling them apart.
2939 </p><p>
2940 Getting people to care about monopolies will take technological
2941 interventions that help them to see what a world free from Big Tech might
2942 look like. Imagine if someone could make a beloved (but unauthorized)
2943 third-party Facebook or Twitter client that dampens the anxiety-producing
2944 algorithmic drumbeat and still lets you talk to your friends without being
2945 spied upon — something that made social media more sociable and less
2946 toxic. Now imagine that it gets shut down in a brutal legal battle. It’s
2947 always easier to convince people that something must be done to save a thing
2948 they love than it is to excite them about something that doesn’t even exist
2949 yet.
2950 </p><p>
2951 Neither tech nor law nor code nor markets are sufficient to reform Big
2952 Tech. But a profitable competitor to Big Tech could bankroll a legislative
2953 push; legal reform can embolden a toolsmith to make a better tool; the tool
2954 can create customers for a potential business who value the benefits of the
2955 internet but want them delivered without Big Tech; and that business can get
2956 funded and divert some of its profits to legal reform. 20 GOTO 10 (or
2957 lather, rinse, repeat). Do it again, but this time, get farther! After all,
2958 this time you’re starting with weaker Big Tech adversaries, a constituency
2959 that understands things can be better, Big Tech rivals who’ll help ensure
2960 their own future by bankrolling reform, and code that other programmers can
2961 build on to weaken Big Tech even further.
2962 </p><p>
2963 The surveillance capitalism hypothesis — that Big Tech’s products really
2964 work as well as they say they do and that’s why everything is so screwed up
2965 — is way too easy on surveillance and even easier on capitalism. Companies
2966 spy because they believe their own BS, and companies spy because governments
2967 let them, and companies spy because any advantage from spying is so
2968 short-lived and minor that they have to do more and more of it just to stay
2969 in place.
2970 </p><p>
2971 As to why things are so screwed up? Capitalism. Specifically, the monopolism
2972 that creates inequality and the inequality that creates monopolism. It’s a
2973 form of capitalism that rewards sociopaths who destroy the real economy to
2974 inflate the bottom line, and they get away with it for the same reason
2975 companies get away with spying: because our governments are in thrall to
2976 both the ideology that says monopolies are actually just fine and in thrall
2977 to the ideology that says that in a monopolistic world, you’d better not
2978 piss off the monopolists.
2979 </p><p>
2980 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked rule
2981 begets surveillance. Surveillance isn’t bad because it lets people
2982 manipulate us. It’s bad because it crushes our ability to be our authentic
2983 selves — and because it lets the rich and powerful figure out who might be
2984 thinking of building guillotines and what dirt they can use to discredit
2985 those embryonic guillotine-builders before they can even get to the
2986 lumberyard.
2987 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="up-and-through"></a>W górę i na wylot</h2></div></div></div><p>
2988 With all the problems of Big Tech, it’s tempting to imagine solving the
2989 problem by returning to a world without tech at all. Resist that temptation.
2990 </p><p>
2991 The only way out of our Big Tech problem is up and through. If our future is
2992 not reliant upon high tech, it will be because civilization has fallen. Big
2993 Tech wired together a planetary, species-wide nervous system that, with the
2994 proper reforms and course corrections, is capable of seeing us through the
2995 existential challenge of our species and planet. Now it’s up to us to seize
2996 the means of computation, putting that electronic nervous system under
2997 democratic, accountable control.
2998 </p><p>
2999 I am, secretly, despite what I have said earlier, a tech exceptionalist. Not
3000 in the sense of thinking that tech should be given a free pass to monopolize
3001 because it has <span class="quote"><span class="quote">economies of scale</span></span> or some other nebulous
3002 feature. I’m a tech exceptionalist because I believe that getting tech right
3003 matters and that getting it wrong will be an unmitigated catastrophe — and
3004 doing it right can give us the power to work together to save our
3005 civilization, our species, and our planet.
3006 </p></div></div></body></html>