]> pere.pagekite.me Git - text-destroy-surveillance.git/blob - public/how-to-destroy-surveillance-capitalism.de.html
Updated web version.
[text-destroy-surveillance.git] / public / how-to-destroy-surveillance-capitalism.de.html
1 <html><head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"><title>Wie man den Überwachungskapitalismus zerstört</title><meta name="generator" content="DocBook XSL Stylesheets V1.79.1"><meta name="description" content="Die von uns genutzten Geräte und Dienste sammeln den Großteil der Daten, welche die NSA für ihr Überwachungsprojekt nutzt. Wir bezahlen für diese Geräte und den damit verbundenen Diensten, und schließlich übernehmen wir auch noch die Lieferung der Daten, die über unsere Leben, Meinungen und Vorliegen erhoben werden. Dank Big Tech ist der Überwachungskapitalismus überall. Nicht weil er gut darin ist, unser Verhalten zu manipulieren, und nicht wegen schurkenhafter Ausnutzung der Macht der Großunternehmen. Er ist das Ergebnis ungehemmten Monopolismus und des missbräulichen Agierens, dem er Vorschub leistet. Es ist das System, das wir beabsichtigt und erwartet funktioniert. Cory Doctorow hat eine ausschweifende Kritik zu Shoshanas Zuboffs „Das Zeitalter des Überwachungskapitalismus“ verfasst, die eine unverblümte Analyse des Problems beinhaltet und zu einem alternativen Lösungsvorschlag führt."><style type="text/css">
2 body { background-image: url('images/draft.png');
3 background-repeat: no-repeat;
4 background-position: top left;
5 /* The following properties make the watermark "fixed" on the page. */
6 /* I think that's just a bit too distracting for the reader... */
7 /* background-attachment: fixed; */
8 /* background-position: center center; */
9 }</style></head><body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF"><div lang="de" class="article"><div class="titlepage"><div><div><h2 class="title"><a name="index"></a>Wie man den Überwachungskapitalismus zerstört</h2></div><div><div class="authorgroup"><div class="author"><h3 class="author"><span class="firstname">Cory</span> <span class="surname">Doctorow</span></h3></div></div></div><div><p class="copyright">Copyright © 2020 Cory Doctorow</p></div><div><p class="copyright">Copyright © 2020 Petter Reinholdtsen</p></div><div><div class="legalnotice"><a name="idm18"></a><p>
10 Wie man den Überwachungskapitalismus zerstört, von Cory Doctorow.
11 </p><p>
12 Herausgegeben von Petter Reinholdtsen.
13 </p><p>
14 ISBN 978-82-93828-XX-X (gebundenes Buch)
15 </p><p>
16 ISBN 978-82-93828-XX-X (Taschenbuch)
17 </p><p>
18 ISBN 978-82-93828-XX-X (ePub)
19 </p><p>
20 Dieses Buch kann unter <a class="ulink" href="https://www.lulu.com/" target="_top">https://www.lulu.com/</a> erworben werden.
21 </p><p>
22 Falls du Rechtschreibfehler oder sonstige Fehler findest, oder falls du
23 Verbesserungsvorschläge die Übersetzung betreffend hast, pflege diese auf
24 <a class="ulink" href="https://hosted.weblate.org/projects/rms-personal-data-safe/how-to-destroy-surveillance-capitalism/de/" target="_top">https://hosted.weblate.org/projects/rms-personal-data-safe/how-to-destroy-surveillance-capitalism/de/</a>
25 ein.
26 </p><p>
27 <span class="inlinemediaobject"><img src="images/cc-some-rights-reserved.png" align="middle" height="38" alt="Creative Commons, einige Rechte vorbehalten"></span>
28 </p><p>
29 Dieses Buch steht unter einer Creative-Commons-Lizenz. Diese Lizenz erlaubt
30 beliebige Nutzung dieses Werks, so lange eine Namensnennung erfolgt und
31 keine Bearbeitungen erfolgen. Weitere Informationen über diese Lizenz
32 findest du unter <a class="ulink" href="https://creativecommons.org/licenses/by-nd/4.0/" target="_top">https://creativecommons.org/licenses/by-nd/4.0/</a>.
33 </p></div></div><div><div class="abstract"><p class="title"><b>Zusammenfassung</b></p><p>
34 Die von uns genutzten Geräte und Dienste sammeln den Großteil der Daten,
35 welche die NSA für ihr Überwachungsprojekt nutzt. Wir bezahlen für diese
36 Geräte und den damit verbundenen Diensten, und schließlich übernehmen wir
37 auch noch die Lieferung der Daten, die über unsere Leben, Meinungen und
38 Vorliegen erhoben werden.
39 </p><p>
40 Dank Big Tech ist der Überwachungskapitalismus überall. Nicht weil er gut
41 darin ist, unser Verhalten zu manipulieren, und nicht wegen schurkenhafter
42 Ausnutzung der Macht der Großunternehmen. Er ist das Ergebnis ungehemmten
43 Monopolismus und des missbräulichen Agierens, dem er Vorschub leistet. Es
44 ist das System, das wir beabsichtigt und erwartet funktioniert. Cory
45 Doctorow hat eine ausschweifende Kritik zu Shoshanas Zuboffs „Das Zeitalter
46 des Überwachungskapitalismus“ verfasst, die eine unverblümte Analyse des
47 Problems beinhaltet und zu einem alternativen Lösungsvorschlag führt.
48 </p></div></div></div><hr></div><div class="toc"><p><b>Inhaltsverzeichnis</b></p><dl class="toc"><dt><span class="sect1"><a href="#the-net-of-a-thousand-lies">Das Netz aus tausend Lügen</a></span></dt><dt><span class="sect1"><a href="#digital-rights-activism-a-quarter-century-on">Digitaler-Rechte-Aktivismus, ein Vierteljahrhundert später</a></span></dt><dt><span class="sect1"><a href="#tech-exceptionalism-then-and-now">Tech-Exzeptionalismus, damals und heute</a></span></dt><dt><span class="sect1"><a href="#dont-believe-the-hype">Glaube nicht an den Hype</a></span></dt><dt><span class="sect1"><a href="#what-is-persuasion">Was ist Überzeugung?</a></span></dt><dd><dl><dt><span class="sect2"><a href="#segmenting">1. Aufteilung</a></span></dt><dt><span class="sect2"><a href="#deception">2. Deception</a></span></dt><dt><span class="sect2"><a href="#domination">3. Domination</a></span></dt><dt><span class="sect2"><a href="#bypassing-our-rational-faculties">4. Bypassing our rational faculties</a></span></dt></dl></dd><dt><span class="sect1"><a href="#if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak">If data is the new oil, then surveillance capitalism’s engine has a leak</a></span></dt><dt><span class="sect1"><a href="#what-is-facebook">What is Facebook?</a></span></dt><dt><span class="sect1"><a href="#monopoly-and-the-right-to-the-future-tense">Monopoly and the right to the future tense</a></span></dt><dt><span class="sect1"><a href="#search-order-and-the-right-to-the-future-tense">Search order and the right to the future tense</a></span></dt><dt><span class="sect1"><a href="#monopolists-can-afford-sleeping-pills-for-watchdogs">Monopolists can afford sleeping pills for watchdogs</a></span></dt><dt><span class="sect1"><a href="#privacy-and-monopoly">Privacy and monopoly</a></span></dt><dt><span class="sect1"><a href="#ronald-reagan-pioneer-of-tech-monopolism">Ronald Reagan, pioneer of tech monopolism</a></span></dt><dt><span class="sect1"><a href="#steering-with-the-windshield-wipers">Steering with the windshield wipers</a></span></dt><dt><span class="sect1"><a href="#surveillance-still-matters">Surveillance still matters</a></span></dt><dt><span class="sect1"><a href="#dignity-and-sanctuary">Dignity and sanctuary</a></span></dt><dt><span class="sect1"><a href="#afflicting-the-afflicted">Afflicting the afflicted</a></span></dt><dt><span class="sect1"><a href="#any-data-you-collect-and-retain-will-eventually-leak">Any data you collect and retain will eventually leak</a></span></dt><dt><span class="sect1"><a href="#critical-tech-exceptionalism-is-still-tech-exceptionalism">Critical tech exceptionalism is still tech exceptionalism</a></span></dt><dt><span class="sect1"><a href="#how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story">How monopolies, not mind control, drive surveillance capitalism: The
49 Snapchat story</a></span></dt><dt><span class="sect1"><a href="#a-monopoly-over-your-friends">A monopoly over your friends</a></span></dt><dt><span class="sect1"><a href="#fake-news-is-an-epistemological-crisis">Fake news is an epistemological crisis</a></span></dt><dt><span class="sect1"><a href="#tech-is-different">Tech is different</a></span></dt><dt><span class="sect1"><a href="#ownership-of-facts">Ownership of facts</a></span></dt><dt><span class="sect1"><a href="#persuasion-works-slowly">Persuasion works… slowly</a></span></dt><dt><span class="sect1"><a href="#paying-wont-help">Paying won’t help</a></span></dt><dt><span class="sect1"><a href="#an-ecology-moment-for-trustbusting">An <span class="quote"><span class="quote">ecology</span></span> moment for trustbusting</a></span></dt><dt><span class="sect1"><a href="#make-big-tech-small-again">Make Big Tech small again</a></span></dt><dt><span class="sect1"><a href="#goto-10">20 GOTO 10</a></span></dt><dt><span class="sect1"><a href="#up-and-through">Up and through</a></span></dt></dl></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="the-net-of-a-thousand-lies"></a>Das Netz aus tausend Lügen</h2></div></div></div><p>
50 Am meisten überrascht am Wiederaufkommen der „Flat Earther“ im
51 21. Jahrhundert, wie allgegenwärtig die Beweise gegen diese Theorie
52 sind. Man mag noch einsehen, dass vor hunderten von Jahren Leute
53 vernünftigerweise denken durften, dass die Erde flach sei, da sie keinen
54 ausreichend hohen Beobachtungspunkt erreichen konnten, von dem aus sie die
55 Erdkrümmung hätten sehen können.
56 </p><p>
57 Aber heutzutage braucht es schon einen außergewöhnlichen Glauben, um
58 weiterhin an die Theorie der Flachen Erde zu glauben – wo man doch bereits
59 in Grundschulen GoPro-Kameras an Ballons befestigt und sie hoch genug
60 aufsteigen lässt, um die Erdkrümmung zu fotografieren, vom gewöhnlichen
61 Ausblick aus einem Flugzeugfenster ganz zu schweigen.
62 </p><p>
63 Ähnlich verhält es sich mit Weißem Nationalismus und Eugenik: In einem
64 Zeitalter, in dem jeder durch eine Postsendung eines Rachenabstrichs und
65 etwas Geld an eine DNA-Sequenzierungs-Firma zu einem Genom-Datenpunkt werden
66 kann, war das Wiederlegen von <span class="quote"><span class="quote">Rassentheorie</span></span> noch nie so
67 einfach.
68 </p><p>
69 Wir durchleben ein goldenes Zeitalter von sowohl sofort verfügbaren Fakten
70 als auch deren Leugnung. Furchtbare, randständige Vorstellungen, die
71 Jahrzehnte oder gar Jahrhunderte geschlummert haben, haben es
72 augenscheinlich über Nacht in den Mainstream geschafft.
73 </p><p>
74 Wenn eine obskure Idee an Auftrieb erlangt, gibt es nur zwei Erklärungen
75 dafür: Entweder ist die Person, die die Idee verbeitet, besser darin
76 geworden, ihre Ansicht zu vertreten, oder die Ansicht ist angesichts sich
77 anhäufender Beweise schwerer zu leugnen geworden. Anders gesagt: Wenn wir
78 möchten, dass die Leute den Klimawandel ernst nehmen, können wir einen
79 Haufen Greta Thunbergs wortgewandte, emotionale Reden auf Podien halten
80 lassen und damit unsere Herzen und unseren Verstand gewinnen, oder wir
81 können Fluten, Feuersbrünste, eine mörderische Sonne und Pandemien für uns
82 sprechen lassen. In der Praxis sollten wir wohl von beidem etwas tun: Je
83 mehr wir schmoren, brennen, ertrinken und dahinschwinden, umso einfacher
84 wird es für die Greta Thunbergs dieser Welt, uns zu überzeugen.
85 </p><p>
86 Die Argumente für den absurden Glauben an hasserfüllte Verschwörungen wie
87 Impfgegnerschaft, Klimaleugnung, eine flache Erde und Eugenik sind nicht
88 besser als vor einer Generation. Sie sind sogar schlechter, weil sie Leuten
89 schmackhaft gemacht werden, die wenigstens ein Gespür für die widerlegenden
90 Fakten haben.
91 </p><p>
92 Impfgegnerschaft gibt es bereits seit den ersten Impfstoffen, aber frühere
93 Impfgegner hatten es auf Leute abgesehen, die nicht einmal ein grundlegendes
94 Verständnis von Mikrobiologie hatten, und überdies waren jene Impfgegner
95 nicht Zeugen massenmörderischer Krankheiten wie Polio, Pocken und Masern
96 geworden. Impfgegner von heute sind nicht eloquenter als frührere Impfgegner
97 und haben es heute schwieriger.
98 </p><p>
99 Können diese Verschwörungstheoretiker wirklich im Ansatz ihrer wichtigsten
100 Argumente erfolgreich sein?
101 </p><p>
102 Manche denken ja. Heutzutage gibt es den weitverbreiteten Glauben, dass
103 maschinelles Lernen und kommerzielle Überwachung sogar den schwurbelnsten
104 Verschwörungstheoretiker in einen Marionettenspieler verwandeln können, der
105 anfälligen Leuten mit K.I.-gestützten, das rationale Denken austricksenden
106 Argumenten die Wahrnehmung verbiegt und sie, normale Leute, schließlich in
107 Flacherdler, Impfgegner oder gar Nazis verwandelt. Wenn die
108 RAND-Corporation<a class="ulink" href="https://www.rand.org/content/dam/rand/pubs/research_reports/RR400/RR453/RAND_RR453.pdf" target="_top">
109 Facebook für <span class="quote"><span class="quote">Radikalisierung</span></span></a> verantwortlich macht und
110 wenn Facebook das Verbreiten von Falschinformationen in Bezug auf SARS-CoV-2
111 <a class="ulink" href="https://secure.avaaz.org/campaign/en/facebook_threat_health/" target="_top">seinen
112 Algorithmen in die Schuhe schiebt</a>, dann ist die verdeckte Botschaft,
113 dass maschinelles Lernen und Überwachung die Änderungen in unserem Konsens
114 darüber hervorrufen, was wahr ist.
115 </p><p>
116 Schließlich muss in einer Welt, in der wuchernde und inkohärente
117 Verschwörungstheorien wie Pizzagate und sein Nachfolger QAnon zahlreiche
118 Anhänger haben, <span class="emphasis"><em> einiges </em></span> im Gange sein.
119 </p><p>
120 Aber was, wenn es eine andere Erklärung gibt? Was, wenn es die wesentlichen
121 Umstände und nicht die Argumente sind, die diesen Verschwörungstheoretikern
122 Aufwind geben? Was, wenn die Traumata vom Durchleben <span class="emphasis"><em>echter
123 Verschwörungen</em></span> um uns herum - Verschwörungen zwischen Reichen,
124 deren Lobbyisten und Gesetzemachern, um unangenehme Fakten und Beweise von
125 unlauterem Verhalten zu vertuschen (solche Verschwörungen nennt man
126 üblicherweise <span class="quote"><span class="quote">Korruption</span></span>) - Leute anfällig für
127 Verschwörungstheorien macht?
128 </p><p>
129 Wenn es Trauma und keine ansteckende Krankheit - materielle Umstände und
130 nicht Ideologie - ist, die heutzutage den Unterschied macht und abstoßenden
131 Falschinformationen angesichts leicht beobachtbarer Fakten Auftrieb gibt,
132 heißt das nicht, dass unsere Computernetzwerke keine Schuld haben. Sie
133 tragen immer noch den Großteil dazu bei, indem sie anfällige Leute
134 identifizieren und sie nach und nach zu immer extremeren Ideen und
135 Communities führen.
136 </p><p>
137 Der Glaube an Verschwörungen ist ein wütendes Feuer, das reellen Schaden
138 angerichtet hat und eine echte Bedrohung für unseren Planeten und unsere
139 Spezies ist, von Epidemien <a class="ulink" href="https://www.cdc.gov/measles/cases-outbreaks.html" target="_top">, die von Impfgegnern
140 ausgelöst wurden,</a> bis zu Massenmorden <a class="ulink" href="https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html" target="_top">,
141 ausgelöst von rassistischen Verschwörungstheorien,</a> bis zum Sterben
142 unseres Planeten, ausgelöst von Klimawandel-leugnerischer Passivität. Unsere
143 Welt brennt, und wir müssen diese Brände löschen - indem wir herausfinden,
144 wir die Leute die Wahrheit der Welt durch die Verschwörungen erkennen lassen
145 können, durch sie verwirrt wurden.
146 </p><p>
147 Aber das Löschen von Bränden ist reaktiv. Wir müssen die
148 <span class="emphasis"><em>Prävention</em></span> befeuern. Wir müssen auf die traumatischen
149 realen Umstände abzielen, die Leute anfällig für die Pandemie von
150 Verschwörungstheorien machen. Auch darin spielt Technologie eine Rolle.
151 </p><p>
152 Vorschläge hierfür gibt es genug. Von der <a class="ulink" href="https://edri.org/tag/terreg/" target="_top">Terrorist Content Regulation</a> der
153 Europäischen Union, welche Plattformen zwingt, <span class="quote"><span class="quote">extremistische</span></span>
154 Inhalte zu überwachen und zu entfernen, über die Vorschläge der Vereinigten
155 Staaten, wonach <a class="ulink" href="https://www.eff.org/deeplinks/2020/03/earn-it-act-violates-constitution" target="_top">Tech-Firmen
156 ihre Nutzer ausspähen</a> und <a class="ulink" href="https://www.natlawreview.com/article/repeal-cda-section-230" target="_top">für deren
157 „bad speech“</a> haftbar zu machen, gibt es zahlreiche Anstrengunen, um
158 Tech-Firmen dazu zu zwingen, die Probleme zu lösen, die sie selbst
159 geschaffen haben.
160 </p><p>
161 Dennoch fehlt ein wesentlicher Aspekt in dieser Debatte. All diese Lösungen
162 setzen voraus, dass Techfirmen ein Fixum sind, dass ihre Dominanz über das
163 Internet ein dauerhaftes Faktum ist. Vorschläge, „Big Tech”-Firmen mit einem
164 dezentralerem, pluralistischerem Internet zu ersetzen, finden sich
165 nirgendwo. Die <span class="quote"><span class="quote">Lösungen</span></span>, die heute zur Debatte stehen,
166 <span class="emphasis"><em>setzen voraus</em></span>, dass Big Tech „big“ bleibt, weil nur die
167 größten Unternehmen es sich leisten können, entsprechende gesetzeskonforme
168 Systeme zu etablieren.
169 </p><p>
170 Wir müssen herausfinden, wie unsere Technologie aussehen soll, wenn wir aus
171 diesem Schlamassel wieder herauskommen wollen. Wir stehen heute an einem
172 Scheideweg, wo wir uns entscheiden müssen, ob wir die „Big Tech“-Firmen
173 reparieren wollen, die das Internet kontrollieren, oder ob wir das Internet
174 reparieren wollen, indem wir es aus dem Klammergriff von „Big Tech“
175 befreien. Beides gleichzeitig geht nicht, so dass wir uns entscheiden
176 müssen.
177 </p><p>
178 Ich möchte, dass wir uns weise entscheiden. Zur Reparatur ist es essentiell,
179 dass „Big Tech“ gezähmt wird, und dafür brauchen wir
180 Digitalen-Rechte-Aktivismus.
181 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="digital-rights-activism-a-quarter-century-on"></a>Digitaler-Rechte-Aktivismus, ein Vierteljahrhundert später</h2></div></div></div><p>
182 Digitaler-Rechte-Aktivismus ist mehr als 30 Jahre alt. Die Eletronic
183 Frontier Foundation ist in diesem Jahr 30 Jahre alt geworden; die Free
184 Software Foundation wurde 1985 gegründet. Das am meisten im Laufe der
185 Geschichte der Bewegung gegen sie vorgebrachte Argument war, dass sie
186 irrelevant sei: Die Themen „echter“ Aktivisten wären auch
187 „echte-Welt“-Probleme (man denke an den Skeptizismus, als <a class="ulink" href="https://www.loc.gov/law/foreign-news/article/finland-legal-right-to-broadband-for-all-citizens/#:~:text=Global%20Legal%20Monitor,-Home%20%7C%20Search%20%7C%20Browse&amp;text=(July%206%2C%202010)%20On,connection%20100%20MBPS%20by%202015." target="_top">Finnland
188 im Jahr 2010 einen Breitbandinternetzugang zum Menschenrecht erklärte
189 </a>), und „echter-Welt“-Aktivismus noch als Stiefel-Aktivismus („shoe
190 leather activism”) galt (man denke an Malcolm Gladwells <a class="ulink" href="https://www.newyorker.com/magazine/2010/10/04/small-change-malcolm-gladwell" target="_top">Geringschätzung
191 für <span class="quote"><span class="quote">Clicktivism</span></span></a>). Aber je zentraler Technologien für
192 unseren Alltag wurde, desto mehr sind die Irrelevanz-Vorwürfe Vorwürfen von
193 Unehrlichkeit gewichen (<span class="quote"><span class="quote">Du sorgst dich nur um Tech, weil du <a class="ulink" href="https://www.ipwatchdog.com/2018/06/04/report-engine-eff-shills-google-patent-reform/id=98007/" target="_top">für
194 Technologie-Unternehmen Werbung machen
195 möchtest</a></span></span>). (<span class="quote"><span class="quote">Wie konntest du nur nicht vorhersehen,
196 dass Tech solch eine zerstörerische Kraft sein kann?</span></span>). Aber
197 Digitaler-Rechte-Aktivismus steht nach wie vor dafür: auf die Menschen in
198 einer Welt achtgeben, die unausweichlich von Technologie übernommen wird.
199 </p><p>
200 Die neueste Form dieser Kritik kommt in der Form des
201 <span class="quote"><span class="quote">Überwachungskapitalismus</span></span>, einem Begriff, der von der
202 Business-Professorin Shoshana Zuboff in ihrem langen und einflussreichen
203 Buch <span class="emphasis"><em>Das Zeitalter des Überwachungskapitalismus</em></span> geprägt
204 wurde, das 2019 erschienen ist. Zuboff argumentiert, dass
205 <span class="quote"><span class="quote">Überwachungskapitalismus</span></span> ein einzigartigs Geschöpf der
206 Tech-Industrie sei und dass es sich von allen anderen ausbeuterischen
207 kommerziellen Praktiken Geschichte unterscheide; ein Geschöpf, das <span class="quote"><span class="quote">
208 sich aus unerwarteten und unverständlichen Mechanismen aus Extrahierung,
209 Kommodifizierung und Kontrolle zusammensetze, das Menschen schließlich von
210 ihrem eigenen Verhalten loslöse und dabei neue Märkte von
211 Verhaltensvorhersage und -manipulation schaffe.</span></span> Es handelt sich
212 dabei um eine neue tödliche Form von Kapitalismus, einen
213 <span class="quote"><span class="quote">schurkenhaften Kapitalismus</span></span>, und unsere Unfähigkeit, dessen
214 einzigartigen Fähigkeiten und Gefahren zu verstehen, stellt eine
215 existenzielle und speziesweite Bedrohung dar. Sie hat insofern recht, als
216 Kapitalismus unsere Spezies heute bedroht, und sie hat auch recht insofern,
217 als Technologie unsere Spezies und Zivilisation vor einzigartige
218 Herausforderungen stellt, aber sie irrt sich darin, inwiefern Technologie
219 andersartig ist und warum es unsere Spezies bedroht.
220 </p><p>
221 Genauer gesagt, denke ich, dass ihre falsche Diagnose uns einen Weg
222 hinabführt, der Big Tech stärker macht, nicht schwächer. Wir müssen Big Tech
223 zu Fall bringen, und um das zu tun, müssen wir zunächst das Problem korrekt
224 identifizieren.
225 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="tech-exceptionalism-then-and-now"></a>Tech-Exzeptionalismus, damals und heute</h2></div></div></div><p>
226 Frühe Kritiker des Digitalen-Rechte-Managements - die am wohl am besten
227 durch Organisationen wie die Electronic Frontier Foundation, the Free
228 Software Foundation, Public Knowledge und andere vertreten werden, die ihren
229 Fokus auf die Bewahrung und Stärkung elementarer Menschenrechte in der
230 digitalen Welt legen - verurteilten Aktivisten für die Ausübung von
231 <span class="quote"><span class="quote">Tech-Exzeptionalismus</span></span>. Um die Jahrtausendwende machten
232 bedeutende Leute jegliche Behauptung, dass Tech-Regularien in der
233 <span class="quote"><span class="quote">echten Welt</span></span> eine Rolle spielten, lächerlich. Behauptungen,
234 wonach Tech-Regularien Folgen für Speech, Zusammenschlüsse, Privatsphäre,
235 Durchsuchungen und Konfiskationen, sowie für grundlegende Rechte und
236 Gleichheit haben konnten, wurden verlacht - verlacht als Besorgnis, die von
237 traurigen Nerds, die sonst in Webforen über <span class="emphasis"><em> Star Trek</em></span>
238 diskutierten, geschürt und gar über die Freiheitskämpfe der Freedom Rider,
239 Nelson Mandela oder des Warschauer Ghetto-Aufstandes erhoben würden.
240 </p><p>
241 In den seitdem vergangenen Jahrzehnten wurden die Vorwürfe von
242 <span class="quote"><span class="quote">Tech-Exzeptionalismus</span></span> schärfer, zumal sich die Bedeutung von
243 Technologie im Alltag ausgeweitet hat: Jetzt, da Technologie jede Nische
244 unseres Lebens infiltriert hat und unsere Online-Leben von einer Handvoll
245 Giganten monopolisiert wurden, werden die Verteidiger der digitalen
246 Freiheiten Beschuldigt, Wasserträger von „Big Tech“ zu sein und Deckung für
247 dessen von eigenen Interessen geleiteter Fahrlässigkeit (oder schlimmer
248 noch: ruchlose Pläne) zu bieten.
249 </p><p>
250 Nach meiner Aufassung ist die Digitale-Rechte-Bewegung stehen geblieben,
251 während der Rest der Welt sich weiterbewegt hat. Von den frühesten Tagen an
252 war das Anliegen der Bewegung, dass Nutzer und Programmierer ihre
253 grundlegenden Rechte verwirklichen Rechte können. Digitale-Rechte-Aktivisten
254 kümmerten sich nur soweit um Firmen, als sie die Rechte ihrer Nutzen
255 achteten (oder, wie so oft, wenn sich Unternehmen so töricht verhielten und
256 neue Regularien zu Fall zu bringen drohten, was es auch guten Akteuren
257 schwerer gemacht hätte, Nutzen zu helfen).
258 </p><p>
259 Der Kritiker des <span class="quote"><span class="quote">Überwachungskapitalismus</span></span> lässt die
260 Digitale-Rechte-Bewegung erneut in einem neuen Licht erscheinen: nicht als
261 Alarmisten, die die Wichtigkeit ihrer Spielzeuge überschätzen oder als
262 Sprecher für Big Tech, sondern als gelassene Sessel-Aktivisten, deren
263 langjähriger Aktivismus zur Last geworden ist, weil es sie unfähig macht,
264 neuartige Bedrohungen zu erkennen, während sie weiterhin Tech-Schlachten des
265 vorigen Jahrhunderts schlagen.
266 </p><p>
267 Aber Tech-Exzeptionalismus ist eine Sünde, unabhängig davon, wer ihn
268 betreibt.
269 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="dont-believe-the-hype"></a>Glaube nicht an den Hype</h2></div></div></div><p>
270 Du hast wahrscheinlich schon einmal gehört, dass <span class="quote"><span class="quote">du das Produkt bist,
271 wenn du nicht für das Produkt bezahlst </span></span>. Wie wir noch sehen werden,
272 ist diese Aussage im Grunde richtig, aber nicht vollständig. Aber es
273 stimmt<span class="emphasis"><em>definitiv</em></span> , dass die Kunden von Big Tech
274 Werbeunternehmen sind, und das Geschäftsmodell von Google und Facebook ist
275 letztlich ihre Fähigkeit, <span class="emphasis"><em>dich</em></span> zu Käufen zu
276 verleiten. Das Produkt von Big Tech ist die Überzeugungskunst. Die Dienste -
277 soziale Medien, Suchmaschinen, Karten- und Kurznachrichtendienste und
278 weitere - sind schlicht Vehikel, um dessen Nutzer von etwas zu überzeugen
279 und zu etwas zu verleiten.
280 </p><p>
281 Die Angst vor Überwachungskapitalismus basiert zunächst auf der (korrekten)
282 Annahme, dass alles, was Big Tech über sich selbst sagt, wahrscheinlich eine
283 Lüge ist. Aber der Kritiker des Überwachungskapitalismus macht hiervon eine
284 Ausnahme, soweit es Big Techs eigene Behauptungen in seinen
285 Verkaufsprospekten sind - der atemlose Hype, der potentiellen
286 Werbeunternehmen online und in Werbetechnologie-Seminaren über die
287 Wirksamkeit seiner Produkte angedient wird: Dem Hype zufolge kann uns Big
288 Tech so gut wie von ihm behauptet beeinflussen. Das ist jedoch falsch, weil
289 Verkaufsprospekte kein zuverlässiger Indikator für die Wirksamkeit eines
290 Produkts ist.
291 </p><p>
292 Überwachungskapitalismus geht davon aus, dass Big Tech etwas Reales
293 verkauft, weil Werbeunternehmen viel von dem kaufen, was Big Tech
294 verkauft. Aber die massiven Umsatzzahlen von Big Tech könnten einfach auch
295 nur das Produkt einer weit verbreiteten Täuschung sein, oder schlimmer noch:
296 eines monopolistischen Kontrolle über unser aller Kommunikation und Handel.
297 </p><p>
298 Überwachung führt zu Verhaltensveränderungen, und zwar nicht zu
299 positiven. Sie gefähdet unseren gesellschaftlichen Fortschritt. Zuboffs Buch
300 arbeitet Erklärungen dieser Phänomene eindrucksvoll heraus. Aber Zuboff
301 behauptet auch, dass Überwachung uns unseres freien Willens beraubt - dass,
302 wenn unsere persönlichen Daten mit maschinellem Lernen kombiniert werden,
303 ein System fataler Überzeugungskunst entsteht, in dessen Angesicht wir
304 hilflos sind. Sprich, Facebook nutzt einen Algorithmus, um die Daten zu
305 analysieren, welche ohne unsere Zustimmung aus deinem Alltag extrahiert
306 werden, und nutzt diese, um deinen Feed so anzupassen, dass du Sachen
307 kaufst. Es handelt sich um einen Strahl zur Gedankensteuerung wie aus einem
308 Comic der 1950er Jahre, der von verrückten Wissenschaftlern bedient wird,
309 deren Supercomputer ihnen ewige und umfassende Weltherrschaft garantiert.
310 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="what-is-persuasion"></a>Was ist Überzeugung?</h2></div></div></div><p>
311 Um zu verstehen, weshalb du dich nicht um Strahlen zur Gedankenkontrolle –
312 aber weshalb du dich um Überwachung <span class="emphasis"><em>und</em></span> Big Tech sorgen
313 <span class="emphasis"><em>solltest</em></span> -, müssen wir einordnen, was wir mit
314 <span class="quote"><span class="quote">Überzeugung</span></span> meinen.
315 </p><p>
316 Google, Facebook und andere Überwachungkapitalisten versprechen ihren Kunden
317 (den Werbeunternehmen), dass sich diesen – durch Werkzeuge maschinellen
318 Lernes, die mit unvorstellbar großen Mengen an persönlichen Daten ohne
319 Zustimmung trainier wurden – Wege eröffnen, um das rationale Denken der
320 Öffentlichkeit umgehen und ihr Verhalten lenken zu können, so dass ein ein
321 Strom an Käufen, Stimmen und anderer erwünschter Ergebnisse erzeugt wird.
322 </p><div class="blockquote"><blockquote class="blockquote"><p>
323 Die Auswirkungen von Vorherrschaft überwiegt die der Manipulation bei weitem
324 und sie sollen im Mittelpunkt unserer Analyse und etwaiger Gegenmittel
325 stehen, die wir zu finden suchen.
326 </p></blockquote></div><p>
327 Aber es gibt wenige Beweise dafür, dass dies geschieht. Stattdessen sind die
328 Vorhersagen, die Überwachungskapitalisten ihren Kunden liefern, viel weniger
329 beeindruckend. Anstelle Wege zu finden, die unser rationales Denken umgehen,
330 tun Überwachungskapitlisten meistens eines oder mehrere der folgenden drei
331 Dinge:
332 </p><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="segmenting"></a>1. Aufteilung</h3></div></div></div><p>
333 Falls du Windeln verkaufst, bist du besser beraten, diese Leuten auf
334 Entbindungsstationen anzubieten. Nicht jeder, der eine Entbindungsstation
335 betritt oder eine solche verlässt, hat gerade ein Kind entbunden, und nicht
336 jeder, der gerade ein Kind entbunden hat, ist im Windelmarkt vertreten. Aber
337 die Geburt eines Kindes ist ein sehr zuverlässiges Korrelat zur Teilnahme am
338 „Windelmarkt“, und der Aufenthalt in einer Entbindungsstation steht in hoher
339 Korrelation zur Geburt eines Kindes. Deshalb Windelwerbung im Bereich von
340 Entbindungsstationen (und sogar Promoter, die auf Entbindungsstationen mit
341 Körben voller Gratisproben herumspuken).
342 </p><p>
343 Surveillance capitalism is segmenting times a billion. Diaper vendors can go
344 way beyond people in maternity wards (though they can do that, too, with
345 things like location-based mobile ads). They can target you based on
346 whether you’re reading articles about child-rearing, diapers, or a host of
347 other subjects, and data mining can suggest unobvious keywords to advertise
348 against. They can target you based on the articles you’ve recently
349 read. They can target you based on what you’ve recently purchased. They can
350 target you based on whether you receive emails or private messages about
351 these subjects — or even if you speak aloud about them (though Facebook and
352 the like convincingly claim that’s not happening — yet).
353 </p><p>
354 Das ist wirklich beängstigend.
355 </p><p>
356 Aber dies ist keine Gedankenkontrolle.
357 </p><p>
358 Es beraubt dich nicht deines freien Willens. Es führt dich nicht hinters
359 Licht.
360 </p><p>
361 Think of how surveillance capitalism works in politics. Surveillance
362 capitalist companies sell political operatives the power to locate people
363 who might be receptive to their pitch. Candidates campaigning on finance
364 industry corruption seek people struggling with debt; candidates campaigning
365 on xenophobia seek out racists. Political operatives have always targeted
366 their message whether their intentions were honorable or not: Union
367 organizers set up pitches at factory gates, and white supremacists hand out
368 fliers at John Birch Society meetings.
369 </p><p>
370 But this is an inexact and thus wasteful practice. The union organizer can’t
371 know which worker to approach on the way out of the factory gates and may
372 waste their time on a covert John Birch Society member; the white
373 supremacist doesn’t know which of the Birchers are so delusional that making
374 it to a meeting is as much as they can manage and which ones might be
375 convinced to cross the country to carry a tiki torch through the streets of
376 Charlottesville, Virginia.
377 </p><p>
378 Because targeting improves the yields on political pitches, it can
379 accelerate the pace of political upheaval by making it possible for everyone
380 who has secretly wished for the toppling of an autocrat — or just an 11-term
381 incumbent politician — to find everyone else who feels the same way at very
382 low cost. This has been critical to the rapid crystallization of recent
383 political movements including Black Lives Matter and Occupy Wall Street as
384 well as less savory players like the far-right white nationalist movements
385 that marched in Charlottesville.
386 </p><p>
387 It’s important to differentiate this kind of political organizing from
388 influence campaigns; finding people who secretly agree with you isn’t the
389 same as convincing people to agree with you. The rise of phenomena like
390 nonbinary or otherwise nonconforming gender identities is often
391 characterized by reactionaries as the result of online brainwashing
392 campaigns that convince impressionable people that they have been secretly
393 queer all along.
394 </p><p>
395 But the personal accounts of those who have come out tell a different story
396 where people who long harbored a secret about their gender were emboldened
397 by others coming forward and where people who knew that they were different
398 but lacked a vocabulary for discussing that difference learned the right
399 words from these low-cost means of finding people and learning about their
400 ideas.
401 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="deception"></a>2. Deception</h3></div></div></div><p>
402 Lies and fraud are pernicious, and surveillance capitalism supercharges them
403 through targeting. If you want to sell a fraudulent payday loan or subprime
404 mortgage, surveillance capitalism can help you find people who are both
405 desperate and unsophisticated and thus receptive to your pitch. This
406 accounts for the rise of many phenomena, like multilevel marketing schemes,
407 in which deceptive claims about potential earnings and the efficacy of sales
408 techniques are targeted at desperate people by advertising against search
409 queries that indicate, for example, someone struggling with ill-advised
410 loans.
411 </p><p>
412 Surveillance capitalism also abets fraud by making it easy to locate other
413 people who have been similarly deceived, forming a community of people who
414 reinforce one another’s false beliefs. Think of <a class="ulink" href="https://www.vulture.com/2020/01/the-dream-podcast-review.html" target="_top">the
415 forums</a> where people who are being victimized by multilevel marketing
416 frauds gather to trade tips on how to improve their luck in peddling the
417 product.
418 </p><p>
419 Sometimes, online deception involves replacing someone’s correct beliefs
420 with incorrect ones, as it does in the anti-vaccination movement, whose
421 victims are often people who start out believing in vaccines but are
422 convinced by seemingly plausible evidence that leads them into the false
423 belief that vaccines are harmful.
424 </p><p>
425 But it’s much more common for fraud to succeed when it doesn’t have to
426 displace a true belief. When my daughter contracted head lice at daycare,
427 one of the daycare workers told me I could get rid of them by treating her
428 hair and scalp with olive oil. I didn’t know anything about head lice, and I
429 assumed that the daycare worker did, so I tried it (it didn’t work, and it
430 doesn’t work). It’s easy to end up with false beliefs when you simply don’t
431 know any better and when those beliefs are conveyed by someone who seems to
432 know what they’re doing.
433 </p><p>
434 This is pernicious and difficult — and it’s also the kind of thing the
435 internet can help guard against by making true information available,
436 especially in a form that exposes the underlying deliberations among parties
437 with sharply divergent views, such as Wikipedia. But it’s not brainwashing;
438 it’s fraud. In the <a class="ulink" href="https://datasociety.net/library/data-voids/" target="_top">majority of cases</a>,
439 the victims of these fraud campaigns have an informational void filled in
440 the customary way, by consulting a seemingly reliable source. If I look up
441 the length of the Brooklyn Bridge and learn that it is 5,800 feet long, but
442 in reality, it is 5,989 feet long, the underlying deception is a problem,
443 but it’s a problem with a simple remedy. It’s a very different problem from
444 the anti-vax issue in which someone’s true belief is displaced by a false
445 one by means of sophisticated persuasion.
446 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="domination"></a>3. Domination</h3></div></div></div><p>
447 Surveillance capitalism is the result of monopoly. Monopoly is the cause,
448 and surveillance capitalism and its negative outcomes are the effects of
449 monopoly. I’ll get into this in depth later, but for now, suffice it to say
450 that the tech industry has grown up with a radical theory of antitrust that
451 has allowed companies to grow by merging with their rivals, buying up their
452 nascent competitors, and expanding to control whole market verticals.
453 </p><p>
454 One example of how monopolism aids in persuasion is through dominance:
455 Google makes editorial decisions about its algorithms that determine the
456 sort order of the responses to our queries. If a cabal of fraudsters have
457 set out to trick the world into thinking that the Brooklyn Bridge is 5,800
458 feet long, and if Google gives a high search rank to this group in response
459 to queries like <span class="quote"><span class="quote">How long is the Brooklyn Bridge?</span></span> then the
460 first eight or 10 screens’ worth of Google results could be wrong. And since
461 most people don’t go beyond the first couple of results — let alone the
462 first <span class="emphasis"><em>page</em></span> of results — Google’s choice means that many
463 people will be deceived.
464 </p><p>
465 Google’s dominance over search — more than 86% of web searches are performed
466 through Google — means that the way it orders its search results has an
467 outsized effect on public beliefs. Ironically, Google claims this is why it
468 can’t afford to have any transparency in its algorithm design: Google’s
469 search dominance makes the results of its sorting too important to risk
470 telling the world how it arrives at those results lest some bad actor
471 discover a flaw in the ranking system and exploit it to push its point of
472 view to the top of the search results. There’s an obvious remedy to a
473 company that is too big to audit: break it up into smaller pieces.
474 </p><p>
475 Zuboff calls surveillance capitalism a <span class="quote"><span class="quote">rogue capitalism</span></span> whose
476 data-hoarding and machine-learning techniques rob us of our free will. But
477 influence campaigns that seek to displace existing, correct beliefs with
478 false ones have an effect that is small and temporary while monopolistic
479 dominance over informational systems has massive, enduring
480 effects. Controlling the results to the world’s search queries means
481 controlling access both to arguments and their rebuttals and, thus, control
482 over much of the world’s beliefs. If our concern is how corporations are
483 foreclosing on our ability to make up our own minds and determine our own
484 futures, the impact of dominance far exceeds the impact of manipulation and
485 should be central to our analysis and any remedies we seek.
486 </p></div><div class="sect2"><div class="titlepage"><div><div><h3 class="title"><a name="bypassing-our-rational-faculties"></a>4. Bypassing our rational faculties</h3></div></div></div><p>
487 <span class="emphasis"><em>This</em></span> is the good stuff: using machine learning,
488 <span class="quote"><span class="quote">dark patterns,</span></span> engagement hacking, and other techniques to
489 get us to do things that run counter to our better judgment. This is mind
490 control.
491 </p><p>
492 Some of these techniques have proven devastatingly effective (if only in the
493 short term). The use of countdown timers on a purchase completion page can
494 create a sense of urgency that causes you to ignore the nagging internal
495 voice suggesting that you should shop around or sleep on your decision. The
496 use of people from your social graph in ads can provide <span class="quote"><span class="quote">social
497 proof</span></span> that a purchase is worth making. Even the auction system
498 pioneered by eBay is calculated to play on our cognitive blind spots,
499 letting us feel like we <span class="quote"><span class="quote">own</span></span> something because we bid on it,
500 thus encouraging us to bid again when we are outbid to ensure that
501 <span class="quote"><span class="quote">our</span></span> things stay ours.
502 </p><p>
503 Games are extraordinarily good at this. <span class="quote"><span class="quote">Free to play</span></span> games
504 manipulate us through many techniques, such as presenting players with a
505 series of smoothly escalating challenges that create a sense of mastery and
506 accomplishment but which sharply transition into a set of challenges that
507 are impossible to overcome without paid upgrades. Add some social proof to
508 the mix — a stream of notifications about how well your friends are faring —
509 and before you know it, you’re buying virtual power-ups to get to the next
510 level.
511 </p><p>
512 Companies have risen and fallen on these techniques, and the
513 <span class="quote"><span class="quote">fallen</span></span> part is worth paying attention to. In general, living
514 things adapt to stimulus: Something that is very compelling or noteworthy
515 when you first encounter it fades with repetition until you stop noticing it
516 altogether. Consider the refrigerator hum that irritates you when it starts
517 up but disappears into the background so thoroughly that you only notice it
518 when it stops again.
519 </p><p>
520 That’s why behavioral conditioning uses <span class="quote"><span class="quote">intermittent reinforcement
521 schedules.</span></span> Instead of giving you a steady drip of encouragement or
522 setbacks, games and gamified services scatter rewards on a randomized
523 schedule — often enough to keep you interested and random enough that you
524 can never quite find the pattern that would make it boring.
525 </p><p>
526 Intermittent reinforcement is a powerful behavioral tool, but it also
527 represents a collective action problem for surveillance capitalism. The
528 <span class="quote"><span class="quote">engagement techniques</span></span> invented by the behaviorists of
529 surveillance capitalist companies are quickly copied across the whole sector
530 so that what starts as a mysteriously compelling fillip in the design of a
531 service—like <span class="quote"><span class="quote">pull to refresh</span></span> or alerts when someone likes
532 your posts or side quests that your characters get invited to while in the
533 midst of main quests—quickly becomes dully ubiquitous. The
534 impossible-to-nail-down nonpattern of randomized drips from your phone
535 becomes a grey-noise wall of sound as every single app and site starts to
536 make use of whatever seems to be working at the time.
537 </p><p>
538 From the surveillance capitalist’s point of view, our adaptive capacity is
539 like a harmful bacterium that deprives it of its food source — our attention
540 — and novel techniques for snagging that attention are like new antibiotics
541 that can be used to breach our defenses and destroy our
542 self-determination. And there <span class="emphasis"><em>are</em></span> techniques like
543 that. Who can forget the Great Zynga Epidemic, when all of our friends were
544 caught in <span class="emphasis"><em>FarmVille</em></span>’s endless, mindless dopamine loops?
545 But every new attention-commanding technique is jumped on by the whole
546 industry and used so indiscriminately that antibiotic resistance sets
547 in. Given enough repetition, almost all of us develop immunity to even the
548 most powerful techniques — by 2013, two years after Zynga’s peak, its user
549 base had halved.
550 </p><p>
551 Not everyone, of course. Some people never adapt to stimulus, just as some
552 people never stop hearing the hum of the refrigerator. This is why most
553 people who are exposed to slot machines play them for a while and then move
554 on while a small and tragic minority liquidate their kids’ college funds,
555 buy adult diapers, and position themselves in front of a machine until they
556 collapse.
557 </p><p>
558 But surveillance capitalism’s margins on behavioral modification
559 suck. Tripling the rate at which someone buys a widget sounds great <a class="ulink" href="https://www.forbes.com/sites/priceonomics/2018/03/09/the-advertising-conversion-rates-for-every-major-tech-platform/#2f6a67485957" target="_top">unless
560 the base rate is way less than 1%</a> with an improved rate of… still
561 less than 1%. Even penny slot machines pull down pennies for every spin
562 while surveillance capitalism rakes in infinitesimal penny fractions.
563 </p><p>
564 Slot machines’ high returns mean that they can be profitable just by
565 draining the fortunes of the small rump of people who are pathologically
566 vulnerable to them and unable to adapt to their tricks. But surveillance
567 capitalism can’t survive on the fractional pennies it brings down from that
568 vulnerable sliver — that’s why, after the Great Zynga Epidemic had finally
569 burned itself out, the small number of still-addicted players left behind
570 couldn’t sustain it as a global phenomenon. And new powerful attention
571 weapons aren’t easy to find, as is evidenced by the long years since the
572 last time Zynga had a hit. Despite the hundreds of millions of dollars that
573 Zynga has to spend on developing new tools to blast through our adaptation,
574 it has never managed to repeat the lucky accident that let it snag so much
575 of our attention for a brief moment in 2009. Powerhouses like Supercell have
576 fared a little better, but they are rare and throw away many failures for
577 every success.
578 </p><p>
579 The vulnerability of small segments of the population to dramatic, efficient
580 corporate manipulation is a real concern that’s worthy of our attention and
581 energy. But it’s not an existential threat to society.
582 </p></div></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="if-data-is-the-new-oil-then-surveillance-capitalisms-engine-has-a-leak"></a>If data is the new oil, then surveillance capitalism’s engine has a leak</h2></div></div></div><p>
583 This adaptation problem offers an explanation for one of surveillance
584 capitalism’s most alarming traits: its relentless hunger for data and its
585 endless expansion of data-gathering capabilities through the spread of
586 sensors, online surveillance, and acquisition of data streams from third
587 parties.
588 </p><p>
589 Zuboff observes this phenomenon and concludes that data must be very
590 valuable if surveillance capitalism is so hungry for it. (In her words:
591 <span class="quote"><span class="quote">Just as industrial capitalism was driven to the continuous
592 intensification of the means of production, so surveillance capitalists and
593 their market players are now locked into the continuous intensification of
594 the means of behavioral modification and the gathering might of
595 instrumentarian power.</span></span>) But what if the voracious appetite is
596 because data has such a short half-life — because people become inured so
597 quickly to new, data-driven persuasion techniques — that the companies are
598 locked in an arms race with our limbic system? What if it’s all a Red
599 Queen’s race where they have to run ever faster — collect ever-more data —
600 just to stay in the same spot?
601 </p><p>
602 Of course, all of Big Tech’s persuasion techniques work in concert with one
603 another, and collecting data is useful beyond mere behavioral trickery.
604 </p><p>
605 If someone wants to recruit you to buy a refrigerator or join a pogrom, they
606 might use profiling and targeting to send messages to people they judge to
607 be good sales prospects. The messages themselves may be deceptive, making
608 claims about things you’re not very knowledgeable about (food safety and
609 energy efficiency or eugenics and historical claims about racial
610 superiority). They might use search engine optimization and/or armies of
611 fake reviewers and commenters and/or paid placement to dominate the
612 discourse so that any search for further information takes you back to their
613 messages. And finally, they may refine the different pitches using machine
614 learning and other techniques to figure out what kind of pitch works best on
615 someone like you.
616 </p><p>
617 Each phase of this process benefits from surveillance: The more data they
618 have, the more precisely they can profile you and target you with specific
619 messages. Think of how you’d sell a fridge if you knew that the warranty on
620 your prospect’s fridge just expired and that they were expecting a tax
621 rebate in April.
622 </p><p>
623 Also, the more data they have, the better they can craft deceptive messages
624 — if I know that you’re into genealogy, I might not try to feed you
625 pseudoscience about genetic differences between <span class="quote"><span class="quote">races,</span></span>
626 sticking instead to conspiratorial secret histories of <span class="quote"><span class="quote">demographic
627 replacement</span></span> and the like.
628 </p><p>
629 Facebook also helps you locate people who have the same odious or antisocial
630 views as you. It makes it possible to find other people who want to carry
631 tiki torches through the streets of Charlottesville in Confederate
632 cosplay. It can help you find other people who want to join your militia and
633 go to the border to look for undocumented migrants to terrorize. It can help
634 you find people who share your belief that vaccines are poison and that the
635 Earth is flat.
636 </p><p>
637 There is one way in which targeted advertising uniquely benefits those
638 advocating for socially unacceptable causes: It is invisible. Racism is
639 widely geographically dispersed, and there are few places where racists —
640 and only racists — gather. This is similar to the problem of selling
641 refrigerators in that potential refrigerator purchasers are geographically
642 dispersed and there are few places where you can buy an ad that will be
643 primarily seen by refrigerator customers. But buying a refrigerator is
644 socially acceptable while being a Nazi is not, so you can buy a billboard or
645 advertise in the newspaper sports section for your refrigerator business,
646 and the only potential downside is that your ad will be seen by a lot of
647 people who don’t want refrigerators, resulting in a lot of wasted expense.
648 </p><p>
649 But even if you wanted to advertise your Nazi movement on a billboard or
650 prime-time TV or the sports section, you would struggle to find anyone
651 willing to sell you the space for your ad partly because they disagree with
652 your views and partly because they fear censure (boycott, reputational
653 damage, etc.) from other people who disagree with your views.
654 </p><p>
655 Targeted ads solve this problem: On the internet, every ad unit can be
656 different for every person, meaning that you can buy ads that are only shown
657 to people who appear to be Nazis and not to people who hate Nazis. When
658 there’s spillover — when someone who hates racism is shown a racist
659 recruiting ad — there is some fallout; the platform or publication might get
660 an angry public or private denunciation. But the nature of the risk assumed
661 by an online ad buyer is different than the risks to a traditional publisher
662 or billboard owner who might want to run a Nazi ad.
663 </p><p>
664 Online ads are placed by algorithms that broker between a diverse ecosystem
665 of self-serve ad platforms that anyone can buy an ad through, so the Nazi ad
666 that slips onto your favorite online publication isn’t seen as their moral
667 failing but rather as a failure in some distant, upstream ad supplier. When
668 a publication gets a complaint about an offensive ad that’s appearing in one
669 of its units, it can take some steps to block that ad, but the Nazi might
670 buy a slightly different ad from a different broker serving the same
671 unit. And in any event, internet users increasingly understand that when
672 they see an ad, it’s likely that the advertiser did not choose that
673 publication and that the publication has no idea who its advertisers are.
674 </p><p>
675 These layers of indirection between advertisers and publishers serve as
676 moral buffers: Today’s moral consensus is largely that publishers shouldn’t
677 be held responsible for the ads that appear on their pages because they’re
678 not actively choosing to put those ads there. Because of this, Nazis are
679 able to overcome significant barriers to organizing their movement.
680 </p><p>
681 Data has a complex relationship with domination. Being able to spy on your
682 customers can alert you to their preferences for your rivals and allow you
683 to head off your rivals at the pass.
684 </p><p>
685 More importantly, if you can dominate the information space while also
686 gathering data, then you make other deceptive tactics stronger because it’s
687 harder to break out of the web of deceit you’re spinning. Domination — that
688 is, ultimately becoming a monopoly — and not the data itself is the
689 supercharger that makes every tactic worth pursuing because monopolistic
690 domination deprives your target of an escape route.
691 </p><p>
692 If you’re a Nazi who wants to ensure that your prospects primarily see
693 deceptive, confirming information when they search for more, you can improve
694 your odds by seeding the search terms they use through your initial
695 communications. You don’t need to own the top 10 results for <span class="quote"><span class="quote">voter
696 suppression</span></span> if you can convince your marks to confine their search
697 terms to <span class="quote"><span class="quote">voter fraud,</span></span> which throws up a very different set of
698 search results.
699 </p><p>
700 Surveillance capitalists are like stage mentalists who claim that their
701 extraordinary insights into human behavior let them guess the word that you
702 wrote down and folded up in your pocket but who really use shills, hidden
703 cameras, sleight of hand, and brute-force memorization to amaze you.
704 </p><p>
705 Or perhaps they’re more like pick-up artists, the misogynistic cult that
706 promises to help awkward men have sex with women by teaching them
707 <span class="quote"><span class="quote">neurolinguistic programming</span></span> phrases, body language
708 techniques, and psychological manipulation tactics like
709 <span class="quote"><span class="quote">negging</span></span> — offering unsolicited negative feedback to women to
710 lower their self-esteem and prick their interest.
711 </p><p>
712 Some pick-up artists eventually manage to convince women to go home with
713 them, but it’s not because these men have figured out how to bypass women’s
714 critical faculties. Rather, pick-up artists’ <span class="quote"><span class="quote">success</span></span> stories
715 are a mix of women who were incapable of giving consent, women who were
716 coerced, women who were intoxicated, self-destructive women, and a few women
717 who were sober and in command of their faculties but who didn’t realize
718 straightaway that they were with terrible men but rectified the error as
719 soon as they could.
720 </p><p>
721 Pick-up artists <span class="emphasis"><em>believe</em></span> they have figured out a secret
722 back door that bypasses women’s critical faculties, but they haven’t. Many
723 of the tactics they deploy, like negging, became the butt of jokes (just
724 like people joke about bad ad targeting), and there’s a good chance that
725 anyone they try these tactics on will immediately recognize them and dismiss
726 the men who use them as irredeemable losers.
727 </p><p>
728 Pick-up artists are proof that people can believe they have developed a
729 system of mind control <span class="emphasis"><em>even when it doesn’t
730 work</em></span>. Pick-up artists simply exploit the fact that
731 one-in-a-million chances can come through for you if you make a million
732 attempts, and then they assume that the other 999,999 times, they simply
733 performed the technique incorrectly and commit themselves to doing better
734 next time. There’s only one group of people who find pick-up artist lore
735 reliably convincing: other would-be pick-up artists whose anxiety and
736 insecurity make them vulnerable to scammers and delusional men who convince
737 them that if they pay for tutelage and follow instructions, then they will
738 someday succeed. Pick-up artists assume they fail to entice women because
739 they are bad at being pick-up artists, not because pick-up artistry is
740 bullshit. Pick-up artists are bad at selling themselves to women, but
741 they’re much better at selling themselves to men who pay to learn the
742 secrets of pick-up artistry.
743 </p><p>
744 Department store pioneer John Wanamaker is said to have lamented,
745 <span class="quote"><span class="quote">Half the money I spend on advertising is wasted; the trouble is I
746 don’t know which half.</span></span> The fact that Wanamaker thought that only
747 half of his advertising spending was wasted is a tribute to the
748 persuasiveness of advertising executives, who are <span class="emphasis"><em>much</em></span>
749 better at convincing potential clients to buy their services than they are
750 at convincing the general public to buy their clients’ wares.
751 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="what-is-facebook"></a>What is Facebook?</h2></div></div></div><p>
752 Facebook is heralded as the origin of all of our modern plagues, and it’s
753 not hard to see why. Some tech companies want to lock their users in but
754 make their money by monopolizing access to the market for apps for their
755 devices and gouging them on prices rather than by spying on them (like
756 Apple). Some companies don’t care about locking in users because they’ve
757 figured out how to spy on them no matter where they are and what they’re
758 doing and can turn that surveillance into money (Google). Facebook alone
759 among the Western tech giants has built a business based on locking in its
760 users <span class="emphasis"><em>and</em></span> spying on them all the time.
761 </p><p>
762 Facebook’s surveillance regime is really without parallel in the Western
763 world. Though Facebook tries to prevent itself from being visible on the
764 public web, hiding most of what goes on there from people unless they’re
765 logged into Facebook, the company has nevertheless booby-trapped the entire
766 web with surveillance tools in the form of Facebook <span class="quote"><span class="quote">Like</span></span>
767 buttons that web publishers include on their sites to boost their Facebook
768 profiles. Facebook also makes various libraries and other useful code
769 snippets available to web publishers that act as surveillance tendrils on
770 the sites where they’re used, funneling information about visitors to the
771 site — newspapers, dating sites, message boards — to Facebook.
772 </p><div class="blockquote"><blockquote class="blockquote"><p>
773 Big Tech is able to practice surveillance not just because it is tech but
774 because it is <span class="emphasis"><em>big</em></span>.
775 </p></blockquote></div><p>
776 Facebook offers similar tools to app developers, so the apps — games, fart
777 machines, business review services, apps for keeping abreast of your kid’s
778 schooling — you use will send information about your activities to Facebook
779 even if you don’t have a Facebook account and even if you don’t download or
780 use Facebook apps. On top of all that, Facebook buys data from third-party
781 brokers on shopping habits, physical location, use of <span class="quote"><span class="quote">loyalty</span></span>
782 programs, financial transactions, etc., and cross-references that with the
783 dossiers it develops on activity on Facebook and with apps and the public
784 web.
785 </p><p>
786 Though it’s easy to integrate the web with Facebook — linking to news
787 stories and such — Facebook products are generally not available to be
788 integrated back into the web itself. You can embed a tweet in a Facebook
789 post, but if you embed a Facebook post in a tweet, you just get a link back
790 to Facebook and must log in before you can see it. Facebook has used extreme
791 technological and legal countermeasures to prevent rivals from allowing
792 their users to embed Facebook snippets in competing services or to create
793 alternative interfaces to Facebook that merge your Facebook inbox with those
794 of other services that you use.
795 </p><p>
796 And Facebook is incredibly popular, with 2.3 billion claimed users (though
797 many believe this figure to be inflated). Facebook has been used to organize
798 genocidal pogroms, racist riots, anti-vaccination movements, flat Earth
799 cults, and the political lives of some of the world’s ugliest, most brutal
800 autocrats. There are some really alarming things going on in the world, and
801 Facebook is implicated in many of them, so it’s easy to conclude that these
802 bad things are the result of Facebook’s mind-control system, which it rents
803 out to anyone with a few bucks to spend.
804 </p><p>
805 To understand what role Facebook plays in the formulation and mobilization
806 of antisocial movements, we need to understand the dual nature of Facebook.
807 </p><p>
808 Because it has a lot of users and a lot of data about those users, Facebook
809 is a very efficient tool for locating people with hard-to-find traits, the
810 kinds of traits that are widely diffused in the population such that
811 advertisers have historically struggled to find a cost-effective way to
812 reach them. Think back to refrigerators: Most of us only replace our major
813 appliances a few times in our entire lives. If you’re a refrigerator
814 manufacturer or retailer, you have these brief windows in the life of a
815 consumer during which they are pondering a purchase, and you have to somehow
816 reach them. Anyone who’s ever registered a title change after buying a house
817 can attest that appliance manufacturers are incredibly desperate to reach
818 anyone who has even the slenderest chance of being in the market for a new
819 fridge.
820 </p><p>
821 Facebook makes finding people shopping for refrigerators a
822 <span class="emphasis"><em>lot</em></span> easier. It can target ads to people who’ve
823 registered a new home purchase, to people who’ve searched for refrigerator
824 buying advice, to people who have complained about their fridge dying, or
825 any combination thereof. It can even target people who’ve recently bought
826 <span class="emphasis"><em>other</em></span> kitchen appliances on the theory that someone
827 who’s just replaced their stove and dishwasher might be in a fridge-buying
828 kind of mood. The vast majority of people who are reached by these ads will
829 not be in the market for a new fridge, but — crucially — the percentage of
830 people who <span class="emphasis"><em>are</em></span> looking for fridges that these ads reach
831 is <span class="emphasis"><em>much</em></span> larger than it is than for any group that might
832 be subjected to traditional, offline targeted refrigerator marketing.
833 </p><p>
834 Facebook also makes it a lot easier to find people who have the same rare
835 disease as you, which might have been impossible in earlier eras — the
836 closest fellow sufferer might otherwise be hundreds of miles away. It makes
837 it easier to find people who went to the same high school as you even though
838 decades have passed and your former classmates have all been scattered to
839 the four corners of the Earth.
840 </p><p>
841 Facebook also makes it much easier to find people who hold the same rare
842 political beliefs as you. If you’ve always harbored a secret affinity for
843 socialism but never dared utter this aloud lest you be demonized by your
844 neighbors, Facebook can help you discover other people who feel the same way
845 (and it might just demonstrate to you that your affinity is more widespread
846 than you ever suspected). It can make it easier to find people who share
847 your sexual identity. And again, it can help you to understand that what
848 you thought was a shameful secret that affected only you was really a widely
849 shared trait, giving you both comfort and the courage to come out to the
850 people in your life.
851 </p><p>
852 All of this presents a dilemma for Facebook: Targeting makes the company’s
853 ads more effective than traditional ads, but it also lets advertisers see
854 just how effective their ads are. While advertisers are pleased to learn
855 that Facebook ads are more effective than ads on systems with less
856 sophisticated targeting, advertisers can also see that in nearly every case,
857 the people who see their ads ignore them. Or, at best, the ads work on a
858 subconscious level, creating nebulous unmeasurables like <span class="quote"><span class="quote">brand
859 recognition.</span></span> This means that the price per ad is very low in nearly
860 every case.
861 </p><p>
862 To make things worse, many Facebook groups spark precious little
863 discussion. Your little-league soccer team, the people with the same rare
864 disease as you, and the people you share a political affinity with may
865 exchange the odd flurry of messages at critical junctures, but on a daily
866 basis, there’s not much to say to your old high school chums or other
867 hockey-card collectors.
868 </p><p>
869 With nothing but <span class="quote"><span class="quote">organic</span></span> discussion, Facebook would not
870 generate enough traffic to sell enough ads to make the money it needs to
871 continually expand by buying up its competitors while returning handsome
872 sums to its investors.
873 </p><p>
874 So Facebook has to gin up traffic by sidetracking its own forums: Every time
875 Facebook’s algorithm injects controversial materials — inflammatory
876 political articles, conspiracy theories, outrage stories — into a group, it
877 can hijack that group’s nominal purpose with its desultory discussions and
878 supercharge those discussions by turning them into bitter, unproductive
879 arguments that drag on and on. Facebook is optimized for engagement, not
880 happiness, and it turns out that automated systems are pretty good at
881 figuring out things that people will get angry about.
882 </p><p>
883 Facebook <span class="emphasis"><em>can</em></span> modify our behavior but only in a couple
884 of trivial ways. First, it can lock in all your friends and family members
885 so that you check and check and check with Facebook to find out what they
886 are up to; and second, it can make you angry and anxious. It can force you
887 to choose between being interrupted constantly by updates — a process that
888 breaks your concentration and makes it hard to be introspective — and
889 staying in touch with your friends. This is a very limited form of mind
890 control, and it can only really make us miserable, angry, and anxious.
891 </p><p>
892 This is why Facebook’s targeting systems — both the ones it shows to
893 advertisers and the ones that let users find people who share their
894 interests — are so next-gen and smooth and easy to use as well as why its
895 message boards have a toolset that seems like it hasn’t changed since the
896 mid-2000s. If Facebook delivered an equally flexible, sophisticated
897 message-reading system to its users, those users could defend themselves
898 against being nonconsensually eyeball-fucked with Donald Trump headlines.
899 </p><p>
900 The more time you spend on Facebook, the more ads it gets to show you. The
901 solution to Facebook’s ads only working one in a thousand times is for the
902 company to try to increase how much time you spend on Facebook by a factor
903 of a thousand. Rather than thinking of Facebook as a company that has
904 figured out how to show you exactly the right ad in exactly the right way to
905 get you to do what its advertisers want, think of it as a company that has
906 figured out how to make you slog through an endless torrent of arguments
907 even though they make you miserable, spending so much time on the site that
908 it eventually shows you at least one ad that you respond to.
909 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="monopoly-and-the-right-to-the-future-tense"></a>Monopoly and the right to the future tense</h2></div></div></div><p>
910 Zuboff and her cohort are particularly alarmed at the extent to which
911 surveillance allows corporations to influence our decisions, taking away
912 something she poetically calls <span class="quote"><span class="quote">the right to the future tense</span></span>
913 — that is, the right to decide for yourself what you will do in the future.
914 </p><p>
915 It’s true that advertising can tip the scales one way or another: When
916 you’re thinking of buying a fridge, a timely fridge ad might end the search
917 on the spot. But Zuboff puts enormous and undue weight on the persuasive
918 power of surveillance-based influence techniques. Most of these don’t work
919 very well, and the ones that do won’t work for very long. The makers of
920 these influence tools are confident they will someday refine them into
921 systems of total control, but they are hardly unbiased observers, and the
922 risks from their dreams coming true are very speculative.
923 </p><p>
924 By contrast, Zuboff is rather sanguine about 40 years of lax antitrust
925 practice that has allowed a handful of companies to dominate the internet,
926 ushering in an information age with, <a class="ulink" href="https://twitter.com/tveastman/status/1069674780826071040" target="_top">as one person
927 on Twitter noted</a>, five giant websites each filled with screenshots
928 of the other four.
929 </p><p>
930 However, if we are to be alarmed that we might lose the right to choose for
931 ourselves what our future will hold, then monopoly’s nonspeculative,
932 concrete, here-and-now harms should be front and center in our debate over
933 tech policy.
934 </p><p>
935 Start with <span class="quote"><span class="quote">digital rights management.</span></span> In 1998, Bill Clinton
936 signed the Digital Millennium Copyright Act (DMCA) into law. It’s a complex
937 piece of legislation with many controversial clauses but none more so than
938 Section 1201, the <span class="quote"><span class="quote">anti-circumvention</span></span> rule.
939 </p><p>
940 This is a blanket ban on tampering with systems that restrict access to
941 copyrighted works. The ban is so thoroughgoing that it prohibits removing a
942 copyright lock even when no copyright infringement takes place. This is by
943 design: The activities that the DMCA’s Section 1201 sets out to ban are not
944 copyright infringements; rather, they are legal activities that frustrate
945 manufacturers’ commercial plans.
946 </p><p>
947 For example, Section 1201’s first major application was on DVD players as a
948 means of enforcing the region coding built into those devices. DVD-CCA, the
949 body that standardized DVDs and DVD players, divided the world into six
950 regions and specified that DVD players must check each disc to determine
951 which regions it was authorized to be played in. DVD players would have
952 their own corresponding region (a DVD player bought in the U.S. would be
953 region 1 while one bought in India would be region 5). If the player and the
954 disc’s region matched, the player would play the disc; otherwise, it would
955 reject it.
956 </p><p>
957 However, watching a lawfully produced disc in a country other than the one
958 where you purchased it is not copyright infringement — it’s the
959 opposite. Copyright law imposes this duty on customers for a movie: You must
960 go into a store, find a licensed disc, and pay the asking price. Do that —
961 and <span class="emphasis"><em>nothing else</em></span> — and you and copyright are square
962 with one another.
963 </p><p>
964 The fact that a movie studio wants to charge Indians less than Americans or
965 release in Australia later than it releases in the U.K. has no bearing on
966 copyright law. Once you lawfully acquire a DVD, it is no copyright
967 infringement to watch it no matter where you happen to be.
968 </p><p>
969 So DVD and DVD player manufacturers would not be able to use accusations of
970 abetting copyright infringement to punish manufacturers who made
971 noncompliant players that would play discs from any region or repair shops
972 that modified players to let you watch out-of-region discs or software
973 programmers who created programs to let you do this.
974 </p><p>
975 That’s where Section 1201 of the DMCA comes in: By banning tampering with an
976 <span class="quote"><span class="quote">access control,</span></span> the rule gave manufacturers and rights
977 holders standing to sue competitors who released superior products with
978 lawful features that the market demanded (in this case, region-free
979 players).
980 </p><p>
981 This is an odious scam against consumers, but as time went by, Section 1201
982 grew to encompass a rapidly expanding constellation of devices and services
983 as canny manufacturers have realized certain things:
984 </p><div class="itemizedlist"><ul class="itemizedlist compact" style="list-style-type: disc; "><li class="listitem"><p>
985 Any device with software in it contains a <span class="quote"><span class="quote">copyrighted work</span></span>
986 i.e., the software.
987 </p></li><li class="listitem"><p>
988 A device can be designed so that reconfiguring the software requires
989 bypassing an <span class="quote"><span class="quote">access control for copyrighted works,</span></span> which is a
990 potential felony under Section 1201.
991 </p></li><li class="listitem"><p>
992 Thus, companies can control their customers’ behavior after they take home
993 their purchases by designing products so that all unpermitted uses require
994 modifications that fall afoul of Section 1201.
995 </p></li></ul></div><p>
996 Section 1201 then becomes a means for manufacturers of all descriptions to
997 force their customers to arrange their affairs to benefit the manufacturers’
998 shareholders instead of themselves.
999 </p><p>
1000 This manifests in many ways: from a new generation of inkjet printers that
1001 use countermeasures to prevent third-party ink that cannot be bypassed
1002 without legal risks to similar systems in tractors that prevent third-party
1003 technicians from swapping in the manufacturer’s own parts that are not
1004 recognized by the tractor’s control system until it is supplied with a
1005 manufacturer’s unlock code.
1006 </p><p>
1007 Closer to home, Apple’s iPhones use these measures to prevent both
1008 third-party service and third-party software installation. This allows Apple
1009 to decide when an iPhone is beyond repair and must be shredded and
1010 landfilled as opposed to the iPhone’s purchaser. (Apple is notorious for its
1011 environmentally catastrophic policy of destroying old electronics rather
1012 than permitting them to be cannibalized for parts.) This is a very useful
1013 power to wield, especially in light of CEO Tim Cook’s January 2019 warning
1014 to investors that the company’s profits are endangered by customers choosing
1015 to hold onto their phones for longer rather than replacing them.
1016 </p><p>
1017 Apple’s use of copyright locks also allows it to establish a monopoly over
1018 how its customers acquire software for their mobile devices. The App Store’s
1019 commercial terms guarantee Apple a share of all revenues generated by the
1020 apps sold there, meaning that Apple gets paid when you buy an app from its
1021 store and then continues to get paid every time you buy something using that
1022 app. This comes out of the bottom line of software developers, who must
1023 either charge more or accept lower profits for their products.
1024 </p><p>
1025 Crucially, Apple’s use of copyright locks gives it the power to make
1026 editorial decisions about which apps you may and may not install on your own
1027 device. Apple has used this power to <a class="ulink" href="https://www.telegraph.co.uk/technology/apple/5982243/Apple-bans-dictionary-from-App-Store-over-swear-words.html" target="_top">reject
1028 dictionaries</a> for containing obscene words; to <a class="ulink" href="https://www.vice.com/en_us/article/538kan/apple-just-banned-the-app-that-tracks-us-drone-strikes-again" target="_top">limit
1029 political speech</a>, especially from apps that make sensitive political
1030 commentary such as an app that notifies you every time a U.S. drone kills
1031 someone somewhere in the world; and to <a class="ulink" href="https://www.eurogamer.net/articles/2016-05-19-palestinian-indie-game-must-not-be-called-a-game-apple-says" target="_top">object
1032 to a game</a> that commented on the Israel-Palestine conflict.
1033 </p><p>
1034 Apple often justifies monopoly power over software installation in the name
1035 of security, arguing that its vetting of apps for its store means that it
1036 can guard its users against apps that contain surveillance code. But this
1037 cuts both ways. In China, the government <a class="ulink" href="https://www.ft.com/content/ad42e536-cf36-11e7-b781-794ce08b24dc" target="_top">ordered
1038 Apple to prohibit the sale of privacy tools</a> like VPNs with the
1039 exception of VPNs that had deliberately introduced flaws designed to let the
1040 Chinese state eavesdrop on users. Because Apple uses technological
1041 countermeasures — with legal backstops — to block customers from installing
1042 unauthorized apps, Chinese iPhone owners cannot readily (or legally) acquire
1043 VPNs that would protect them from Chinese state snooping.
1044 </p><p>
1045 Zuboff calls surveillance capitalism a <span class="quote"><span class="quote">rogue capitalism.</span></span>
1046 Theoreticians of capitalism claim that its virtue is that it <a class="ulink" href="https://en.wikipedia.org/wiki/Price_signal" target="_top">aggregates information in
1047 the form of consumers’ decisions</a>, producing efficient
1048 markets. Surveillance capitalism’s supposed power to rob its victims of
1049 their free will through computationally supercharged influence campaigns
1050 means that our markets no longer aggregate customers’ decisions because we
1051 customers no longer decide — we are given orders by surveillance
1052 capitalism’s mind-control rays.
1053 </p><p>
1054 If our concern is that markets cease to function when consumers can no
1055 longer make choices, then copyright locks should concern us at
1056 <span class="emphasis"><em>least</em></span> as much as influence campaigns. An influence
1057 campaign might nudge you to buy a certain brand of phone; but the copyright
1058 locks on that phone absolutely determine where you get it serviced, which
1059 apps can run on it, and when you have to throw it away rather than fixing
1060 it.
1061 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="search-order-and-the-right-to-the-future-tense"></a>Search order and the right to the future tense</h2></div></div></div><p>
1062 Markets are posed as a kind of magic: By discovering otherwise hidden
1063 information conveyed by the free choices of consumers, those consumers’
1064 local knowledge is integrated into a self-correcting system that makes
1065 efficient allocations—more efficient than any computer could calculate. But
1066 monopolies are incompatible with that notion. When you only have one app
1067 store, the owner of the store — not the consumer — decides on the range of
1068 choices. As Boss Tweed once said, <span class="quote"><span class="quote">I don’t care who does the electing,
1069 so long as I get to do the nominating.</span></span> A monopolized market is an
1070 election whose candidates are chosen by the monopolist.
1071 </p><p>
1072 This ballot rigging is made more pernicious by the existence of monopolies
1073 over search order. Google’s search market share is about 90%. When Google’s
1074 ranking algorithm puts a result for a popular search term in its top 10,
1075 that helps determine the behavior of millions of people. If Google’s answer
1076 to <span class="quote"><span class="quote">Are vaccines dangerous?</span></span> is a page that rebuts anti-vax
1077 conspiracy theories, then a sizable portion of the public will learn that
1078 vaccines are safe. If, on the other hand, Google sends those people to a
1079 site affirming the anti-vax conspiracies, a sizable portion of those
1080 millions will come away convinced that vaccines are dangerous.
1081 </p><p>
1082 Google’s algorithm is often tricked into serving disinformation as a
1083 prominent search result. But in these cases, Google isn’t persuading people
1084 to change their minds; it’s just presenting something untrue as fact when
1085 the user has no cause to doubt it.
1086 </p><p>
1087 This is true whether the search is for <span class="quote"><span class="quote">Are vaccines
1088 dangerous?</span></span> or <span class="quote"><span class="quote">best restaurants near me.</span></span> Most users
1089 will never look past the first page of search results, and when the
1090 overwhelming majority of people all use the same search engine, the ranking
1091 algorithm deployed by that search engine will determine myriad outcomes
1092 (whether to adopt a child, whether to have cancer surgery, where to eat
1093 dinner, where to move, where to apply for a job) to a degree that vastly
1094 outstrips any behavioral outcomes dictated by algorithmic persuasion
1095 techniques.
1096 </p><p>
1097 Many of the questions we ask search engines have no empirically correct
1098 answers: <span class="quote"><span class="quote">Where should I eat dinner?</span></span> is not an objective
1099 question. Even questions that do have correct answers (<span class="quote"><span class="quote">Are vaccines
1100 dangerous?</span></span>) don’t have one empirically superior source for that
1101 answer. Many pages affirm the safety of vaccines, so which one goes first?
1102 Under conditions of competition, consumers can choose from many search
1103 engines and stick with the one whose algorithmic judgment suits them best,
1104 but under conditions of monopoly, we all get our answers from the same
1105 place.
1106 </p><p>
1107 Google’s search dominance isn’t a matter of pure merit: The company has
1108 leveraged many tactics that would have been prohibited under classical,
1109 pre-Ronald-Reagan antitrust enforcement standards to attain its
1110 dominance. After all, this is a company that has developed two major
1111 products: a really good search engine and a pretty good Hotmail clone. Every
1112 other major success it’s had — Android, YouTube, Google Maps, etc. — has
1113 come through an acquisition of a nascent competitor. Many of the company’s
1114 key divisions, such as the advertising technology of DoubleClick, violate
1115 the historical antitrust principle of structural separation, which forbade
1116 firms from owning subsidiaries that competed with their
1117 customers. Railroads, for example, were barred from owning freight companies
1118 that competed with the shippers whose freight they carried.
1119 </p><p>
1120 If we’re worried about giant companies subverting markets by stripping
1121 consumers of their ability to make free choices, then vigorous antitrust
1122 enforcement seems like an excellent remedy. If we’d denied Google the right
1123 to effect its many mergers, we would also have probably denied it its total
1124 search dominance. Without that dominance, the pet theories, biases, errors
1125 (and good judgment, too) of Google search engineers and product managers
1126 would not have such an outsized effect on consumer choice.
1127 </p><p>
1128 This goes for many other companies. Amazon, a classic surveillance
1129 capitalist, is obviously the dominant tool for searching Amazon — though
1130 many people find their way to Amazon through Google searches and Facebook
1131 posts — and obviously, Amazon controls Amazon search. That means that
1132 Amazon’s own self-serving editorial choices—like promoting its own house
1133 brands over rival goods from its sellers as well as its own pet theories,
1134 biases, and errors— determine much of what we buy on Amazon. And since
1135 Amazon is the dominant e-commerce retailer outside of China and since it
1136 attained that dominance by buying up both large rivals and nascent
1137 competitors in defiance of historical antitrust rules, we can blame the
1138 monopoly for stripping consumers of their right to the future tense and the
1139 ability to shape markets by making informed choices.
1140 </p><p>
1141 Not every monopolist is a surveillance capitalist, but that doesn’t mean
1142 they’re not able to shape consumer choices in wide-ranging ways. Zuboff
1143 lauds Apple for its App Store and iTunes Store, insisting that adding price
1144 tags to the features on its platforms has been the secret to resisting
1145 surveillance and thus creating markets. But Apple is the only retailer
1146 allowed to sell on its platforms, and it’s the second-largest mobile device
1147 vendor in the world. The independent software vendors that sell through
1148 Apple’s marketplace accuse the company of the same surveillance sins as
1149 Amazon and other big retailers: spying on its customers to find lucrative
1150 new products to launch, effectively using independent software vendors as
1151 free-market researchers, then forcing them out of any markets they discover.
1152 </p><p>
1153 Because of its use of copyright locks, Apple’s mobile customers are not
1154 legally allowed to switch to a rival retailer for its apps if they want to
1155 do so on an iPhone. Apple, obviously, is the only entity that gets to decide
1156 how it ranks the results of search queries in its stores. These decisions
1157 ensure that some apps are often installed (because they appear on page one)
1158 and others are never installed (because they appear on page one
1159 million). Apple’s search-ranking design decisions have a vastly more
1160 significant effect on consumer behaviors than influence campaigns delivered
1161 by surveillance capitalism’s ad-serving bots.
1162 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="monopolists-can-afford-sleeping-pills-for-watchdogs"></a>Monopolists can afford sleeping pills for watchdogs</h2></div></div></div><p>
1163 Only the most extreme market ideologues think that markets can self-regulate
1164 without state oversight. Markets need watchdogs — regulators, lawmakers, and
1165 other elements of democratic control — to keep them honest. When these
1166 watchdogs sleep on the job, then markets cease to aggregate consumer choices
1167 because those choices are constrained by illegitimate and deceptive
1168 activities that companies are able to get away with because no one is
1169 holding them to account.
1170 </p><p>
1171 But this kind of regulatory capture doesn’t come cheap. In competitive
1172 sectors, where rivals are constantly eroding one another’s margins,
1173 individual firms lack the surplus capital to effectively lobby for laws and
1174 regulations that serve their ends.
1175 </p><p>
1176 Many of the harms of surveillance capitalism are the result of weak or
1177 nonexistent regulation. Those regulatory vacuums spring from the power of
1178 monopolists to resist stronger regulation and to tailor what regulation
1179 exists to permit their existing businesses.
1180 </p><p>
1181 Here’s an example: When firms over-collect and over-retain our data, they
1182 are at increased risk of suffering a breach — you can’t leak data you never
1183 collected, and once you delete all copies of that data, you can no longer
1184 leak it. For more than a decade, we’ve lived through an endless parade of
1185 ever-worsening data breaches, each one uniquely horrible in the scale of
1186 data breached and the sensitivity of that data.
1187 </p><p>
1188 But still, firms continue to over-collect and over-retain our data for three
1189 reasons:
1190 </p><p>
1191 <span class="strong"><strong>1. They are locked in the aforementioned limbic arms
1192 race with our capacity to shore up our attentional defense systems to resist
1193 their new persuasion techniques.</strong></span> They’re also locked in an arms
1194 race with their competitors to find new ways to target people for sales
1195 pitches. As soon as they discover a soft spot in our attentional defenses (a
1196 counterintuitive, unobvious way to target potential refrigerator buyers),
1197 the public begins to wise up to the tactic, and their competitors leap on
1198 it, hastening the day in which all potential refrigerator buyers have been
1199 inured to the pitch.
1200 </p><p>
1201 <span class="strong"><strong>2. They believe the surveillance capitalism
1202 story.</strong></span> Data is cheap to aggregate and store, and both proponents
1203 and opponents of surveillance capitalism have assured managers and product
1204 designers that if you collect enough data, you will be able to perform
1205 sorcerous acts of mind control, thus supercharging your sales. Even if you
1206 never figure out how to profit from the data, someone else will eventually
1207 offer to buy it from you to give it a try. This is the hallmark of all
1208 economic bubbles: acquiring an asset on the assumption that someone else
1209 will buy it from you for more than you paid for it, often to sell to someone
1210 else at an even greater price.
1211 </p><p>
1212 <span class="strong"><strong>3. The penalties for leaking data are
1213 negligible.</strong></span> Most countries limit these penalties to actual
1214 damages, meaning that consumers who’ve had their data breached have to show
1215 actual monetary harms to get a reward. In 2014, Home Depot disclosed that it
1216 had lost credit-card data for 53 million of its customers, but it settled
1217 the matter by paying those customers about $0.34 each — and a third of that
1218 $0.34 wasn’t even paid in cash. It took the form of a credit to procure a
1219 largely ineffectual credit-monitoring service.
1220 </p><p>
1221 But the harms from breaches are much more extensive than these
1222 actual-damages rules capture. Identity thieves and fraudsters are wily and
1223 endlessly inventive. All the vast breaches of our century are being
1224 continuously recombined, the data sets merged and mined for new ways to
1225 victimize the people whose data was present in them. Any reasonable,
1226 evidence-based theory of deterrence and compensation for breaches would not
1227 confine damages to actual damages but rather would allow users to claim
1228 these future harms.
1229 </p><p>
1230 However, even the most ambitious privacy rules, such as the EU General Data
1231 Protection Regulation, fall far short of capturing the negative
1232 externalities of the platforms’ negligent over-collection and
1233 over-retention, and what penalties they do provide are not aggressively
1234 pursued by regulators.
1235 </p><p>
1236 This tolerance of — or indifference to — data over-collection and
1237 over-retention can be ascribed in part to the sheer lobbying muscle of the
1238 platforms. They are so profitable that they can handily afford to divert
1239 gigantic sums to fight any real change — that is, change that would force
1240 them to internalize the costs of their surveillance activities.
1241 </p><p>
1242 And then there’s state surveillance, which the surveillance capitalism story
1243 dismisses as a relic of another era when the big worry was being jailed for
1244 your dissident speech, not having your free will stripped away with machine
1245 learning.
1246 </p><p>
1247 But state surveillance and private surveillance are intimately related. As
1248 we saw when Apple was conscripted by the Chinese government as a vital
1249 collaborator in state surveillance, the only really affordable and tractable
1250 way to conduct mass surveillance on the scale practiced by modern states —
1251 both <span class="quote"><span class="quote">free</span></span> and autocratic states — is to suborn commercial
1252 services.
1253 </p><p>
1254 Whether it’s Google being used as a location tracking tool by local law
1255 enforcement across the U.S. or the use of social media tracking by the
1256 Department of Homeland Security to build dossiers on participants in
1257 protests against Immigration and Customs Enforcement’s family separation
1258 practices, any hard limits on surveillance capitalism would hamstring the
1259 state’s own surveillance capability. Without Palantir, Amazon, Google, and
1260 other major tech contractors, U.S. cops would not be able to spy on Black
1261 people, ICE would not be able to manage the caging of children at the U.S.
1262 border, and state welfare systems would not be able to purge their rolls by
1263 dressing up cruelty as empiricism and claiming that poor and vulnerable
1264 people are ineligible for assistance. At least some of the states’
1265 unwillingness to take meaningful action to curb surveillance should be
1266 attributed to this symbiotic relationship. There is no mass state
1267 surveillance without mass commercial surveillance.
1268 </p><p>
1269 Monopolism is key to the project of mass state surveillance. It’s true that
1270 smaller tech firms are apt to be less well-defended than Big Tech, whose
1271 security experts are drawn from the tops of their field and who are given
1272 enormous resources to secure and monitor their systems against
1273 intruders. But smaller firms also have less to protect: fewer users whose
1274 data is more fragmented across more systems and have to be suborned one at a
1275 time by state actors.
1276 </p><p>
1277 A concentrated tech sector that works with authorities is a much more
1278 powerful ally in the project of mass state surveillance than a fragmented
1279 one composed of smaller actors. The U.S. tech sector is small enough that
1280 all of its top executives fit around a single boardroom table in Trump Tower
1281 in 2017, shortly after Trump’s inauguration. Most of its biggest players bid
1282 to win JEDI, the Pentagon’s $10 billion Joint Enterprise Defense
1283 Infrastructure cloud contract. Like other highly concentrated industries,
1284 Big Tech rotates its key employees in and out of government service, sending
1285 them to serve in the Department of Defense and the White House, then hiring
1286 ex-Pentagon and ex-DOD top staffers and officers to work in their own
1287 government relations departments.
1288 </p><p>
1289 They can even make a good case for doing this: After all, when there are
1290 only four or five big companies in an industry, everyone qualified to
1291 regulate those companies has served as an executive in at least a couple of
1292 them — because, likewise, when there are only five companies in an industry,
1293 everyone qualified for a senior role at any of them is by definition working
1294 at one of the other ones.
1295 </p><div class="blockquote"><blockquote class="blockquote"><p>
1296 While surveillance doesn’t cause monopolies, monopolies certainly abet
1297 surveillance.
1298 </p></blockquote></div><p>
1299 Industries that are competitive are fragmented — composed of companies that
1300 are at each other’s throats all the time and eroding one another’s margins
1301 in bids to steal their best customers. This leaves them with much more
1302 limited capital to use to lobby for favorable rules and a much harder job of
1303 getting everyone to agree to pool their resources to benefit the industry as
1304 a whole.
1305 </p><p>
1306 Surveillance combined with machine learning is supposed to be an existential
1307 crisis, a species-defining moment at which our free will is just a few more
1308 advances in the field from being stripped away. I am skeptical of this
1309 claim, but I <span class="emphasis"><em>do</em></span> think that tech poses an existential
1310 threat to our society and possibly our species.
1311 </p><p>
1312 But that threat grows out of monopoly.
1313 </p><p>
1314 One of the consequences of tech’s regulatory capture is that it can shift
1315 liability for poor security decisions onto its customers and the wider
1316 society. It is absolutely normal in tech for companies to obfuscate the
1317 workings of their products, to make them deliberately hard to understand,
1318 and to threaten security researchers who seek to independently audit those
1319 products.
1320 </p><p>
1321 IT is the only field in which this is practiced: No one builds a bridge or a
1322 hospital and keeps the composition of the steel or the equations used to
1323 calculate load stresses a secret. It is a frankly bizarre practice that
1324 leads, time and again, to grotesque security defects on farcical scales,
1325 with whole classes of devices being revealed as vulnerable long after they
1326 are deployed in the field and put into sensitive places.
1327 </p><p>
1328 The monopoly power that keeps any meaningful consequences for breaches at
1329 bay means that tech companies continue to build terrible products that are
1330 insecure by design and that end up integrated into our lives, in possession
1331 of our data, and connected to our physical world. For years, Boeing has
1332 struggled with the aftermath of a series of bad technology decisions that
1333 made its 737 fleet a global pariah, a rare instance in which bad tech
1334 decisions have been seriously punished in the market.
1335 </p><p>
1336 These bad security decisions are compounded yet again by the use of
1337 copyright locks to enforce business-model decisions against
1338 consumers. Recall that these locks have become the go-to means for shaping
1339 consumer behavior, making it technically impossible to use third-party ink,
1340 insulin, apps, or service depots in connection with your lawfully acquired
1341 property.
1342 </p><p>
1343 Recall also that these copyright locks are backstopped by legislation (such
1344 as Section 1201 of the DMCA or Article 6 of the 2001 EU Copyright Directive)
1345 that ban tampering with (<span class="quote"><span class="quote">circumventing</span></span>) them, and these
1346 statutes have been used to threaten security researchers who make
1347 disclosures about vulnerabilities without permission from manufacturers.
1348 </p><p>
1349 This amounts to a manufacturer’s veto over safety warnings and
1350 criticism. While this is far from the legislative intent of the DMCA and its
1351 sister statutes around the world, Congress has not intervened to clarify the
1352 statute nor will it because to do so would run counter to the interests of
1353 powerful, large firms whose lobbying muscle is unstoppable.
1354 </p><p>
1355 Copyright locks are a double whammy: They create bad security decisions that
1356 can’t be freely investigated or discussed. If markets are supposed to be
1357 machines for aggregating information (and if surveillance capitalism’s
1358 notional mind-control rays are what make it a <span class="quote"><span class="quote">rogue
1359 capitalism</span></span> because it denies consumers the power to make decisions),
1360 then a program of legally enforced ignorance of the risks of products makes
1361 monopolism even more of a <span class="quote"><span class="quote">rogue capitalism</span></span> than surveillance
1362 capitalism’s influence campaigns.
1363 </p><p>
1364 And unlike mind-control rays, enforced silence over security is an
1365 immediate, documented problem, and it <span class="emphasis"><em>does</em></span> constitute
1366 an existential threat to our civilization and possibly our species. The
1367 proliferation of insecure devices — especially devices that spy on us and
1368 especially when those devices also can manipulate the physical world by,
1369 say, steering your car or flipping a breaker at a power station — is a kind
1370 of technology debt.
1371 </p><p>
1372 In software design, <span class="quote"><span class="quote">technology debt</span></span> refers to old, baked-in
1373 decisions that turn out to be bad ones in hindsight. Perhaps a long-ago
1374 developer decided to incorporate a networking protocol made by a vendor that
1375 has since stopped supporting it. But everything in the product still relies
1376 on that superannuated protocol, and so, with each revision, the product team
1377 has to work around this obsolete core, adding compatibility layers,
1378 surrounding it with security checks that try to shore up its defenses, and
1379 so on. These Band-Aid measures compound the debt because every subsequent
1380 revision has to make allowances for <span class="emphasis"><em>them</em></span>, too, like
1381 interest mounting on a predatory subprime loan. And like a subprime loan,
1382 the interest mounts faster than you can hope to pay it off: The product team
1383 has to put so much energy into maintaining this complex, brittle system that
1384 they don’t have any time left over to refactor the product from the ground
1385 up and <span class="quote"><span class="quote">pay off the debt</span></span> once and for all.
1386 </p><p>
1387 Typically, technology debt results in a technological bankruptcy: The
1388 product gets so brittle and unsustainable that it fails
1389 catastrophically. Think of the antiquated COBOL-based banking and accounting
1390 systems that fell over at the start of the pandemic emergency when
1391 confronted with surges of unemployment claims. Sometimes that ends the
1392 product; sometimes it takes the company down with it. Being caught in the
1393 default of a technology debt is scary and traumatic, just like losing your
1394 house due to bankruptcy is scary and traumatic.
1395 </p><p>
1396 But the technology debt created by copyright locks isn’t individual debt;
1397 it’s systemic. Everyone in the world is exposed to this over-leverage, as
1398 was the case with the 2008 financial crisis. When that debt comes due — when
1399 we face a cascade of security breaches that threaten global shipping and
1400 logistics, the food supply, pharmaceutical production pipelines, emergency
1401 communications, and other critical systems that are accumulating technology
1402 debt in part due to the presence of deliberately insecure and deliberately
1403 unauditable copyright locks — it will indeed pose an existential risk.
1404 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="privacy-and-monopoly"></a>Privacy and monopoly</h2></div></div></div><p>
1405 Many tech companies are gripped by an orthodoxy that holds that if they just
1406 gather enough data on enough of our activities, everything else is possible
1407 — the mind control and endless profits. This is an unfalsifiable hypothesis:
1408 If data gives a tech company even a tiny improvement in behavior prediction
1409 and modification, the company declares that it has taken the first step
1410 toward global domination with no end in sight. If a company
1411 <span class="emphasis"><em>fails</em></span> to attain any improvements from gathering and
1412 analyzing data, it declares success to be just around the corner, attainable
1413 once more data is in hand.
1414 </p><p>
1415 Surveillance tech is far from the first industry to embrace a nonsensical,
1416 self-serving belief that harms the rest of the world, and it is not the
1417 first industry to profit handsomely from such a delusion. Long before
1418 hedge-fund managers were claiming (falsely) that they could beat the
1419 S&amp;P 500, there were plenty of other <span class="quote"><span class="quote">respectable</span></span>
1420 industries that have been revealed as quacks in hindsight. From the makers
1421 of radium suppositories (a real thing!) to the cruel sociopaths who claimed
1422 they could <span class="quote"><span class="quote">cure</span></span> gay people, history is littered with the
1423 formerly respectable titans of discredited industries.
1424 </p><p>
1425 This is not to say that there’s nothing wrong with Big Tech and its
1426 ideological addiction to data. While surveillance’s benefits are mostly
1427 overstated, its harms are, if anything, <span class="emphasis"><em>understated</em></span>.
1428 </p><p>
1429 There’s real irony here. The belief in surveillance capitalism as a
1430 <span class="quote"><span class="quote">rogue capitalism</span></span> is driven by the belief that markets
1431 wouldn’t tolerate firms that are gripped by false beliefs. An oil company
1432 that has false beliefs about where the oil is will eventually go broke
1433 digging dry wells after all.
1434 </p><p>
1435 But monopolists get to do terrible things for a long time before they pay
1436 the price. Think of how concentration in the finance sector allowed the
1437 subprime crisis to fester as bond-rating agencies, regulators, investors,
1438 and critics all fell under the sway of a false belief that complex
1439 mathematics could construct <span class="quote"><span class="quote">fully hedged</span></span> debt instruments
1440 that could not possibly default. A small bank that engaged in this kind of
1441 malfeasance would simply go broke rather than outrunning the inevitable
1442 crisis, perhaps growing so big that it averted it altogether. But large
1443 banks were able to continue to attract investors, and when they finally
1444 <span class="emphasis"><em>did</em></span> come a-cropper, the world’s governments bailed them
1445 out. The worst offenders of the subprime crisis are bigger than they were in
1446 2008, bringing home more profits and paying their execs even larger sums.
1447 </p><p>
1448 Big Tech is able to practice surveillance not just because it is tech but
1449 because it is <span class="emphasis"><em>big</em></span>. The reason every web publisher
1450 embeds a Facebook <span class="quote"><span class="quote">Like</span></span> button is that Facebook dominates the
1451 internet’s social media referrals — and every one of those
1452 <span class="quote"><span class="quote">Like</span></span> buttons spies on everyone who lands on a page that
1453 contains them (see also: Google Analytics embeds, Twitter buttons, etc.).
1454 </p><p>
1455 The reason the world’s governments have been slow to create meaningful
1456 penalties for privacy breaches is that Big Tech’s concentration produces
1457 huge profits that can be used to lobby against those penalties — and Big
1458 Tech’s concentration means that the companies involved are able to arrive at
1459 a unified negotiating position that supercharges the lobbying.
1460 </p><p>
1461 The reason that the smartest engineers in the world want to work for Big
1462 Tech is that Big Tech commands the lion’s share of tech industry jobs.
1463 </p><p>
1464 The reason people who are aghast at Facebook’s and Google’s and Amazon’s
1465 data-handling practices continue to use these services is that all their
1466 friends are on Facebook; Google dominates search; and Amazon has put all the
1467 local merchants out of business.
1468 </p><p>
1469 Competitive markets would weaken the companies’ lobbying muscle by reducing
1470 their profits and pitting them against each other in regulatory forums. It
1471 would give customers other places to go to get their online services. It
1472 would make the companies small enough to regulate and pave the way to
1473 meaningful penalties for breaches. It would let engineers with ideas that
1474 challenged the surveillance orthodoxy raise capital to compete with the
1475 incumbents. It would give web publishers multiple ways to reach audiences
1476 and make the case against Facebook and Google and Twitter embeds.
1477 </p><p>
1478 In other words, while surveillance doesn’t cause monopolies, monopolies
1479 certainly abet surveillance.
1480 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="ronald-reagan-pioneer-of-tech-monopolism"></a>Ronald Reagan, pioneer of tech monopolism</h2></div></div></div><p>
1481 Technology exceptionalism is a sin, whether it’s practiced by technology’s
1482 blind proponents or by its critics. Both of these camps are prone to
1483 explaining away monopolistic concentration by citing some special
1484 characteristic of the tech industry, like network effects or first-mover
1485 advantage. The only real difference between these two groups is that the
1486 tech apologists say monopoly is inevitable so we should just let tech get
1487 away with its abuses while competition regulators in the U.S. and the EU say
1488 monopoly is inevitable so we should punish tech for its abuses but not try
1489 to break up the monopolies.
1490 </p><p>
1491 To understand how tech became so monopolistic, it’s useful to look at the
1492 dawn of the consumer tech industry: 1979, the year the Apple II Plus
1493 launched and became the first successful home computer. That also happens to
1494 be the year that Ronald Reagan hit the campaign trail for the 1980
1495 presidential race — a race he won, leading to a radical shift in the way
1496 that antitrust concerns are handled in America. Reagan’s cohort of
1497 politicians — including Margaret Thatcher in the U.K., Brian Mulroney in
1498 Canada, Helmut Kohl in Germany, and Augusto Pinochet in Chile — went on to
1499 enact similar reforms that eventually spread around the world.
1500 </p><p>
1501 Antitrust’s story began nearly a century before all that with laws like the
1502 Sherman Act, which took aim at monopolists on the grounds that monopolies
1503 were bad in and of themselves — squeezing out competitors, creating
1504 <span class="quote"><span class="quote">diseconomies of scale</span></span> (when a company is so big that its
1505 constituent parts go awry and it is seemingly helpless to address the
1506 problems), and capturing their regulators to such a degree that they can get
1507 away with a host of evils.
1508 </p><p>
1509 Then came a fabulist named Robert Bork, a former solicitor general who
1510 Reagan appointed to the powerful U.S. Court of Appeals for the D.C. Circuit
1511 and who had created an alternate legislative history of the Sherman Act and
1512 its successors out of whole cloth. Bork insisted that these statutes were
1513 never targeted at monopolies (despite a wealth of evidence to the contrary,
1514 including the transcribed speeches of the acts’ authors) but, rather, that
1515 they were intended to prevent <span class="quote"><span class="quote">consumer harm</span></span> — in the form of
1516 higher prices.
1517 </p><p>
1518 Bork was a crank, but he was a crank with a theory that rich people really
1519 liked. Monopolies are a great way to make rich people richer by allowing
1520 them to receive <span class="quote"><span class="quote">monopoly rents</span></span> (that is, bigger profits) and
1521 capture regulators, leading to a weaker, more favorable regulatory
1522 environment with fewer protections for customers, suppliers, the
1523 environment, and workers.
1524 </p><p>
1525 Bork’s theories were especially palatable to the same power brokers who
1526 backed Reagan, and Reagan’s Department of Justice and other agencies began
1527 to incorporate Bork’s antitrust doctrine into their enforcement decisions
1528 (Reagan even put Bork up for a Supreme Court seat, but Bork flunked the
1529 Senate confirmation hearing so badly that, 40 years later, D.C. insiders use
1530 the term <span class="quote"><span class="quote">borked</span></span> to refer to any catastrophically bad
1531 political performance).
1532 </p><p>
1533 Little by little, Bork’s theories entered the mainstream, and their backers
1534 began to infiltrate the legal education field, even putting on junkets where
1535 members of the judiciary were treated to lavish meals, fun outdoor
1536 activities, and seminars where they were indoctrinated into the consumer
1537 harm theory of antitrust. The more Bork’s theories took hold, the more money
1538 the monopolists were making — and the more surplus capital they had at their
1539 disposal to lobby for even more Borkian antitrust influence campaigns.
1540 </p><p>
1541 The history of Bork’s antitrust theories is a really good example of the
1542 kind of covertly engineered shifts in public opinion that Zuboff warns us
1543 against, where fringe ideas become mainstream orthodoxy. But Bork didn’t
1544 change the world overnight. He played a very long game, for over a
1545 generation, and he had a tailwind because the same forces that backed
1546 oligarchic antitrust theories also backed many other oligarchic shifts in
1547 public opinion. For example, the idea that taxation is theft, that wealth is
1548 a sign of virtue, and so on — all of these theories meshed to form a
1549 coherent ideology that elevated inequality to a virtue.
1550 </p><p>
1551 Today, many fear that machine learning allows surveillance capitalism to
1552 sell <span class="quote"><span class="quote">Bork-as-a-Service,</span></span> at internet speeds, so that you can
1553 contract a machine-learning company to engineer <span class="emphasis"><em>rapid</em></span>
1554 shifts in public sentiment without needing the capital to sustain a
1555 multipronged, multigenerational project working at the local, state,
1556 national, and global levels in business, law, and philosophy. I do not
1557 believe that such a project is plausible, though I agree that this is
1558 basically what the platforms claim to be selling. They’re just lying about
1559 it. Big Tech lies all the time, <span class="emphasis"><em>including</em></span> in their
1560 sales literature.
1561 </p><p>
1562 The idea that tech forms <span class="quote"><span class="quote">natural monopolies</span></span> (monopolies that
1563 are the inevitable result of the realities of an industry, such as the
1564 monopolies that accrue the first company to run long-haul phone lines or
1565 rail lines) is belied by tech’s own history: In the absence of
1566 anti-competitive tactics, Google was able to unseat AltaVista and Yahoo;
1567 Facebook was able to head off Myspace. There are some advantages to
1568 gathering mountains of data, but those mountains of data also have
1569 disadvantages: liability (from leaking), diminishing returns (from old
1570 data), and institutional inertia (big companies, like science, progress one
1571 funeral at a time).
1572 </p><p>
1573 Indeed, the birth of the web saw a mass-extinction event for the existing
1574 giant, wildly profitable proprietary technologies that had capital, network
1575 effects, and walls and moats surrounding their businesses. The web showed
1576 that when a new industry is built around a protocol, rather than a product,
1577 the combined might of everyone who uses the protocol to reach their
1578 customers or users or communities outweighs even the most massive
1579 products. CompuServe, AOL, MSN, and a host of other proprietary walled
1580 gardens learned this lesson the hard way: Each believed it could stay
1581 separate from the web, offering <span class="quote"><span class="quote">curation</span></span> and a guarantee of
1582 consistency and quality instead of the chaos of an open system. Each was
1583 wrong and ended up being absorbed into the public web.
1584 </p><p>
1585 Yes, tech is heavily monopolized and is now closely associated with industry
1586 concentration, but this has more to do with a matter of timing than its
1587 intrinsically monopolistic tendencies. Tech was born at the moment that
1588 antitrust enforcement was being dismantled, and tech fell into exactly the
1589 same pathologies that antitrust was supposed to guard against. To a first
1590 approximation, it is reasonable to assume that tech’s monopolies are the
1591 result of a lack of anti-monopoly action and not the much-touted unique
1592 characteristics of tech, such as network effects, first-mover advantage, and
1593 so on.
1594 </p><p>
1595 In support of this thesis, I offer the concentration that every
1596 <span class="emphasis"><em>other</em></span> industry has undergone over the same period. From
1597 professional wrestling to consumer packaged goods to commercial property
1598 leasing to banking to sea freight to oil to record labels to newspaper
1599 ownership to theme parks, <span class="emphasis"><em>every</em></span> industry has undergone
1600 a massive shift toward concentration. There’s no obvious network effects or
1601 first-mover advantage at play in these industries. However, in every case,
1602 these industries attained their concentrated status through tactics that
1603 were prohibited before Bork’s triumph: merging with major competitors,
1604 buying out innovative new market entrants, horizontal and vertical
1605 integration, and a suite of anti-competitive tactics that were once illegal
1606 but are not any longer.
1607 </p><p>
1608 Again: When you change the laws intended to prevent monopolies and then
1609 monopolies form in exactly the way the law was supposed to prevent, it is
1610 reasonable to suppose that these facts are related. Tech’s concentration
1611 can be readily explained without recourse to radical theories of network
1612 effects — but only if you’re willing to indict unregulated markets as
1613 tending toward monopoly. Just as a lifelong smoker can give you a hundred
1614 reasons why their smoking didn’t cause their cancer (<span class="quote"><span class="quote">It was the
1615 environmental toxins</span></span>), true believers in unregulated markets have a
1616 whole suite of unconvincing explanations for monopoly in tech that leave
1617 capitalism intact.
1618 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="steering-with-the-windshield-wipers"></a>Steering with the windshield wipers</h2></div></div></div><p>
1619 It’s been 40 years since Bork’s project to rehabilitate monopolies achieved
1620 liftoff, and that is a generation and a half, which is plenty of time to
1621 take a common idea and make it seem outlandish and vice versa. Before the
1622 1940s, affluent Americans dressed their baby boys in pink while baby girls
1623 wore blue (a <span class="quote"><span class="quote">delicate and dainty</span></span> color). While gendered
1624 colors are obviously totally arbitrary, many still greet this news with
1625 amazement and find it hard to imagine a time when pink connoted masculinity.
1626 </p><p>
1627 After 40 years of studiously ignoring antitrust analysis and enforcement,
1628 it’s not surprising that we’ve all but forgotten that antitrust exists, that
1629 in living memory, growth through mergers and acquisitions were largely
1630 prohibited under law, that market-cornering strategies like vertical
1631 integration could land a company in court.
1632 </p><p>
1633 Antitrust is a market society’s steering wheel, the control of first resort
1634 to keep would-be masters of the universe in their lanes. But Bork and his
1635 cohort ripped out our steering wheel 40 years ago. The car is still
1636 barreling along, and so we’re yanking as hard as we can on all the
1637 <span class="emphasis"><em>other</em></span> controls in the car as well as desperately
1638 flapping the doors and rolling the windows up and down in the hopes that one
1639 of these other controls can be repurposed to let us choose where we’re
1640 heading before we careen off a cliff.
1641 </p><p>
1642 It’s like a 1960s science-fiction plot come to life: People stuck in a
1643 <span class="quote"><span class="quote">generation ship,</span></span> plying its way across the stars, a ship once
1644 piloted by their ancestors; and now, after a great cataclysm, the ship’s
1645 crew have forgotten that they’re in a ship at all and no longer remember
1646 where the control room is. Adrift, the ship is racing toward its extinction,
1647 and unless we can seize the controls and execute emergency course
1648 correction, we’re all headed for a fiery death in the heart of a sun.
1649 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="surveillance-still-matters"></a>Surveillance still matters</h2></div></div></div><p>
1650 None of this is to minimize the problems with surveillance. Surveillance
1651 matters, and Big Tech’s use of surveillance <span class="emphasis"><em>is</em></span> an
1652 existential risk to our species, but that’s not because surveillance and
1653 machine learning rob us of our free will.
1654 </p><p>
1655 Surveillance has become <span class="emphasis"><em>much</em></span> more efficient thanks to
1656 Big Tech. In 1989, the Stasi — the East German secret police — had the whole
1657 country under surveillance, a massive undertaking that recruited one out of
1658 every 60 people to serve as an informant or intelligence operative.
1659 </p><p>
1660 Today, we know that the NSA is spying on a significant fraction of the
1661 entire world’s population, and its ratio of surveillance operatives to the
1662 surveilled is more like 1:10,000 (that’s probably on the low side since it
1663 assumes that every American with top-secret clearance is working for the NSA
1664 on this project — we don’t know how many of those cleared people are
1665 involved in NSA spying, but it’s definitely not all of them).
1666 </p><p>
1667 How did the ratio of surveillable citizens expand from 1:60 to 1:10,000 in
1668 less than 30 years? It’s thanks to Big Tech. Our devices and services gather
1669 most of the data that the NSA mines for its surveillance project. We pay for
1670 these devices and the services they connect to, and then we painstakingly
1671 perform the data-entry tasks associated with logging facts about our lives,
1672 opinions, and preferences. This mass surveillance project has been largely
1673 useless for fighting terrorism: The NSA can <a class="ulink" href="https://www.washingtonpost.com/world/national-security/nsa-cites-case-as-success-of-phone-data-collection-program/2013/08/08/fc915e5a-feda-11e2-96a8-d3b921c0924a_story.html" target="_top">only
1674 point to a single minor success story</a> in which it used its data
1675 collection program to foil an attempt by a U.S. resident to wire a few
1676 thousand dollars to an overseas terror group. It’s ineffective for much the
1677 same reason that commercial surveillance projects are largely ineffective at
1678 targeting advertising: The people who want to commit acts of terror, like
1679 people who want to buy a refrigerator, are extremely rare. If you’re trying
1680 to detect a phenomenon whose base rate is one in a million with an
1681 instrument whose accuracy is only 99%, then every true positive will come at
1682 the cost of 9,999 false positives.
1683 </p><p>
1684 Let me explain that again: If one in a million people is a terrorist, then
1685 there will only be about one terrorist in a random sample of one million
1686 people. If your test for detecting terrorists is 99% accurate, it will
1687 identify 10,000 terrorists in your million-person sample (1% of one million
1688 is 10,000). For every true positive, you’ll get 9,999 false positives.
1689 </p><p>
1690 In reality, the accuracy of algorithmic terrorism detection falls far short
1691 of the 99% mark, as does refrigerator ad targeting. The difference is that
1692 being falsely accused of wanting to buy a fridge is a minor nuisance while
1693 being falsely accused of planning a terror attack can destroy your life and
1694 the lives of everyone you love.
1695 </p><p>
1696 Mass state surveillance is only feasible because of surveillance capitalism
1697 and its extremely low-yield ad-targeting systems, which require a constant
1698 feed of personal data to remain barely viable. Surveillance capitalism’s
1699 primary failure mode is mistargeted ads while mass state surveillance’s
1700 primary failure mode is grotesque human rights abuses, tending toward
1701 totalitarianism.
1702 </p><p>
1703 State surveillance is no mere parasite on Big Tech, sucking up its data and
1704 giving nothing in return. In truth, the two are symbiotes: Big Tech sucks up
1705 our data for spy agencies, and spy agencies ensure that governments don’t
1706 limit Big Tech’s activities so severely that it would no longer serve the
1707 spy agencies’ needs. There is no firm distinction between state surveillance
1708 and surveillance capitalism; they are dependent on one another.
1709 </p><p>
1710 To see this at work today, look no further than Amazon’s home surveillance
1711 device, the Ring doorbell, and its associated app, Neighbors. Ring — a
1712 product that Amazon acquired and did not develop in house — makes a
1713 camera-enabled doorbell that streams footage from your front door to your
1714 mobile device. The Neighbors app allows you to form a neighborhood-wide
1715 surveillance grid with your fellow Ring owners through which you can share
1716 clips of <span class="quote"><span class="quote">suspicious characters.</span></span> If you’re thinking that this
1717 sounds like a recipe for letting curtain-twitching racists supercharge their
1718 suspicions of people with brown skin who walk down their blocks, <a class="ulink" href="https://www.eff.org/deeplinks/2020/07/amazons-ring-enables-over-policing-efforts-some-americas-deadliest-law-enforcement" target="_top">you’re
1719 right</a>. Ring has become a <span class="emphasis"><em>de facto,</em></span>
1720 off-the-books arm of the police without any of the pesky oversight or rules.
1721 </p><p>
1722 In mid-2019, a series of public records requests revealed that Amazon had
1723 struck confidential deals with more than 400 local law enforcement agencies
1724 through which the agencies would promote Ring and Neighbors and in exchange
1725 get access to footage from Ring cameras. In theory, cops would need to
1726 request this footage through Amazon (and internal documents reveal that
1727 Amazon devotes substantial resources to coaching cops on how to spin a
1728 convincing story when doing so), but in practice, when a Ring customer turns
1729 down a police request, Amazon only requires the agency to formally request
1730 the footage from the company, which it will then produce.
1731 </p><p>
1732 Ring and law enforcement have found many ways to intertwine their
1733 activities. Ring strikes secret deals to acquire real-time access to 911
1734 dispatch and then streams alarming crime reports to Neighbors users, which
1735 serve as convincers for anyone who’s contemplating a surveillance doorbell
1736 but isn’t sure whether their neighborhood is dangerous enough to warrant it.
1737 </p><p>
1738 The more the cops buzz-market the surveillance capitalist Ring, the more
1739 surveillance capability the state gets. Cops who rely on private entities
1740 for law-enforcement roles then brief against any controls on the deployment
1741 of that technology while the companies return the favor by lobbying against
1742 rules requiring public oversight of police surveillance technology. The more
1743 the cops rely on Ring and Neighbors, the harder it will be to pass laws to
1744 curb them. The fewer laws there are against them, the more the cops will
1745 rely on them.
1746 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="dignity-and-sanctuary"></a>Dignity and sanctuary</h2></div></div></div><p>
1747 But even if we could exercise democratic control over our states and force
1748 them to stop raiding surveillance capitalism’s reservoirs of behavioral
1749 data, surveillance capitalism would still harm us.
1750 </p><p>
1751 This is an area where Zuboff shines. Her chapter on <span class="quote"><span class="quote">sanctuary</span></span>
1752 — the feeling of being unobserved — is a beautiful hymn to introspection,
1753 calmness, mindfulness, and tranquility.
1754 </p><p>
1755 When you are watched, something changes. Anyone who has ever raised a child
1756 knows this. You might look up from your book (or more realistically, from
1757 your phone) and catch your child in a moment of profound realization and
1758 growth, a moment where they are learning something that is right at the edge
1759 of their abilities, requiring their entire ferocious concentration. For a
1760 moment, you’re transfixed, watching that rare and beautiful moment of focus
1761 playing out before your eyes, and then your child looks up and sees you
1762 seeing them, and the moment collapses. To grow, you need to be and expose
1763 your authentic self, and in that moment, you are vulnerable like a hermit
1764 crab scuttling from one shell to the next. The tender, unprotected tissues
1765 you expose in that moment are too delicate to reveal in the presence of
1766 another, even someone you trust as implicitly as a child trusts their
1767 parent.
1768 </p><p>
1769 In the digital age, our authentic selves are inextricably tied to our
1770 digital lives. Your search history is a running ledger of the questions
1771 you’ve pondered. Your location history is a record of the places you’ve
1772 sought out and the experiences you’ve had there. Your social graph reveals
1773 the different facets of your identity, the people you’ve connected with.
1774 </p><p>
1775 To be observed in these activities is to lose the sanctuary of your
1776 authentic self.
1777 </p><p>
1778 There’s another way in which surveillance capitalism robs us of our capacity
1779 to be our authentic selves: by making us anxious. Surveillance capitalism
1780 isn’t really a mind-control ray, but you don’t need a mind-control ray to
1781 make someone anxious. After all, another word for anxiety is agitation, and
1782 to make someone experience agitation, you need merely to agitate them. To
1783 poke them and prod them and beep at them and buzz at them and bombard them
1784 on an intermittent schedule that is just random enough that our limbic
1785 systems never quite become inured to it.
1786 </p><p>
1787 Our devices and services are <span class="quote"><span class="quote">general purpose</span></span> in that they can
1788 connect anything or anyone to anything or anyone else and that they can run
1789 any program that can be written. This means that the distraction rectangles
1790 in our pockets hold our most precious moments with our most beloved people
1791 and their most urgent or time-sensitive communications (from <span class="quote"><span class="quote">running
1792 late can you get the kid?</span></span> to <span class="quote"><span class="quote">doctor gave me bad news and I
1793 need to talk to you RIGHT NOW</span></span>) as well as ads for refrigerators and
1794 recruiting messages from Nazis.
1795 </p><p>
1796 All day and all night, our pockets buzz, shattering our concentration and
1797 tearing apart the fragile webs of connection we spin as we think through
1798 difficult ideas. If you locked someone in a cell and agitated them like
1799 this, we’d call it <span class="quote"><span class="quote">sleep deprivation torture,</span></span> and it would be
1800 <a class="ulink" href="https://www.youtube.com/watch?v=1SKpRbvnx6g" target="_top">a war crime under
1801 the Geneva Conventions</a>.
1802 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="afflicting-the-afflicted"></a>Afflicting the afflicted</h2></div></div></div><p>
1803 The effects of surveillance on our ability to be our authentic selves are
1804 not equal for all people. Some of us are lucky enough to live in a time and
1805 place in which all the most important facts of our lives are widely and
1806 roundly socially acceptable and can be publicly displayed without the risk
1807 of social consequence.
1808 </p><p>
1809 But for many of us, this is not true. Recall that in living memory, many of
1810 the ways of being that we think of as socially acceptable today were once
1811 cause for dire social sanction or even imprisonment. If you are 65 years
1812 old, you have lived through a time in which people living in <span class="quote"><span class="quote">free
1813 societies</span></span> could be imprisoned or sanctioned for engaging in
1814 homosexual activity, for falling in love with a person whose skin was a
1815 different color than their own, or for smoking weed.
1816 </p><p>
1817 Today, these activities aren’t just decriminalized in much of the world,
1818 they’re considered normal, and the fallen prohibitions are viewed as
1819 shameful, regrettable relics of the past.
1820 </p><p>
1821 How did we get from prohibition to normalization? Through private, personal
1822 activity: People who were secretly gay or secret pot-smokers or who secretly
1823 loved someone with a different skin color were vulnerable to retaliation if
1824 they made their true selves known and were limited in how much they could
1825 advocate for their own right to exist in the world and be true to
1826 themselves. But because there was a private sphere, these people could form
1827 alliances with their friends and loved ones who did not share their
1828 disfavored traits by having private conversations in which they came out,
1829 disclosing their true selves to the people around them and bringing them to
1830 their cause one conversation at a time.
1831 </p><p>
1832 The right to choose the time and manner of these conversations was key to
1833 their success. It’s one thing to come out to your dad while you’re on a
1834 fishing trip away from the world and another thing entirely to blurt it out
1835 over the Christmas dinner table while your racist Facebook uncle is there to
1836 make a scene.
1837 </p><p>
1838 Without a private sphere, there’s a chance that none of these changes would
1839 have come to pass and that the people who benefited from these changes would
1840 have either faced social sanction for coming out to a hostile world or would
1841 have never been able to reveal their true selves to the people they love.
1842 </p><p>
1843 The corollary is that, unless you think that our society has attained social
1844 perfection — that your grandchildren in 50 years will ask you to tell them
1845 the story of how, in 2020, every injustice had been righted and no further
1846 change had to be made — then you should expect that right now, at this
1847 minute, there are people you love, whose happiness is key to your own, who
1848 have a secret in their hearts that stops them from ever being their
1849 authentic selves with you. These people are sorrowing and will go to their
1850 graves with that secret sorrow in their hearts, and the source of that
1851 sorrow will be the falsity of their relationship to you.
1852 </p><p>
1853 A private realm is necessary for human progress.
1854 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="any-data-you-collect-and-retain-will-eventually-leak"></a>Any data you collect and retain will eventually leak</h2></div></div></div><p>
1855 The lack of a private life can rob vulnerable people of the chance to be
1856 their authentic selves and constrain our actions by depriving us of
1857 sanctuary, but there is another risk that is borne by everyone, not just
1858 people with a secret: crime.
1859 </p><p>
1860 Personally identifying information is of very limited use for the purpose of
1861 controlling peoples’ minds, but identity theft — really a catchall term for
1862 a whole constellation of terrible criminal activities that can destroy your
1863 finances, compromise your personal integrity, ruin your reputation, or even
1864 expose you to physical danger — thrives on it.
1865 </p><p>
1866 Attackers are not limited to using data from one breached source,
1867 either. Multiple services have suffered breaches that exposed names,
1868 addresses, phone numbers, passwords, sexual tastes, school grades, work
1869 performance, brushes with the criminal justice system, family details,
1870 genetic information, fingerprints and other biometrics, reading habits,
1871 search histories, literary tastes, pseudonymous identities, and other
1872 sensitive information. Attackers can merge data from these different
1873 breaches to build up extremely detailed dossiers on random subjects and then
1874 use different parts of the data for different criminal purposes.
1875 </p><p>
1876 For example, attackers can use leaked username and password combinations to
1877 hijack whole fleets of commercial vehicles that <a class="ulink" href="https://www.vice.com/en_us/article/zmpx4x/hacker-monitor-cars-kill-engine-gps-tracking-apps" target="_top">have
1878 been fitted with anti-theft GPS trackers and immobilizers</a> or to
1879 hijack baby monitors in order to <a class="ulink" href="https://www.washingtonpost.com/technology/2019/04/23/how-nest-designed-keep-intruders-out-peoples-homes-effectively-allowed-hackers-get/?utm_term=.15220e98c550" target="_top">terrorize
1880 toddlers with the audio tracks from pornography</a>. Attackers use
1881 leaked data to trick phone companies into giving them your phone number,
1882 then they intercept SMS-based two-factor authentication codes in order to
1883 take over your email, bank account, and/or cryptocurrency wallets.
1884 </p><p>
1885 Attackers are endlessly inventive in the pursuit of creative ways to
1886 weaponize leaked data. One common use of leaked data is to penetrate
1887 companies in order to access <span class="emphasis"><em>more</em></span> data.
1888 </p><p>
1889 Like spies, online fraudsters are totally dependent on companies
1890 over-collecting and over-retaining our data. Spy agencies sometimes pay
1891 companies for access to their data or intimidate them into giving it up, but
1892 sometimes they work just like criminals do — by <a class="ulink" href="https://www.bbc.com/news/world-us-canada-24751821" target="_top">sneaking data out of
1893 companies’ databases</a>.
1894 </p><p>
1895 The over-collection of data has a host of terrible social consequences, from
1896 the erosion of our authentic selves to the undermining of social progress,
1897 from state surveillance to an epidemic of online crime. Commercial
1898 surveillance is also a boon to people running influence campaigns, but
1899 that’s the least of our troubles.
1900 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="critical-tech-exceptionalism-is-still-tech-exceptionalism"></a>Critical tech exceptionalism is still tech exceptionalism</h2></div></div></div><p>
1901 Big Tech has long practiced technology exceptionalism: the idea that it
1902 should not be subject to the mundane laws and norms of
1903 <span class="quote"><span class="quote">meatspace.</span></span> Mottoes like Facebook’s <span class="quote"><span class="quote">move fast and break
1904 things</span></span> attracted justifiable scorn of the companies’ self-serving
1905 rhetoric.
1906 </p><p>
1907 Tech exceptionalism got us all into a lot of trouble, so it’s ironic and
1908 distressing to see Big Tech’s critics committing the same sin.
1909 </p><p>
1910 Big Tech is not a <span class="quote"><span class="quote">rogue capitalism</span></span> that cannot be cured
1911 through the traditional anti-monopoly remedies of trustbusting (forcing
1912 companies to divest of competitors they have acquired) and bans on mergers
1913 to monopoly and other anti-competitive tactics. Big Tech does not have the
1914 power to use machine learning to influence our behavior so thoroughly that
1915 markets lose the ability to punish bad actors and reward superior
1916 competitors. Big Tech has no rule-writing mind-control ray that necessitates
1917 ditching our old toolbox.
1918 </p><p>
1919 The thing is, people have been claiming to have perfected mind-control rays
1920 for centuries, and every time, it turned out to be a con — though sometimes
1921 the con artists were also conning themselves.
1922 </p><p>
1923 For generations, the advertising industry has been steadily improving its
1924 ability to sell advertising services to businesses while only making
1925 marginal gains in selling those businesses’ products to prospective
1926 customers. John Wanamaker’s lament that <span class="quote"><span class="quote">50% of my advertising budget
1927 is wasted, I just don’t know which 50%</span></span> is a testament to the triumph
1928 of <span class="emphasis"><em>ad executives</em></span>, who successfully convinced Wanamaker
1929 that only half of the money he spent went to waste.
1930 </p><p>
1931 The tech industry has made enormous improvements in the science of
1932 convincing businesses that they’re good at advertising while their actual
1933 improvements to advertising — as opposed to targeting — have been pretty
1934 ho-hum. The vogue for machine learning — and the mystical invocation of
1935 <span class="quote"><span class="quote">artificial intelligence</span></span> as a synonym for straightforward
1936 statistical inference techniques — has greatly boosted the efficacy of Big
1937 Tech’s sales pitch as marketers have exploited potential customers’ lack of
1938 technical sophistication to get away with breathtaking acts of overpromising
1939 and underdelivering.
1940 </p><p>
1941 It’s tempting to think that if businesses are willing to pour billions into
1942 a venture that the venture must be a good one. Yet there are plenty of times
1943 when this rule of thumb has led us astray. For example, it’s virtually
1944 unheard of for managed investment funds to outperform simple index funds,
1945 and investors who put their money into the hands of expert money managers
1946 overwhelmingly fare worse than those who entrust their savings to index
1947 funds. But managed funds still account for the majority of the money
1948 invested in the markets, and they are patronized by some of the richest,
1949 most sophisticated investors in the world. Their vote of confidence in an
1950 underperforming sector is a parable about the role of luck in wealth
1951 accumulation, not a sign that managed funds are a good buy.
1952 </p><p>
1953 The claims of Big Tech’s mind-control system are full of tells that the
1954 enterprise is a con. For example, <a class="ulink" href="https://www.frontiersin.org/articles/10.3389/fpsyg.2020.01415/full" target="_top">the
1955 reliance on the <span class="quote"><span class="quote">Big Five</span></span> personality traits</a> as a
1956 primary means of influencing people even though the <span class="quote"><span class="quote">Big Five</span></span>
1957 theory is unsupported by any large-scale, peer-reviewed studies and is
1958 <a class="ulink" href="https://www.wired.com/story/the-noisy-fallacies-of-psychographic-targeting/" target="_top">mostly
1959 the realm of marketing hucksters and pop psych</a>.
1960 </p><p>
1961 Big Tech’s promotional materials also claim that their algorithms can
1962 accurately perform <span class="quote"><span class="quote">sentiment analysis</span></span> or detect peoples’
1963 moods based on their <span class="quote"><span class="quote">microexpressions,</span></span> but <a class="ulink" href="https://www.npr.org/2018/09/12/647040758/advertising-on-facebook-is-it-worth-it" target="_top">these
1964 are marketing claims, not scientific ones</a>. These methods are largely
1965 untested by independent scientific experts, and where they have been tested,
1966 they’ve been found sorely wanting. Microexpressions are particularly
1967 suspect as the companies that specialize in training people to detect them
1968 <a class="ulink" href="https://theintercept.com/2017/02/08/tsas-own-files-show-doubtful-science-behind-its-behavior-screening-program/" target="_top">have
1969 been shown</a> to underperform relative to random chance.
1970 </p><p>
1971 Big Tech has been so good at marketing its own supposed superpowers that
1972 it’s easy to believe that they can market everything else with similar
1973 acumen, but it’s a mistake to believe the hype. Any statement a company
1974 makes about the quality of its products is clearly not impartial. The fact
1975 that we distrust all the things that Big Tech says about its data handling,
1976 compliance with privacy laws, etc., is only reasonable — but why on Earth
1977 would we treat Big Tech’s marketing literature as the gospel truth? Big Tech
1978 lies about just about <span class="emphasis"><em>everything</em></span>, including how well
1979 its machine-learning fueled persuasion systems work.
1980 </p><p>
1981 That skepticism should infuse all of our evaluations of Big Tech and its
1982 supposed abilities, including our perusal of its patents. Zuboff vests these
1983 patents with enormous significance, pointing out that Google claimed
1984 extensive new persuasion capabilities in <a class="ulink" href="https://patents.google.com/patent/US20050131762A1/en" target="_top">its patent
1985 filings</a>. These claims are doubly suspect: first, because they are so
1986 self-serving, and second, because the patent itself is so notoriously an
1987 invitation to exaggeration.
1988 </p><p>
1989 Patent applications take the form of a series of claims and range from broad
1990 to narrow. A typical patent starts out by claiming that its authors have
1991 invented a method or system for doing every conceivable thing that anyone
1992 might do, ever, with any tool or device. Then it narrows that claim in
1993 successive stages until we get to the actual <span class="quote"><span class="quote">invention</span></span> that
1994 is the true subject of the patent. The hope is that the patent examiner —
1995 who is almost certainly overworked and underinformed — will miss the fact
1996 that some or all of these claims are ridiculous, or at least suspect, and
1997 grant the patent’s broader claims. Patents for unpatentable things are still
1998 incredibly useful because they can be wielded against competitors who might
1999 license that patent or steer clear of its claims rather than endure the
2000 lengthy, expensive process of contesting it.
2001 </p><p>
2002 What’s more, software patents are routinely granted even though the filer
2003 doesn’t have any evidence that they can do the thing claimed by the
2004 patent. That is, you can patent an <span class="quote"><span class="quote">invention</span></span> that you haven’t
2005 actually made and that you don’t know how to make.
2006 </p><p>
2007 With these considerations in hand, it becomes obvious that the fact that a
2008 Big Tech company has patented what it <span class="emphasis"><em>says</em></span> is an
2009 effective mind-control ray is largely irrelevant to whether Big Tech can in
2010 fact control our minds.
2011 </p><p>
2012 Big Tech collects our data for many reasons, including the diminishing
2013 returns on existing stores of data. But many tech companies also collect
2014 data out of a mistaken tech exceptionalist belief in the network effects of
2015 data. Network effects occur when each new user in a system increases its
2016 value. The classic example is fax machines: A single fax machine is of no
2017 use, two fax machines are of limited use, but every new fax machine that’s
2018 put to use after the first doubles the number of possible fax-to-fax links.
2019 </p><p>
2020 Data mined for predictive systems doesn’t necessarily produce these
2021 dividends. Think of Netflix: The predictive value of the data mined from a
2022 million English-speaking Netflix viewers is hardly improved by the addition
2023 of one more user’s viewing data. Most of the data Netflix acquires after
2024 that first minimum viable sample duplicates existing data and produces only
2025 minimal gains. Meanwhile, retraining models with new data gets progressively
2026 more expensive as the number of data points increases, and manual tasks like
2027 labeling and validating data do not get cheaper at scale.
2028 </p><p>
2029 Businesses pursue fads to the detriment of their profits all the time,
2030 especially when the businesses and their investors are not motivated by the
2031 prospect of becoming profitable but rather by the prospect of being acquired
2032 by a Big Tech giant or by having an IPO. For these firms, ticking faddish
2033 boxes like <span class="quote"><span class="quote">collects as much data as possible</span></span> might realize a
2034 bigger return on investment than <span class="quote"><span class="quote">collects a business-appropriate
2035 quantity of data.</span></span>
2036 </p><p>
2037 This is another harm of tech exceptionalism: The belief that more data
2038 always produces more profits in the form of more insights that can be
2039 translated into better mind-control rays drives firms to over-collect and
2040 over-retain data beyond all rationality. And since the firms are behaving
2041 irrationally, a good number of them will go out of business and become ghost
2042 ships whose cargo holds are stuffed full of data that can harm people in
2043 myriad ways — but which no one is responsible for antey longer. Even if the
2044 companies don’t go under, the data they collect is maintained behind the
2045 minimum viable security — just enough security to keep the company viable
2046 while it waits to get bought out by a tech giant, an amount calculated to
2047 spend not one penny more than is necessary on protecting data.
2048 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="how-monopolies-not-mind-control-drive-surveillance-capitalism-the-snapchat-story"></a>How monopolies, not mind control, drive surveillance capitalism: The
2049 Snapchat story</h2></div></div></div><p>
2050 For the first decade of its existence, Facebook competed with the social
2051 media giants of the day (Myspace, Orkut, etc.) by presenting itself as the
2052 pro-privacy alternative. Indeed, Facebook justified its walled garden —
2053 which let users bring in data from the web but blocked web services like
2054 Google Search from indexing and caching Facebook pages — as a pro-privacy
2055 measure that protected users from the surveillance-happy winners of the
2056 social media wars like Myspace.
2057 </p><p>
2058 Despite frequent promises that it would never collect or analyze its users’
2059 data, Facebook periodically created initiatives that did just that, like the
2060 creepy, ham-fisted Beacon tool, which spied on you as you moved around the
2061 web and then added your online activities to your public timeline, allowing
2062 your friends to monitor your browsing habits. Beacon sparked a user
2063 revolt. Every time, Facebook backed off from its surveillance initiative,
2064 but not all the way; inevitably, the new Facebook would be more surveilling
2065 than the old Facebook, though not quite as surveilling as the intermediate
2066 Facebook following the launch of the new product or service.
2067 </p><p>
2068 The pace at which Facebook ramped up its surveillance efforts seems to have
2069 been set by Facebook’s competitive landscape. The more competitors Facebook
2070 had, the better it behaved. Every time a major competitor foundered,
2071 Facebook’s behavior <a class="ulink" href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247362" target="_top">got
2072 markedly worse</a>.
2073 </p><p>
2074 All the while, Facebook was prodigiously acquiring companies, including a
2075 company called Onavo. Nominally, Onavo made a battery-monitoring mobile
2076 app. But the permissions that Onavo required were so expansive that the app
2077 was able to gather fine-grained telemetry on everything users did with their
2078 phones, including which apps they used and how they were using them.
2079 </p><p>
2080 Through Onavo, Facebook discovered that it was losing market share to
2081 Snapchat, an app that — like Facebook a decade before — billed itself as the
2082 pro-privacy alternative to the status quo. Through Onavo, Facebook was able
2083 to mine data from the devices of Snapchat users, including both current and
2084 former Snapchat users. This spurred Facebook to acquire Instagram — some
2085 features of which competed with Snapchat — and then allowed Facebook to
2086 fine-tune Instagram’s features and sales pitch to erode Snapchat’s gains and
2087 ensure that Facebook would not have to face the kinds of competitive
2088 pressures it had earlier inflicted on Myspace and Orkut.
2089 </p><p>
2090 The story of how Facebook crushed Snapchat reveals the relationship between
2091 monopoly and surveillance capitalism. Facebook combined surveillance with
2092 lax antitrust enforcement to spot the competitive threat of Snapchat on its
2093 horizon and then take decisive action against it. Facebook’s surveillance
2094 capitalism let it avert competitive pressure with anti-competitive
2095 tactics. Facebook users still want privacy — Facebook hasn’t used
2096 surveillance to brainwash them out of it — but they can’t get it because
2097 Facebook’s surveillance lets it destroy any hope of a rival service emerging
2098 that competes on privacy features.
2099 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="a-monopoly-over-your-friends"></a>A monopoly over your friends</h2></div></div></div><p>
2100 A decentralization movement has tried to erode the dominance of Facebook and
2101 other Big Tech companies by fielding <span class="quote"><span class="quote">indieweb</span></span> alternatives —
2102 Mastodon as a Twitter alternative, Diaspora as a Facebook alternative,
2103 etc. — but these efforts have failed to attain any kind of liftoff.
2104 </p><p>
2105 Fundamentally, each of these services is hamstrung by the same problem:
2106 Every potential user for a Facebook or Twitter alternative has to convince
2107 all their friends to follow them to a decentralized web alternative in order
2108 to continue to realize the benefit of social media. For many of us, the only
2109 reason to have a Facebook account is that our friends have Facebook
2110 accounts, and the reason they have Facebook accounts is that
2111 <span class="emphasis"><em>we</em></span> have Facebook accounts.
2112 </p><p>
2113 All of this has conspired to make Facebook — and other dominant platforms —
2114 into <span class="quote"><span class="quote">kill zones</span></span> that investors will not fund new entrants
2115 for.
2116 </p><p>
2117 And yet, all of today’s tech giants came into existence despite the
2118 entrenched advantage of the companies that came before them. To understand
2119 how that happened, you have to understand both interoperability and
2120 adversarial interoperability.
2121 </p><div class="blockquote"><blockquote class="blockquote"><p>
2122 The hard problem of our species is coordination.
2123 </p></blockquote></div><p>
2124 <span class="quote"><span class="quote">Interoperability</span></span> is the ability of two technologies to work
2125 with one another: Anyone can make an LP that will play on any record player,
2126 anyone can make a filter you can install in your stove’s extractor fan,
2127 anyone can make gasoline for your car, anyone can make a USB phone charger
2128 that fits in your car’s cigarette lighter receptacle, anyone can make a
2129 light bulb that works in your light socket, anyone can make bread that will
2130 toast in your toaster.
2131 </p><p>
2132 Interoperability is often a source of innovation and consumer benefit: Apple
2133 made the first commercially successful PC, but millions of independent
2134 software vendors made interoperable programs that ran on the Apple II
2135 Plus. The simple analog antenna inputs on the back of TVs first allowed
2136 cable operators to connect directly to TVs, then they allowed game console
2137 companies and then personal computer companies to use standard televisions
2138 as displays. Standard RJ-11 telephone jacks allowed for the production of
2139 phones from a variety of vendors in a variety of forms, from the free
2140 football-shaped phone that came with a <span class="emphasis"><em>Sports
2141 Illustrated</em></span> subscription to business phones with speakers, hold
2142 functions, and so on and then answering machines and finally modems, paving
2143 the way for the internet revolution.
2144 </p><p>
2145 <span class="quote"><span class="quote">Interoperability</span></span> is often used interchangeably with
2146 <span class="quote"><span class="quote">standardization,</span></span> which is the process when manufacturers and
2147 other stakeholders hammer out a set of agreed-upon rules for implementing a
2148 technology, such as the electrical plug on your wall, the CAN bus used by
2149 your car’s computer systems, or the HTML instructions that your browser
2150 interprets.
2151 </p><p>
2152 But interoperability doesn’t require standardization — indeed,
2153 standardization often proceeds from the chaos of ad hoc interoperability
2154 measures. The inventor of the cigarette-lighter USB charger didn’t need to
2155 get permission from car manufacturers or even the manufacturers of the
2156 dashboard lighter subcomponent. The automakers didn’t take any
2157 countermeasures to prevent the use of these aftermarket accessories by their
2158 customers, but they also didn’t do anything to make life easier for the
2159 chargers’ manufacturers. This is a kind of <span class="quote"><span class="quote">neutral
2160 interoperability.</span></span>
2161 </p><p>
2162 Beyond neutral interoperability, there is <span class="quote"><span class="quote">adversarial
2163 interoperability.</span></span> That’s when a manufacturer makes a product that
2164 interoperates with another manufacturer’s product <span class="emphasis"><em>despite the
2165 second manufacturer’s objections</em></span> and <span class="emphasis"><em>even if that means
2166 bypassing a security system designed to prevent interoperability</em></span>.
2167 </p><p>
2168 Probably the most familiar form of adversarial interoperability is
2169 third-party printer ink. Printer manufacturers claim that they sell printers
2170 below cost and that the only way they can recoup the losses they incur is by
2171 charging high markups on ink. To prevent the owners of printers from buying
2172 ink elsewhere, the printer companies deploy a suite of anti-customer
2173 security systems that detect and reject both refilled and third-party
2174 cartridges.
2175 </p><p>
2176 Owners of printers take the position that HP and Epson and Brother are not
2177 charities and that customers for their wares have no obligation to help them
2178 survive, and so if the companies choose to sell their products at a loss,
2179 that’s their foolish choice and their consequences to live with. Likewise,
2180 competitors who make ink or refill kits observe that they don’t owe printer
2181 companies anything, and their erosion of printer companies’ margins are the
2182 printer companies’ problems, not their competitors’. After all, the printer
2183 companies shed no tears when they drive a refiller out of business, so why
2184 should the refillers concern themselves with the economic fortunes of the
2185 printer companies?
2186 </p><p>
2187 Adversarial interoperability has played an outsized role in the history of
2188 the tech industry: from the founding of the <span class="quote"><span class="quote">alt.*</span></span> Usenet
2189 hierarchy (which was started against the wishes of Usenet’s maintainers and
2190 which grew to be bigger than all of Usenet combined) to the browser wars
2191 (when Netscape and Microsoft devoted massive engineering efforts to making
2192 their browsers incompatible with the other’s special commands and
2193 peccadilloes) to Facebook (whose success was built in part by helping its
2194 new users stay in touch with friends they’d left behind on Myspace because
2195 Facebook supplied them with a tool that scraped waiting messages from
2196 Myspace and imported them into Facebook, effectively creating an
2197 Facebook-based Myspace reader).
2198 </p><p>
2199 Today, incumbency is seen as an unassailable advantage. Facebook is where
2200 all of your friends are, so no one can start a Facebook competitor. But
2201 adversarial compatibility reverses the competitive advantage: If you were
2202 allowed to compete with Facebook by providing a tool that imported all your
2203 users’ waiting Facebook messages into an environment that competed on lines
2204 that Facebook couldn’t cross, like eliminating surveillance and ads, then
2205 Facebook would be at a huge disadvantage. It would have assembled all
2206 possible ex-Facebook users into a single, easy-to-find service; it would
2207 have educated them on how a Facebook-like service worked and what its
2208 potential benefits were; and it would have provided an easy means for
2209 disgruntled Facebook users to tell their friends where they might expect
2210 better treatment.
2211 </p><p>
2212 Adversarial interoperability was once the norm and a key contributor to the
2213 dynamic, vibrant tech scene, but now it is stuck behind a thicket of laws
2214 and regulations that add legal risks to the tried-and-true tactics of
2215 adversarial interoperability. New rules and new interpretations of existing
2216 rules mean that a would-be adversarial interoperator needs to steer clear of
2217 claims under copyright, terms of service, trade secrecy, tortious
2218 interference, and patent.
2219 </p><p>
2220 In the absence of a competitive market, lawmakers have resorted to assigning
2221 expensive, state-like duties to Big Tech firms, such as automatically
2222 filtering user contributions for copyright infringement or terrorist and
2223 extremist content or detecting and preventing harassment in real time or
2224 controlling access to sexual material.
2225 </p><p>
2226 These measures put a floor under how small we can make Big Tech because only
2227 the very largest companies can afford the humans and automated filters
2228 needed to perform these duties.
2229 </p><p>
2230 But that’s not the only way in which making platforms responsible for
2231 policing their users undermines competition. A platform that is expected to
2232 police its users’ conduct must prevent many vital adversarial
2233 interoperability techniques lest these subvert its policing measures. For
2234 example, if someone using a Twitter replacement like Mastodon is able to
2235 push messages into Twitter and read messages out of Twitter, they could
2236 avoid being caught by automated systems that detect and prevent harassment
2237 (such as systems that use the timing of messages or IP-based rules to make
2238 guesses about whether someone is a harasser).
2239 </p><p>
2240 To the extent that we are willing to let Big Tech police itself — rather
2241 than making Big Tech small enough that users can leave bad platforms for
2242 better ones and small enough that a regulation that simply puts a platform
2243 out of business will not destroy billions of users’ access to their
2244 communities and data — we build the case that Big Tech should be able to
2245 block its competitors and make it easier for Big Tech to demand legal
2246 enforcement tools to ban and punish attempts at adversarial
2247 interoperability.
2248 </p><p>
2249 Ultimately, we can try to fix Big Tech by making it responsible for bad acts
2250 by its users, or we can try to fix the internet by cutting Big Tech down to
2251 size. But we can’t do both. To replace today’s giant products with
2252 pluralistic protocols, we need to clear the legal thicket that prevents
2253 adversarial interoperability so that tomorrow’s nimble, personal,
2254 small-scale products can federate themselves with giants like Facebook,
2255 allowing the users who’ve left to continue to communicate with users who
2256 haven’t left yet, reaching tendrils over Facebook’s garden wall that
2257 Facebook’s trapped users can use to scale the walls and escape to the
2258 global, open web.
2259 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="fake-news-is-an-epistemological-crisis"></a>Fake news is an epistemological crisis</h2></div></div></div><p>
2260 Tech is not the only industry that has undergone massive concentration since
2261 the Reagan era. Virtually every major industry — from oil to newspapers to
2262 meatpacking to sea freight to eyewear to online pornography — has become a
2263 clubby oligarchy that just a few players dominate.
2264 </p><p>
2265 At the same time, every industry has become something of a tech industry as
2266 general-purpose computers and general-purpose networks and the promise of
2267 efficiencies through data-driven analysis infuse every device, process, and
2268 firm with tech.
2269 </p><p>
2270 This phenomenon of industrial concentration is part of a wider story about
2271 wealth concentration overall as a smaller and smaller number of people own
2272 more and more of our world. This concentration of both wealth and industries
2273 means that our political outcomes are increasingly beholden to the parochial
2274 interests of the people and companies with all the money.
2275 </p><p>
2276 That means that whenever a regulator asks a question with an obvious,
2277 empirical answer (<span class="quote"><span class="quote">Are humans causing climate change?</span></span> or
2278 <span class="quote"><span class="quote">Should we let companies conduct commercial mass surveillance?</span></span>
2279 or <span class="quote"><span class="quote">Does society benefit from allowing network neutrality
2280 violations?</span></span>), the answer that comes out is only correct if that
2281 correctness meets with the approval of rich people and the industries that
2282 made them so wealthy.
2283 </p><p>
2284 Rich people have always played an outsized role in politics and more so
2285 since the Supreme Court’s <span class="emphasis"><em>Citizens United</em></span> decision
2286 eliminated key controls over political spending. Widening inequality and
2287 wealth concentration means that the very richest people are now a lot richer
2288 and can afford to spend a lot more money on political projects than ever
2289 before. Think of the Koch brothers or George Soros or Bill Gates.
2290 </p><p>
2291 But the policy distortions of rich individuals pale in comparison to the
2292 policy distortions that concentrated industries are capable of. The
2293 companies in highly concentrated industries are much more profitable than
2294 companies in competitive industries — no competition means not having to
2295 reduce prices or improve quality to win customers — leaving them with bigger
2296 capital surpluses to spend on lobbying.
2297 </p><p>
2298 Concentrated industries also find it easier to collaborate on policy
2299 objectives than competitive ones. When all the top execs from your industry
2300 can fit around a single boardroom table, they often do. And
2301 <span class="emphasis"><em>when</em></span> they do, they can forge a consensus position on
2302 regulation.
2303 </p><p>
2304 Rising through the ranks in a concentrated industry generally means working
2305 at two or three of the big companies. When there are only relatively few
2306 companies in a given industry, each company has a more ossified executive
2307 rank, leaving ambitious execs with fewer paths to higher positions unless
2308 they are recruited to a rival. This means that the top execs in concentrated
2309 industries are likely to have been colleagues at some point and socialize in
2310 the same circles — connected through social ties or, say, serving as
2311 trustees for each others’ estates. These tight social bonds foster a
2312 collegial, rather than competitive, attitude.
2313 </p><p>
2314 Highly concentrated industries also present a regulatory conundrum. When an
2315 industry is dominated by just four or five companies, the only people who
2316 are likely to truly understand the industry’s practices are its veteran
2317 executives. This means that top regulators are often former execs of the
2318 companies they are supposed to be regulating. These turns in government are
2319 often tacitly understood to be leaves of absence from industry, with former
2320 employers welcoming their erstwhile watchdogs back into their executive
2321 ranks once their terms have expired.
2322 </p><p>
2323 All this is to say that the tight social bonds, small number of firms, and
2324 regulatory capture of concentrated industries give the companies that
2325 comprise them the power to dictate many, if not all, of the regulations that
2326 bind them.
2327 </p><p>
2328 This is increasingly obvious. Whether it’s payday lenders <a class="ulink" href="https://www.washingtonpost.com/business/2019/02/25/how-payday-lending-industry-insider-tilted-academic-research-its-favor/" target="_top">winning
2329 the right to practice predatory lending</a> or Apple <a class="ulink" href="https://www.vice.com/en_us/article/mgxayp/source-apple-will-fight-right-to-repair-legislation" target="_top">winning
2330 the right to decide who can fix your phone</a> or Google and Facebook
2331 winning the right to breach your private data without suffering meaningful
2332 consequences or victories for pipeline companies or impunity for opioid
2333 manufacturers or massive tax subsidies for incredibly profitable dominant
2334 businesses, it’s increasingly apparent that many of our official,
2335 evidence-based truth-seeking processes are, in fact, auctions for sale to
2336 the highest bidder.
2337 </p><p>
2338 It’s really impossible to overstate what a terrifying prospect this is. We
2339 live in an incredibly high-tech society, and none of us could acquire the
2340 expertise to evaluate every technological proposition that stands between us
2341 and our untimely, horrible deaths. You might devote your life to acquiring
2342 the media literacy to distinguish good scientific journals from corrupt
2343 pay-for-play lookalikes and the statistical literacy to evaluate the quality
2344 of the analysis in the journals as well as the microbiology and epidemiology
2345 knowledge to determine whether you can trust claims about the safety of
2346 vaccines — but that would still leave you unqualified to judge whether the
2347 wiring in your home will give you a lethal shock <span class="emphasis"><em>and</em></span>
2348 whether your car’s brakes’ software will cause them to fail unpredictably
2349 <span class="emphasis"><em>and</em></span> whether the hygiene standards at your butcher are
2350 sufficient to keep you from dying after you finish your dinner.
2351 </p><p>
2352 In a world as complex as this one, we have to defer to authorities, and we
2353 keep them honest by making those authorities accountable to us and binding
2354 them with rules to prevent conflicts of interest. We can’t possibly acquire
2355 the expertise to adjudicate conflicting claims about the best way to make
2356 the world safe and prosperous, but we <span class="emphasis"><em>can</em></span> determine
2357 whether the adjudication process itself is trustworthy.
2358 </p><p>
2359 Right now, it’s obviously not.
2360 </p><p>
2361 The past 40 years of rising inequality and industry concentration, together
2362 with increasingly weak accountability and transparency for expert agencies,
2363 has created an increasingly urgent sense of impending doom, the sense that
2364 there are vast conspiracies afoot that operate with tacit official approval
2365 despite the likelihood they are working to better themselves by ruining the
2366 rest of us.
2367 </p><p>
2368 For example, it’s been decades since Exxon’s own scientists concluded that
2369 its products would render the Earth uninhabitable by humans. And yet those
2370 decades were lost to us, in large part because Exxon lobbied governments and
2371 sowed doubt about the dangers of its products and did so with the
2372 cooperation of many public officials. When the survival of you and everyone
2373 you love is threatened by conspiracies, it’s not unreasonable to start
2374 questioning the things you think you know in an attempt to determine whether
2375 they, too, are the outcome of another conspiracy.
2376 </p><p>
2377 The collapse of the credibility of our systems for divining and upholding
2378 truths has left us in a state of epistemological chaos. Once, most of us
2379 might have assumed that the system was working and that our regulations
2380 reflected our best understanding of the empirical truths of the world as
2381 they were best understood — now we have to find our own experts to help us
2382 sort the true from the false.
2383 </p><p>
2384 If you’re like me, you probably believe that vaccines are safe, but you
2385 (like me) probably also can’t explain the microbiology or statistics. Few of
2386 us have the math skills to review the literature on vaccine safety and
2387 describe why their statistical reasoning is sound. Likewise, few of us can
2388 review the stats in the (now discredited) literature on opioid safety and
2389 explain how those stats were manipulated. Both vaccines and opioids were
2390 embraced by medical authorities, after all, and one is safe while the other
2391 could ruin your life. You’re left with a kind of inchoate constellation of
2392 rules of thumb about which experts you trust to fact-check controversial
2393 claims and then to explain how all those respectable doctors with their
2394 peer-reviewed research on opioid safety <span class="emphasis"><em>were</em></span> an
2395 aberration and then how you know that the doctors writing about vaccine
2396 safety are <span class="emphasis"><em>not</em></span> an aberration.
2397 </p><p>
2398 I’m 100% certain that vaccinating is safe and effective, but I’m also at
2399 something of a loss to explain exactly, <span class="emphasis"><em>precisely,</em></span> why
2400 I believe this, given all the corruption I know about and the many times the
2401 stamp of certainty has turned out to be a parochial lie told to further
2402 enrich the super rich.
2403 </p><p>
2404 Fake news — conspiracy theories, racist ideologies, scientific denialism —
2405 has always been with us. What’s changed today is not the mix of ideas in the
2406 public discourse but the popularity of the worst ideas in that
2407 mix. Conspiracy and denial have skyrocketed in lockstep with the growth of
2408 Big Inequality, which has also tracked the rise of Big Tech and Big Pharma
2409 and Big Wrestling and Big Car and Big Movie Theater and Big Everything Else.
2410 </p><p>
2411 No one can say for certain why this has happened, but the two dominant camps
2412 are idealism (the belief that the people who argue for these conspiracies
2413 have gotten better at explaining them, maybe with the help of
2414 machine-learning tools) or materialism (the ideas have become more
2415 attractive because of material conditions in the world).
2416 </p><p>
2417 I’m a materialist. I’ve been exposed to the arguments of conspiracy
2418 theorists all my life, and I have not experienced any qualitative leap in
2419 the quality of those arguments.
2420 </p><p>
2421 The major difference is in the world, not the arguments. In a time where
2422 actual conspiracies are commonplace, conspiracy theories acquire a ring of
2423 plausibility.
2424 </p><p>
2425 We have always had disagreements about what’s true, but today, we have a
2426 disagreement over how we know whether something is true. This is an
2427 epistemological crisis, not a crisis over belief. It’s a crisis over the
2428 credibility of our truth-seeking exercises, from scientific journals (in an
2429 era where the biggest journal publishers have been caught producing
2430 pay-to-play journals for junk science) to regulations (in an era where
2431 regulators are routinely cycling in and out of business) to education (in an
2432 era where universities are dependent on corporate donations to keep their
2433 lights on).
2434 </p><p>
2435 Targeting — surveillance capitalism — makes it easier to find people who are
2436 undergoing this epistemological crisis, but it doesn’t create the
2437 crisis. For that, you need to look to corruption.
2438 </p><p>
2439 And, conveniently enough, it’s corruption that allows surveillance
2440 capitalism to grow by dismantling monopoly protections, by permitting
2441 reckless collection and retention of personal data, by allowing ads to be
2442 targeted in secret, and by foreclosing on the possibility of going somewhere
2443 else where you might continue to enjoy your friends without subjecting
2444 yourself to commercial surveillance.
2445 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="tech-is-different"></a>Tech is different</h2></div></div></div><p>
2446 I reject both iterations of technological exceptionalism. I reject the idea
2447 that tech is uniquely terrible and led by people who are greedier or worse
2448 than the leaders of other industries, and I reject the idea that tech is so
2449 good — or so intrinsically prone to concentration — that it can’t be blamed
2450 for its present-day monopolistic status.
2451 </p><p>
2452 I think tech is just another industry, albeit one that grew up in the
2453 absence of real monopoly constraints. It may have been first, but it isn’t
2454 the worst nor will it be the last.
2455 </p><p>
2456 But there’s one way in which I <span class="emphasis"><em>am</em></span> a tech
2457 exceptionalist. I believe that online tools are the key to overcoming
2458 problems that are much more urgent than tech monopolization: climate change,
2459 inequality, misogyny, and discrimination on the basis of race, gender
2460 identity, and other factors. The internet is how we will recruit people to
2461 fight those fights, and how we will coordinate their labor. Tech is not a
2462 substitute for democratic accountability, the rule of law, fairness, or
2463 stability — but it’s a means to achieve these things.
2464 </p><p>
2465 The hard problem of our species is coordination. Everything from climate
2466 change to social change to running a business to making a family work can be
2467 viewed as a collective action problem.
2468 </p><p>
2469 The internet makes it easier than at any time before to find people who want
2470 to work on a project with you — hence the success of free and open-source
2471 software, crowdfunding, and racist terror groups — and easier than ever to
2472 coordinate the work you do.
2473 </p><p>
2474 The internet and the computers we connect to it also possess an exceptional
2475 quality: general-purposeness. The internet is designed to allow any two
2476 parties to communicate any data, using any protocol, without permission from
2477 anyone else. The only production design we have for computers is the
2478 general-purpose, <span class="quote"><span class="quote">Turing complete</span></span> computer that can run every
2479 program we can express in symbolic logic.
2480 </p><p>
2481 This means that every time someone with a special communications need
2482 invests in infrastructure and techniques to make the internet faster,
2483 cheaper, and more robust, this benefit redounds to everyone else who is
2484 using the internet to communicate. And this also means that every time
2485 someone with a special computing need invests to make computers faster,
2486 cheaper, and more robust, every other computing application is a potential
2487 beneficiary of this work.
2488 </p><p>
2489 For these reasons, every type of communication is gradually absorbed into
2490 the internet, and every type of device — from airplanes to pacemakers —
2491 eventually becomes a computer in a fancy case.
2492 </p><p>
2493 While these considerations don’t preclude regulating networks and computers,
2494 they do call for gravitas and caution when doing so because changes to
2495 regulatory frameworks could ripple out to have unintended consequences in
2496 many, many other domains.
2497 </p><p>
2498 The upshot of this is that our best hope of solving the big coordination
2499 problems — climate change, inequality, etc. — is with free, fair, and open
2500 tech. Our best hope of keeping tech free, fair, and open is to exercise
2501 caution in how we regulate tech and to attend closely to the ways in which
2502 interventions to solve one problem might create problems in other domains.
2503 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="ownership-of-facts"></a>Ownership of facts</h2></div></div></div><p>
2504 Big Tech has a funny relationship with information. When you’re generating
2505 information — anything from the location data streaming off your mobile
2506 device to the private messages you send to friends on a social network — it
2507 claims the rights to make unlimited use of that data.
2508 </p><p>
2509 But when you have the audacity to turn the tables — to use a tool that
2510 blocks ads or slurps your waiting updates out of a social network and puts
2511 them in another app that lets you set your own priorities and suggestions or
2512 crawls their system to allow you to start a rival business — they claim that
2513 you’re stealing from them.
2514 </p><p>
2515 The thing is, information is a very bad fit for any kind of private property
2516 regime. Property rights are useful for establishing markets that can lead to
2517 the effective development of fallow assets. These markets depend on clear
2518 titles to ensure that the things being bought and sold in them can, in fact,
2519 be bought and sold.
2520 </p><p>
2521 Information rarely has such a clear title. Take phone numbers: There’s
2522 clearly something going wrong when Facebook slurps up millions of users’
2523 address books and uses the phone numbers it finds in them to plot out social
2524 graphs and fill in missing information about other users.
2525 </p><p>
2526 But the phone numbers Facebook nonconsensually acquires in this transaction
2527 are not the <span class="quote"><span class="quote">property</span></span> of the users they’re taken from nor do
2528 they belong to the people whose phones ring when you dial those numbers. The
2529 numbers are mere integers, 10 digits in the U.S. and Canada, and they
2530 appear in millions of places, including somewhere deep in pi as well as
2531 numerous other contexts. Giving people ownership titles to integers is an
2532 obviously terrible idea.
2533 </p><p>
2534 Likewise for the facts that Facebook and other commercial surveillance
2535 operators acquire about us, like that we are the children of our parents or
2536 the parents to our children or that we had a conversation with someone else
2537 or went to a public place. These data points can’t be property in the sense
2538 that your house or your shirt is your property because the title to them is
2539 intrinsically muddy: Does your mom own the fact that she is your mother? Do
2540 you? Do both of you? What about your dad — does he own this fact too, or
2541 does he have to license the fact from you (or your mom or both of you) in
2542 order to use this fact? What about the hundreds or thousands of other people
2543 who know these facts?
2544 </p><p>
2545 If you go to a Black Lives Matter demonstration, do the other demonstrators
2546 need your permission to post their photos from the event? The online fights
2547 over <a class="ulink" href="https://www.wired.com/story/how-to-take-photos-at-protests/" target="_top">when and
2548 how to post photos from demonstrations</a> reveal a nuanced, complex
2549 issue that cannot be easily hand-waved away by giving one party a property
2550 right that everyone else in the mix has to respect.
2551 </p><p>
2552 The fact that information isn’t a good fit with property and markets doesn’t
2553 mean that it’s not valuable. Babies aren’t property, but they’re inarguably
2554 valuable. In fact, we have a whole set of rules just for babies as well as a
2555 subset of those rules that apply to humans more generally. Someone who
2556 argues that babies won’t be truly valuable until they can be bought and sold
2557 like loaves of bread would be instantly and rightfully condemned as a
2558 monster.
2559 </p><p>
2560 It’s tempting to reach for the property hammer when Big Tech treats your
2561 information like a nail — not least because Big Tech are such prolific
2562 abusers of property hammers when it comes to <span class="emphasis"><em>their</em></span>
2563 information. But this is a mistake. If we allow markets to dictate the use
2564 of our information, then we’ll find that we’re sellers in a buyers’ market
2565 where the Big Tech monopolies set a price for our data that is so low as to
2566 be insignificant or, more likely, set at a nonnegotiable price of zero in a
2567 click-through agreement that you don’t have the opportunity to modify.
2568 </p><p>
2569 Meanwhile, establishing property rights over information will create
2570 insurmountable barriers to independent data processing. Imagine that we
2571 require a license to be negotiated when a translated document is compared
2572 with its original, something Google has done and continues to do billions of
2573 times to train its automated language translation tools. Google can afford
2574 this, but independent third parties cannot. Google can staff a clearances
2575 department to negotiate one-time payments to the likes of the EU (one of the
2576 major repositories of translated documents) while independent watchdogs
2577 wanting to verify that the translations are well-prepared, or to root out
2578 bias in translations, will find themselves needing a staffed-up legal
2579 department and millions for licenses before they can even get started.
2580 </p><p>
2581 The same goes for things like search indexes of the web or photos of
2582 peoples’ houses, which have become contentious thanks to Google’s Street
2583 View project. Whatever problems may exist with Google’s photographing of
2584 street scenes, resolving them by letting people decide who can take pictures
2585 of the facades of their homes from a public street will surely create even
2586 worse ones. Think of how street photography is important for newsgathering —
2587 including informal newsgathering, like photographing abuses of authority —
2588 and how being able to document housing and street life are important for
2589 contesting eminent domain, advocating for social aid, reporting planning and
2590 zoning violations, documenting discriminatory and unequal living conditions,
2591 and more.
2592 </p><p>
2593 The ownership of facts is antithetical to many kinds of human progress. It’s
2594 hard to imagine a rule that limits Big Tech’s exploitation of our collective
2595 labors without inadvertently banning people from gathering data on online
2596 harassment or compiling indexes of changes in language or simply
2597 investigating how the platforms are shaping our discourse — all of which
2598 require scraping data that other people have created and subjecting it to
2599 scrutiny and analysis.
2600 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="persuasion-works-slowly"></a>Persuasion works… slowly</h2></div></div></div><p>
2601 The platforms may oversell their ability to persuade people, but obviously,
2602 persuasion works sometimes. Whether it’s the private realm that LGBTQ people
2603 used to recruit allies and normalize sexual diversity or the decadeslong
2604 project to convince people that markets are the only efficient way to solve
2605 complicated resource allocation problems, it’s clear that our societal
2606 attitudes <span class="emphasis"><em>can</em></span> change.
2607 </p><p>
2608 The project of shifting societal attitudes is a game of inches and
2609 years. For centuries, svengalis have purported to be able to accelerate this
2610 process, but even the most brutal forms of propaganda have struggled to make
2611 permanent changes. Joseph Goebbels was able to subject Germans to daily,
2612 mandatory, hourslong radio broadcasts, to round up and torture and murder
2613 dissidents, and to seize full control over their children’s education while
2614 banning any literature, broadcasts, or films that did not comport with his
2615 worldview.
2616 </p><p>
2617 Yet, after 12 years of terror, once the war ended, Nazi ideology was largely
2618 discredited in both East and West Germany, and a program of national truth
2619 and reconciliation was put in its place. Racism and authoritarianism were
2620 never fully abolished in Germany, but neither were the majority of Germans
2621 irrevocably convinced of Nazism — and the rise of racist authoritarianism in
2622 Germany today tells us that the liberal attitudes that replaced Nazism were
2623 no more permanent than Nazism itself.
2624 </p><p>
2625 Racism and authoritarianism have also always been with us. Anyone who’s
2626 reviewed the kind of messages and arguments that racists put forward today
2627 would be hard-pressed to say that they have gotten better at presenting
2628 their ideas. The same pseudoscience, appeals to fear, and circular logic
2629 that racists presented in the 1980s, when the cause of white supremacy was
2630 on the wane, are to be found in the communications of leading white
2631 nationalists today.
2632 </p><p>
2633 If racists haven’t gotten more convincing in the past decade, then how is it
2634 that more people were convinced to be openly racist at that time? I believe
2635 that the answer lies in the material world, not the world of ideas. The
2636 ideas haven’t gotten more convincing, but people have become more
2637 afraid. Afraid that the state can’t be trusted to act as an honest broker in
2638 life-or-death decisions, from those regarding the management of the economy
2639 to the regulation of painkillers to the rules for handling private
2640 information. Afraid that the world has become a game of musical chairs in
2641 which the chairs are being taken away at a never-before-seen rate. Afraid
2642 that justice for others will come at their expense. Monopolism isn’t the
2643 cause of these fears, but the inequality and material desperation and policy
2644 malpractice that monopolism contributes to is a significant contributor to
2645 these conditions. Inequality creates the conditions for both conspiracies
2646 and violent racist ideologies, and then surveillance capitalism lets
2647 opportunists target the fearful and the conspiracy-minded.
2648 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="paying-wont-help"></a>Paying won’t help</h2></div></div></div><p>
2649 As the old saw goes, <span class="quote"><span class="quote">If you’re not paying for the product, you’re the
2650 product.</span></span>
2651 </p><p>
2652 It’s a commonplace belief today that the advent of free, ad-supported media
2653 was the original sin of surveillance capitalism. The reasoning is that the
2654 companies that charged for access couldn’t <span class="quote"><span class="quote">compete with free</span></span>
2655 and so they were driven out of business. Their ad-supported competitors,
2656 meanwhile, declared open season on their users’ data in a bid to improve
2657 their ad targeting and make more money and then resorted to the most
2658 sensationalist tactics to generate clicks on those ads. If only we’d pay for
2659 media again, we’d have a better, more responsible, more sober discourse that
2660 would be better for democracy.
2661 </p><p>
2662 But the degradation of news products long precedes the advent of
2663 ad-supported online news. Long before newspapers were online, lax antitrust
2664 enforcement had opened the door for unprecedented waves of consolidation and
2665 roll-ups in newsrooms. Rival newspapers were merged, reporters and ad sales
2666 staff were laid off, physical plants were sold and leased back, leaving the
2667 companies loaded up with debt through leveraged buyouts and subsequent
2668 profit-taking by the new owners. In other words, it wasn’t merely shifts in
2669 the classified advertising market, which was long held to be the primary
2670 driver in the decline of the traditional newsroom, that made news companies
2671 unable to adapt to the internet — it was monopolism.
2672 </p><p>
2673 Then, as news companies <span class="emphasis"><em>did</em></span> come online, the ad
2674 revenues they commanded dropped even as the number of internet users (and
2675 thus potential online readers) increased. That shift was a function of
2676 consolidation in the ad sales market, with Google and Facebook emerging as
2677 duopolists who made more money every year from advertising while paying less
2678 and less of it to the publishers whose work the ads appeared
2679 alongside. Monopolism created a buyer’s market for ad inventory with
2680 Facebook and Google acting as gatekeepers.
2681 </p><p>
2682 Paid services continue to exist alongside free ones, and often it is these
2683 paid services — anxious to prevent people from bypassing their paywalls or
2684 sharing paid media with freeloaders — that exert the most control over their
2685 customers. Apple’s iTunes and App Stores are paid services, but to maximize
2686 their profitability, Apple has to lock its platforms so that third parties
2687 can’t make compatible software without permission. These locks allow the
2688 company to exercise both editorial control (enabling it to exclude <a class="ulink" href="https://ncac.org/news/blog/does-apples-strict-app-store-content-policy-limit-freedom-of-expression" target="_top">controversial
2689 political material</a>) and technological control, including control
2690 over who can repair the devices it makes. If we’re worried that ad-supported
2691 products deprive people of their right to self-determination by using
2692 persuasion techniques to nudge their purchase decisions a few degrees in one
2693 direction or the other, then the near-total control a single company holds
2694 over the decision of who gets to sell you software, parts, and service for
2695 your iPhone should have us very worried indeed.
2696 </p><p>
2697 We shouldn’t just be concerned about payment and control: The idea that
2698 paying will improve discourse is also dangerously wrong. The poor success
2699 rate of targeted advertising means that the platforms have to incentivize
2700 you to <span class="quote"><span class="quote">engage</span></span> with posts at extremely high levels to generate
2701 enough pageviews to safeguard their profits. As discussed earlier, to
2702 increase engagement, platforms like Facebook use machine learning to guess
2703 which messages will be most inflammatory and make a point of shoving those
2704 into your eyeballs at every turn so that you will hate-click and argue with
2705 people.
2706 </p><p>
2707 Perhaps paying would fix this, the reasoning goes. If platforms could be
2708 economically viable even if you stopped clicking on them once your
2709 intellectual and social curiosity had been slaked, then they would have no
2710 reason to algorithmically enrage you to get more clicks out of you, right?
2711 </p><p>
2712 There may be something to that argument, but it still ignores the wider
2713 economic and political context of the platforms and the world that allowed
2714 them to grow so dominant.
2715 </p><p>
2716 Platforms are world-spanning and all-encompassing because they are
2717 monopolies, and they are monopolies because we have gutted our most
2718 important and reliable anti-monopoly rules. Antitrust was neutered as a key
2719 part of the project to make the wealthy wealthier, and that project has
2720 worked. The vast majority of people on Earth have a negative net worth, and
2721 even the dwindling middle class is in a precarious state, undersaved for
2722 retirement, underinsured for medical disasters, and undersecured against
2723 climate and technology shocks.
2724 </p><p>
2725 In this wildly unequal world, paying doesn’t improve the discourse; it
2726 simply prices discourse out of the range of the majority of people. Paying
2727 for the product is dandy, if you can afford it.
2728 </p><p>
2729 If you think today’s filter bubbles are a problem for our discourse, imagine
2730 what they’d be like if rich people inhabited free-flowing Athenian
2731 marketplaces of ideas where you have to pay for admission while everyone
2732 else lives in online spaces that are subsidized by wealthy benefactors who
2733 relish the chance to establish conversational spaces where the <span class="quote"><span class="quote">house
2734 rules</span></span> forbid questioning the status quo. That is, imagine if the
2735 rich seceded from Facebook, and then, instead of running ads that made money
2736 for shareholders, Facebook became a billionaire’s vanity project that also
2737 happened to ensure that nobody talked about whether it was fair that only
2738 billionaires could afford to hang out in the rarified corners of the
2739 internet.
2740 </p><p>
2741 Behind the idea of paying for access is a belief that free markets will
2742 address Big Tech’s dysfunction. After all, to the extent that people have a
2743 view of surveillance at all, it is generally an unfavorable one, and the
2744 longer and more thoroughly one is surveilled, the less one tends to like
2745 it. Same goes for lock-in: If HP’s ink or Apple’s App Store were really
2746 obviously fantastic, they wouldn’t need technical measures to prevent users
2747 from choosing a rival’s product. The only reason these technical
2748 countermeasures exist is that the companies don’t believe their customers
2749 would <span class="emphasis"><em>voluntarily</em></span> submit to their terms, and they want
2750 to deprive them of the choice to take their business elsewhere.
2751 </p><p>
2752 Advocates for markets laud their ability to aggregate the diffused knowledge
2753 of buyers and sellers across a whole society through demand signals, price
2754 signals, and so on. The argument for surveillance capitalism being a
2755 <span class="quote"><span class="quote">rogue capitalism</span></span> is that machine-learning-driven persuasion
2756 techniques distort decision-making by consumers, leading to incorrect
2757 signals — consumers don’t buy what they prefer, they buy what they’re
2758 tricked into preferring. It follows that the monopolistic practices of
2759 lock-in, which do far more to constrain consumers’ free choices, are even
2760 more of a <span class="quote"><span class="quote">rogue capitalism.</span></span>
2761 </p><p>
2762 The profitability of any business is constrained by the possibility that its
2763 customers will take their business elsewhere. Both surveillance and lock-in
2764 are anti-features that no customer wants. But monopolies can capture their
2765 regulators, crush their competitors, insert themselves into their customers’
2766 lives, and corral people into <span class="quote"><span class="quote">choosing</span></span> their services
2767 regardless of whether they want them — it’s fine to be terrible when there
2768 is no alternative.
2769 </p><p>
2770 Ultimately, surveillance and lock-in are both simply business strategies
2771 that monopolists can choose. Surveillance companies like Google are
2772 perfectly capable of deploying lock-in technologies — just look at the
2773 onerous Android licensing terms that require device-makers to bundle in
2774 Google’s suite of applications. And lock-in companies like Apple are
2775 perfectly capable of subjecting their users to surveillance if it means
2776 keeping the Chinese government happy and preserving ongoing access to
2777 Chinese markets. Monopolies may be made up of good, ethical people, but as
2778 institutions, they are not your friend — they will do whatever they can get
2779 away with to maximize their profits, and the more monopolistic they are, the
2780 more they <span class="emphasis"><em>can</em></span> get away with.
2781 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="an-ecology-moment-for-trustbusting"></a>An <span class="quote"><span class="quote">ecology</span></span> moment for trustbusting</h2></div></div></div><p>
2782 If we’re going to break Big Tech’s death grip on our digital lives, we’re
2783 going to have to fight monopolies. That may sound pretty mundane and
2784 old-fashioned, something out of the New Deal era, while ending the use of
2785 automated behavioral modification feels like the plotline of a really cool
2786 cyberpunk novel.
2787 </p><p>
2788 Meanwhile, breaking up monopolies is something we seem to have forgotten how
2789 to do. There is a bipartisan, trans-Atlantic consensus that breaking up
2790 companies is a fool’s errand at best — liable to mire your federal
2791 prosecutors in decades of litigation — and counterproductive at worst,
2792 eroding the <span class="quote"><span class="quote">consumer benefits</span></span> of large companies with massive
2793 efficiencies of scale.
2794 </p><p>
2795 But trustbusters once strode the nation, brandishing law books, terrorizing
2796 robber barons, and shattering the illusion of monopolies’ all-powerful grip
2797 on our society. The trustbusting era could not begin until we found the
2798 political will — until the people convinced politicians they’d have their
2799 backs when they went up against the richest, most powerful men in the world.
2800 </p><p>
2801 Could we find that political will again?
2802 </p><p>
2803 Copyright scholar James Boyle has described how the term
2804 <span class="quote"><span class="quote">ecology</span></span> marked a turning point in environmental
2805 activism. Prior to the adoption of this term, people who wanted to preserve
2806 whale populations didn’t necessarily see themselves as fighting the same
2807 battle as people who wanted to protect the ozone layer or fight freshwater
2808 pollution or beat back smog or acid rain.
2809 </p><p>
2810 But the term <span class="quote"><span class="quote">ecology</span></span> welded these disparate causes together
2811 into a single movement, and the members of this movement found solidarity
2812 with one another. The people who cared about smog signed petitions
2813 circulated by the people who wanted to end whaling, and the anti-whalers
2814 marched alongside the people demanding action on acid rain. This uniting
2815 behind a common cause completely changed the dynamics of environmentalism,
2816 setting the stage for today’s climate activism and the sense that preserving
2817 the habitability of the planet Earth is a shared duty among all people.
2818 </p><p>
2819 I believe we are on the verge of a new <span class="quote"><span class="quote">ecology</span></span> moment
2820 dedicated to combating monopolies. After all, tech isn’t the only
2821 concentrated industry nor is it even the <span class="emphasis"><em>most</em></span>
2822 concentrated of industries.
2823 </p><p>
2824 You can find partisans for trustbusting in every sector of the
2825 economy. Everywhere you look, you can find people who’ve been wronged by
2826 monopolists who’ve trashed their finances, their health, their privacy,
2827 their educations, and the lives of people they love. Those people have the
2828 same cause as the people who want to break up Big Tech and the same
2829 enemies. When most of the world’s wealth is in the hands of a very few, it
2830 follows that nearly every large company will have overlapping shareholders.
2831 </p><p>
2832 That’s the good news: With a little bit of work and a little bit of
2833 coalition building, we have more than enough political will to break up Big
2834 Tech and every other concentrated industry besides. First we take Facebook,
2835 then we take AT&amp;T/WarnerMedia.
2836 </p><p>
2837 But here’s the bad news: Much of what we’re doing to tame Big Tech
2838 <span class="emphasis"><em>instead</em></span> of breaking up the big companies also
2839 forecloses on the possibility of breaking them up later.
2840 </p><p>
2841 Big Tech’s concentration currently means that their inaction on harassment,
2842 for example, leaves users with an impossible choice: absent themselves from
2843 public discourse by, say, quitting Twitter or endure vile, constant
2844 abuse. Big Tech’s over-collection and over-retention of data results in
2845 horrific identity theft. And their inaction on extremist recruitment means
2846 that white supremacists who livestream their shooting rampages can reach an
2847 audience of billions. The combination of tech concentration and media
2848 concentration means that artists’ incomes are falling even as the revenue
2849 generated by their creations are increasing.
2850 </p><p>
2851 Yet governments confronting all of these problems all inevitably converge on
2852 the same solution: deputize the Big Tech giants to police their users and
2853 render them liable for their users’ bad actions. The drive to force Big Tech
2854 to use automated filters to block everything from copyright infringement to
2855 sex-trafficking to violent extremism means that tech companies will have to
2856 allocate hundreds of millions to run these compliance systems.
2857 </p><p>
2858 These rules — the EU’s new Directive on Copyright, Australia’s new terror
2859 regulation, America’s FOSTA/SESTA sex-trafficking law and more — are not
2860 just death warrants for small, upstart competitors that might challenge Big
2861 Tech’s dominance but who lack the deep pockets of established incumbents to
2862 pay for all these automated systems. Worse still, these rules put a floor
2863 under how small we can hope to make Big Tech.
2864 </p><p>
2865 That’s because any move to break up Big Tech and cut it down to size will
2866 have to cope with the hard limit of not making these companies so small that
2867 they can no longer afford to perform these duties — and it’s
2868 <span class="emphasis"><em>expensive</em></span> to invest in those automated filters and
2869 outsource content moderation. It’s already going to be hard to unwind these
2870 deeply concentrated, chimeric behemoths that have been welded together in
2871 the pursuit of monopoly profits. Doing so while simultaneously finding some
2872 way to fill the regulatory void that will be left behind if these
2873 self-policing rulers were forced to suddenly abdicate will be much, much
2874 harder.
2875 </p><p>
2876 Allowing the platforms to grow to their present size has given them a
2877 dominance that is nearly insurmountable — deputizing them with public duties
2878 to redress the pathologies created by their size makes it virtually
2879 impossible to reduce that size. Lather, rinse, repeat: If the platforms
2880 don’t get smaller, they will get larger, and as they get larger, they will
2881 create more problems, which will give rise to more public duties for the
2882 companies, which will make them bigger still.
2883 </p><p>
2884 We can work to fix the internet by breaking up Big Tech and depriving them
2885 of monopoly profits, or we can work to fix Big Tech by making them spend
2886 their monopoly profits on governance. But we can’t do both. We have to
2887 choose between a vibrant, open internet or a dominated, monopolized internet
2888 commanded by Big Tech giants that we struggle with constantly to get them to
2889 behave themselves.
2890 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="make-big-tech-small-again"></a>Make Big Tech small again</h2></div></div></div><p>
2891 Trustbusting is hard. Breaking big companies into smaller ones is expensive
2892 and time-consuming. So time-consuming that by the time you’re done, the
2893 world has often moved on and rendered years of litigation irrelevant. From
2894 1969 to 1982, the U.S. government pursued an antitrust case against IBM over
2895 its dominance of mainframe computing — but the case collapsed in 1982
2896 because mainframes were being speedily replaced by PCs.
2897 </p><div class="blockquote"><blockquote class="blockquote"><p>
2898 A future U.S. president could simply direct their attorney general to
2899 enforce the law as it was written.
2900 </p></blockquote></div><p>
2901 It’s far easier to prevent concentration than to fix it, and reinstating the
2902 traditional contours of U.S. antitrust enforcement will, at the very least,
2903 prevent further concentration. That means bans on mergers between large
2904 companies, on big companies acquiring nascent competitors, and on platform
2905 companies competing directly with the companies that rely on the platforms.
2906 </p><p>
2907 These powers are all in the plain language of U.S. antitrust laws, so in
2908 theory, a future U.S. president could simply direct their attorney general
2909 to enforce the law as it was written. But after decades of judicial
2910 <span class="quote"><span class="quote">education</span></span> in the benefits of monopolies, after multiple
2911 administrations that have packed the federal courts with lifetime-appointed
2912 monopoly cheerleaders, it’s not clear that mere administrative action would
2913 do the trick.
2914 </p><p>
2915 If the courts frustrate the Justice Department and the president, the next
2916 stop would be Congress, which could eliminate any doubt about how antitrust
2917 law should be enforced in the U.S. by passing new laws that boil down to
2918 saying, <span class="quote"><span class="quote">Knock it off. We all know what the Sherman Act says. Robert
2919 Bork was a deranged fantasist. For avoidance of doubt, <span class="emphasis"><em>fuck that
2920 guy</em></span>.</span></span> In other words, the problem with monopolies is
2921 <span class="emphasis"><em>monopolism</em></span> — the concentration of power into too few
2922 hands, which erodes our right to self-determination. If there is a monopoly,
2923 the law wants it gone, period. Sure, get rid of monopolies that create
2924 <span class="quote"><span class="quote">consumer harm</span></span> in the form of higher prices, but also,
2925 <span class="emphasis"><em>get rid of other monopolies, too</em></span>.
2926 </p><p>
2927 But this only prevents things from getting worse. To help them get better,
2928 we will have to build coalitions with other activists in the anti-monopoly
2929 ecology movement — a pluralism movement or a self-determination movement —
2930 and target existing monopolies in every industry for breakup and structural
2931 separation rules that prevent, for example, the giant eyewear monopolist
2932 Luxottica from dominating both the sale and the manufacture of spectacles.
2933 </p><p>
2934 In an important sense, it doesn’t matter which industry the breakups begin
2935 in. Once they start, shareholders in <span class="emphasis"><em>every</em></span> industry
2936 will start to eye their investments in monopolists skeptically. As
2937 trustbusters ride into town and start making lives miserable for
2938 monopolists, the debate around every corporate boardroom’s table will
2939 shift. People within corporations who’ve always felt uneasy about monopolism
2940 will gain a powerful new argument to fend off their evil rivals in the
2941 corporate hierarchy: <span class="quote"><span class="quote">If we do it my way, we make less money; if we do
2942 it your way, a judge will fine us billions and expose us to ridicule and
2943 public disapprobation. So even though I get that it would be really cool to
2944 do that merger, lock out that competitor, or buy that little company and
2945 kill it before it can threaten it, we really shouldn’t — not if we don’t
2946 want to get tied to the DOJ’s bumper and get dragged up and down Trustbuster
2947 Road for the next 10 years.</span></span>
2948 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="goto-10"></a>20 GOTO 10</h2></div></div></div><p>
2949 Fixing Big Tech will require a lot of iteration. As cyber lawyer Lawrence
2950 Lessig wrote in his 1999 book, <span class="emphasis"><em>Code and Other Laws of
2951 Cyberspace</em></span>, our lives are regulated by four forces: law (what’s
2952 legal), code (what’s technologically possible), norms (what’s socially
2953 acceptable), and markets (what’s profitable).
2954 </p><p>
2955 If you could wave a wand and get Congress to pass a law that re-fanged the
2956 Sherman Act tomorrow, you could use the impending breakups to convince
2957 venture capitalists to fund competitors to Facebook, Google, Twitter, and
2958 Apple that would be waiting in the wings after they were cut down to size.
2959 </p><p>
2960 But getting Congress to act will require a massive normative shift, a mass
2961 movement of people who care about monopolies — and pulling them apart.
2962 </p><p>
2963 Getting people to care about monopolies will take technological
2964 interventions that help them to see what a world free from Big Tech might
2965 look like. Imagine if someone could make a beloved (but unauthorized)
2966 third-party Facebook or Twitter client that dampens the anxiety-producing
2967 algorithmic drumbeat and still lets you talk to your friends without being
2968 spied upon — something that made social media more sociable and less
2969 toxic. Now imagine that it gets shut down in a brutal legal battle. It’s
2970 always easier to convince people that something must be done to save a thing
2971 they love than it is to excite them about something that doesn’t even exist
2972 yet.
2973 </p><p>
2974 Neither tech nor law nor code nor markets are sufficient to reform Big
2975 Tech. But a profitable competitor to Big Tech could bankroll a legislative
2976 push; legal reform can embolden a toolsmith to make a better tool; the tool
2977 can create customers for a potential business who value the benefits of the
2978 internet but want them delivered without Big Tech; and that business can get
2979 funded and divert some of its profits to legal reform. 20 GOTO 10 (or
2980 lather, rinse, repeat). Do it again, but this time, get farther! After all,
2981 this time you’re starting with weaker Big Tech adversaries, a constituency
2982 that understands things can be better, Big Tech rivals who’ll help ensure
2983 their own future by bankrolling reform, and code that other programmers can
2984 build on to weaken Big Tech even further.
2985 </p><p>
2986 The surveillance capitalism hypothesis — that Big Tech’s products really
2987 work as well as they say they do and that’s why everything is so screwed up
2988 — is way too easy on surveillance and even easier on capitalism. Companies
2989 spy because they believe their own BS, and companies spy because governments
2990 let them, and companies spy because any advantage from spying is so
2991 short-lived and minor that they have to do more and more of it just to stay
2992 in place.
2993 </p><p>
2994 As to why things are so screwed up? Capitalism. Specifically, the monopolism
2995 that creates inequality and the inequality that creates monopolism. It’s a
2996 form of capitalism that rewards sociopaths who destroy the real economy to
2997 inflate the bottom line, and they get away with it for the same reason
2998 companies get away with spying: because our governments are in thrall to
2999 both the ideology that says monopolies are actually just fine and in thrall
3000 to the ideology that says that in a monopolistic world, you’d better not
3001 piss off the monopolists.
3002 </p><p>
3003 Surveillance doesn’t make capitalism rogue. Capitalism’s unchecked rule
3004 begets surveillance. Surveillance isn’t bad because it lets people
3005 manipulate us. It’s bad because it crushes our ability to be our authentic
3006 selves — and because it lets the rich and powerful figure out who might be
3007 thinking of building guillotines and what dirt they can use to discredit
3008 those embryonic guillotine-builders before they can even get to the
3009 lumberyard.
3010 </p></div><div class="sect1"><div class="titlepage"><div><div><h2 class="title" style="clear: both"><a name="up-and-through"></a>Up and through</h2></div></div></div><p>
3011 With all the problems of Big Tech, it’s tempting to imagine solving the
3012 problem by returning to a world without tech at all. Resist that temptation.
3013 </p><p>
3014 The only way out of our Big Tech problem is up and through. If our future is
3015 not reliant upon high tech, it will be because civilization has fallen. Big
3016 Tech wired together a planetary, species-wide nervous system that, with the
3017 proper reforms and course corrections, is capable of seeing us through the
3018 existential challenge of our species and planet. Now it’s up to us to seize
3019 the means of computation, putting that electronic nervous system under
3020 democratic, accountable control.
3021 </p><p>
3022 I am, secretly, despite what I have said earlier, a tech exceptionalist. Not
3023 in the sense of thinking that tech should be given a free pass to monopolize
3024 because it has <span class="quote"><span class="quote">economies of scale</span></span> or some other nebulous
3025 feature. I’m a tech exceptionalist because I believe that getting tech right
3026 matters and that getting it wrong will be an unmitigated catastrophe — and
3027 doing it right can give us the power to work together to save our
3028 civilization, our species, and our planet.
3029 </p></div></div></body></html>