1 A radical proposal to keep your personal data safe
2 ==================================================
4 by Richard Stallman, 2018-04-03
6 The surveillance imposed on us today is worse than in the Soviet
7 Union. We need laws to stop this data being collected in the first
10 Journalists have been asking me whether the revulsion against the
12 [Facebook](https://www.theguardian.com/technology/2018/mar/31/big-data-lie-exposed-simply-blaming-facebook-wont-fix-reclaim-private-information)
13 data could be a turning point for the campaign to recover
14 privacy. That could happen, if the public makes its campaign broader
17 Broader, meaning extending to all surveillance systems, not just
18 [Facebook](https://www.theguardian.com/technology/facebook). Deeper,
19 meaning to advance from regulating the use of data to regulating the
20 accumulation of data. Because surveillance is so pervasive, restoring
21 privacy is necessarily a big change, and requires powerful measures.
23 The surveillance imposed on us today far exceeds that of the Soviet
24 Union. For freedom and democracy’s sake, we need to eliminate most of
25 it. There are so many ways to use data to hurt people that the only
26 safe database is the one that was never collected. Thus, instead of
27 the EU’s approach of mainly regulating how personal data may be used
28 (in its [General Data Protection Regulation](https://www.eugdpr.org/)
29 or GDPR), I propose a law to stop systems from collecting personal
32 The robust way to do that, the way that can’t be set aside at the whim
33 of a government, is to require systems to be built so as not to
34 collect data about a person. The basic principle is that a system must
35 be designed not to collect certain data, if its basic function can be
36 carried out without that data.
38 Data about who travels where is particularly sensitive, because it is
39 an ideal basis for repressing any chosen target. We can take the
40 London trains and buses as a case for study.
42 The Transport for London digital payment card system centrally records
43 the trips any given Oyster or bank card has paid for. When a passenger
44 feeds the card digitally, the system associates the card with the
45 passenger’s identity. This adds up to complete surveillance.
47 I expect the transport system can justify this practice under the
48 GDPR’s rules. My proposal, by contrast, would require the system to
49 stop tracking who goes where. The card’s basic function is to pay for
50 transport. That can be done without centralising that data, so the
51 transport system would have to stop doing so. When it accepts digital
52 payments, it should do so through an anonymous payment system.
54 Frills on the system, such as the feature of letting a passenger
55 review the list of past journeys, are not part of the basic function,
56 so they can’t justify incorporating any additional surveillance.
58 These additional services could be offered separately to users who
59 request them. Even better, users could use their own personal systems
60 to privately track their own journeys.
62 Black cabs demonstrate that a system for hiring cars with drivers does
63 not need to identify passengers. Therefore such systems should not be
64 allowed to identify passengers; they should be required to accept
65 privacy-respecting cash from passengers without ever trying to
68 However, convenient digital payment systems can also protect
69 passengers’ anonymity and privacy. We have already developed one: [GNU
70 Taler](https://taler.net/en/). It is designed to be anonymous for the
71 payer, but payees are always identified. We designed it that way so as
72 not to facilitate tax dodging. All digital payment systems should be
73 required to defend anonymity using this or a similar method.
75 What about security? Such systems in areas where the public are
76 admitted must be designed so they cannot track people. Video cameras
77 should make a local recording that can be checked for the next few
78 weeks if a crime occurs, but should not allow remote viewing without
79 physical collection of the recording. Biometric systems should be
80 designed so they only recognise people on a court-ordered list of
81 suspects, to respect the privacy of the rest of us. An unjust state is
82 more dangerous than terrorism, and too much security encourages an
85 The EU’s GDPR regulations are well-meaning, but do not go very far. It
86 will not deliver much privacy, because its rules are too lax. They
87 permit collecting any data if it is somehow useful to the system, and
88 it is easy to come up with a way to make any particular data useful
91 The GDPR makes much of requiring users (in some cases) to give consent
92 for the collection of their data, but that doesn’t do much
93 good. System designers have become expert at manufacturing consent (to
94 repurpose Noam Chomsky’s phrase). Most users consent to a site’s terms
95 without reading them; a company that
96 [required](https://www.theguardian.com/technology/2014/sep/29/londoners-wi-fi-security-herod-clause)
97 users to trade their first-born child got consent from plenty of
98 users. Then again, when a system is crucial for modern life, like
99 buses and trains, users ignore the terms because refusal of consent is
100 too painful to consider.
102 To restore privacy, we must stop surveillance before it even asks for
105 Finally, don’t forget the software in your own computer. If it is the
106 non-free software of Apple, Google or Microsoft, it [spies on you
107 regularly](https://gnu.org/malware/). That’s because it is controlled
108 by a company that won’t hesitate to spy on you. Companies tend to lose
109 their scruples when that is profitable. By contrast, free (libre)
110 software is [controlled by its
111 users](https://gnu.org/philosophy/free-software-even-more-important.html). That
112 user community keeps the software honest.
114 Richard Stallman is president of the Free
115 [Software](https://www.theguardian.com/technology/software)
116 Foundation, which launched the development of a free/libre operating
119 Copyright 2018 Richard Stallman. Released under [Creative Commons
120 Attribution NoDerivatives License
121 4.0](https://creativecommons.org/licenses/by-nd/4.0/).
123 Originally published by [The
124 Guardian](https://www.theguardian.com/commentisfree/2018/apr/03/facebook-abusing-data-law-privacy-big-tech-surveillance)