1 # Una solución radical para mantener seguros sus objetos personales
3 Por Richard Stallman, 2018-04-03
5 La vigilancia impuesta hoy en día es peor que la impuesta en la Unión Soviética. Para empezar, hacen falta leyes para deneter tal recolección de datos.
7 Journalists have been asking me whether the revulsion against the abuse of
8 [Facebook](https://www.theguardian.com/technology/2018/mar/31/big-data-lie-exposed-simply-blaming-facebook-wont-fix-reclaim-private-information)
9 data could be a turning point for the campaign to recover privacy. That
10 could happen, if the public makes its campaign broader and deeper.
12 Broader, meaning extending to all surveillance systems, not just
13 [Facebook](https://www.theguardian.com/technology/facebook). Deeper, meaning
14 to advance from regulating the use of data to regulating the accumulation of
15 data. Because surveillance is so pervasive, restoring privacy is necessarily
16 a big change, and requires powerful measures.
18 The surveillance imposed on us today far exceeds that of the Soviet
19 Union. For freedom and democracy’s sake, we need to eliminate most of
20 it. There are so many ways to use data to hurt people that the only safe
21 database is the one that was never collected. Thus, instead of the EU’s
22 approach of mainly regulating how personal data may be used (in its [General
23 Data Protection Regulation](https://www.eugdpr.org/) or GDPR), I propose a
24 law to stop systems from collecting personal data.
26 The robust way to do that, the way that can’t be set aside at the whim of a
27 government, is to require systems to be built so as not to collect data
28 about a person. The basic principle is that a system must be designed not to
29 collect certain data, if its basic function can be carried out without that
32 Data about who travels where is particularly sensitive, because it is an
33 ideal basis for repressing any chosen target. We can take the London trains
34 and buses as a case for study.
36 The Transport for London digital payment card system centrally records the
37 trips any given Oyster or bank card has paid for. When a passenger feeds the
38 card digitally, the system associates the card with the passenger’s
39 identity. This adds up to complete surveillance.
41 I expect the transport system can justify this practice under the GDPR’s
42 rules. My proposal, by contrast, would require the system to stop tracking
43 who goes where. The card’s basic function is to pay for transport. That can
44 be done without centralising that data, so the transport system would have
45 to stop doing so. When it accepts digital payments, it should do so through
46 an anonymous payment system.
48 Frills on the system, such as the feature of letting a passenger review the
49 list of past journeys, are not part of the basic function, so they can’t
50 justify incorporating any additional surveillance.
52 These additional services could be offered separately to users who request
53 them. Even better, users could use their own personal systems to privately
54 track their own journeys.
56 Black cabs demonstrate that a system for hiring cars with drivers does not
57 need to identify passengers. Therefore such systems should not be allowed to
58 identify passengers; they should be required to accept privacy-respecting
59 cash from passengers without ever trying to identify them.
61 However, convenient digital payment systems can also protect passengers’
62 anonymity and privacy. We have already developed one: [GNU
63 Taler](https://taler.net/en/). It is designed to be anonymous for the payer,
64 but payees are always identified. We designed it that way so as not to
65 facilitate tax dodging. All digital payment systems should be required to
66 defend anonymity using this or a similar method.
68 What about security? Such systems in areas where the public are admitted
69 must be designed so they cannot track people. Video cameras should make a
70 local recording that can be checked for the next few weeks if a crime
71 occurs, but should not allow remote viewing without physical collection of
72 the recording. Biometric systems should be designed so they only recognise
73 people on a court-ordered list of suspects, to respect the privacy of the
74 rest of us. An unjust state is more dangerous than terrorism, and too much
75 security encourages an unjust state.
77 The EU’s GDPR regulations are well-meaning, but do not go very far. It will
78 not deliver much privacy, because its rules are too lax. They permit
79 collecting any data if it is somehow useful to the system, and it is easy to
80 come up with a way to make any particular data useful for something.
82 The GDPR makes much of requiring users (in some cases) to give consent for
83 the collection of their data, but that doesn’t do much good. System
84 designers have become expert at manufacturing consent (to repurpose Noam
85 Chomsky’s phrase). Most users consent to a site’s terms without reading
87 [required](https://www.theguardian.com/technology/2014/sep/29/londoners-wi-fi-security-herod-clause)
88 users to trade their first-born child got consent from plenty of users. Then
89 again, when a system is crucial for modern life, like buses and trains,
90 users ignore the terms because refusal of consent is too painful to
93 To restore privacy, we must stop surveillance before it even asks for
96 Finally, don’t forget the software in your own computer. If it is the
97 non-free software of Apple, Google or Microsoft, it [spies on you
98 regularly](https://gnu.org/malware/). That’s because it is controlled by a
99 company that won’t hesitate to spy on you. Companies tend to lose their
100 scruples when that is profitable. By contrast, free (libre) software is
102 users](https://gnu.org/philosophy/free-software-even-more-important.html).
103 That user community keeps the software honest.
105 Richard Stallman is president of the Free Software Foundation, which
106 launched the development of a free/libre operating system GNU.
108 Copyright 2018 Richard Stallman. Released under [Creative Commons
109 Attribution NoDerivatives License
110 4.0](https://creativecommons.org/licenses/by-nd/4.0/). The original English
111 version was published in [The
112 Guardian](https://www.theguardian.com/commentisfree/2018/apr/03/facebook-abusing-data-law-privacy-big-tech-surveillance)