The controversial and compulsory inclusion of fingerprints in passports has been in place in the EU since 2009. From that year on, fingerprints were also included in Dutch identity cards, even though under EU law there was no such obligation. While the inclusion of fingerprints in identity cards in the Netherlands was reversed in January 2014 due to privacy concerns, there is now new European legislation that will make the inclusion of fingerprints in identity cards compulsory as of August 2, 2021.
Dutch citizens can apply for a new identity card without fingerprints until August 2. After that, only people can do so who are ‘temporarily or permanently unable physically to have fingerprints taken’.
The Dutch Senate is expected to debate and vote on the amendment of the Dutch Passport Act in connection with the reintroduction of fingerprints in Dutch identity cards on July 13. In that context, Privacy First sent the following email to the Dutch Senate yesterday:
Dear Members of Parliament,
Since Privacy First was founded in 2008, we have opposed the mandatory collection of fingerprints for passports and identity cards. Since the introduction of the new Passport Act in 2009, Privacy First has done so through lawsuits, campaigns, freedom of information requests, political lobbying and by activating the media. Despite the subsequent Dutch discontinuation of the (planned) central storage of fingerprints in both national and municipal databases in 2011, everyone’s fingerprints are still taken when applying for a passport, and soon (as a result of the new European Regulation on ID cards) again for Dutch ID cards after this was retracted in 2014.
To date, however, the millions of fingerprints taken from virtually the entire adult population in the Netherlands have hardly been used in practice, as the biometric technology had already proven to be unsound and unworkable in 2009. The compulsory collection of everyone’s fingerprints under the Dutch Passport Act therefore still constitutes the most massive and longest-lasting privacy violation that the Netherlands has ever known.
Having read the current report of the Senate on the amendment of the Passport Act to reintroduce fingerprints in ID cards, Privacy First hereby draws your attention to the following concerns. In this context, we ask you to vote against the amendment of the law, in contravention of European policy. After all:
- As early as May 2016, the Dutch Council of State (Raad van State) ruled that fingerprints in Dutch identity cards violated the right to privacy due to a lack of necessity and proportionality, see https://www.raadvanstate.nl/pers/persberichten/tekst-persbericht.html?id=956 (in Dutch).
- Freedom of information requests from Privacy First have revealed that the phenomenon to be tackled (look-alike fraud with passports and identity cards) is so small in scale that the compulsory collection of everyone’s fingerprints is completely disproportionate and therefore unlawful. See: https://www.privacyfirst.nl/rechtszaken-1/wob-procedures/item/524-onthullende-cijfers-over-look-alike-fraude-met-nederlandse-reisdocumenten.html.
- In recent years, fingerprints in passports and identity cards have had a biometric error rate as high as 30%, see https://zoek.officielebekendmakingen.nl/kst-32317-163.html (Dutch State Secretary Teeven, January 31, 2013). Before that, Minister Donner (Security & Justice) admitted an error rate of 21-25%: see https://zoek.officielebekendmakingen.nl/kst-25764-47.html (April 27, 2011). How high are these error rates today?
- Partly because of the high error rates mentioned above, fingerprints in passports and ID cards are virtually not used to date, either domestically, at borders or at airports.
- Because of these high error percentages, former Dutch State Secretary Bijleveld (Interior and Kingdom Relations) instructed all Dutch municipalities as early as September 2009 to (in principle) refrain from conducting biometric fingerprint verifications when issuing passports and identity cards. After all, in the event of a ‘mismatch’, the ID document concerned would have to be returned to the passport manufacturer, which would lead to rapid societal disruption if the numbers were high. In this respect, the Ministry of the Interior and Kingdom Relations was also concerned about large-scale unrest and even possible violence at municipal counters. These concerns and the instruction of State Secretary Bijleveld still apply today.
- Since 2016, several individual Dutch lawsuits are still pending at the European Court of Human Rights in Strasbourg, challenging the mandatory issuing of fingerprints for passports and ID cards on the grounds of violation of Art. 8 ECHR (right to privacy).
- In any case, an exception should be negotiated for people who, for whatever reason, do not wish to give their fingerprints (biometric conscientious objectors, Art. 9 ECHR).
- Partly for the above reasons, fingerprints have not been taken for the Dutch identity card since January 2014. It is up to your Chamber to maintain this status quo and also to push for the abolition of fingerprints for passports.
For background information, see the report ‘Happy Landings' by the Scientific Council for Government Policy (WRR) that Privacy First director Vincent Böhre wrote in 2010. Partly as a result of this critical report (and the large-scale lawsuit brought by Privacy First et al. against the Passport Act), the decentralized (municipal) storage of fingerprints was largely abolished in 2011 and the planned central storage of fingerprints was halted.
For further information or questions regarding the above, Privacy First can be reached at any time.
The Privacy First Foundation
As an NGO that promotes civil rights and privacy protection, Privacy First has been concerned with financial privacy for years. Since 2017, we have been keeping close track of the developments surrounding the second European Payment Services Directive (PSD2), pointing out the dangers to the privacy of consumers. In particular, we focus on privacy issues related to ‘account information service providers’ (AISPs) and on the dangerous possibilities offered by PSD2 to process personal data in more extensive ways.
At the end of 2017, we assumed that providing more adequate information and more transparency to consumers would be sufficient to mitigate the risks associated with PSD2. However, these risks turned out to be greater and of a more fundamental nature. We therefore decided to launch a bilingual (Dutch & English) website called PSD2meniet.nl in order to outline both our concerns and our solutions with regard to PSD2.
Central to our project is the Don’t-PSD2-Me-Register, an idea we launched on 7 January 2019 in the Dutch television program Radar and in this press release. The aim of the Don’t-PSD2-Me-Register is to provide a real tool to consumers with which they can filter out and thus protect their personal data. In time, more options to filter out and restrict the use of data should become available. With this project, Privacy First aims to contribute to positive improvements to PSD2 and its implementation.
Protection of special personal data
In this project, which is supported by the SIDN Fund, Privacy First has focused particularly on ‘special personal data’, such as those generated through payments made to trade unions, political parties, religious organizations, LGBT advocacy groups or medical service providers. Payments made to the Dutch Central Judicial Collection Agency equally reveal parts of people’s lives that require extra protection. These special personal data directly touch upon the issue of fundamental human rights. When consumers use AISPs under PSD2, their data can be shared more widely among third parties. PSD2 indirectly allows data that are currently protected, to become widely known, for example by being included in consumer profiles or black lists.
The best form of protection is to prevent special personal data from getting processed in the first place. That is why we have built the Don’t-PSD2-Me-Register, with an Application Programming Interface (API) – essentially a privacy filter – wrapped around it. With this filter, AISPs can detect and filter out account numbers and thus prevent special personal data from being unnecessarily processed or provided to third parties. Moreover, the register informs consumers and gives them a genuine choice as to whether or not they wish to share their data.
We have outlined many of the results we have achieved in a Whitepaper, which has been sent to stakeholders such as the European Commission, the European Data Protection Board (EDPB) and the Dutch Data Protection Authority. And of course, to as many AISPs as possible, because if they decide to adopt the measures we propose, they would be protecting privacy by design. Our Whitepaper contains a number of examples and good practices on how to enhance privacy protection. Among other things, it lays out how to improve the transparency of account information services. We hope that AISPs will take the recommendations in our Whitepaper to heart.
Our Application Programming Interface (API) has already been adopted by a service provider called Gatekeeper for Open Banking. We support this start up’s continued development, and we make suggestions on how the privacy filter can be best incorporated into their design and services. When AISPs use Gatekeeper, consumers get the control over their data that they deserve.
Knowing that the European Commission will not be evaluating PSD2 until 2022, we are glad to have been able to convey our own thoughts through our Whitepaper. Along with the API we have developed and distributed, it is an important tool for any AISP that takes the privacy of its consumers seriously.
Privacy First will continue to monitor all developments related to the second Payment Services Directive. Our website PSD2meniet.nl will remain up and running and will continue to be the must-visit platform for any updates on this topic.
Today – on European Data Protection Day – the 2021 Dutch Privacy Awards were handed out during the Dutch National Privacy Conference, a joint initiative by Privacy First and the Dutch Platform for the Information Society (ECP). These Awards provide a platform for companies and governments that see privacy as an opportunity to distinguish themselves positively and to make privacy-friendly entrepreneurship and innovation the norm. The winners of the Dutch Privacy Awards 2021 are STER, NLdigital, Schluss, FCInet and the Dutch Ministry of Justice and Security.
Advertising without storage of personal data, contextual targeting: proven effectiveness
The Dutch Stichting Ether Reclame (Ether Advertising Foundation), better known as STER, was one of the first organizations in the Netherlands to abandon the common model of offering advertisements based on information collected via cookies. STER has developed a procedure that only uses relevant information on the webpages visited. No personal data are collected at all (data such as browser version, IP address and click-through behaviour). Advertisers submit their advertisements to STER, which are then put on the website in conformity with the protocol developed by STER, which is based on a number of simple categories. These categories are linked to the information that is shown, such as a TV program that someone has selected. The protocol has been built up and refined over the past period and now works properly.
In this way, STER kills several birds with one stone. Most importantly, initial applications show that this approach is at least as effective for advertisers as the old cookie-based way. Secondly, the approach removes parties from the chain. Data brokers who played a role in the old system are now superfluous. Apart from the financial gain for the chain, this also prevents data coming into the possession of parties the data should not end up with. And thirdly, STER stays in control of its own advertising campaigns.
This makes STER a deserved winner of the Dutch Privacy Awards. The concept developed is innovative and helps to protect the privacy of citizens without them having to make any effort. STER is also investigating the possibility of using the approach more broadly. This too is an innovation that the expert panel applauds.
In that sense STER’s approach is also a well-founded response to the data-driven superpowers on the market as it demonstrates that the endless collection of personal data is not at all necessary to get your message across, whether it is commercial or idealistic.
STER could perhaps also have been submitted as a Business-to-Business entry, but the direct interests of consumers meant that it was listed in the category of consumer solutions.
Organisational innovation and practical application: Data Pro Code
Entries for the Dutch Privacy Awards often relate to technical innovations. At NLdigital it is not the technology, but the approach that is innovative. It has given concrete meaning to GDPR obligations through agreements and focuses mainly on data processors, not on the responsible parties. This enables processors to make agreements more quickly, practically and with sufficient care – agreements which are also verifiable in this regard. Many companies provide services by making applications available which involve data processing. And that requires processing agreements, which are not easy to apply for every organization. Filling in the corresponding statement leads to an appropriate processing agreement for clients.
NLdigital’s code of conduct called Data Pro Code is a practical instrument tailor made for the target group: IT companies that process data on behalf of others. With the help of (600) participants/members, the Code is drawn up as an elaboration of Art. 28 of the GDPR. It has been approved by the Dutch Data Protection Authority and has led to a publicly accessible certification.
Winner: FCInet & Ministery of Justice and Security
Ma³tch, privacy on the government agenda: innovative data minimization
FCInet is innovative, privacy-enhancing technology that was developed by the Dutch Ministry of Justice and Security and the Dutch Ministry of Finance. It is meant to assist in the fight against (international) crime. Part of FCInet is Ma³tch, which stands for Autonous Anonymous Analysis. With this feature the Financial Criminal Investigation Services (FCIS) can share secure and pseudonymized datasets on a national level (for example with the Financial Intelligence Unit-Netherlands and the Fiscal Information and Investigation Service), but also internationally. Ma³tch is a technology that supports and enforces parties concerned to make careful considerations per data field. This is possible with regard to the question of which data these parties want to compare and on the basis of which conditions. This ensures that parties can set up the infrastructure in such a way that it can be technically enforced that data are exchanged only on a legitimate basis.
Through hashing, organization A encrypts (bundles of) personal data in such a way that receiving party B has the possibility to check whether a person known to organization B is also known to organization A. Only if it turns out that there is a match (because the list of known persons in hashed form of organization B is checked against the list of persons in the sent list) does the next step take place whereby organization B actually requests information about the person concerned from organization A. The check takes place in a secure decentralized environment, so organization A does not know whether there is a hit or not. The technology thus prevents the unnecessary perusal of personal data in the context of comparisons.
The open source code technology of FCInet offers broader possibilities for application, which is encouraged by the expert panel and was an important reason for the submission: it can be reused in many other organizations and systems. The panel therefore assessed this initiative as a good investment in privacy by the government, where, clearly, the issue of privacy really is on the agenda.
Schluss applied for the Dutch Privacy Awards in 2021 for the third time. That is not the reason for the Incentive Award, even though it may encourage others to persevere in a similar way.
The reason is that it is a very nice initiative, focused on the self-management of personal data. In the form of an app, private users are offered a vault for their personal data, whether they are of a medical, financial or other nature. Users decide which people or organizations gets access to their data. The idea is that others who are allowed to see the data no longer need to store these data themselves. Schluss has no insight into who uses the app, its role is only to facilitate the process. The technology, which is open source, guarantees transparency about the operation of the app.
Schluss won the prestigious Incentive Award because thus far the app has had only a beta release. However, promising projects have been started with the Volksbank and there is a pilot in collaboration with the Royal Dutch Association of Civil-law Notaries. With the mission statement (‘With Schluss, only you decide who gets to know which of your details’) in mind, Schluss chose to become a cooperation, an organizational form that appealed to the expert panel. With this national Incentive Award the panel hopes to encourage the initiators to continue along this path and to persuade parties to join forces with Schluss.
There are four categories in which applicants are awarded:
1. the category of Consumer solutions (business-to-consumer)
2. the category of Business solutions (within a company or business-to-business)
3. the category of Public services (public authority-to-citizen)
4. the incentive award for a ground breaking technology or person.
From the various entries, the independent expert panel chose the following nominees per category (listed in arbitrary order):
Roseman Labs (Secure Multiparty Computation)
Ministry of Health (CoronaMelder)
NLdigital (Data Pro Code)
FCInet & Ministry of Justice (Ma³tch)
STER (Contextual targeting)
During the National Privacy Conference all nominees presented their projects to the audience in Award pitches. Thereafter, the Awards were handed out. Click HERE for the entire expert panel report (pdf in Dutch), which includes participation criteria and explanatory notes on all the nominees and winners.
National Privacy Conference
The Dutch National Privacy Conference is a ECP|Platform for the Information Society and Privacy First initiative. Once a year, the conference brings together Dutch industry, public authorities, the academic community and civil society with the aim to build a privacy-friendly information society. The mission of both the National Privacy Conference and Privacy First is to turn the Netherlands into a guiding nation in the field of privacy. To this end, privacy by design is key.
These were the speakers during the 2021 National Privacy Conference in successive order:
- Monique Verdier (vice chairwoman of the Dutch Data Protection Authority)
- Judith van Schie (Considerati)
- Erik Gerritsen (Secretary General of the Dutch Ministery of Health, Welfare and Sport)
- Mieke van Heesewijk (SIDN Fund)
- Peter Verkoulen (Dutch Blockchain Coalition)
- Paul Tang (MEP for PvdA)
- Ancilla van de Leest (Privacy First chairwoman)
- Chris van Dam (Member of the Dutch House of Representatives for CDA)
- Evelyn Austin (director of Bits of Freedom)
- Wilmar Hendriks (chairman of the expert panel of the Dutch Privacy Awards).
The entire conference was livestreamed from Nieuwspoort in The Hague: see https://www.nieuwspoort.nl/agenda/overzicht/privacy-conferentie-2021/stream and https://youtu.be/asEX1jy4Tv0.
Dutch Privacy Awards expert panel
The independent expert Award panel consists of privacy experts from different fields:
- Wilmar Hendriks, founder of Control Privacy and member of the Privacy First advisory board (panel chairman)
- Ancilla van de Leest, Privacy First chairwoman
- Paul Korremans, partner at Comfort Information Architects and Privacy First board member
- Marc van Lieshout, managing director at iHub, Radboud University Nijmegen
- Alex Commandeur, senior advisor BMC Advies
- Melanie Rieback, CEO and co-founder of Radically Open Security
- Nico Mookhoek, privacy lawyer and founder of DePrivacyGuru
- Rion Rijker, privacy and data protection expert, IT lawyer and partner at Fresa Consulting.
In order to make sure that the Award process is run objectively, the panel members may not judge on any entry of his or her own organization.
In collaboration with the Dutch Platform for the Information Society (ECP), Privacy First organizes the Dutch Privacy Awards with the support of the Democracy & Media Foundation and The Privacy Factory.
Pre-registrations for the 2022 Dutch Privacy Awards are welcome!
Would you like to become a sponsor of the Dutch Privacy Awards? Please contact Privacy First!
It is with great concern that Privacy First has taken note of the Dutch draft bill on COVID-19 test certificates. Under this bill, a negative COVID-19 test certificate will become mandatory for access to sporting and youth activities, all sorts of events and public places including bars and restaurants and cultural and higher education institutions, Those who have no such certificates risk getting high fines. This will put pressure on everyone's right to privacy.
Serious violation of fundamental rights
The draft bill severely infringes numerous fundamental and human rights, including the right to privacy, physical integrity and freedom of movement in combination with other relevant human rights such as the right to participate in cultural life, the right to education and various children’s rights such as the right to recreation. Any curtailment of these rights must be strictly necessary, proportionate and effective. However, the current draft bill fails to demonstrate this, while the required necessity in the public interest is simply assumed. More privacy-friendly alternatives to reopen and normalize society do not seem to have been considered. For these reasons alone, the proposal cannot pass the human rights test and should therefore be withdrawn.
The proposal also violates the general prohibition of discrimination, as it introduces a broad social distinction based on medical status. This puts pressure on social life and may lead to large-scale inequality, stigmatization, social segregation and even possible tensions, as large groups in society will not (or not systematically) want to or will not be able to get tested (for various reasons). During the recent Dutch National Privacy Conference organized by Privacy First and the Platform for the Information Society (ECP), it already became clear that the introduction of a mandatory ‘corona passport’ could have a socially disruptive effect. On that occasion the Dutch Data Protection Authority, among others, took a strong stand against it. Such social risks apply all the more strongly to the indirect vaccination obligation that follows on from the corona test certificate. In this regard, Privacy First wants to recall that recently both the Dutch House of Representatives and the Parliamentary Assembly of the Council of Europe have expressed their opposition to a direct or indirect vaccination requirement. In addition, the draft bill under consideration will have the potential to set precedents for other medical conditions and other sectors of society, putting pressure on a much broader range of socio-economic rights. For all of these reasons, Privacy First strongly recommends that the Dutch government withdraw this draft bill.
Multiple privacy violations
Moreover, from the perspective of the right to privacy, a number of specific objections and questions apply. First of all, the draft bill introduces a mandatory ‘proof of healthiness’ for participation in a large part of social life, in flagrant violation of the right to privacy and the protection of personal data. Also, the draft bill introduces an identification requirement at the entrance of public places, in violation of the right to anonymity in public spaces. The bill also results in the inconsistent application of existing legislation to the same act, namely testing, with far-reaching consequences on the one hand for a precious achievement like medical confidentiality and the trust of citizens in that confidentiality, and on the other hand for the practical implementation of retention periods while the processing of the test result does not change. After all, it is not the result of the test that should determine whether the file falls under the Dutch Medical Treatment Contracts Act (WGBO, which has a medical secrecy requirement and a retention period of 20 years) or under the Public Health Act (with a retention period of five years), but the act of testing itself. Moreover, it is unclear why the current draft bill seeks to connect to the Public Health Act and/or WGBO if it only concerns obtaining a test certificate for the purpose of participating in society (and therefore no medical treatment or public health task for that purpose). Here, the only possibility for processing and for breaching medical confidentiality should be the basis of consent. In this case, however, there cannot be the legally required freely given consent, since testing will be a compelling condition for participation in society.
Privacy requires clarity
Many other issues are still unclear: which data will be stored, where, by whom, and which data may possibly be exchanged? To what extent will there be personal localization and identification as opposed to occasional verification and authentication? Why may test results be kept for an unnecessarily long time (five or even 20 years)? How great are the risks of hacking, data breaches, fraud and forgery? To what extent will there be decentralized, privacy-friendly technology, privacy by design, open source software, data minimization and anonymization? Will test certificates remain free of charge and to what extent will privacy-friendly diversity and choice in testing applications be possible? Is work already underway to introduce an ‘alternative digital carrier’ in place of the Dutch CoronaCheck app, namely a chip, with all the risks that entails? How will function creep and profiling be prevented and are there any arrangements when it comes to data protection supervision? Will non-digital, paper alternatives always remain available? What will happen to the test material taken, i.e. everyone’s DNA? And when will the corona test certificates be abolished?
As long as such concerns and questions remain unanswered, submission of this bill makes no sense at all and the corona test certificate will only lead to the destruction of social capital. Privacy First therefore reiterates its request that the current proposal be withdrawn and not submitted to Parliament. Failing this, Privacy First will reserve the right to have the matter reviewed by the courts and declared unlawful.
 See the Dutch National Privacy Conference, 28 January 2021, https://youtu.be/asEX1jy4Tv0?t=9378, starting at 2h 36 min 18 sec.
 See Council of Europe, Parliamentary Assembly, Resolution 2361 (2021): Covid-19 vaccines: ethical, legal and practical considerations, https://pace.coe.int/en/files/29004/html, par. 7.3.1-7.3.2: “Ensure that citizens are informed that the vaccination is NOT mandatory and that no one is politically, socially, or otherwise pressured to get themselves vaccinated, if they do not wish to do so themselves; ensure that no one is discriminated against for not having been vaccinated, due to possible health risks or not wanting to be vaccinated.” See also, for example, Dutch House of Representatives, Motion by Member Azarkan on No Corona Vaccination Obligation (28 October 2020), Parliamentary Document 25295-676, https://zoek.officielebekendmakingen.nl/kst-25295-676.html: "The House (...) pronounces that there should never be a direct or indirect coronavirus vaccination obligation in the future"; Motion by Member Azarkan on Access to Public Benefits for All Regardless of Vaccination or Testing Status (5 January 2021), Parliamentary Document 25295-864, https://zoek.officielebekendmakingen.nl/kst-25295-864.html: "The House (...) requests the government to enable access to public services for all regardless of vaccination or testing status.’
Under the Corona Pandemic Emergency Act, the Dutch government has the option to introduce all kinds of restrictive measures, including the wide-ranging and mandatory use of face masks. This is unless the Dutch House of Representatives rejects this measure later this week. In this context, Privacy First today has sent the following email to the House of Representatives:
Dear Members of Parliament,
On 19 November, the government submitted to you the Regulation concerning additional requirements for face masks under COVID-19. Under this regulation, wearing a face mask will become mandatory in numerous places (including shops, railway stations, airports and schools) as of 1 December 2020. This obligation can be periodically extended by the government without the consent of Parliament. Based on the Corona Pandemic Emergency Act, you currently have seven days to exercise your right of veto and prevent the entry into force of a wide-ranging face mask obligation. By 26 November at the latest, you will be able to vote on this issue and reject this measure.
The wearing of face masks has been the subject of much public debate for months. Both the government and the National Institute for Public Health and the Environment (RIVM) have repeatedly stated that wearing non-medical face masks is hardly effective in combating the coronavirus. Scientists seem to be divided on this. At the same time, wearing a face mask can also have the opposite effect, i.e. harm people's health. There is a consensus, however, that in a legal sense the compulsory use of face masks is an infringement of the right to privacy and self-determination.
This accordingly falls within the scope of Privacy First. The right to privacy is a universal human right that is protected in the Netherlands by international and European treaties and by our national Constitution. Any infringement of the right to privacy must therefore be strictly necessary, proportionate and effective. If that is not the case, it is an unjustified breach and therefore a violation of the right to privacy, both as a human right and as a constitutional right. As long as the wearing of non-medical face masks to deafeat the coronavirus has not proven effective and can even have adverse health effects, there cannot be any social necessity for the introduction of a general face mask obligation. Such an obligation would thus amount to a social experiment with unforeseen consequences. This is not in keeping with a free and democratic constitutional society under the rule of law. Privacy First therefore advises you to reject the proposed regulation for the introduction of compulsory face masks and instead propose to continue wearing them on a voluntary basis.
The Privacy First Foundation
In the fight against the coronavirus, the Dutch government this week made clear that the introduction of a curfew is imminent. Because of this, Privacy First today has sent the following appeal to the Dutch House of Representatives:
Dear Members of Parliament,
This week the Netherlands finds itself at a historical human rights crossroads: is a nation-wide curfew going to be introduced for the first time since World War II? For Privacy First such a far-reaching, generic measure would be disproportionate and far from necessary in virtually every situation. Moreover, in the fight against the coronavirus the effectiveness of such a measure remains unknown to this date. For that alone, there can be no legally required social necessity of a curfew. A curfew could in fact also be counterproductive, as it would harm the mental and (therefore also) physical health of large groups in society. Besides, a curfew in the Netherlands is yet another step towards a surveillance society. The use of lighter, targeted and more effective measures is always preferable. Should a curfew nonetheless be introduced, Privacy First would consider it a massive violation of the right to privacy and freedom of movement. Privacy First therefore calls on you to not let this happen and to thwart the introduction of a curfew.
The Privacy First Foundation
Update 17 February 2021: this week, in summary proceedings, the district court of The Hague handed down a ground-breaking ruling that says that the curfew was wrongly introduced under the Dutch Extraordinary Powers Act. The current Dutch curfew is therefore unlawful. Moreover, the court found that there are "major question marks regarding the factual substantiation by the State of the necessity of the curfew. (...) Before a far-reaching restriction such as a curfew is introduced, it must be clear that no other, less far-reaching measures are available and that the introduction of the curfew will actually have a substantial effect", stated the court, without the conviction that this was the case. In addition, the court raised the question of why an urgent (but voluntary) curfew advice had not been chosen. The court also noted that "the Dutch Outbreak Management Team, according to the team itself, has no evidence that the curfew will make a substantial contribution to reducing the spread of the virus." All this "makes the State's assertion that a curfew is inevitable at least debatable and without convincing justification", the court concluded. (See judgment (in Dutch), paragraphs 4.12-4.14.)
The judgment of the district court of The Hague is in line with Privacy First’s earlier position. Privacy First hopes that this will be confirmed on appeal by the Hague Court of Appeal and that it will also lead to the rejection of the curfew by both the Dutch House of Representatives and the Senate.
This week the Dutch House of Representatives will debate the ‘temporary’ Corona emergency law under which the movements of everyone in the Netherlands can henceforth be monitored ‘anonymously’. Privacy First has previously criticized this plan in a television broadcast by current affairs program Nieuwsuur. Subsequently, today Privacy First has sent the following letter to the House of Representatives:
Dear Members of Parliament,
With great concern, Privacy First has taken note of the ‘temporary’ legislative proposal to provide COVID-19 related telecommunications data to the Dutch National Public Health Institute (RIVM). Privacy First advises to reject this proposal on account of the following fundamental concerns and risks:
Violation of fundamental administrative and privacy principles
- There is no societal necessity for this legislative proposal. Other forms of monitoring have already proven sufficiently effective. The necessity of this proposal has not been demonstrated and there is no other country where the application of similar technologies made any significant contribution.
- The proposal is entirely disproportionate as it encompasses all telecom location data in the entire country. Any form of differentiation is absent. The same applies to data minimization: a sample would be sufficient.
- The proposal goes into effect retroactively on 1 January 2020. This violates legal certainty and the principle of legality, particularly because this date is long before the Dutch ‘start’ of the pandemic (11 March 2020).
- The system of ‘further instructions from the minister’ that has been chosen for the proposal is completely undemocratic. This further erodes the democratic rule of law and the oversight of parliament.
- The proposal does not mention 'privacy by design' or the implementation thereof, while this should actually be one of its prominent features.
Alternatives are less invasive: subsidiarity
- The State Secretary failed to adequately investigate alternatives which are more privacy friendly. Does she even have any interest in this at all?
- Data in the possession of telecom providers are pseudonymized with unique ID numbers and as such are submitted to Statistics Netherlands (CBS). This means that huge amounts of sensitive personal data become very vulnerable. Anonymization by CBS happens only at a later stage.
- When used, the data are filtered based on geographical origin. This creates a risk of discrimination on the basis of nationality, which is prohibited.
- It is unclear whether the CBS and the RIVM intend to ‘enrich’ these data with other data, which could lead to function creep and potential data misuse.
Lack of transparency and independent oversight
- Up until now, the Privacy Impact Assessment (PIA) of the proposal has not been made public.
- There is no independent oversight on the measures and effects (by a judge or an independent commission).
- The GDPR may be applicable to the proposal only partially as anonymous data and statistics are exempt from the GDPR. This gives rise to new risks of data misuse, poor digital protection, data breaches, etc. General privacy principles should therefore be made applicable in any case.
Structural changes and chilling effect
- This proposal seems to be temporary, but the history of similar legislation shows that it will most likely become permanent.
- Regardless of the ‘anonymization’ of various data, this proposal will make many people feel like they are being monitored, which in turn will make them behave unnaturally. The risk of a societal chilling effect is huge.
Faulty method with a significant impact
- The effectiveness of the legislative proposal is unknown. In essence, it constitutes a large scale experiment. However, Dutch society is not meant to be a living laboratory.
- By means of data fusion, it appears that individuals could still be identified on the basis of anonymous data. Even at the chosen threshold of 15 units per data point, the risk of unique singling out and identification is likely still too large.
- The proposal will lead to false signals and blind spots due to people with several telephones as well as vulnerable groups without telephones, etc.
- There is a large risk of function creep, of surreptitious use and misuse of data (including the international exchange thereof) by other public services (including the intelligence services) and future public authorities.
- This proposal puts pressure not just on the right to privacy, but on other human rights as well, including the right to freedom of movement and the right to demonstrate. The proposal can easily lead to structural crowd control that does not belong in a democratic society.
Specific prior consent
Quite apart from the above concerns and risks, Privacy First doubts whether the use of telecom data by telecom providers, as envisaged by the legislative proposal, is lawful in the first place. In the view of Privacy First, this would require either explicit, specific and prior consent (opt-in) from customers, or the possibility for them to opt-out at a later stage and to have the right to have all their data removed.
It is up to you as Members of Parliament to protect our society from this legislative proposal. If you fail to do so, Privacy First reserves the right to take legal action against this law.
The Privacy First Foundation
Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.
Dear Members of Parliament,
Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.
Lack of necessity and effectiveness
With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.
In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.
Risks of misuse
There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.
For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.
Testing instead of apps
According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.
Haste is rarely a good thing
If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.
Privacy by design
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The Privacy First Foundation
Dear Members of Parliament,
You have received our position paper, this is our oral explanation.
First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.
With this in mind, we look at three legal principles:
- Legitimate purpose limitation.
- What is the problem?
- What is the scale of the problem?
- What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?
It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.
Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.
- Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.
- Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.
On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.
With great concern, Privacy First has taken note of the intention of the Dutch government to employ special apps in the fight against the coronavirus. In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatisation and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect. Furthermore, there is a substantial risk that the collected data will be used and misued for multiple (illegitimate) purposes by companies and public authorities. Moreover, if these data fall into the hands of criminal organizations, they will be a gold mine for criminal activities. For Privacy First, these risks of Corona apps do not outweigh their presumed benefits.
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional State. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The world is hit exceptionally hard by the coronavirus. This pandemic is not only a health hazard, but can also lead to a human rights crisis, endangering privacy among other rights.
The right to privacy includes the protection of everyone’s private life, personal data, confidential communication, home inviolability and physical integrity. Privacy First was founded to protect and promote these rights. Not only in times of peace and prosperity, but also in times of crisis.
Now more than ever, it is vital to stand up for our social freedom and privacy. Fear should not play a role in this. However, various countries have introduced draconian laws, measures and infrastructures. Much is at stake here, namely preserving everyone’s freedom, autonomy and human dignity.
Privacy First monitors these developments and reacts proactively as soon as governments are about to take measures that are not strictly necessary and proportionate. In this respect, Privacy First holds that the following measures are in essence illegitimate:
- Mass surveillance
- Forced inspections in the home
- Abolition of anonymous or cash payments
- Secret use of camera surveillance and biometrics
- Every form of infringement on medical confidentiality.
Privacy First will see to it that justified measures will only apply temporarily and will be lifted as soon as the Corona crisis is over. It should be ensured that no new, structural and permanent emergency legislation is introduced. While the measures are in place, effective legal means should remain available and privacy supervisory bodies should remain critical.
Moreover, in order to control the coronavirus effectively, we should rely on the individual responsibility of citizens. Much is possible on the basis of voluntariness and individual, fully informed, specific and prior consent.
As always, Privacy First is prepared to assist in the development of privacy-friendly policies and any solutions based on privacy by design, preferably in collaboration with relevant organizations and experts. Especially in these times, the Netherlands (and the European Union) can become an international point of reference when it comes to fighting a pandemic while preserving democratic values and the right to privacy. This is the only way that the Corona crisis will not be able to weaken our world lastingly, and instead, we will emerge stronger together.