Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.
Dear Members of Parliament,
Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.
Lack of necessity and effectiveness
With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.
In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.
Risks of misuse
There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.
For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.
Testing instead of apps
According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.
Haste is rarely a good thing
If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.
Privacy by design
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The Privacy First Foundation
Dear Members of Parliament,
You have received our position paper, this is our oral explanation.
First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.
With this in mind, we look at three legal principles:
- Legitimate purpose limitation.
- What is the problem?
- What is the scale of the problem?
- What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?
It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.
Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.
- Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.
- Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.
On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.
With great concern, Privacy First has taken note of the intention of the Dutch government to employ special apps in the fight against the coronavirus. In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatisation and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect. Furthermore, there is a substantial risk that the collected data will be used and misued for multiple (illegitimate) purposes by companies and public authorities. Moreover, if these data fall into the hands of criminal organizations, they will be a gold mine for criminal activities. For Privacy First, these risks of Corona apps do not outweigh their presumed benefits.
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional State. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The world is hit exceptionally hard by the coronavirus. This pandemic is not only a health hazard, but can also lead to a human rights crisis, endangering privacy among other rights.
The right to privacy includes the protection of everyone’s private life, personal data, confidential communication, home inviolability and physical integrity. Privacy First was founded to protect and promote these rights. Not only in times of peace and prosperity, but also in times of crisis.
Now more than ever, it is vital to stand up for our social freedom and privacy. Fear should not play a role in this. However, various countries have introduced draconian laws, measures and infrastructures. Much is at stake here, namely preserving everyone’s freedom, autonomy and human dignity.
Privacy First monitors these developments and reacts proactively as soon as governments are about to take measures that are not strictly necessary and proportionate. In this respect, Privacy First holds that the following measures are in essence illegitimate:
- Mass surveillance
- Forced inspections in the home
- Abolition of anonymous or cash payments
- Secret use of camera surveillance and biometrics
- Every form of infringement on medical confidentiality.
Privacy First will see to it that justified measures will only apply temporarily and will be lifted as soon as the Corona crisis is over. It should be ensured that no new, structural and permanent emergency legislation is introduced. While the measures are in place, effective legal means should remain available and privacy supervisory bodies should remain critical.
Moreover, in order to control the coronavirus effectively, we should rely on the individual responsibility of citizens. Much is possible on the basis of voluntariness and individual, fully informed, specific and prior consent.
As always, Privacy First is prepared to assist in the development of privacy-friendly policies and any solutions based on privacy by design, preferably in collaboration with relevant organizations and experts. Especially in these times, the Netherlands (and the European Union) can become an international point of reference when it comes to fighting a pandemic while preserving democratic values and the right to privacy. This is the only way that the Corona crisis will not be able to weaken our world lastingly, and instead, we will emerge stronger together.
On July 1 and 2, 2019, the Netherlands will be examined in Geneva by the United Nations Human Rights Committee. This UN body is tasked with supervising the compliance of one of the oldest and most important human rights treaties in the world: the International Covenant on Civil and Political Rights (ICCPR). Each country which is a contracting party to the ICCPR is subject to periodical review by the UN Human Rights Committee. At the beginning of next week, the Dutch government must answer before the Committee for various current privacy issues that have been put on the agenda by Privacy First among others.
The previous Dutch session before the UN Human Rights Committee dates from July 2009, when the Dutch minister of Justice Ernst Hirsch Ballin had to answer for the then proposed central storage of fingerprints under the new Dutch Passport Act. This was a cause for considerable criticism of the Dutch government. Now, ten years on, the situation in the Netherlands will be examined once more. Against this background, Privacy First had submitted to the Committee a critical report (pdf) at the end of 2016, and has recently supplemented this with a new report (pdf). In a nutshell, Privacy First has brought the following current issues to the attention of the Committee:
- the limited admissibility of interest groups in class action lawsuits
- the Dutch ban on judicial review of the constitutionality of laws
- Automatic Number Plate Recognition (ANPR)
- border control camera system @MIGO-BORAS
- the Dutch public transport chip card ('OV-chipkaart')
- Electronic Health Record systems
- possible reintroduction of the Telecommunications Data Retention Act
- the new Dutch Intelligence and Security Services Act (‘Tapping Law’)
- Passenger Name Records (PNR)
- the Dutch abolition of consultative referendums
- the Dutch non-recognition of the international prohibition of propaganda for war.
The entire Dutch session before the Committee can be watched live on UN Web TV on Monday afternoon, July 1, and Tuesday morning, July 2. In addition to privacy issues, several Dutch organizations have put numerous other human rights issues on the agenda of the Committee; click HERE for an overview, which also features the previously established List of Issues (including the new Intelligence and Security Services Act, the possible reintroduction of the retention of telecommunications data, camera system @MIGO-BORAS, and medical confidentiality with health insurance companies). The Committee will likely present its ‘Concluding Observations’ within a matter of weeks. Privacy First awaits the outcome of these observations with confidence.
Update July 26, 2019: yesterday afternoon the Committee has published its Concluding Observations on the human rights situation in the Netherlands, which includes critical opinions on two privacy issues that were brought to the attention of the Committee by Privacy First:
The Intelligence and Security Services Act
The Committee is concerned about the Intelligence and Security Act 2017, which provides intelligence and security services with broad surveillance and interception powers, including bulk data collection. It is particularly concerned that the Act does not seem to provide for a clear definition of bulk data collection for investigation related purpose; clear grounds for extending retention periods for information collected; and effective independent safeguards against bulk data hacking. It is also concerned by the limited practical possibilities for complaining, in the absence of a comprehensive notification regime to the Dutch Oversight Board for the Intelligence and Security Services (CTIVD) (art. 17).
The State party should review the Act with a view to bringing its definitions and the powers and limits on their exercise in line with the Covenant and strengthen the independence and effectiveness of CTIVD and the Committee overseeing intelligence efforts and competences that has been established by the Act.
The Market Healthcare Act
The Committee is concerned that the Act to amend the Market Regulation (Healthcare) Act allows health insurance company medical consultants access to individual records in the electronic patient registration without obtaining a prior, informed and specific consent of the insured and that such practice has been carried out by health insurance companies for many years (art. 17).
The State party should require insurance companies to refrain from consulting individual medical records without a consent of the insured and ensure that the Bill requires health insurance companies to obtain a prior and informed consent of the insured to consult their records in the electronic patient registration and provide for an opt-out option for patients that oppose access to their records.
During the session in Geneva the abolition of the referendum and the camera system @MIGO-BORAS were also critically looked at. However, Privacy First regrets that the Committee makes no mention of these and various other current issues in its Concluding Observations. Nevertheless, the report by the Committee shows that the issue of privacy is ever higher on the agenda of the United Nations. Privacy First welcomes this development and will continue in the coming years to encourage the Committee to go down this path. Moreover, Privacy First will ensure that the Netherlands will indeed implement the various recommendations by the Committee.
Today an important debate will take place in the Dutch House of Representatives about the introduction of Passenger Name Records (PNR): the large scale, years-long storage of all sorts of data of airline passengers, supposedly to fight crime and terrorism. Privacy First has major objections and at the end of last week has sent the following letter to the House. Today’s parliamentary debate was first scheduled to take place on 14 May 2018, but was cancelled (following a similar letter from Privacy First) until further notice. Following new parliamentary questions, the debate will now take place today after all. Here is the full text of our most recent letter:
Dear Members of the House of Representatives,
On Monday afternoon, this 11 March, you will discuss the Dutch implementation of the European directive on Passenger Name Records (PNR) with minister Grapperhaus (Justice and Security). In Privacy First’s view, both the European PNR directive as well as the Dutch implementation thereof are legally untenable. We shall here briefly elucidate our position.
Under the minister’s legislative proposal concerning PNR, numerous data of every single airline passenger travelling to or from the Netherlands will be stored for five years in a central government database of the new Passenger Information Unit and will be used to prevent, investigate and prosecute crimes and terrorism. Sensitive personal data (such as names, addresses, telephone numbers, email addresses, dates of birth, travel data, ID document numbers, destinations, fellow passengers and payment data) of many millions of passengers will, as a result, become available for many years for the purpose of data mining and profiling. In essence, this means that every airline passenger will be treated as a potential criminal or terrorist. In 99.9% of all cases, however, this concerns perfectly innocent citizens, mainly holidaymakers and business travellers. This is a flagrant breach of their right to privacy and freedom of movement. Last year, Privacy First had already made these arguments in the Volkskrant and on BNR Nieuwsradio. Because of privacy objections, in recent years there has been a lot of political resistance to such large scale PNR storage of data, which has been rejected by both the House of Representatives as well as the European Parliament on several occasions since 2010. In 2015, Dutch ruling parties VVD and PvdA were absolutely opposed to PNR as well. Back then, they called it a ‘holiday register’ and they themselves threatened to take to the European Court of Justice in case the PNR directive would be adopted. However, after the attacks in Paris and Brussels, it seemed that many political restraints had evaporated and in 2016, the PNR directive finally came about after all. Up to now however, the legally required necessity and proportionality of this directive have still to be demonstrated.
In the summer of 2017, the European Court of Justice issued an important ruling with regard to the similar PNR agreement between the EU and Canada. The Court declared this agreement invalid because it violates the right to privacy. Among other things, the Court held that the envisaged agreement must, “limit the retention of PNR data after the air passengers’ departure to that of passengers in respect of whom there is objective evidence from which it may be inferred that they may present a risk in terms of the fight against terrorism and serious transnational crime.” (See Opinion 1/15 (26 July 2017), par. 207.) Ever since this ruling, the European PNR directive is a legal uncertainty. Therefore, the Dutch government has valid ‘‘concerns about the future viability of the PNR directive” (see Note in response to report, p. 23, in Dutch). Privacy First expects that the current PNR directive will soon be submitted to the European Court of Justice for judicial review and will then be declared unlawful. Subsequently, a situation will arise that is similar to the one we have witnessed a few years ago with regard to the European Telecommunications Data Retention Act: as soon as this European directive will be annulled, the Dutch implementing provisions will equally be invalidated in interim injunction proceedings.
The current Dutch PNR legislative proposal seems unlawful a priori because of a lack of demonstrable necessity, proportionality and subsidiarity. The legislative proposal comes down to mass surveillance of mostly innocent citizens; in the 2016 Tele2 case the European Court already ruled that this type of legislation is unlawful. Thereupon the Netherlands pledged before the UN Human Rights Council “to ensure that the collection and maintenance of data for criminal [investigation] purposes does not entail massive surveillance of innocent persons.” The Netherlands now seems to renege on that promise. After all, a lot of completely unnecessary data of every airline passenger will be stored for years and can be used by various Dutch, European and even non-European government agencies. Moreover, the effectiveness of PNR has to date never been demonstrated, the minister himself affirmed: ‘‘There is no statistical support” (see Note in response to report, p. 8, in Dutch). The risk of unjust suspicion and discrimination (due to fallible algorithms used for profiling) under the proposed PNR system is serious, which also increases the likelihood of delays and missed flights for innocent passengers. All the while, wanted persons will often stay under the radar and choose alternative travel routes. Furthermore, the legislative proposal entirely fails to address the role and capabilities of secret services, which will be granted secret and shielded access to the central PNR database under the new Dutch Intelligence and Security Services Act. However, the most questionable aspect of the Dutch PNR legislative proposal is that it goes even two steps further than the European PNR directive itself: After all, it is the Dutch government's own decision to also store the data of passengers on all intra-EU flights. This is not obligatory under the PNR directive, and the Netherlands could have limited this to preselected flights (judged to be at risk) only. This would have been in line with the advice of most experts in this field who argue for targeted actions as opposed to mass surveillance. In other words, to focus on persons with a reasonable suspicion about them, in accordance with the principles of our democracy under the rule of law.
Privacy First Advice
Privacy First strongly advises you to reject the current legislative proposal and to replace it with a privacy-friendly version. In case this will lead to the European Commission referring the Netherlands to the European Court of Justice due to a lack of implementation of the present PNR directive, Privacy First would be confident this would end in a clear victory for the Netherlands. EU Member States simply cannot be expected to implement privacy-violating EU rules. This applies equally to the national implementation of relevant resolutions of the UN Security Council (in this case UNSC Res. 2396 (2017)) which is similarly at odds with international human rights law. In this respect, Privacy First has already warned of the abuse of the Dutch TRIP system (which is also used for PNR) by other UN Member States. In this regard, the Netherlands has its own responsibility under the Dutch Constitution as well as under international law.
Privacy First Foundation
Update 19 March 2019: Regrettably, today the House of Representatives has adopted the legislative proposal almost unchanged; only GroenLinks, SP, PvdD and Denk voted against. Unfortunately, a motion by GroenLinks and SP to provoke legal action by the European Commission against the Dutch government about the PNR directive was rejected. The only bright spot is the widely adopted motion for the judicial reassessment and possible revision of the PNR directive at a European political level. (Only PVV and FvD voted against this motion.) Next stop: the Senate.
Update 4 June 2019: despite sending the above letter for a second time and despite other critical input by Privacy First, the Senate today has unfortunately adopted the legislative proposal. Only GroenLinks, PvdD and SP voted against. Even in spite of the enormous error rates (false positives) of 99.7% that recently came to light in the comparable German PNR system, see https://www.sueddeutsche.de/digital/fluggastdaten-bka-falschtreffer-1.4419760. Meanwhile, large scale cases have been brought against the European PNR directive in Germany and Austria in order for the European Court of Justice to nullify it on account of violations of the right to privacy, see the German-English campaign website https://nopnr.eu and https://www.nrc.nl/nieuws/2019/05/15/burgers-in-verzet-tegen-opslaan-passagiersgegevens-a3960431. As soon as the European Court rules that the PNR directive is unlawful, Privacy First will start interim injunction proceedings in order for the Dutch PNR law to be rendered inoperative. Moreover, yesterday Privacy First has put the PNR law on the agenda of the UN Human Rights Committee in Geneva. On 1 and 2 July 2019, the overall human rights situation in the Netherlands (including violations of the right to privacy) will be critically reviewed by this Committee.