A coalition of civil rights organizations in the Netherlands that had previously won a lawsuit against System Risk Indication (SyRI) is calling on the Dutch Senate to reject an even more sweeping Bill dubbed ‘Super SyRI’. According to the parties, the proposal is on a collision course with the rule of law while the Dutch government refuses to learn lessons from the childcare benefits scandal, one of the largest scandals in Dutch politics in recent decades.
The Data Processing by Partnerships Act (Wet Gegevensverwerking door Samenwerkingsverbanden, WGS) enables Dutch government agencies and companies to link together the data stored about citizens and companies through partnerships. Public authorities and companies that take part in such cooperative frameworks are obliged to pool together their data. This should help in the fight against all kinds of crime and offenses.
Under the Act, it is not just data that companies and public authorities share with each other. Signals, suspicions and blacklists are also exchanged and linked together. On the basis of this form of shadow record-keeping, these parties can coordinate with each other enforcement ‘interventions’ against citizens who end up in their crosshairs.
Public authorities and companies targeting citizens through data surveillance
In order to enable the large-scale sharing of personal data between public authorities and companies, the Act casts aside numerous confidentiality obligations, privacy rights and legal safeguards that have traditionally applied to the processing of personal data. This leads to a "far-reaching, large-scale erosion of the legal protection of citizens", according to the opposing coalition of which Privacy First is a member: "If this Bill is adopted, the door will be left wide open for the executive branch of the government and private parties to subject both citizens and companies to arbitrary data surveillance."
Through the Act, the Dutch government also wants to create the possibility to start new partnerships in case of ‘urgency’, without providing Parliament the opportunity of examination. The Dutch House of Representatives will be informed about such partnerships only after their establishment, then having to decide whether to pass them into law. This is contrary to the Dutch Constitution, which stipulates that legislation approved by Parliament should include privacy protections. The parties find it unacceptable that Parliament is not involved in the formation of new partnerships and can decide on them only after they have been established.
Legitimizing unlawful practices that have lasted for years
In addition to the possibility of establishing new partnerships, the Act includes four partnerships that have been around for years, but have never been laid down in law. The cabinet now wants to retroactively create a legal basis for these partnerships.
The parties that brought legal proceedings against System Risk Indication (SyRI) point out that SyRI, which was prohibited by the court, was also used for years without a legal basis. According to the parties, there are strong similarities with the partnerships that the new Bill is now intended to legitimize: "Drastic practices in which personal data are processed in violation of the fundamental rights of citizens were set up as a trial and continued for years, only to be given a legal basis as a fait accompli. Fundamental rights that should protect citizens against unjustified government action thereby become mere obstacles for the government to overcome."
Risk assessments, blacklists and suspicions
The coalition previously wrote that the practices under the Act are in many ways similar to the data processing that preceded the childcare benefits scandal that sent shock waves through Dutch society. Based on secret data analyses, lists of citizens who had been falsely labeled by the tax authorities as criminal fraudsters were distributed through various agencies, ruining the personal lives of tens of thousands of families. Under the partnerships that would be made possible by the Act, public authorities and companies would be able to abundantly share risk analyses, blacklists and many other types of data, suspicions and signals about citizens. The Dutch Data Protection Authority advised the Senate in November 2021 not to pass the law, stating that the proposal could lead to "Kafkaesque situations for large numbers of people".
The civil society coalition against SyRI consists of the Dutch Civil Rights Platform (Platform Bescherming Burgerrechten), the Dutch Lawyers Committee for Human Rights (NJCM), Dutch trade union FNV, the Dutch National Clients Council, Privacy First, the KDVP Foundation and authors Maxim Februari and Tommy Wieringa.
Download the recent letter by the coalition to the Dutch Senate HERE (pdf in Dutch).
Source: https://bijvoorbaatverdacht.nl/syri-coalitie-eerste-kamer-moet-datasurveillancewet-super-syri-afwijzen/, 15 February 2022.
Summary proceedings against massive privacy violation by Automatic Number Plate Recognition (ANPR) camera surveillance
Challenging large-scale privacy violations in court has long been Privacy First’s established practice. In recent years, Privacy First has successfully done so against the central storage in the Netherlands of everyone’s fingerprints under the Dutch Passport Act, against the storage of everyone’s communications data under the Dutch Telecommunications Data Retention Act and – in coalition with other parties – against large-scale risk profiling of innocent citizens through the Dutch System Risk Indication (SyRI).
A current and urgent issue that equally merits going to court over, concerns the Dutch legislation on Automatic Number Plate Recognition (ANPR) which applies since 2019 under Art. 126jj of the Dutch Code of Penal Procedure. Under this piece of law, the number plate codes of millions of cars in the Netherlands (i.e. everyone’s travel movements) are stored continuously for four weeks in a central police database for criminal investigation purposes, regardless of whether one is suspected of anything. This is totally unnecessary, completely disproportionate and also ineffective, as was revealed in evaluation reports published today by the Dutch Research and Documentation Center (‘WODC’, part of the Dutch Ministry of Justice and Security). Supervision is lacking and the system can easily be abused, newspaper NRC Handelsblad recently confirmed in its reporting.
Privacy First has therefore prepared a lawsuit to have the ANPR legislation repealed on account of violation of European privacy law. Summary proceedings against the Dutch government will take place at the district court of The Hague on 10 November 2021. Through Pro Bono Connect, Privacy First has engaged CMS as the law firm that will take care of the litigation in this case. Our summons in summary proceedings can be found HERE (pdf in Dutch). If necessary, these preliminary proceedings will be followed by broader proceedings on the merits. After all, there is no doubt that the current ANPR law constitutes a massive privacy violation and simply does not belong in a free democratic society. Considering the relevant European case law, Privacy First deems the likelihood of successful legal action very high.
Case details: Privacy First vs. the State (Dutch Ministry of Justice and Security), Wednesday 10 November 2021 11.00 am, The Hague district court. You are welcome to attend the court hearing. A route description in Dutch can be found here.
Update November 8, 2021: due to Corona restrictions, it appears that the court is only willing to allow two (already registered) visitors at the court hearing. However, due to high public interest, there will be a livestream: https://www.rechtspraak.nl/Organisatie-en-contact/Organisatie/Rechtbanken/Rechtbank-Den-Haag/Nieuws/Paginas/Livestream-rechtszaak-stichting-Privacy-First-tegen-de-Staat.aspx.
Update November 10, 2021: the court hearing took place today; click HERE for our lawyer's pleading (pdf in Dutch). The court's ruling is scheduled for December 1st.
Update December 1, 2021: today the district court of The Hague rendered its judgment. In the judgment, the court first of all established that Privacy First is admissible in this case as a non-profit interest group for the protection of the privacy of all citizens in the Netherlands. This again establishes that Privacy First can conduct these and subsequent legal proceedings in the public interest. Subsequently, however, the court ruled that in these preliminary relief proceedings there was no sufficiently urgent interest. Privacy First finds this judgment incomprehensible, since in the case of a daily massive privacy violation by definition there is an urgent interest to have that violation legally reviewed and to have it stopped. Privacy First will now commence proceedings on the merits against the ANPR legislation and is also considering lodging an urgent appeal against the current judgment with the Court of Appeal of The Hague. In view of relevant European case law, Privacy First still considers the chances of successful legal action exceptionally high.
The ANPR legislation at issue in Privacy First's lawsuit relates to the mass collection and storage of everyone's "historical" ANPR data, also known as "no hits". This should be distinguished from the many years of police practice where license plates of suspects (so-called "hits") can be used for criminal investigations. Dutch media are regularly confused about this as a result of misleading government information, for example on the websites of the Dutch National Police and the Public Prosecution Service. Privacy First regrets such deception and hopes that the media will not be misled by this.
Would you like to support these legal proceedings? Then please consider becoming a donor! Privacy First consists largely of volunteers and is entirely dependent on sponsorship and donations to pursue litigation.
Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.
Dear Members of Parliament,
Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.
Lack of necessity and effectiveness
With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.
In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.
Risks of misuse
There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.
For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.
Testing instead of apps
According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.
Haste is rarely a good thing
If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.
Privacy by design
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The Privacy First Foundation
Dear Members of Parliament,
You have received our position paper, this is our oral explanation.
First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.
With this in mind, we look at three legal principles:
- Legitimate purpose limitation.
- What is the problem?
- What is the scale of the problem?
- What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?
It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.
Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.
- Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.
- Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.
On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.
With great concern, Privacy First has taken note of the intention of the Dutch government to employ special apps in the fight against the coronavirus. In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatisation and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect. Furthermore, there is a substantial risk that the collected data will be used and misued for multiple (illegitimate) purposes by companies and public authorities. Moreover, if these data fall into the hands of criminal organizations, they will be a gold mine for criminal activities. For Privacy First, these risks of Corona apps do not outweigh their presumed benefits.
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional State. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
Today, the district court of The Hague ruled on the use of the algorithm-based system SyRI (System Risk Indication) by the Dutch government. The judges decided that the government, in trying to detect social services fraud, has to stop profiling citizens on the basis of large scale data analysis. As a result, people in the Netherlands are no longer 'suspected from the very start’ ("bij voorbaat verdacht").
The case against the Dutch government was brought by a coalition of NGOs, consisting of the Dutch Platform for the Protection of Civil Rights (Platform Bescherming Burgerrechten), the Netherlands Committee of Jurists for Human Rights (Nederlands Juristen Comité voor de Mensenrechten, NJCM), Privacy First, the KDVP Foundation (privacy in mental healthcare), Dutch trade union FNV, the National Clients Council (LCR) and authors Tommy Wieringa and Maxim Februari.
The court concludes that SyRI is in violation of the European Convention on Human Rights. SyRI impinges disproportionately on the private life of citizens. This concerns not only those that SyRI has flagged as an 'increased risk', but everyone whose data are analysed by the system. According to the court, SyRI is non-transparent and therefore cannot be scrutinized. Citizens can neither anticipate the intrusion into their private life, nor can they guard themselves against it.
Moreover, the court draws attention to the actual risk of discrimination and stigmatization on the grounds of socio-economic status and possibly migration background, of citizens in disadvantaged urban areas where SyRI is being deployed. There is a risk – which cannot be examined – that SyRI operates on the basis of prejudices. The attorneys of the claimant parties, Mr. Ekker and Mr. Linders, had this to say: "The court confirms that the large scale linking of personal data is in violation of EU law, Dutch law and fundamental human rights, including the protection of privacy. Therefore, this ruling is also important for other European countries and on a wider international level."
From now on, as long as there is no well-founded suspicion, personal data from different sources may no longer be combined.
Line in the sand
"This ruling is an important line in the sand against the unbridled collection of data and risk profiling. The court puts a clear stop to the massive surveillance that innocent citizens have been under. SyRI and similar systems should be abolished immediately", states Privacy First director Vincent Böhre.
"Today we have been proved right on all fundamental aspects. This is a well-timed victory for the legal protection of all citizens in the Netherlands", says Tijmen Wisman of the Platform for the Protection of Civil Rights.
Another plaintiff in the case, trade union FNV, equally rejects SyRI on principal grounds. "We are delighted that the court has now definitively cancelled SyRI", comments Kitty Jong, vice chair of FNV.
The parties hope that the ruling will herald a turning point in the way in which the government deals with the data of citizens. They believe this viewpoint is endorsed by the considerations of the court: these apply not only to SyRI, but also to similar practices. Many municipalities in the Netherlands have their own data linking systems which profile citizens for all sorts of policy purposes. When it comes to combining data, a legislative proposal that would be greater in scope than SyRI and would enable lumping together the databases of private parties and those of public authorities, was all but unthinkable. The decision by the Hague district court, however, clamps down on these Big Data practices. According to the claimant parties, it is therefore of crucial importance that the SyRI ruling will affect both current as well as future political policies.
The case against SyRI serves both a legal and a social goal. With this ruling, both goals are reached. Merel Hendrickx of PILP-NJCM: "Apart from stopping SyRI, we also aimed at initiating a public debate about the way the government deals with citizens in a society undergoing digitisation. This ruling shows how important it is to have that discussion."
Although SyRI was adopted in 2014 without any fuss, the discussion about its legality intensified after the lawsuit was announced. At the start of 2019, the use of SyRI in two Rotterdam neighbourhoods led to protests among inhabitants and a discussion in the municipal council. Soon after, the mayor of Rotterdam, Ahmed Aboutaleb, pulled the plug on the SyRI program because of doubts over its legal basis. In June 2019, Dutch newspaper Volkskrant revealed that SyRI had not detected a single fraudster since its inception. In October 2019, the UN Special Rapporteur on extreme poverty and human rights, Philip Alston, wrote a critical letter to the district court of The Hague expressing serious doubts over the legality of SyRI. Late November 2019, SyRI won a Big Brother Award.
The coalition of parties was represented in court by Anton Ekker (Ekker Advocatuur) and Douwe Linders (SOLV Attorneys). The proceedings were coordinated by the Public Interest Litigation Project (PILP) of the NJCM.
The full ruling of the court can be found HERE (official translation in English).
Fundamental lawsuit against mass risk profiling of unsuspected citizens
On Tuesday October 29 at 9:30 am in the district court of The Hague the court hearing will take place in the main proceedings of a broad coalition of Dutch civil society organizations against Systeem Risico Indicatie (System Risk Indication - SyRI). SyRI uses secret algorithms to screen entire residential areas to profile citizens on the risk of fraud with social services. According to the coalition of plaintiffs, this system poses a threat to the rule of law and SyRI must be declared unlawful.
The group of plaintiffs, consisting of the Dutch Platform for the Protection of Civil Rights, the Netherlands Committee of Jurists for Human Rights (NJCM), the Privacy First Foundation, the KDVP Foundation and the National Client Council (LCR), in March 2018 sued the Dutch Ministry of Social Affairs. Authors Tommy Wieringa and Maxim Februari, who previously spoke very critically about SyRI, joined the proceedings in their personal capacity. In July 2018, Dutch labour union FNV also joined the coalition.
The parties are represented by Anton Ekker (Ekker Advocatuur) and Douwe Linders (SOLV Attorneys). The case is coordinated by the Public Interest Litigation Project (PILP) of the NJCM.
Trawl method on unsuspected citizens
SyRI links the personal data of citizens from various government databases on a large scale. These centrally collected data are subsequently analyzed by secret algorithms. This should show whether citizens pose a risk of being guilty of one of the many forms of fraud and violations that the system covers. If the analysis of SyRI leads to a risk notification, then the citizen in question will be included in the so-called Risk Notices Register (Register Risicomeldingen), which can be accessed by government authorities.
SyRI uses this trawl method to screen all residents of a neighborhood or area. For this, the system uses almost all data that government authorities store about citizens. It comprises 17 data categories, which together provide a very intrusive picture of someone's private life. SyRI currently covers the databases of the Dutch Tax Authorities, Inspectorate of Social Affairs, Employment Office, Social Security Bank, municipalities and the Immigration Service. According to the Dutch Council of State (Raad van State), which gave a negative opinion on the SyRI bill, it was hard to imagine any data that did not fall within the scope of the system. Former chairman Kohnstamm of the Dutch Data Protection Authority, which also issued a negative opinion on the system, called the adoption of the SyRI legislation "dramatic" at the time.
Threat to the rule of law
According to the claimants, SyRI is a black box with major risks for the democratic rule of law. It is completely unclear to any citizen, who can be screened by SyRI without cause, what data are used for this, which analysis is carried out with it and what makes him or her a 'risk'. Moreover, due to the secret operation of SyRI, citizens are also unable to refute an incorrect risk indication. The use of SyRI makes the legal process and the associated procedures intransparent.
SyRI thereby undermines the relationship of trust between the government and its citizens; these citizens are in fact suspected in advance. Virtually all information that they share with the government, often to be eligible for basic services, can be used against them secretly without any suspicion.
The plaintiffs in this lawsuit are not opposed to the government combating fraud. They just think that this should be done on the basis of a concrete suspicion. There should be no trawl searches in the private life of unsuspected Dutch citizens to look for possible fraud risks. According to the claimants, this disproportionate method does more harm than good. There are better and less radical forms of fraud prevention than SyRI.
Not one fraudster detected yet
The total of five SyRI investigations that have been announced since the system's legal introduction have by now turned tens of thousands of citizens inside out, but have not yet detected one fraudster. This was revealed at the end of June 2019 by Dutch newspaper Volkskrant, which managed to get hold of evaluations of SyRI investigations. The investigations failed because the analyses were incorrect, due to lack of capacity and time at the implementing bodies, but also because there is disagreement within the government about SyRI.
For example, mayor Aboutaleb of Rotterdam pulled the plug from the SyRI investigation in two neighborhoods in Rotterdam South last summer, because the Ministry, unlike the municipality, also wanted to use police and healthcare data in the investigation. The deployment of SyRI also led to protest among the neighborhood's residents, who clearly showed that they felt insulted and unfairly treated.
UN expresses concern about SyRI
The UN Special Rapporteur on extreme poverty and human rights Philip Alston wrote to the court earlier this month about his concerns about SyRI and urged the judges to thoroughly assess the case. According to the rapporteur, several fundamental rights are at stake. SyRI is described in his letter as a digital equivalent of a social detective who visits every household in an area without permission and searches for fraudulent cases; in the analogue world such a massive manhunt would immediately lead to great resistance, but with a digital instrument such as SyRI, it is wrongly claimed that 'ignorance is bliss'.
The court hearing is open to the public and will take place on Tuesday October 29th from 9.30 am in the Palace of Justice, Prins Clauslaan 60 in The Hague. Case number: C/09/550982 HA ZA 18/388 (Nederlands Juristen Comité c.s./Staat).
Source: campaign website Bijvoorbaatverdacht.nl.
On July 1 and 2, 2019, the Netherlands will be examined in Geneva by the United Nations Human Rights Committee. This UN body is tasked with supervising the compliance of one of the oldest and most important human rights treaties in the world: the International Covenant on Civil and Political Rights (ICCPR). Each country which is a contracting party to the ICCPR is subject to periodical review by the UN Human Rights Committee. At the beginning of next week, the Dutch government must answer before the Committee for various current privacy issues that have been put on the agenda by Privacy First among others.
The previous Dutch session before the UN Human Rights Committee dates from July 2009, when the Dutch minister of Justice Ernst Hirsch Ballin had to answer for the then proposed central storage of fingerprints under the new Dutch Passport Act. This was a cause for considerable criticism of the Dutch government. Now, ten years on, the situation in the Netherlands will be examined once more. Against this background, Privacy First had submitted to the Committee a critical report (pdf) at the end of 2016, and has recently supplemented this with a new report (pdf). In a nutshell, Privacy First has brought the following current issues to the attention of the Committee:
- the limited admissibility of interest groups in class action lawsuits
- the Dutch ban on judicial review of the constitutionality of laws
- Automatic Number Plate Recognition (ANPR)
- border control camera system @MIGO-BORAS
- the Dutch public transport chip card ('OV-chipkaart')
- Electronic Health Record systems
- possible reintroduction of the Telecommunications Data Retention Act
- the new Dutch Intelligence and Security Services Act (‘Tapping Law’)
- Passenger Name Records (PNR)
- the Dutch abolition of consultative referendums
- the Dutch non-recognition of the international prohibition of propaganda for war.
The entire Dutch session before the Committee can be watched live on UN Web TV on Monday afternoon, July 1, and Tuesday morning, July 2. In addition to privacy issues, several Dutch organizations have put numerous other human rights issues on the agenda of the Committee; click HERE for an overview, which also features the previously established List of Issues (including the new Intelligence and Security Services Act, the possible reintroduction of the retention of telecommunications data, camera system @MIGO-BORAS, and medical confidentiality with health insurance companies). The Committee will likely present its ‘Concluding Observations’ within a matter of weeks. Privacy First awaits the outcome of these observations with confidence.
Update July 26, 2019: yesterday afternoon the Committee has published its Concluding Observations on the human rights situation in the Netherlands, which includes critical opinions on two privacy issues that were brought to the attention of the Committee by Privacy First:
The Intelligence and Security Services Act
The Committee is concerned about the Intelligence and Security Act 2017, which provides intelligence and security services with broad surveillance and interception powers, including bulk data collection. It is particularly concerned that the Act does not seem to provide for a clear definition of bulk data collection for investigation related purpose; clear grounds for extending retention periods for information collected; and effective independent safeguards against bulk data hacking. It is also concerned by the limited practical possibilities for complaining, in the absence of a comprehensive notification regime to the Dutch Oversight Board for the Intelligence and Security Services (CTIVD) (art. 17).
The State party should review the Act with a view to bringing its definitions and the powers and limits on their exercise in line with the Covenant and strengthen the independence and effectiveness of CTIVD and the Committee overseeing intelligence efforts and competences that has been established by the Act.
The Market Healthcare Act
The Committee is concerned that the Act to amend the Market Regulation (Healthcare) Act allows health insurance company medical consultants access to individual records in the electronic patient registration without obtaining a prior, informed and specific consent of the insured and that such practice has been carried out by health insurance companies for many years (art. 17).
The State party should require insurance companies to refrain from consulting individual medical records without a consent of the insured and ensure that the Bill requires health insurance companies to obtain a prior and informed consent of the insured to consult their records in the electronic patient registration and provide for an opt-out option for patients that oppose access to their records.
During the session in Geneva the abolition of the referendum and the camera system @MIGO-BORAS were also critically looked at. However, Privacy First regrets that the Committee makes no mention of these and various other current issues in its Concluding Observations. Nevertheless, the report by the Committee shows that the issue of privacy is ever higher on the agenda of the United Nations. Privacy First welcomes this development and will continue in the coming years to encourage the Committee to go down this path. Moreover, Privacy First will ensure that the Netherlands will indeed implement the various recommendations by the Committee.
A group of civil society organizations is bringing a case against the Dutch government because of System Risk Indication, better known by the abbreviation SyRI. According to the plaintiffs, this risk profiling system is a black box that should be stopped as it forms a risk to the democratic rule of law.
The coalition of plaintiffs consists of the Netherlands Committee of Jurists for Human Rights (NJCM), the Dutch Platform for the Protection of Civil Rights (Platform Bescherming Burgerrechten), Privacy First, the KDVP Foundation (privacy in mental healthcare) and the National Clients Council (LCR). Two well-known authors, Tommy Wieringa and Maxim Februari, have in their individual capacities joined the case as plaintiffs. As ‘ambassadors’ to this lawsuit, they have fiercely criticized SyRI on multiple occasions.
The proceedings are carried out by Deikwijs Attorneys under the guidance of the Public Interest Litigation Project (PILP) of the NJCM.
Trawl net actions on the basis of secret algorithms targeting innocent citizens
SyRI links together on a large scale personal data of innocent citizens from databases of public authorities and companies. With the use of secret algorithms, citizens are subsequently subjected to a risk analysis. When there is an increased risk of breaking one of the many laws that SyRI covers, individuals are included in the Risk Reports Register, which is accessible to many government agencies.
SyRI is a black box that poses a major threat to the democratic rule of law. Citizens who are being examined through SyRI without any justification, have absolutely no idea which of their data are being used for analyses, what kind of analyses are being carried out and what actually determines whether or not they are a ‘risk’. Because SyRI works surreptitiously, citizens are not in a position to refute any incorrect flagging that may concern them.
According to the coalition, SyRI is in breach of various fundamental rights while it simultaneously undermines the relationship of trust between citizens and those in power. Citizens are suspect from the very start and all of the information that they share with public authorities, may secretly be used against them without imputation or concrete ground.
Ministry refuses to operate in a transparent manner
Despite fundamental objections from the Dutch Council of State (Raad van State) and the Dutch Data Protection Authority about the lawfulness of the system, at the end of 2014 the legislation for SyRI was rubber-stamped by the Dutch Senate and the House of Representatives. However, SyRI has been in use ever since 2008 already. Since then, dozens of investigations have been carried out and this included examining entire neighborhoods in several Dutch cities. Once the system was specified in law, it has been applied in Eindhoven and Capelle aan den IJssel among other places. It was recently announced that SyRI will be used in the Rotterdam neighborhoods of Bloemhof en Hillesluis and in the Haarlem neighborhood of Schalkwijk.
A FOIA request submitted by the coalition has resulted in barely any information concerning the dozens of SyRI investigations that have been carried out prior to and after the system had been laid down in law in 2014. The Dutch Ministry of Social Affairs is unwilling to provide insight into its practices arguing that, by disclosing the data and risk models that are used in SyRI, cunning citizens would become aware what to look out for when they commit fraud. The claimants, in their turn, assert that this is not in line with the obligation to inform and the right to a fair trial.
In the context of this lawsuit, a public information campaign called ‘Bij Voorbaat Verdacht’ (‘Suspect From The Very Start’) has been launched. On the (Dutch) campaign website you can find updates about the legal proceedings as well as a simplified summary of the subpoena. The complete subpoena (in Dutch) can be found on the website of Deikwijs Attorneys (pdf). Click HERE for the English version on the website of PILP (pdf).
Update 16 October 2018: the District Court of The Hague has allowed the Dutch Federation of Trade Unions (FNV) as co-plaintiff in the lawsuit.
Tomorrow morning the Netherlands will be examined in Geneva by the highest human rights body in the world: the United Nations Human Rights Council. Since 2008, the Human Rights Council reviews the human rights situation in each UN Member State once every five years. This procedure is called the Universal Periodic Review (UPR).
Privacy First shadow report
During the previous two UPR sessions in 2008 and 2012, the Netherlands endured a fair amount of criticism. At the moment, the perspectives with regard to privacy in the Netherlands are worse than they’ve ever been before. This is reason for Privacy First to actively bring a number of issues to the attention of the UN. Privacy First did so in September 2016 (a week prior to the UN deadline), through a so-called shadow report: a report in which civil society organizations express their concerns about certain issues. (It’s worth pointing out that the Human Rights Council imposes rigorous requirements on these reports, a strict word limit being one of them.) UN diplomats rely on these reports in order to properly carry out their job. Otherwise, they would depend on one-sided State-written reports that mostly provide a far too optimistic view. So Privacy First submitted its own report about the Netherlands (pdf), which includes the following recommendations:
Better opportunities in the Netherlands for civil society organizations to collectively institute legal proceedings.
Introduction of constitutional review of laws by the Dutch judiciary.
Better legislation pertaining to profiling and datamining.
No introduction of automatic number plate recognition (ANPR) as is currently being envisaged.
Suspension of the unregulated border control system @MIGO-BORAS.
No reintroduction of large scale data retention (general Data Retention Act).
No mass surveillance under the new Intelligence and Security Services Act and closer judicial supervision over secret services.
Withdrawal of the Computer Criminality Act III , which will allow the Dutch police to hack into any ICT device.
A voluntary and regionally organized (instead of a national) Electronic Health Record system with privacy by design.
Introduction of an anonymous public transport chip card that is truly anonymous.
Privacy First did not sent its report only to the Human Rights Council but also forwarded it to all the foreign embassies in The Hague. Consequently, Privacy First had extensive (confidential) meetings in recent months with the embassies of Argentina, Australia, Bulgaria, Chili, Germany, Greece and Tanzania. The positions of our interlocutors varied from senior diplomats to ambassadors. Furthermore, Privacy First received positive reactions to its report from the embassies of Mexico, Sweden and the United Kingdom. Moreover, several passages from our report were integrated in the UN summary of the overall human rights situation in the Netherlands; click HERE ('Summary of stakeholders' information', par. 47-50).
Our efforts will hopefully prove to have been effective tomorrow. However, this cannot be guaranteed as it concerns an inter-State, diplomatic process and many issues in our report (and in recent talks) are sensitive subjects in countless other UN Member States as well.
UN Human Rights Committee
In December 2016, Privacy First submitted a similar report to the UN Human Rights Committee in Geneva. This Committee periodically reviews the compliance of the Netherlands with the International Covenant on Civil and Political Rights (ICCPR). Partly as a result of this report, last week the Committee put the Intelligence and Security Services Act, camera system @MIGO-BORAS and the Data Retention Act among other things, on the agenda for the upcoming Dutch session in 2018 (see par. 11, 27).
We hope that our input will be used by both the UN Human Rights Council as well as the UN Human Rights Committee and that it will lead to constructive criticism and internationally exchangeable best practices.
The Dutch UPR session will take place tomorrow between 9am and 12.30pm and can be followed live online.
Update 10 May 2017: during the UPR session in Geneva today, the Dutch government delegation (led by Dutch Minister of Home Affairs Ronald Plasterk) received critical recommendations on human rights and privacy in relation to counter-terrorism by Canada, Germany, Hungary, Mexico and Russia. The entire UPR session can be viewed HERE. Publication of all recommendations by the UN Human Rights Council follows May 12th.
Update 12 May 2017: Today all recommendations to the Netherlands have been published by the UN Human Rights Council, click HERE (pdf). Useful recommendations to the Netherlands regarding the right to privacy were made by Germany, Canada, Spain, Hungary, Mexico and Russia, see paras. 5.29, 5.30, 5.113, 5.121, 5.128 & 5.129. You can find these recommendations below. Further comments by Privacy First will follow.
Extend the National Action Plan on Human Rights to cover all relevant human rights issues, including counter-terrorism, government surveillance, migration and human rights education (Germany);
Extend the National Action Plan on Human Rights, published in 2013 to cover all relevant human rights issues, including respect for human rights while countering terrorism, and ensure independent monitoring and evaluation of the Action Plan (Hungary);
Review any adopted or proposed counter-terrorism legislation, policies, or programs to provide adequate safeguards against human rights violations and minimize any possible stigmatizing effect such measures might have on certain segments of the population (Canada);
Take necessary measures to ensure that the collection and maintenance of data for criminal [investigation] purposes does not entail massive surveillance of innocent persons (Spain);
Adopt and implement specific legislation on collection, use and accumulation of meta-data and individual profiles, including in security and anti-terrorist activities, guaranteeing the right to privacy, transparency, accountability, and the right to decide on the use, correction and deletion of personal data (Mexico);
Ensure the protection of private life and prevent cases of unwarranted access of special agencies in personal information of citizens in the Internet that have no connection with any illegal actions (Russian Federation). [sic]
Update 26 May 2017: a more comprehensive UN report of the UPR session has now been published (including the 'interactive dialogue' between UN Member States and the Netherlands); click HERE (pdf). In September this year, the Dutch government will announce which recommendations it will accept and implement.
Christmas column by Bas Filippini,
Chairman of the Privacy First Foundation
Principles of our democratic constitutional State are still very relevant
‘‘Your choice in a free society’’ is the slogan of the Privacy First Foundation. Privacy First has defined its principles on the basis of universal human rights and our Dutch Constitution and is reputed for professional and, if necessary, legal action in line with our free constitutional State. The mere fact that Privacy First exists, means that in recent years the aforementioned principles have come under increasing pressure. We base our (legal) actions and judgements on thorough fact-finding, to the extent possible in our working area.
‘The Netherlands as a secure global pioneer in the field of privacy’, that’s our motto. This country should also serve as an example of how to use technology whilst maintaining the principles of our open and free society. This can be achieved through legislative, executive and IT infrastructures, starting from privacy by design and making use of privacy enhanced technology.
Whereas the industrial revolution has environmental pollution as a negative side effect, the information revolution has the ‘pollution of privacy and freedom’ as an unwanted side effect.
Therefore, the question is how to preserve the basic principles of our democratic constitutional State and how to support new structures and services towards the future. As far as we’re concerned, these basic principles are neither negotiable nor exchangeable. Yet time and again we see the same incident-driven politics based on the misconceptions of the day strike at times when the constitutional State is at its most vulnerable and cannot defend itself against the emotional tide of the moment.
Paris as yet another excuse to pull through ‘new’ laws
Various politicians feed on the attacks in Paris and tumble over one another to express Orwellian macho talk, taking things further and further in legislative proposals or in emotional speeches characterized by belligerence and rhetoric. And it’s always so predictable: further restraining existing freedoms of all citizens instead of focusing further on the group of adolescents (on average, terrorist attackers are between 18 and 30 years old) that intelligence agencies already have in sight. Instead of having a discussion about how intelligence agencies can more effectively tackle the already defined group that needs to be monitored and take preventive measures in the communication with and education of this target group, the focus too easily shifts to familiar affairs whereby necessity, proportionality and subsidiarity are hard to find.
So in the meanwhile we’ve witnessed the prolonged state of emergency in France, the far reaching extension of powers of the police, the judiciary and intelligence services (also to the detriment of innocent citizens), extra controls in public space, the retention of passenger data, etc., etc. All this apparently for legitimate reasons in the heat of the moment, but it will be disastrous for our freedom both in the short as well as in the long run. In this respect the blurring definition of the term ‘terrorism’ is striking. Privacy First focuses on government powers in relation to the presumption of innocence that citizens have. We’re in favour of applying special powers in dealing with citizens who are under reasonable suspicion of criminal offences and violate the rights of others with their hate and violence. In fact, that’s exactly what the law says. Let’s first implement this properly, instead of introducing legislative proposals that throw out the baby with the bathwater.
The governments is committed to impossible 100 per cent security solutions
What often strikes me in conversations with civil servants is the idea that the government should provide 100 per cent solutions for citizens and applies a risk exclusion principle. This leads to a great deal of compartmentalization and paralyzation when it comes to possible government solutions in the area of security. Technology-based quick fixes are adhered to by default, without properly analyzing the cause of problems and looking at the implementation of existing legislation.
The government way of thinking is separate from citizens, who are not trusted in having legal capacity and are regarded as a necessary evil, as troublesome and as inconvenient in the performance of the government’s tasks. The idea that the government, serving its citizens, should offer as high a percentage as possible but certainly not a 100 per cent security (the final 10 per cent are very costly on the one hand and suffocating for society on the other) is not commonly shared. No civil servant and no politician is prepared to introduce policies to maintain an open society today (and 50 years from now) that entail any risk factors. However, in reality there will always be risks in an open society and it should be noted that a society is not a matter of course but something we should treat with great care.
Here in the Netherlands we’ve seen other forms of government before: from rule by royal decree to a bourgeoisie society and an actual war dictatorship. Every time we chose not to like these forms of society. What could possibly be a reason to be willing to go back to any of these forms and give up our freedoms instead of increasing them and enforcing them with technology? Especially in a society that has high levels of education and wherein citizens show to be perfectly able to take their own decisions on various issues. We hire the government and politics as our representatives, not the other way around. However, we’re now put up with a government that doesn’t trust us, is only prepared to deliver information on the basis of FOIA requests and requires us to hand over all information and communications about us and our deepest private lives as if we were prima facie suspects. That puts everything back to front and to me it embodies a one way trip to North Korea. You’ll be more than welcome there!
Political lobby of the industry
The industry’s persistence to overload the government and citizens with ICT solutions is unprecedented. Again and again here in the Netherlands and in Silicon Valley the same companies pop up that want to secure their Christmas bonus by marketing their products in exchange for our freedom. We’re talking about various electronic health records like the Child record and the Orwellian and centralized electronic patient record, the all-encompassing System Risk-Indication database, travel and residency records, road pricing, chips in number plates and cars, so-called automated guided vehicles (including illegal data collection by car manufacturers), number plate parking, automatic number plate recognition cameras, facial recognition in public space and counter-hacking by government agencies while voting computers are back on the agenda. Big Data, the Internet of things, the list goes on.
With huge budgets these companies promote these allegedly smart solutions, without caring about their dangers for our freedom. It’s alienating to see that the reversal of legal principles is creeping in and is being supported by various government and industry mantras. It’s as if a parasitic wasp erodes civil liberties: the outside looks intact but the inside is already empty and rotten.
From street terrorism to State terrorism
As indicated above, the information revolution leads to the restriction of freedom. It’s imperative to realize that after 4000 years of struggle, development and evolution we have come to our refined form of society and principles that are (relatively) universal for every free citizen. Just as most of us are born out of love, freedom and trust, to me these are also the best principles with which to build a society. We’re all too familiar with societies founded on hate, fear and government control and we have renounced them not so long ago as disastrous and exceptionally unpleasant. At the expense of many sacrifices and lives these principles have been enshrined in treaties, charters and constitutions and are therefore non-negotiable.
It’s high time to continue to act on the basis of these principles and make policy implementation and technology subordinate to this, taking into account the people’s needs and their own responsibility. In my eyes, a civil servant in the service of the people who places security above everything else, is nothing more than a State terrorist or a white collar terrorist who in the long term causes much more damage to our constitutional State and freedom than a so called street terrorist. The government and industry should have an immediate integrity discussion about this, after which clear codes can be introduced for privacy-sustainable governing and entrepreneurship.
Towards a secure global pioneer in the field of privacy
Privacy First would like to see government and industry take their own responsibility in protecting and promoting the personal freedom of citizens and in so doing use a 80/20 rule as far as security is concerned. By focusing on risk groups a lot of money and misery can be saved. Exceptions prove the rule, which in this case is a free and democratic constitutional State and not the other way around. Say yes to a free and secure Netherlands as a global pioneer in the field of privacy!