The hearing at the court of appeal in The Hague in the proceedings of Privacy First against the register for Ultimate Beneficial Owners (UBO) is scheduled for Monday, 27 September 2021.

Following the very critical advice of the European Data Protection Supervisor (EDPS), the district court of The Hague confirmed on 18 March 2021 that there is every reason to doubt the validity of the European money laundering directives that form the basis for the UBO register. The judge ruled that it cannot be excluded that the highest European court, the Court of Justice of the EU (CJEU), will conclude that the public nature of the UBO register is not in line with the principle of proportionality. Since a Luxembourg local court has already refered questions about this to the CJEU, the Dutch court in summary proceedings did not find it necessary to ask questions about it as well. Privacy First has appealed the judgment in these summary proceedings, taking the case to the court of appeal of The Hague. Our appeal summons can be found here (pdf in Dutch).

Privacy First requests the court of appeal to ask preliminary questions on the UBO register to the European Court of Justice and calls for the suspension of the operation of the UBO register until these questions have been answered. Privacy First also asks the court to temporarily suspend the public accessibility of the UBO register, at least until the CJEU has ruled on this matter. The court of appeal's ruling is expected a few weeks after the hearing on 27 September 2021.

‘‘The UBO register will put privacy-sensitive data of millions of people up for grabs’’, Privacy First’s attorney Otto Volgenant of Boekx Attorneys comments. ‘‘There are doubts from all sides whether this is an effective tool in the fight against money laundering and terrorism financing. It’s like using a sledgehammer to crack a nut. The Court of Justice of the EU will ultimately rule on this. I expect that it will annul the UBO register – at least its public accessibility. Until then, I advise UBOs not to submit any data to the UBO register. Once data have been made public, they cannot be retrieved.’’

Background of the lawsuit against the UBO register

Privacy First is bringing a lawsuit against the Dutch government regarding the UBO Register which was introduced in 2020. In summary proceedings, the invalidity of the EU regulations on which the UBO register is based are being invoked. The consequences of this new legislation are far-reaching. After all, it concerns very privacy-sensitive information. Data about the financial situation of natural persons will be out in the open. More than 1.5 million legal entities in the Netherlands that are listed in the Dutch Trade Register will have to disclose information about their ultimate beneficial owners. The UBO register is accessible to everyone, for €2.50 per retrieval. This level of public accessibility is not proportionate.

On 24 June 2020, the Dutch ‘Implementation Act on Registration of Ultimate Beneficial Owners of Companies and Other Legal Entities’ entered into force. Based on this new Act, a new UBO register linked to the Trade Register of the Netherlands Chamber of Commerce will contain information on all ultimate beneficial owners of companies and other legal entities incorporated in the Netherlands. This information must indicate the interest of the UBO, i.e. 25-50%, 50-75% or more than 75%. In any case, the UBO’s name, month and year of birth as well as nationality will be publicly available for everyone to consult, with all the privacy risks this entails.

Since 27 September 2020, newly established entities must register their UBO in the UBO Register. Existing legal entities have until March 27 2022 to register their UBOs. The law gives only very limited options for shielding information. This is only possible for persons secured by the police, for minors and for those under guardianship. The result will be that the interests of almost all UBOs will become public knowledge.

European Anti-Money Laundering Directive

This new law stems from the Fifth European Anti-Money Laundering Directive, which requires EU Member States to register and disclose to the public the personal data of UBOs. The aim of this is to combat money laundering and terrorist financing. According to the European legislator, the registration and subsequent disclosure of personal data of UBOs, including the interest that the UBO has in a company, contributes to that objective. The public nature of the register would have a deterrent effect on persons wishing to launder money or finance terrorism. But the effectiveness of a UBO register in the fight against money laundering and terrorism has never been substantiated.

Massive privacy violation and fundamental criticism

The question is whether the means does not defeat the purpose. Registering the personal data of all UBOs and making it accessible to everyone is a blanket measure of a preventive nature. 99.99% of all UBOs have nothing to do with money laundering or terrorist financing. If it was in fact proportionate to collect information on UBOs, it should be sufficient if that information is available to those government agencies involved in combating money laundering and terrorism. Making the information completely public is going too far. The European Data Protection Supervisor already ruled that this privacy violation is not proportionate. But this opinion has not led to an amendment of the European directive.

Leading up to the the debate on this law in the Dutch House of Representatives, fundamental criticism came from various quarters. The business community agitated because it feared – and now experiences – an increase in burdens and perceives privacy risks. UBOs of family-owned companies that have remained out of the public eye up until now are running major privacy and security risks. There was also a great deal of attention for the position of parties that attach great importance to the protection of data subjects, such as church communities and social organizations. As for associations and foundations that do not have owners, things are cumbersome: they have to put the data that is already in the Trade Register in another register. Unfortunately, this has not led to any changes in the regulations.

Dutch investigative journalism platform Follow the Money looked into the social costs of the Dutch UBO register. Follow the Money writes: ‘‘The UBO register entails costs, hassle and sometimes slightly absurd bureaucracy for millions of entrepreneurs and directors. The Ministry of Finance reckons the total costs of the register for the business community is 99 million Euros. Another 9 million Euros must be added for one-time implementation costs. When lawyer Volgenant hears about this amount, he reacts with dismay: 'The total costs are much higher than I thought! If you extrapolate that to the whole EU, the costs are astronomical.’’’

Favourable outcome of lawsuit is likely

Privacy First has initiated a lawsuit against the UBO register for violation of the fundamental right to privacy and the protection of personal data. Privacy First requests the Dutch judiciary to render the UBO register inoperative in the short term and to submit preliminary questions on this subject to the Court of Justice of the European Union. It would not be the first time privacy-violating regulations are repealed by the courts, something that previous Privacy First lawsuits attest to.

The Dutch law and also the underlying European directive are in conflict with the European Charter of Fundamental Rights as well as the General Data Protection Regulation. The legislator has created these regulations, but it is up to the courts to conduct a thorough review of them. Ultimately the judge will have the final say. If the (European) legislator does not pay enough attention to the protection of fundamental rights, then the (European) judge can cast the regulations aside. The Court of Justice of the European Union has previously declared regulations invalid due to privacy violations, for example the Telecom Data Protection Directive and the Privacy Shield. The Dutch courts also regularly invalidate privacy-invading regulations. Privacy First has previously successfully challenged the validity of legislation, for example in the proceedings about the Telecommunications Data Retention Act and in the proceedings against SyRI. Viewed against this background, the lawsuit against the UBO register is considered very promising.

Update 27 September 2021: this afternoon the court session took place in The Hague; click HERE for the pleading of our lawyer (pdf in Dutch). The judgment of the court of appeal is scheduled for 16 November 2021.

Do you have any questions? Please contact us or our attorney Otto Volgenant of Boekx Attorneys. Privacy First can use your help and would appreciate it if you would become a donor.

Published in Litigation

Summary proceedings against massive privacy violation by Automatic Number Plate Recognition (ANPR) camera surveillance

Challenging large-scale privacy violations in court has long been Privacy First’s established practice. In recent years, Privacy First has successfully done so against the central storage in the Netherlands of everyone’s fingerprints under the Dutch Passport Act, against the storage of everyone’s communications data under the Dutch Telecommunications Data Retention Act and – in coalition with other parties – against large-scale risk profiling of innocent citizens through the Dutch System Risk Indication (SyRI).

A current and urgent issue that equally merits going to court over, concerns the Dutch legislation on Automatic Number Plate Recognition (ANPR) which applies since 2019 under Art. 126jj of the Dutch Code of Penal Procedure. Under this piece of law, the number plate codes of millions of cars in the Netherlands (i.e. everyone’s travel movements) are stored continuously for four weeks in a central police database for criminal investigation purposes, regardless of whether one is suspected of anything. This is totally unnecessary, completely disproportionate and also ineffective, as was revealed in evaluation reports published today by the Dutch Research and Documentation Center (‘WODC’, part of the Dutch Ministry of Justice and Security). Supervision is lacking and the system can easily be abused, newspaper NRC Handelsblad recently confirmed in its reporting.

Privacy First has therefore prepared a lawsuit to have the ANPR legislation repealed on account of violation of European privacy law. Summary proceedings against the Dutch government will take place at the district court of The Hague on 10 November 2021. Through Pro Bono Connect, Privacy First has engaged CMS as the law firm that will take care of the litigation in this case. Our summons in summary proceedings can be found HERE (pdf in Dutch). If necessary, these preliminary proceedings will be followed by broader proceedings on the merits. After all, there is no doubt that the current ANPR law constitutes a massive privacy violation and simply does not belong in a free democratic society. Considering the relevant European case law, Privacy First deems the likelihood of successful legal action very high.

Case details: Privacy First vs. the State (Dutch Ministry of Justice and Security), Wednesday 10 November 2021 11.00 am, The Hague district court. You are welcome to attend the court hearing. A route description in Dutch can be found here

Update November 8, 2021: due to Corona restrictions, it appears that the court is only willing to allow two (already registered) visitors at the court hearing. However, due to high public interest, there will be a livestream: https://www.rechtspraak.nl/Organisatie-en-contact/Organisatie/Rechtbanken/Rechtbank-Den-Haag/Nieuws/Paginas/Livestream-rechtszaak-stichting-Privacy-First-tegen-de-Staat.aspx.

Update November 10, 2021: the court hearing took place today; click HERE for our lawyer's pleading (pdf in Dutch). The court's ruling is scheduled for December 1st.

Update December 1, 2021: today the district court of The Hague rendered its judgment. In the judgment, the court first of all established that Privacy First is admissible in this case as a non-profit interest group for the protection of the privacy of all citizens in the Netherlands. This again establishes that Privacy First can conduct these and subsequent legal proceedings in the public interest. Subsequently, however, the court ruled that in these preliminary relief proceedings there was no sufficiently urgent interest. Privacy First finds this judgment incomprehensible, since in the case of a daily massive privacy violation by definition there is an urgent interest to have that violation legally reviewed and to have it stopped. Privacy First will now commence proceedings on the merits against the ANPR legislation and is also considering lodging an urgent appeal against the current judgment with the Court of Appeal of The Hague. In view of relevant European case law, Privacy First still considers the chances of successful legal action exceptionally high.

The ANPR legislation at issue in Privacy First's lawsuit relates to the mass collection and storage of everyone's "historical" ANPR data, also known as "no hits". This should be distinguished from the many years of police practice where license plates of suspects (so-called "hits") can be used for criminal investigations. Dutch media are regularly confused about this as a result of misleading government information, for example on the websites of the Dutch National Police and the Public Prosecution Service. Privacy First regrets such deception and hopes that the media will not be misled by this.

Would you like to support these legal proceedings? Then please consider becoming a donor! Privacy First consists largely of volunteers and is entirely dependent on sponsorship and donations to pursue litigation.

Published in Litigation

The controversial and compulsory inclusion of fingerprints in passports has been in place in the EU since 2009. From that year on, fingerprints were also included in Dutch identity cards, even though under EU law there was no such obligation. While the inclusion of fingerprints in identity cards in the Netherlands was reversed in January 2014 due to privacy concerns, there is now new European legislation that will make the inclusion of fingerprints in identity cards compulsory as of August 2, 2021.

Dutch citizens can apply for a new identity card without fingerprints until August 2. After that, only people can do so who are ‘temporarily or permanently unable physically to have fingerprints taken’.

The Dutch Senate is expected to debate and vote on the amendment of the Dutch Passport Act in connection with the reintroduction of fingerprints in Dutch identity cards on July 13. In that context, Privacy First sent the following email to the Dutch Senate yesterday:


Dear Members of Parliament,

Since Privacy First was founded in 2008, we have opposed the mandatory collection of fingerprints for passports and identity cards. Since the introduction of the new Passport Act in 2009, Privacy First has done so through lawsuits, campaigns, freedom of information requests, political lobbying and by activating the media. Despite the subsequent Dutch discontinuation of the (planned) central storage of fingerprints in both national and municipal databases in 2011, everyone’s fingerprints are still taken when applying for a passport, and soon (as a result of the new European Regulation on ID cards) again for Dutch ID cards after this was retracted in 2014.

To date, however, the millions of fingerprints taken from virtually the entire adult population in the Netherlands have hardly been used in practice, as the biometric technology had already proven to be unsound and unworkable in 2009. The compulsory collection of everyone’s fingerprints under the Dutch Passport Act therefore still constitutes the most massive and longest-lasting privacy violation that the Netherlands has ever known.

Having read the current report of the Senate on the amendment of the Passport Act to reintroduce fingerprints in ID cards, Privacy First hereby draws your attention to the following concerns. In this context, we ask you to vote against the amendment of the law, in contravention of European policy. After all:

  1. As early as May 2016, the Dutch Council of State (Raad van State) ruled that fingerprints in Dutch identity cards violated the right to privacy due to a lack of necessity and proportionality, see https://www.raadvanstate.nl/pers/persberichten/tekst-persbericht.html?id=956 (in Dutch).
  2. Freedom of information requests from Privacy First have revealed that the phenomenon to be tackled (look-alike fraud with passports and identity cards) is so small in scale that the compulsory collection of everyone’s fingerprints is completely disproportionate and therefore unlawful. See: https://www.privacyfirst.nl/rechtszaken-1/wob-procedures/item/524-onthullende-cijfers-over-look-alike-fraude-met-nederlandse-reisdocumenten.html.
  3. In recent years, fingerprints in passports and identity cards have had a biometric error rate as high as 30%, see https://zoek.officielebekendmakingen.nl/kst-32317-163.html (Dutch State Secretary Teeven, January 31, 2013). Before that, Minister Donner (Security & Justice) admitted an error rate of 21-25%: see https://zoek.officielebekendmakingen.nl/kst-25764-47.html (April 27, 2011). How high are these error rates today?
  4. Partly because of the high error rates mentioned above, fingerprints in passports and ID cards are virtually not used to date, either domestically, at borders or at airports.
  5. Because of these high error percentages, former Dutch State Secretary Bijleveld (Interior and Kingdom Relations) instructed all Dutch municipalities as early as September 2009 to (in principle) refrain from conducting biometric fingerprint verifications when issuing passports and identity cards. After all, in the event of a ‘mismatch’, the ID document concerned would have to be returned to the passport manufacturer, which would lead to rapid societal disruption if the numbers were high. In this respect, the Ministry of the Interior and Kingdom Relations was also concerned about large-scale unrest and even possible violence at municipal counters. These concerns and the instruction of State Secretary Bijleveld still apply today.
  6. Since 2016, several individual Dutch lawsuits are still pending at the European Court of Human Rights in Strasbourg, challenging the mandatory issuing of fingerprints for passports and ID cards on the grounds of violation of Art. 8 ECHR (right to privacy).
  7. In any case, an exception should be negotiated for people who, for whatever reason, do not wish to give their fingerprints (biometric conscientious objectors, Art. 9 ECHR).
  8. Partly for the above reasons, fingerprints have not been taken for the Dutch identity card since January 2014. It is up to your Chamber to maintain this status quo and also to push for the abolition of fingerprints for passports.

For background information, see the report ‘Happy Landings' by the Scientific Council for Government Policy (WRR) that Privacy First director Vincent Böhre wrote in 2010. Partly as a result of this critical report (and the large-scale lawsuit brought by Privacy First et al. against the Passport Act), the decentralized (municipal) storage of fingerprints was largely abolished in 2011 and the planned central storage of fingerprints was halted.

For further information or questions regarding the above, Privacy First can be reached at any time.

Yours sincerely,

The Privacy First Foundation

Published in Law & Politics

As an NGO that promotes civil rights and privacy protection, Privacy First has been concerned with financial privacy for years. Since 2017, we have been keeping close track of the developments surrounding the second European Payment Services Directive (PSD2), pointing out the dangers to the privacy of consumers. In particular, we focus on privacy issues related to ‘account information service providers’ (AISPs) and on the dangerous possibilities offered by PSD2 to process personal data in more extensive ways.

At the end of 2017, we assumed that providing more adequate information and more transparency to consumers would be sufficient to mitigate the risks associated with PSD2. However, these risks turned out to be greater and of a more fundamental nature. We therefore decided to launch a bilingual (Dutch & English) website called PSD2meniet.nl in order to outline both our concerns and our solutions with regard to PSD2.

Central to our project is the Don’t-PSD2-Me-Register, an idea we launched on 7 January 2019 in the Dutch television program Radar and in this press release. The aim of the Don’t-PSD2-Me-Register is to provide a real tool to consumers with which they can filter out and thus protect their personal data. In time, more options to filter out and restrict the use of data should become available. With this project, Privacy First aims to contribute to positive improvements to PSD2 and its implementation.

Protection of special personal data

In this project, which is supported by the SIDN Fund, Privacy First has focused particularly on ‘special personal data’, such as those generated through payments made to trade unions, political parties, religious organizations, LGBT advocacy groups or medical service providers. Payments made to the Dutch Central Judicial Collection Agency equally reveal parts of people’s lives that require extra protection. These special personal data directly touch upon the issue of fundamental human rights. When consumers use AISPs under PSD2, their data can be shared more widely among third parties. PSD2 indirectly allows data that are currently protected, to become widely known, for example by being included in consumer profiles or black lists.

The best form of protection is to prevent special personal data from getting processed in the first place. That is why we have built the Don’t-PSD2-Me-Register, with an Application Programming Interface (API) – essentially a privacy filter – wrapped around it. With this filter, AISPs can detect and filter out account numbers and thus prevent special personal data from being unnecessarily processed or provided to third parties. Moreover, the register informs consumers and gives them a genuine choice as to whether or not they wish to share their data.

What’s next?

We have outlined many of the results we have achieved in a Whitepaper, which has been sent to stakeholders such as the European Commission, the European Data Protection Board (EDPB) and the Dutch Data Protection Authority. And of course, to as many AISPs as possible, because if they decide to adopt the measures we propose, they would be protecting privacy by design. Our Whitepaper contains a number of examples and good practices on how to enhance privacy protection. Among other things, it lays out how to improve the transparency of account information services. We hope that AISPs will take the recommendations in our Whitepaper to heart.

Our Application Programming Interface (API) has already been adopted by a service provider called Gatekeeper for Open Banking. We support this start up’s continued development, and we make suggestions on how the privacy filter can be best incorporated into their design and services. When AISPs use Gatekeeper, consumers get the control over their data that they deserve.

Knowing that the European Commission will not be evaluating PSD2 until 2022, we are glad to have been able to convey our own thoughts through our Whitepaper. Along with the API we have developed and distributed, it is an important tool for any AISP that takes the privacy of its consumers seriously.

Privacy First will continue to monitor all developments related to the second Payment Services Directive. Our website PSD2meniet.nl will remain up and running and will continue to be the must-visit platform for any updates on this topic.

If you want to know how things develop, or in case you have any suggestions, please send an email to Martijn van der Veen: This email address is being protected from spambots. You need JavaScript enabled to view it..

Today – on European Data Protection Day – the 2021 Dutch Privacy Awards were handed out during the Dutch National Privacy Conference, a joint initiative by Privacy First and the Dutch Platform for the Information Society (ECP). These Awards provide a platform for companies and governments that see privacy as an opportunity to distinguish themselves positively and to make privacy-friendly entrepreneurship and innovation the norm. The winners of the Dutch Privacy Awards 2021 are STER, NLdigital, Schluss, FCInet and the Dutch Ministry of Justice and Security.

Consumer solutions

Winner: STER

Advertising without storage of personal data, contextual targeting: proven effectiveness

The Dutch Stichting Ether Reclame (Ether Advertising Foundation), better known as STER, was one of the first organizations in the Netherlands to abandon the common model of offering advertisements based on information collected via cookies. STER has developed a procedure that only uses relevant information on the webpages visited. No personal data are collected at all (data such as browser version, IP address and click-through behaviour). Advertisers submit their advertisements to STER, which are then put on the website in conformity with the protocol developed by STER, which is based on a number of simple categories. These categories are linked to the information that is shown, such as a TV program that someone has selected. The protocol has been built up and refined over the past period and now works properly.

In this way, STER kills several birds with one stone. Most importantly, initial applications show that this approach is at least as effective for advertisers as the old cookie-based way. Secondly, the approach removes parties from the chain. Data brokers who played a role in the old system are now superfluous. Apart from the financial gain for the chain, this also prevents data coming into the possession of parties the data should not end up with. And thirdly, STER stays in control of its own advertising campaigns.

This makes STER a deserved winner of the Dutch Privacy Awards. The concept developed is innovative and helps to protect the privacy of citizens without them having to make any effort. STER is also investigating the possibility of using the approach more broadly. This too is an innovation that the expert panel applauds.

In that sense STER’s approach is also a well-founded response to the data-driven superpowers on the market as it demonstrates that the endless collection of personal data is not at all necessary to get your message across, whether it is commercial or idealistic.

STER could perhaps also have been submitted as a Business-to-Business entry, but the direct interests of consumers meant that it was listed in the category of consumer solutions.

Business solutions

Winner: NLdigital

Organisational innovation and practical application: Data Pro Code

Entries for the Dutch Privacy Awards often relate to technical innovations. At NLdigital it is not the technology, but the approach that is innovative. It has given concrete meaning to GDPR obligations through agreements and focuses mainly on data processors, not on the responsible parties. This enables processors to make agreements more quickly, practically and with sufficient care – agreements which are also verifiable in this regard. Many companies provide services by making applications available which involve data processing. And that requires processing agreements, which are not easy to apply for every organization. Filling in the corresponding statement leads to an appropriate processing agreement for clients.

NLdigital’s code of conduct called Data Pro Code is a practical instrument tailor made for the target group: IT companies that process data on behalf of others. With the help of (600) participants/members, the Code is drawn up as an elaboration of Art. 28 of the GDPR. It has been approved by the Dutch Data Protection Authority and has led to a publicly accessible certification.

Public services

Winner: FCInet & Ministery of Justice and Security

Ma³tch, privacy on the government agenda: innovative data minimization

FCInet is innovative, privacy-enhancing technology that was developed by the Dutch Ministry of Justice and Security and the Dutch Ministry of Finance. It is meant to assist in the fight against (international) crime. Part of FCInet is Ma³tch, which stands for Autonous Anonymous Analysis. With this feature the Financial Criminal Investigation Services (FCIS) can share secure and pseudonymized datasets on a national level (for example with the Financial Intelligence Unit-Netherlands and the Fiscal Information and Investigation Service), but also internationally. Ma³tch is a technology that supports and enforces parties concerned to make careful considerations per data field. This is possible with regard to the question of which data these parties want to compare and on the basis of which conditions. This ensures that parties can set up the infrastructure in such a way that it can be technically enforced that data are exchanged only on a legitimate basis.

Through hashing, organization A encrypts (bundles of) personal data in such a way that receiving party B has the possibility to check whether a person known to organization B is also known to organization A. Only if it turns out that there is a match (because the list of known persons in hashed form of organization B is checked against the list of persons in the sent list) does the next step take place whereby organization B actually requests information about the person concerned from organization A. The check takes place in a secure decentralized environment, so organization A does not know whether there is a hit or not. The technology thus prevents the unnecessary perusal of personal data in the context of comparisons.

The open source code technology of FCInet offers broader possibilities for application, which is encouraged by the expert panel and was an important reason for the submission: it can be reused in many other organizations and systems. The panel therefore assessed this initiative as a good investment in privacy by the government, where, clearly, the issue of privacy really is on the agenda.

Incentive Award

Winner: Schluss

Schluss applied for the Dutch Privacy Awards in 2021 for the third time. That is not the reason for the Incentive Award, even though it may encourage others to persevere in a similar way.

The reason is that it is a very nice initiative, focused on the self-management of personal data. In the form of an app, private users are offered a vault for their personal data, whether they are of a medical, financial or other nature. Users decide which people or organizations gets access to their data. The idea is that others who are allowed to see the data no longer need to store these data themselves. Schluss has no insight into who uses the app, its role is only to facilitate the process. The technology, which is open source, guarantees transparency about the operation of the app.

Schluss won the prestigious Incentive Award because thus far the app has had only a beta release. However, promising projects have been started with the Volksbank and there is a pilot in collaboration with the Royal Dutch Association of Civil-law Notaries. With the mission statement (‘With Schluss, only you decide who gets to know which of your details’) in mind, Schluss chose to become a cooperation, an organizational form that appealed to the expert panel. With this national Incentive Award the panel hopes to encourage the initiators to continue along this path and to persuade parties to join forces with Schluss.

Nominations  

There are four categories in which applicants are awarded:

1. the category of Consumer solutions (business-to-consumer)

2. the category of Business solutions (within a company or business-to-business)

3. the category of Public services (public authority-to-citizen)

4. the incentive award for a ground breaking technology or person.

From the various entries, the independent expert panel chose the following nominees per category (listed in arbitrary order):

Consumer solutions:

Business solutions:

Public services:

NKey

Roseman Labs (Secure Multiparty Computation)

Ministry of Health (CoronaMelder)

Schluss

NLdigital (Data Pro Code)

FCInet & Ministry of Justice (Ma³tch)

STER (Contextual targeting)

Simple Analytics

 

4MedBox (4LifeSupport)

 

 

During the National Privacy Conference all nominees presented their projects to the audience in Award pitches. Thereafter, the Awards were handed out. Click HERE for the entire expert panel report (pdf in Dutch), which includes participation criteria and explanatory notes on all the nominees and winners.

National Privacy Conference

The Dutch National Privacy Conference is a ECP|Platform for the Information Society and Privacy First initiative. Once a year, the conference brings together Dutch industry, public authorities, the academic community and civil society with the aim to build a privacy-friendly information society. The mission of both the National Privacy Conference and Privacy First is to turn the Netherlands into a guiding nation in the field of privacy. To this end, privacy by design is key.

These were the speakers during the 2021 National Privacy Conference in successive order:
- Monique Verdier (vice chairwoman of the Dutch Data Protection Authority)
- Judith van Schie (Considerati)
- Erik Gerritsen (Secretary General of the Dutch Ministery of Health, Welfare and Sport) 
- Mieke van Heesewijk (SIDN Fund) 
- Peter Verkoulen (Dutch Blockchain Coalition)
- Paul Tang (MEP for PvdA)
- Ancilla van de Leest (Privacy First chairwoman)
- Chris van Dam (Member of the Dutch House of Representatives for CDA)
- Evelyn Austin (director of Bits of Freedom)
- Wilmar Hendriks (chairman of the expert panel of the Dutch Privacy Awards).

The entire conference was livestreamed from Nieuwspoort in The Hague: see https://www.nieuwspoort.nl/agenda/overzicht/privacy-conferentie-2021/stream and https://youtu.be/asEX1jy4Tv0.

Dutch Privacy Awards expert panel

The independent expert Award panel consists of privacy experts from different fields:

  • Wilmar Hendriks, founder of Control Privacy and member of the Privacy First advisory board (panel chairman)
  • Ancilla van de Leest, Privacy First chairwoman
  • Paul Korremans, partner at Comfort Information Architects and Privacy First board member
  • Marc van Lieshout, managing director at iHub, Radboud University Nijmegen
  • Alex Commandeur, senior advisor BMC Advies
  • Melanie Rieback, CEO and co-founder of Radically Open Security
  • Nico Mookhoek, privacy lawyer and founder of DePrivacyGuru
  • Rion Rijker, privacy and data protection expert, IT lawyer and partner at Fresa Consulting.

In order to make sure that the Award process is run objectively, the panel members may not judge on any entry of his or her own organization.

In collaboration with the Dutch Platform for the Information Society (ECP), Privacy First organizes the Dutch Privacy Awards with the support of the Democracy & Media Foundation and The Privacy Factory.

Pre-registrations for the 2022 Dutch Privacy Awards are welcome!

Would you like to become a sponsor of the Dutch Privacy Awards? Please contact Privacy First! 

 

FG7A4979m

Published in Actions

It is with great concern that Privacy First has taken note of the Dutch draft bill on COVID-19 test certificates. Under this bill, a negative COVID-19 test certificate will become mandatory for access to sporting and youth activities, all sorts of events and public places including bars and restaurants and cultural and higher education institutions, Those who have no such certificates risk getting high fines. This will put pressure on everyone's right to privacy. 

Serious violation of fundamental rights

The draft bill severely infringes numerous fundamental and human rights, including the right to privacy, physical integrity and freedom of movement in combination with other relevant human rights such as the right to participate in cultural life, the right to education and various children’s rights such as the right to recreation. Any curtailment of these rights must be strictly necessary, proportionate and effective. However, the current draft bill fails to demonstrate this, while the required necessity in the public interest is simply assumed. More privacy-friendly alternatives to reopen and normalize society do not seem to have been considered. For these reasons alone, the proposal cannot pass the human rights test and should therefore be withdrawn.

Social exclusion

The proposal also violates the general prohibition of discrimination, as it introduces a broad social distinction based on medical status. This puts pressure on social life and may lead to large-scale inequality, stigmatization, social segregation and even possible tensions, as large groups in society will not (or not systematically) want to or will not be able to get tested (for various reasons). During the recent Dutch National Privacy Conference organized by Privacy First and the Platform for the Information Society (ECP), it already became clear that the introduction of a mandatory ‘corona passport’ could have a socially disruptive effect.[1] On that occasion the Dutch Data Protection Authority, among others, took a strong stand against it. Such social risks apply all the more strongly to the indirect vaccination obligation that follows on from the corona test certificate. In this regard, Privacy First wants to recall that recently both the Dutch House of Representatives and the Parliamentary Assembly of the Council of Europe have expressed their opposition to a direct or indirect vaccination requirement.[2] In addition, the draft bill under consideration will have the potential to set precedents for other medical conditions and other sectors of society, putting pressure on a much broader range of socio-economic rights. For all of these reasons, Privacy First strongly recommends that the Dutch government withdraw this draft bill.

Multiple privacy violations

Moreover, from the perspective of the right to privacy, a number of specific objections and questions apply. First of all, the draft bill introduces a mandatory ‘proof of healthiness’ for participation in a large part of social life, in flagrant violation of the right to privacy and the protection of personal data. Also, the draft bill introduces an identification requirement at the entrance of public places, in violation of the right to anonymity in public spaces. The bill also results in the inconsistent application of existing legislation to the same act, namely testing, with far-reaching consequences on the one hand for a precious achievement like medical confidentiality and the trust of citizens in that confidentiality, and on the other hand for the practical implementation of retention periods while the processing of the test result does not change. After all, it is not the result of the test that should determine whether the file falls under the Dutch Medical Treatment Contracts Act (WGBO, which has a medical secrecy requirement and a retention period of 20 years) or under the Public Health Act (with a retention period of five years), but the act of testing itself. Moreover, it is unclear why the current draft bill seeks to connect to the Public Health Act and/or WGBO if it only concerns obtaining a test certificate for the purpose of participating in society (and therefore no medical treatment or public health task for that purpose). Here, the only possibility for processing and for breaching medical confidentiality should be the basis of consent. In this case, however, there cannot be the legally required freely given consent, since testing will be a compelling condition for participation in society.

Privacy requires clarity

Many other issues are still unclear: which data will be stored, where, by whom, and which data may possibly be exchanged? To what extent will there be personal localization and identification as opposed to occasional verification and authentication? Why may test results be kept for an unnecessarily long time (five or even 20 years)? How great are the risks of hacking, data breaches, fraud and forgery? To what extent will there be decentralized, privacy-friendly technology, privacy by design, open source software, data minimization and anonymization? Will test certificates remain free of charge and to what extent will privacy-friendly diversity and choice in testing applications be possible? Is work already underway to introduce an ‘alternative digital carrier’ in place of the Dutch CoronaCheck app, namely a chip, with all the risks that entails? How will function creep and profiling be prevented and are there any arrangements when it comes to data protection supervision? Will non-digital, paper alternatives always remain available? What will happen to the test material taken, i.e. everyone’s DNA? And when will the corona test certificates be abolished?

As long as such concerns and questions remain unanswered, submission of this bill makes no sense at all and the corona test certificate will only lead to the destruction of social capital. Privacy First therefore reiterates its request that the current proposal be withdrawn and not submitted to Parliament. Failing this, Privacy First will reserve the right to have the matter reviewed by the courts and declared unlawful.

[1] See the Dutch National Privacy Conference, 28 January 2021, https://youtu.be/asEX1jy4Tv0?t=9378, starting at 2h 36 min 18 sec.
[2] See Council of Europe, Parliamentary Assembly, Resolution 2361 (2021): Covid-19 vaccines: ethical, legal and practical considerations, https://pace.coe.int/en/files/29004/html, par. 7.3.1-7.3.2: “Ensure that citizens are informed that the vaccination is NOT mandatory and that no one is politically, socially, or otherwise pressured to get themselves vaccinated, if they do not wish to do so themselves; ensure that no one is discriminated against for not having been vaccinated, due to possible health risks or not wanting to be vaccinated.” See also, for example, Dutch House of Representatives, Motion by Member Azarkan on No Corona Vaccination Obligation (28 October 2020), Parliamentary Document 25295-676, https://zoek.officielebekendmakingen.nl/kst-25295-676.html: "The House (...) pronounces that there should never be a direct or indirect coronavirus vaccination obligation in the future"; Motion by Member Azarkan on Access to Public Benefits for All Regardless of Vaccination or Testing Status (5 January 2021), Parliamentary Document 25295-864, https://zoek.officielebekendmakingen.nl/kst-25295-864.html: "The House (...) requests the government to enable access to public services for all regardless of vaccination or testing status.’

Published in Law & Politics

Under the Corona Pandemic Emergency Act, the Dutch government has the option to introduce all kinds of restrictive measures, including the wide-ranging and mandatory use of face masks. This is unless the Dutch House of Representatives rejects this measure later this week. In this context, Privacy First today has sent the following email to the House of Representatives:

Dear Members of Parliament,

On 19 November, the government submitted to you the Regulation concerning additional requirements for face masks under COVID-19. Under this regulation, wearing a face mask will become mandatory in numerous places (including shops, railway stations, airports and schools) as of 1 December 2020. This obligation can be periodically extended by the government without the consent of Parliament. Based on the Corona Pandemic Emergency Act, you currently have seven days to exercise your right of veto and prevent the entry into force of a wide-ranging face mask obligation. By 26 November at the latest, you will be able to vote on this issue and reject this measure.

The wearing of face masks has been the subject of much public debate for months. Both the government and the National Institute for Public Health and the Environment (RIVM) have repeatedly stated that wearing non-medical face masks is hardly effective in combating the coronavirus. Scientists seem to be divided on this. At the same time, wearing a face mask can also have the opposite effect, i.e. harm people's health. There is a consensus, however, that in a legal sense the compulsory use of face masks is an infringement of the right to privacy and self-determination.

This accordingly falls within the scope of Privacy First. The right to privacy is a universal human right that is protected in the Netherlands by international and European treaties and by our national Constitution. Any infringement of the right to privacy must therefore be strictly necessary, proportionate and effective. If that is not the case, it is an unjustified breach and therefore a violation of the right to privacy, both as a human right and as a constitutional right. As long as the wearing of non-medical face masks to deafeat the coronavirus has not proven effective and can even have adverse health effects, there cannot be any social necessity for the introduction of a general face mask obligation. Such an obligation would thus amount to a social experiment with unforeseen consequences. This is not in keeping with a free and democratic constitutional society under the rule of law. Privacy First therefore advises you to reject the proposed regulation for the introduction of compulsory face masks and instead propose to continue wearing them on a voluntary basis.

Yours faithfully,

The Privacy First Foundation

Published in Law & Politics

In the fight against the coronavirus, the Dutch government this week made clear that the introduction of a curfew is imminent. Because of this, Privacy First today has sent the following appeal to the Dutch House of Representatives:

Dear Members of Parliament,

This week the Netherlands finds itself at a historical human rights crossroads: is a nation-wide curfew going to be introduced for the first time since World War II? For Privacy First such a far-reaching, generic measure would be disproportionate and far from necessary in virtually every situation. Moreover, in the fight against the coronavirus the effectiveness of such a measure remains unknown to this date. For that alone, there can be no legally required social necessity of a curfew. A curfew could in fact also be counterproductive, as it would harm the mental and (therefore also) physical health of large groups in society. Besides, a curfew in the Netherlands is yet another step towards a surveillance society. The use of lighter, targeted and more effective measures is always preferable. Should a curfew nonetheless be introduced, Privacy First would consider it a massive violation of the right to privacy and freedom of movement. Privacy First therefore calls on you to not let this happen and to thwart the introduction of a curfew.

Yours faithfully,

The Privacy First Foundation


Update 17 February 2021: this week, in summary proceedings, the district court of The Hague handed down a ground-breaking ruling that says that the curfew was wrongly introduced under the Dutch Extraordinary Powers Act. The current Dutch curfew is therefore unlawful. Moreover, the court found that there are "major question marks regarding the factual substantiation by the State of the necessity of the curfew. (...) Before a far-reaching restriction such as a curfew is introduced, it must be clear that no other, less far-reaching measures are available and that the introduction of the curfew will actually have a substantial effect", stated the court, without the conviction that this was the case. In addition, the court raised the question of why an urgent (but voluntary) curfew advice had not been chosen. The court also noted that "the Dutch Outbreak Management Team, according to the team itself, has no evidence that the curfew will make a substantial contribution to reducing the spread of the virus." All this "makes the State's assertion that a curfew is inevitable at least debatable and without convincing justification", the court concluded. (See judgment (in Dutch), paragraphs 4.12-4.14.)

The judgment of the district court of The Hague is in line with Privacy First’s earlier position. Privacy First hopes that this will be confirmed on appeal by the Hague Court of Appeal and that it will also lead to the rejection of the curfew by both the Dutch House of Representatives and the Senate.

Published in Mobility

This week the Dutch House of Representatives will debate the ‘temporary’ Corona emergency law under which the movements of everyone in the Netherlands can henceforth be monitored ‘anonymously’. Privacy First has previously criticized this plan in a television broadcast by current affairs program Nieuwsuur. Subsequently, today Privacy First has sent the following letter to the House of Representatives:

Dear Members of Parliament,

With great concern, Privacy First has taken note of the ‘temporary’ legislative proposal to provide COVID-19 related telecommunications data to the Dutch National Public Health Institute (RIVM). Privacy First advises to reject this proposal on account of the following fundamental concerns and risks:

Violation of fundamental administrative and privacy principles

- There is no societal necessity for this legislative proposal. Other forms of monitoring have already proven sufficiently effective. The necessity of this proposal has not been demonstrated and there is no other country where the application of similar technologies made any significant contribution.
- The proposal is entirely disproportionate as it encompasses all telecom location data in the entire country. Any form of differentiation is absent. The same applies to data minimization: a sample would be sufficient.
- The proposal goes into effect retroactively on 1 January 2020. This violates legal certainty and the principle of legality, particularly because this date is long before the Dutch ‘start’ of the pandemic (11 March 2020).
- The system of ‘further instructions from the minister’ that has been chosen for the proposal is completely undemocratic. This further erodes the democratic rule of law and the oversight of parliament.
- The proposal does not mention 'privacy by design' or the implementation thereof, while this should actually be one of its prominent features.

Alternatives are less invasive: subsidiarity

- The State Secretary failed to adequately investigate alternatives which are more privacy friendly. Does she even have any interest in this at all?
- Data in the possession of telecom providers are pseudonymized with unique ID numbers and as such are submitted to Statistics Netherlands (CBS). This means that huge amounts of sensitive personal data become very vulnerable. Anonymization by CBS happens only at a later stage.
- When used, the data are filtered based on geographical origin. This creates a risk of discrimination on the basis of nationality, which is prohibited.
- It is unclear whether the CBS and the RIVM intend to ‘enrich’ these data with other data, which could lead to function creep and potential data misuse.

Lack of transparency and independent oversight

- Up until now, the Privacy Impact Assessment (PIA) of the proposal has not been made public.
- There is no independent oversight on the measures and effects (by a judge or an independent commission).
- The GDPR may be applicable to the proposal only partially as anonymous data and statistics are exempt from the GDPR. This gives rise to new risks of data misuse, poor digital protection, data breaches, etc. General privacy principles should therefore be made applicable in any case.

Structural changes and chilling effect

- This proposal seems to be temporary, but the history of similar legislation shows that it will most likely become permanent.
- Regardless of the ‘anonymization’ of various data, this proposal will make many people feel like they are being monitored, which in turn will make them behave unnaturally. The risk of a societal chilling effect is huge.

Faulty method with a significant impact

- The effectiveness of the legislative proposal is unknown. In essence, it constitutes a large scale experiment. However, Dutch society is not meant to be a living laboratory.
- By means of data fusion, it appears that individuals could still be identified on the basis of anonymous data. Even at the chosen threshold of 15 units per data point, the risk of unique singling out and identification is likely still too large.
- The proposal will lead to false signals and blind spots due to people with several telephones as well as vulnerable groups without telephones, etc.
- There is a large risk of function creep, of surreptitious use and misuse of data (including the international exchange thereof) by other public services (including the intelligence services) and future public authorities.
- This proposal puts pressure not just on the right to privacy, but on other human rights as well, including the right to freedom of movement and the right to demonstrate. The proposal can easily lead to structural crowd control that does not belong in a democratic society.

Specific prior consent

Quite apart from the above concerns and risks, Privacy First doubts whether the use of telecom data by telecom providers, as envisaged by the legislative proposal, is lawful in the first place. In the view of Privacy First, this would require either explicit, specific and prior consent (opt-in) from customers, or the possibility for them to opt-out at a later stage and to have the right to have all their data removed.

It is up to you as Members of Parliament to protect our society from this legislative proposal. If you fail to do so, Privacy First reserves the right to take legal action against this law.

For further information or questions with regard to everything discussed above, Privacy First can be contacted at all times by telephone (+31-20-8100279) and email (This email address is being protected from spambots. You need JavaScript enabled to view it.).

Yours sincerely,

The Privacy First Foundation

Published in Law & Politics

Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.

Dear Members of Parliament,

Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.

Lack of necessity and effectiveness

With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.

Surveillance society

In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.

Risks of misuse

There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.

For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.

Testing instead of apps

According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.

Haste is rarely a good thing

If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.

Privacy by design

The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.      

Yours faithfully,

The Privacy First Foundation
(...)


Dear Members of Parliament,

You have received our position paper, this is our oral explanation.

First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.

With this in mind, we look at three legal principles:

  •  Legitimate purpose limitation.
    - What is the problem?
    - What is the scale of the problem?
    - What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?

    It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.

    Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.

  • Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.

  • Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.

On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.

Published in Law & Politics
Page 2 of 4

Our Partners

logo Voys Privacyfirst
logo greenhost
logo platfrm
logo AKBA
logo boekx
logo brandeis
 
 
 
banner ned 1024px1
logo demomedia
 
 
 
 
 
Pro Bono Connect logo
privacy coalitie deelnemer

Follow us on Twitter

twitter icon

Follow our RSS-feed

rss icon

Follow us on LinkedIn

linked in icon

Follow us on Facebook

facebook icon