15.10.2013

Data Protection Regulation in the European Parliament - Three Key Issues

Portrait von Niko Härting
Niko Härting

Next Monday, 21 October 2013, the LIBE Committee of the European Parliament is expected to vote on the proposed amendments of the Data Protection Regulation tabled by the European Commission in January 2012. Whatever the outcome of the vote may be, here are tree key  issues that the LIBE Committee will, hopefully, address:

1. Incentives for Pseudonymity and Anonymity

When, in an internet chat, I am allowed to use a pseudoym (e.g. "darling88") or when I am given the option of total anonymity  ("anonymous user"), I will regard such options as privacy-friendly. It is, therefore, necessary to encourage service providers to allow the anonymous or pseudonymous use of services. There need to be clear incentives. And it is just logical that that means that there need to be specific rules that apply to anonymous or pseudonymous data. These rules need to be less stringent on the provider than the rules applying to the processing of data containing "real names".

It is not enough

  • for anonymous data to rely on data protection laws to be inapplicable for the lack of an "identifiable" user because there is no such thing as "total anonymity". There will always be some way of "lifting the veil" of anonymity. Just the information that a user is male, resides in the centre of Berlin and is a lawyer interested in privacy matters, foreign affairs  and jazz music may suffice for identification. From a service provider's point of view, this means that they are well advised not to treat the respective data as "anonymous" and, therefore, out of the reach of data protection law. Rather, the service provider will consider the possibility of de-anonymization and follow data protection rules. If such rules do not differentiate between "anonymous" and (other) personal data, the provider may as well do without anonymity as the requirements for processing (especially consent) are identical.
  • for pseudonymous data just to mention them as a specific category of personal data (as is the case in German data protection law) without setting up rules for the processing of such data that are different from the rules applying to (other) personal data. If the rules are identical, a service provider has no incentive to allow the pseudonymous use of its service. If the service provider needs consent, be it for the use of "full name" data or for pseudonymity, they might as well go for the "full name".

2. Strict Rules for the States

The Regulation is to cover both data processing by public authorities and by private businesses. As it is a Regulation, the principle is that the existing member states' laws on the protection of citizens' data will expire once the Regulation comes into effect. For Germany alone, that would mean that hundreds of specific data protection laws - from social security data to data on foreign nationals to census data and so forth - will cease to exist. While the Regulation - in its title - insinuates that it is "general", it is entirely what this is supposed to mean:

  • Should the "general" nature of the Regulation mean to allow "specific" data protection laws for the member states in the public sector, the question would be: Why a Regulation if the outcome is exactly the same as with a Directive setting a minimum standard and giving the member states' the right to enhance data protection for their citizens by national laws?
  • Should the Regulation be "general" as it still needs to be amended by additional rules, specific for various sectors, the questions would be: Who will enact such rules? (if not the EU-Commission as suggested by the EU-Commission itself)? And how can EU citizens' rights be adequately protected against the member states until additional, specific rules are agreed on and enacted by the EU?

3. Consent or Accountability?

Take the example of cookies and tracking: There needs to be a clear decision on who is to bear the burden of protecting privacy:

Is it the citizens themselves who - by consent - have the option of allowing as many cookies and as much tracking as they like?

Or is it the service providers who have to refrain from intrusions into privacy as much as possible - be it by outlawing cookies and tracking altogether or by legal duties to minimize intrusions, for example by means of privacy by design and/or by default or by making the composition of algorithms transparent?

It would be unsatisfactory if

  • consent would be - as a rule - required when, at the same time, the citizens are not trusted to make the right choice. If consent is regarded as a (if not the) key element of protecting privacy, there is no reason to make it hard for citizens to give consent: If consent is regarded as safeguarding self-determination, why should such self-determination need to be "explicit" or "unambiguous"? And why should consent be irrelevant when there is a "significant imbalance" between the citizen and a service provider?
  • the concept of accountability is entirely misunderstood. Nobody will deny that there is a need for efficient sanctions when data protection laws are disregarded. But that is not what accountability is all about. Accountability is a concept that shifts the responsibility for privacy from the citizen to the data processor. Instead of leaving it up to the citizens whether to give or refuse consent, accountability holds the data processors responsible by putting them under a duty to design and administer processes in a minimal-invasive fashion. Under a regime of accountability, the data processor needs to be obliged to follow statutory guidelines when processing data. The data processor will then be held accountable for any breach of privacy (e. g. "leaks").

 

Zurück