Last updated: 
3 months 3 weeks ago
Blog Manager
One of Jisc’s activities is to monitor and, where possible, influence regulatory developments that affect us and our customer universities, colleges and schools as operators of large computer networks. Since Janet and its customer networks are classified by Ofcom as private networks, postings here are likely to concentrate on the regulation of those networks. Postings here are, to the best of our knowledge, accurate on the date they are made, but may well become out of date or unreliable at unpredictable times thereafter. Before taking action that may have legal consequences, you should talk to your own lawyers. NEW: To help navigate the many posts on the General Data Protection Regulation, I've classified them as most relevant to developing a GDPR compliance process, GDPR's effect on specific topics, or how the GDPR is being developed. Or you can just use my free GDPR project plan.

Group administrators:

Pseudonymous Identifiers and the Law

Wednesday, June 6, 2012 - 10:18

For a while I've been trying to understand how pseudonymous identifiers, such as IP addresses and the TargetedID value used in Federated Access Management, fit into privacy law. In most cases the organisation that issues such identifiers can link them to the people who use them, but other organisations who receive the identifiers can't. Indeed Access Management federations spend a lot of effort to make it as difficult as possible for the link to be made, using both technical and legal means to protect the privacy of users.

Both UK and EU law recognise such identifiers, but don't give them any special status: they are either personal data (just like names or e-mail addresses) or not personal data, depending on whether courts and regulators think that it's "likely" that the link between the identifier and the individual will be made. This lack of flexibility makes the decision very significant - personal data is subject to a lot of regulation, non-personal data is subject to none. This is producing odd results, with courts seeming first to decide whether the proposed processing is desirable and then finding a way to reach the appropriate decision on whether or not it involves processing personal data. Thus courts in various parts of Europe (as far as I know there haven't yet been any cases on the question in the UK) have come to contradictory conclusions depending on whether the question is asked in the context of enforcing rights (good) or "surveillance" (bad).

Yesterday I presented a paper on this at a law conference in Edinburgh, suggesting that the current situation is unhelpful both for systems designers and for privacy. Instead, I think, the law ought to be working out how much risk there is to privacy and then requiring a proportionate level of protection. Unfortunately that doesn't seem to be the way that either the Data Protection Act or the European Privacy Directive are being used at the moment. The audience of lawyers seemed to agree both with my analysis and my ideas for how to fix the problem, which will be reassuring for any future discussions with the Commission and UK Regulators.