Last updated: 
4 months 2 weeks ago
Blog Manager
One of Jisc’s activities is to monitor and, where possible, influence regulatory developments that affect us and our customer universities, colleges and schools as operators of large computer networks. Since Janet and its customer networks are classified by Ofcom as private networks, postings here are likely to concentrate on the regulation of those networks. Postings here are, to the best of our knowledge, accurate on the date they are made, but may well become out of date or unreliable at unpredictable times thereafter. Before taking action that may have legal consequences, you should talk to your own lawyers. NEW: To help navigate the many posts on the General Data Protection Regulation, I've classified them as most relevant to developing a GDPR compliance process, GDPR's effect on specific topics, or how the GDPR is being developed. Or you can just use my free GDPR project plan.

Group administrators:

Consent and the Role of the Regulator

Tuesday, January 6, 2015 - 09:29

Reading yet another paper on privacy and big data that concluded that processing should be based on the individual's consent, it occurred to me how much that approach limits the scope and powers of privacy regulators. When using consent to justify processing, pretty much the only question for regulators is whether the consent was fairly obtained – effectively they are reduced to just commenting and ruling on privacy notices. And, indeed, a surprising number of recent opinions and cases do seem to be about physical and digital signage.

But in an area as complicated as big data, where both the potential risks and benefits to individuals and society are huge, I'd like privacy regulators to be doing more than that. It seems pretty clear that there will be some possible uses of big data that should be prohibited – no matter how persuasive the privacy notice – as harmful to individuals and society. Conversely there are other uses where the benefits to both should legitimise them without everyone having to agree individually. Privacy regulators ought, I think, to be playing a key role in those decisions, something that invoking "consent" prevents them from doing.

There is an existing legal provision that would let regulators discuss much meatier questions: whether processing is "necessary for a legitimate interest" and whether that interest is "overridden by the fundamental rights of the individual"; however until recently it hasn’t been much used. The Article 29 Working Party’s Opinion on Legitimate Interests is a promising start, but it would be good to see regulators routinely discussing new types of processing in those terms. Looking at big data, and other technologies with complex privacy effects, explicitly in terms of the benefits they might provide and the harms they might cause – maximising the former and minimising the latter – seems a much better way to protect privacy than simply handing the question to individuals and then considering, after it is too late, whether or not their consent was fairly obtained.

Analysing applications in terms of legitimate interests and personal rights could even benefit those organisations that want to do the right thing. A business that can demonstrate, in terms approved by a privacy regulator, how its activities provide a significant benefit without threatening the fundamental rights of its customers would seem to have a strong ethical and legal position: at least as good as one claiming "those consequences were clear from our privacy policy that you consented to". An interesting survey of trust in different public sector organisations suggests this may be a calculation we are already making instinctively. And if this approach were to become the norm then it might even provide a signal of its own – that a proposition that doesn't make the legitimate interest/fundamental rights case, but relies instead on user consent, should be examined very closely by those users.