Last updated: 
1 week 17 hours ago
Blog Manager
One of Jisc’s activities is to monitor and, where possible, influence regulatory developments that affect us and our customer universities, colleges and schools as operators of large computer networks. Since Janet and its customer networks are classified by Ofcom as private networks, postings here are likely to concentrate on the regulation of those networks. Postings here are, to the best of our knowledge, accurate on the date they are made, but may well become out of date or unreliable at unpredictable times thereafter. Before taking action that may have legal consequences, you should talk to your own lawyers. NEW: To help navigate the many posts on the General Data Protection Regulation, I've classified them as most relevant to developing a GDPR compliance process, GDPR's effect on specific topics, or how the GDPR is being developed. Or you can just use my free GDPR project plan.

Group administrators:

Automated Processing for Network and Information Security

Friday, February 16, 2018 - 10:21

Article 22 of the GDPR contains a new, and oddly-worded, "right not to be subject to a decision based solely on automated processing". This only applies to decisions that "produce[] legal effects … or similarly significantly affect[]" the individual. Last year, the Article 29 Working Party's draft guidance on interpreting this Article noted that an automated refusal to hire a bicycle – because of insufficient credit – might reach this threshold.

This raised the concern, discussed in our consultation response, that automated processes that the Working Party has previously approved of – such as automatically filtering e-mails for viruses and spam – might now require human intervention. They do, after all, aim to cause disadvantage to the person who hopes to hold your files to ransom.

Fortunately the Working Party's final guidance, published this week, clarifies that the threshold is, in fact, much higher than this. Their examples of "serious impactful" effects are now at the level of automated refusal of citizenship, social benefits or job opportunities. So automation to defend our systems, networks and data against attack should be well within the boundaries where normal data protection law, not Article 22's special provisions, apply.

Interestingly there's also a suggestion that some flexibility may be allowed where the volume of data makes human inspection impractical. Although GDPR Recital 71 mentions 'e-recruiting practices without any human intervention', the example on page 23 of the guidance approves of automated short-listing where the volume of job applications makes it "not practically possible to identify fitting candidates without first using fully automated means to sift out irrelevant applications".