In thinking about the legal arrangements for Jisc's learning analytics services we consciously postponed incorporating medical and other information that Article 9(1) of the General Data Protection Regulation (GDPR) classifies as Special Category Data (SCD): "personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation" (mo
Reflecting on the scope chosen by Blackboard for our working group - "Ethical use of AI in Education" - it's worth considering what, if anything, makes education different as a venue for artificial intelligence. Education is, I think, different from commercial businesses because our measure of success should be what pupils/students achieve. Educational institutions should have the same goal as those they teach, unlike commercial settings where success is often a zero-sum game.
Last week I was invited to a fascinating discussion on ethical use of artificial intelligence in higher education, hosted by Blackboard. Obviously that's a huge topic, so I've been trying to come up with a way to divide it into smaller ones without too many overlaps. So far, it seems a division into three may be possible:
One of the concerns commonly raised for Artificial Intelligence is that it may not be clear how a system reached its conclusion from the input data. The same could well be said of human decision makers: AI at least lets us choose an approach based on the kind of explainability we want. Discussions at last week's Ethical AI in HE meeting revealed several different options:
One of my guidelines for when consent may be an appropriate basis for processing personal data is whether the individual is able to lie or walk away. If they can, then that practical possibility may indicate a legal possibility too.
The Article 29 Working Party has published its draft guidelines on transparency. For those of us who have already been working on GDPR privacy notices, there don’t seem to be any surprises: this is largely a compilation of the relevant sections of the Regulation and other guidance.
Concern has sometimes been expressed whether the General Data Protection Regulation’s (GDPR) requirement to notify individuals of all processing of their personal data would cause difficulties for security and incident response teams. These activities involve a lot of processing of IP addresses, which the GDPR and case law seem to indicate will normally count as personal data. But a law that required us to tell attackers how much we knew about their activities would help them far more than us.
For those who couldn't make it to the Jisc GDPR conference last week (and those who did, but want a refresher) the slides are now available.
The Article 29 Working Party of European Data Protection Supervisors has published draft guidance on consent under the General Data Protection Regulation. Since the Working Party has already published extensive guidance on the existing Data Protection Directive rules on consent, this new paper concentrates on what has changed under the GDPR.
The Forum of Incident Response and Security Teams (FIRST) invited me to write a piece on how GDPR affects security and incident response.
Summary: it makes them pretty much essential :)