Last updated: 
1 week 2 days ago
Blog Manager
One of Jisc’s activities is to monitor and, where possible, influence regulatory developments that affect us and our customer universities, colleges and schools as operators of large computer networks. Since Janet and its customer networks are classified by Ofcom as private networks, postings here are likely to concentrate on the regulation of those networks. Postings here are, to the best of our knowledge, accurate on the date they are made, but may well become out of date or unreliable at unpredictable times thereafter. Before taking action that may have legal consequences, you should talk to your own lawyers. NEW: To help navigate the many posts on the General Data Protection Regulation, I've classified them as most relevant to developing a GDPR compliance process, GDPR's effect on specific topics, or how the GDPR is being developed. Or you can just use my free GDPR project plan.

Group administrators:

Should we just log everything?

Thursday, January 30, 2020 - 09:22

In a world where data storage is almost unlimited and algorithms promise to interrogate data to answer any question, it's tempting for security teams to simply follow a "log everything, for ever" approach. At this week's CSIRT Task Force in Malaga, Xavier Mertens suggested that traditional approaches are still preferable.

With the speed of modern networks and systems, logging everything is almost guaranteed to produce files far too big for humans to interpret, so incident responders become entirely dependent on algorithms. And, since those algorithms don't know which events or incidents really matter to the organisation, they may well highlight or explain the wrong things. Xavier suggested that having too many logs may well give organisations a false sense of security.

Another problem with this approach is that no one knows which logs are actually important, so it's hard to work out which are worth spending time on when, for example, their format changes (if we even notice), or they are challenged by accountants or regulators.

So it seems it's still better to start from a purposive approach: think through the kinds of incident that it's most important for you to be able to deal with, work out which logs you need to investigate those, and ensure those are available for as long as there is any point in investigating.

If, as still seems to be too common, a breach remains undiscovered for months or years, investigation is likely to be more trouble than it is worth, since it's likely that some essential knowledge will have been lost, and the attacker will have had ample time to do all the damage they want. Belated discovery of breaches is a sign that we need to improve our detection processes, not that we need to retain even more logs for even longer.