Information Security in the Real World. Confidentiality, Availability, Integrity, Practicality.
Monday, 21 November 2011
There is a reason I called this column "Practically Secure". Because I know how Mr. Lacey feels. Pragmatism is way down the list of objectives for the authors of today's security standards.
[NB I have finally edited this post to add my commentary, sorry for the delay!]
Thursday, 16 June 2011
"LulzSec website break-ins look to have been languorously orchestrated, using nothing more sophisticated than entry-level automatic web database bug-finding tools, available for free online.
In other words, LulzSec is a timely wake-up call to better security if you are still asleep at the wheel. Your customers' data is important - both to them and to you."
We InfoSec professionals need to heed the warning. It's going to get worse before it gets better. The Advanced Persistent Threat is actually the "Simple Persistent Threat". The online organisation without any weak spots, the impregnable network is a fantasy. We need to wake up, improve security but also reduce the potential impact of a breach, with encryption, data cleansing and segregation, and a decent Incident Response plan.
But for now, back to Sophos' take on LulzSec, for those that are ambivalent to their activities:-
"But the end doesn't justify the means. Time spent throwing bricks through other people's digital windows doesn't actually teach anyone anything about glassmaking, glazing or civil engineering. If you consider yourself a hacker and you have time to spare, but you're tempted by "hacking" such as DDoSes or gratuitous break-ins, why not use your skills for active benefit instead? Follow the lead of a guy like Johnny Long and hackersforcharity.org"
Tuesday, 26 April 2011
"Unfortunately, it's all based on collections of ancient practices, with a heavy emphasis on documentation and audits. And if you don't want to pay for security, you simply accept the risk. Your security might be completely ineffective but your paperwork will gain you full marks."Lacey then goes on to describe "Real Security" as distinct from both compliance and the "business enablement" view of security we sell to management. With a doom and gloom conclusion that "most organizations are sleepwalking into a future crisis" Lacey paints a grim picture of the current state of Information Security.
Is he right? Certainly the continuous stream of breach notifications and ever growing landscape of threats seems to bear this out. We once knew what we were dealing with, or at least thought we did. We don't, not if we are relying on standards written in the 80s and revised six years ago when the term "cloud computing" was still met with giggles and shrugs and "virtualisation" was a software tool fit only for development environments. The future of Infosec demands imagination, foresight, a step change fit for the 201Xs just as BS7799 was a step change in the 1980s. Because the if current trends show anything, it's that we're even worse prepared against the enormous imagination and technical skill of todays malicious agents than we thought we were. But in truth, no more than we deserve to be.
Sunday, 27 March 2011
A good education programme is worth a dozen new technical controls.
In 2007, details of 25 million UK citizens went missing on two CDs because a junior employee didn't know the rules and management procedures were lax. Similar mistakes led to huge data losses in more recent years at Zurich insurance, UK railways operator Network Rail and the British Ministry of Defence.
One possible response to these breaches is a technological one. Maybe Data-Loss Prevention technology could have helped, even something as simple as disabling the writable DVD drive on employee workstations. If the employee could not copy data to DVD they could not have lost it.
But are we missing the point? These were human failings. Like many security issues these were entirely preventable human errors. A system of people is capable of a myriad different failings. If we continue to throw expensive technological solutions at human error then we will never be finished. Wouldn't we feel better knowing that our employees know what is expected of them in the fight to remain secure and compliant? That they are on the side of Information Security and they work with us to prevent fraud, loss and service disruption?
This is what a security awareness programme does. A good one will change people's understanding of security, will encourage them to feel part of the solution, and engender good habits in their day to day activities. If the HMRC junior employee had had some education around the value of sensitive information, the trust placed in them by their customers - the British people - and the risks inherent in moving that data from a secure place to an insecure one, then maybe that breach would never have happened.
Much talk after the events above was about technological prevention and improving procedures. But human nature suggests that whatever technical or administrative control you put in place, there will be a tendency to resent the control, to see it as a barrier to productivity and to work around it. More so if the subjects of the control - the employees with pressure to get the job done - perceive it to be too restrictive, or don't value the risk you are mitigating.
Technical and administrative controls have their place, they are a major weapon against data breaches. But a far more effective weapon is the power of human nature. Education programmes can go a long way to change staff behaviour and keep your data safe.
Why then do we spend so much money on technological solutions to human problems? DLP and Security Incident and Event Management (SIEM) are often recommended after a breach with its roots in human error. While these have their place, human element measures such as education are often more cost-effective. So why the technological focus?
Maybe it has something to do with the people doing the recommendations. Maybe the auditors, analysts and CISOs feel they have to justify their position and sizeable fee by sounding knowledgeable. Recommending staff training does not sound like expert Information Security advice. It's too simple, and not what we expect from a CISSP/CISA/whatever. So several new appliances and desktop software suites are recommended - the latest wizardry - thus the CIO feels he has received value for money from his security experts.
This needs to change. We need to value the human element in our Information systems, and recognise that it needs managing at least as expertly as the digital elements. Our people need help, encouragement and empowerment to become security advocates.
Once you've established a permanent, rolling security education programme then you might want to review your technical controls and ensure they are appropriate to the risk you are managing. Who knows, maybe you might find you can relax some controls without degrading your risk posture, and at the same time make your staff more productive. And what CIO doesn't want that?
Wednesday, 16 February 2011
Intel's use of the phrase recognises the fact that employees, associates and outside agents regularly find ways around our efforts to contain our data and many do so without malice but in order to get their job done. We should therefore recognise this behaviour and manage it, instead of trying to limit or quash it. This is genuinely refreshing stuff from a big name, and is a timely response to David Lacey's call for new standards and security models. The five "laws" in Intel's model are:
- Information wants to be free
- Code wants to be wrong
- Services want to be on
- Users want to click
- Even a security feature can be used for harm.