Years ago I attended a lecture given by the head of IT Security at a major oil company. He said that nobody takes IT Security seriously and nobody would until an attack was so severe as to bankrupt a major corporation. Yesterday we heard that the NHS, Nissan and many other organisations, have been severely hit by the, so called, WannaCry malware. Could this be the one?
Following incidents such as these there are cries of “Why did this happen?” and experts will make the usual recommendations about ensuring systems are patched and staff adequately trained but these are tactical measures. The real reasons IT systems are so insecure are deeper rooted and the bottom line is that management don´t take IT Security seriously. In addition, two systemic trends which underpin most large organisations, impede implementation of good security: Complexity and Business Process Reengineering.
Perhaps complexity is inevitable with technology today but the risks of massive complexity are not adequately appreciated and the NHS, in particular, has often been guilty of biting off more than it can chew. Risks, including security risks, are neglected by over ambitious managers egged on by service companies which over promise. IT professionals would do well to follow the U.S. Navy design principle from 1960s: Keep It Simple, Stupid (KISS). The idea being that systems often work best if they are kept simple so simplicity should be a key design goal.
In a way, large organisations can be considered “systems” akin to computer systems with information and human beings as their components. Starting around the 1990s the corporate world began trying to design intelligence into the system. They called it Process Reengineering and, in part, it is responsible for efficiency gains in the past couple of decades. The idea was that you pay consultants to ask all your most gifted staff what they do and why. The consultants then draw up a lot of workflows and “processes” to reflect how the organisation works. After that they reengineer the processes to make them more efficient. That was the idea. In fact what happened was that, wherever possible, they ensured that each process was simple enough that it could be performed by a numpty so that staff costs could be drastically reduced. This was good for efficiency but it resulted in those idiotic call centres with people incapable of responding to new situations.
No problem! cried the consultants. We’ll build another set of processes called “Kaizen” or “continuous improvement” where failures will be recorded and teams will work to refine the processes. While removing the need for gifted people they would design intelligence into the system itself. This sounds good too but a problem is emerging which cannot be solved by more of the same.
We have become indoctrinated with the idea that all human endeavour can be reduced to a written document, a set of processes which can be followed like a computer program. But it can’t. The idea that the system can be intelligent is bollocks.
Modern organisations and systems are so complex that staff are given view of only a small part. Not all staff are numpties obviously but managers will use cheap, low skill labour wherever possible and most work in silos having experience only of following instructions and not thinking for themselves. As staff become more senior they move away from technical aspects and become bogged down in a set of management processes such as annual assessments and negotiating. Nobody is encouraged to gain any understanding of the system as a whole. Standards such as ITIL and PRINCE2 are useful but often seen as disciplines in themselves rather than tools to assist the technical process.
I recall being a member of an improvement team which gathered 8 people together for an hour in the first meeting where nothing was discussed but the name of the team! Another team had a mission to bring the elapsed time of a process down to zero and no matter how hard people explained that it was physically impossible to achieve anything in no time at all this could not be understood by the “customer representative”.
The nature of good IT security is that it exhibits no obvious results other than an absence of breaches. This combined with the obsession with process means that many companies pay lip service to security and cross their fingers. Security becomes, like everything else, a paper exercise where the staff tasked to operate the “security controls” to prevent malware are of such a junior level as to be unable to understand or defend the the controls’ importance in the face of constant pressure for greater efficiency from non-technical managers.
It is time to treat IT Security the way we treat Safety. In a reputable shipping company if a security officer decides that a ship is unsafe it doesn’t sail. They don’t blame the security officer they blame the guys tasked with keeping the ship in good repair. The management will not accept the risk of loss of expensive ships, cargo or, worse still, loss of life. Most large scale engineering infrastructure is also subject to stringent safety standards which must be met before passing into production. Contrast this with the sloppy way many IT systems are implemented.
The ongoing palava surrounding fake news is not a security breach in itself but it is related to the integrity of our information systems. In a recent BBC Panorama program the Facebook Policy Director, Simon Milner, was asked repeatedly to quantify how much money they make through “fake news”. Instead of being open or admitting that he didn’t know he flagrantly just repeated his silly little sound-bite preceded by “we take the issue very seriously” and this is what the IT industry does with IT Security. It trots out platitudes while treating security as secondary to business as usual.
At the time of writing the attacks on the NHS had yet to result in fatalities but it can be only a matter of time. Perhaps then, we’ll all start taking it seriously.