False Sense of Security

Recently, as reported in the Wall Street Journal and also in Investor’s Business daily, as well as Fox News, President Obama is proposing that organizations share information security data as well as proposing laws to require companies to report breaches within 30 days. There are certainly pros and cons on either side of these proposals. I see one problem already: IT and Security departments do not necessarily report transparently to the Board and the Executive Suite out of fear of losing jobs. Mandating is not going to change that reality as companies already may have a false sense of security.

Generally, I would argue that greater transparency is better. Someone once said to me that “sunshine is a great disinfectant,” and so it also is with our view into the security of our organization.   Greater transparency allows us to take an objective view across the entire enterprise and enables us to measure what is working, as well as what is not working through the lens of continual improvement. Greater transparency also builds credibility and may lead to competitive advantages.

From the perspective of a couple of decades managing IT as well as security organizations, I can attest first hand that there is a tendency to hide the truth. When something goes wrong, human nature is often led to self-preservation and it is true that sometimes “heads roll” when systems or networks go down, or when there are serious security breaches. Someone has to take the fall, right? It may happen, although I still argue for honesty and transparency as I would rather hire someone who might have lost a job because something went wrong, and is honest about it, rather than someone who pretends that nothing has ever gone wrong nor any mistake made. Mistakes happen and what matters is what we do about them.

Over the last 15-20 years, I have interviewed hundreds of candidates for technical and/or security roles. One thing I always ask each one is to tell me about a time where they made a mistake, or when something went wrong, and how they handled it. Responses are mixed with some claiming they cannot think of anything, and others giving me quite detailed responses along with how they, as well as the organization learned and improved from the experience. Needless to say, the ones who chose to be transparent made the short list and were hired. The others did not progress.

A few years back while leading a managed security services team, we took on a channel account which had around 400 end customers sold through them. Invariably, a mistake or a mis-diagnosed incident occurs with such a large estate and while I do not recall the exact details, I do recall the process of generating an incident report. The service delivery manager sent me the draft copy and inquired how we should edit it before sharing with our operations counterparts at the channel. I made it clear to the SDM that the unfiltered analysis was to hit our counterpart’s inbox at the same time it hit mine. From that day forward, we have shared a very transparent working relationship as we discuss and work through the occasional mishaps with a shared eye towards solutions and driving improvements from the learning process of mistakes.

The point of these examples is that failures and mistakes are opportunities to establish credibility with customers as well as executives and the board. Anyone can paint a rosy picture and tell customers or the executives that all is well. It takes courage and integrity to step up and convey bad news along with a plan for improvement and mitigation.

No CISO, Security team, or MSSP can guarantee to their constituency that they can prevent all hacking and security breaches. In fact, it is a guarantee that sooner or later, any organization will be hacked. The important element is detection, and detecting the security breach before substantial harm can be done to the organization.

So how should we package this news to executives and the board? The monthly dashboard would be a great place to start. Roughly 67% of organizations out there have some type of malware present, and it takes over six months on average to discover this. Worse, two thirds will not find out from a third party. Instead of reporting “confirmed kills” such as virus detections, IPS exploits blocked, etc., why not also report false positives subsequently discovered, and the trend of the detection gap between breach and discovery?

The progression of security starts with protection, then moves through detection, response and finally to recovery. The dashboard presented to executives should set the expectation that there will eventually be breaches, and that the security team is taking steps to shorten the detection gap to the point that minimal harm is done before achieving recovery. The key metric is in reporting continual improvement and minimizing organizational harm.

Failure to report reality to management may in the short term achieve job security, albeit at the cost of credibility. When a major breach occurs, and sooner or later it will, if the security team has not previously established credibility, jobs may indeed be lost when someone’s head must roll, and justifiably so.

Creating a false sense of security accomplishes little in our quest to improve information security. It is far better to take the high road and report transparency and improvement than to hide the bad news.

Ted Lloyd, CISM



Comments are closed.