Tuesday, January 5, 2010

Measuring the Success of Our Efforts

One of the things we have lacked in the IT Risk and Information Security Fields is a decent way to measure how we are doing.

A decade ago, it was popular to measure the levels of viruses.  Some of us (not this chick) would report numbers of viruses in the network every month.  If they were up, things were trending badly; if they were down, we claimed victory.

Hindsight makes everything clearer, and if one thinks about it, that is a measurement filled with fallacy.  I could report lower, or even zero, viruses in my network and it could mean a number of things: (1) viruses were higher and now they are lower, or (2) I am incapable of really seeing how many viruses there really are. It also begs the question if viruses on the network are a particularly bad thing.  When I was a CISO at Microsoft, we proudly (and publicly) acknowledged that the corporate network was as dirty as the Internet.   But for most corporations,  giving management a number - ANY number - made everyone feel better, and we happily measured the "success" of the security program on metrics that meant absolutely nothing at all.  I suppose we will start to measure our improvement of airport security by measuring how many body scanners are in place, how many people are added to "no fly" lists,  in coming months - again, these are completely meaningless measurements because they don't reflect the reality.

The way we measure things is called a "measurement system."  We must understand the root cause of the problem, or multiple root causes in complex problems, and choose measurements that tell how well we are addressing the root cause.  That is a primary measurement.  We also need a secondary measurement - one that will ensure that we didn't "fix" the problem in one department, only to create a nightmare situation for another department.  That would be "transferring the problem" in a way that hurts the organization, but does not reduce the risk appropriately.

I'll give another example of transferring the problem, something we can almost all relate to this time of year: weight loss.  If I need to lose 10 pounds (which I always do) and I starve myself, I will lose weight.  The weight I lose will be the wrong kind of weight - I will lose muscle AND I will reduce my metabolism so as soon as I eat again, it will all come back with interest, and my lean muscle mass will be worse than before.  What happened?  I lost weight, but I am choosing the wrong metric by only measuring weight loss.  The root cause of the problem is not my extra 10 pounds, it is a sluggish metabolism which is a direct result of not enough exercise.  I should be measuring how far I walk everyday as an effective measurement in that scenario.

In the world of information security, and security in general, we react to the superficial, and content ourselves far too easily with responses that really don't solve anything, and often make things worse.  How can that happen?  Because we don't measure results properly so we don't set expectations properly.  As you listen in the days ahead to the discussions about national security in the days ahead, realize that will not be solved by the superficial improvements being discussed.  We could very well end up monitoring the wrong controls, spend billions of dollars and find ourselves in an even worse security situation than we are today.  Think about what we are doing, and how the problem is being moved from place to place.    We may also have to ask ourselves, is this  problem thing or action that can really be eliminated?  Or as individuals do we accept some level of accountability for our personal security and safety and manage our personal risk accordingly.

Just something to think about.

2 comments:

  1. A couple infosec areas I'd like to see more measurement is process and business service e.g.
    - % of apps through SDL and post production security bugs
    - # of internal consulting engagements
    - % of infosec services with RACIs and SIPOCs
    - % of LOB strategy alignment reviews (not just IT)
    Security teams provide a lot of free lunch and don't often market themselves very well.
    Now if the TSA could only publish some of their measurements.
    No comment on my fitness metrics...

    ReplyDelete
  2. Interesting thinking. I'll pile up my thoughts as well in term of realistic measurements of the accomplishment:
    % critical systems and data asset being monitored by IT security (log collecting, event correlation/generation).
    # of verified/exploitable high level vulnerabilities with the critical systems and applications.
    $ security budget to remedy the risk. -:)

    ReplyDelete