Indicators of failure for information security projects

Tags: #<Tag:0x00007f76f4e23688> #<Tag:0x00007f76f4e23520> #<Tag:0x00007f76f4e23408>

Fail early, fail often? Just make it happen? Because you are smart!

  • More data breaches get reported, customer records appear on Pastebin, more DB instances with no passwords on Shodan. In rare cases companies can say - or want to say - whether PIIs have been lost.

  • Although credit card fraud is on the rise (and that has very little to do with Skimming) no one wants to “fix it”. Insecure web shops everywhere, malicious advertisements and anti-AdBlocker movements have become common threats.

  • Phishing mails get opened in 80% of all cases, and attachments get executed. Infosec 1999 aehh 2016.

What did we do in the last 10 years, if we are at this state? Did we have success? How does it feel, if an information security project fails? What does that even mean? - In an almost Shakespearean way we need to ask:

Too smart to fail or smart enough to fail?

Information security technology is expensive. If you take a look at the prices of the usual appliances (and I don’t want to name vendors here), you will realize that you need big budgets and a lot of manpower to get these up and running. Leaving the the maintenance and the business-process related “issues / tasks” besides.
Long story short: infosec projects can fail. It’s rarely spoken of, and few people know what to look for in order to determine success or fail.

Indicator 1: oversimplification loop

One of the best indicators of fail is, that staff (including management) oversimplifies tasks and doesn’t surface issues.

The more undocumented issues get fixed ad-hoc, the less healthy the infosec infrastructure becomes. To be exact: the more frequent these issues lead to security exceptions, the less likely it is that your security posture is efficient.

Security holes often origin from design issues, lack of domain expertise or lack of time. Often “small issues” get simplified in order to meet certain “ticket throughput” criteria, because “things need to work in this company” and “we cannot block this”. If you hear these phrases in your infosec department, you should stop oversimplifying information security aspects. It is a very expensive loop of inefficiency escalation, which your department has started to cycle it’s workload through. Because the more exceptions, the more complexity. The more complexity, the more exceptions. And each cycle of this loop lowers the efficiency of the security posture. It falls apart.

Indicator 2: security managers with no domain expertise

If the staff responsible for information security (with competences geared towards their position) does not understand the difference between process execution and a residual binary (for example) chances are good they will not be able to detect the oversimplification loop (Indicator 1).

Detecting these oversimplification problems is a required competence in security management because all the departments of a modern company have specific (if not subjective) security requirements.
A professional information security manager needs to wear many hats: finance insights, software development (process) internals, physical building security details… It’s quite likely that there are 3 business meetings about three different domains scheduled on a single day. Breaking down the security requirements to the individual experts without security domain expertise is impossible.
You don’t need to be an expert, let alone a specialist of “everything” (see Indicator 7), but you need to be a domain expert with a broad horizon, who uses a thinking outside the box mindset with passion.

“But engineers are not good with people, and management is a people skill…”

I have been told things like that. At my desk, when I was supposed to do commit a bugfix “yesterday”[tm].
Thing is that this is actually discrimination, and a prejudice. Whoever starts his argumentation like this should be questioned. When I was a junior developer I didn’t dare to do that, but these days…I know better. Ask early, ask often, and make clear that you want to be treated “with people skills”.
It’s not exactly a people-skill to go to the desk of an employee who is working on a complex time-critical problem, and to interrupt him. One usually waits, and makes sure not to disturb others. That’s why you knock at a door. Domain expertise matters. People skills are a habit, which gets developed in a work environment.

Indicator 3: non-configuration

Day 1: you get an email washing machine; some sort of prevent this-and-that or anti-Boom. Day 2: it’s still in the co-location server room, not even unpacked. Day 3: no one has got any training for this. Day 4: someone racks it in. Same shit, different day style: cables in, lights on, rack closed. Day 5: it’s installed and the compliance standard doesn’t mention any configuration specifics. “We are done”[tm] (Indicator 2).

Non-configuration is a big problem in information security, especially if the oversimplification-loop (Indicator 1) has escalated. Checklist oriented behavior of staff often shows a lack of domain discipline and an apparent disrespect for the security posture or the information security domain as whole. This makes risks invisible, which should surface. An invisible risk, covered up by a oversimplifying (compliance) practice, is a real problem. - Because it’s a short-term win with consequences.

Non-configuration leads to dangerous assumptions upon the risk matrix. Elements, which appear to have a mitigation, have none in reality. This is a disconnect, which marks absolute project fail. The money, spend on the washing machine, is wasted. Because it doesn’t wash anything.

Indicator 4: severity escalation and assumption management

I call it the “dirty laundry syndrome”. Someone sells a Machine Learning based next-next-generation security appliance with Artificial Intelligence algorithms to your company. A great investment in security! Of course Indicator 3 is present, because who has got expertise of this? How do you rate the severity of the results from such a system? Call your old professor?
No, you need a Big Data SIEM workflow. And correlation of results, and multiple of these appliances. And of course that’s all simple to do (Indicator 1). In order to produce value you start filtering for issues, which are known to exist. These re-surface… and now you have your findings. Of course, because they re-surfaced now, the severity is high. Based on the assumption that this is a correlated result.

What that means in layman’s terms: it’s guesswork with a search engine.

In order to cover this, a lot of complexity gets added. The usual names for these complexity obfuscations is “next-generation”. Because if you tag it as “current generation” people would try to understand it. The more “next-generation” appliances you have, the more likely it is that you are doing assumption management instead of risk management, and that you escalate the seventies based on fear, uncertainty and doubt (FUD).

Indicator 5: license loops

If you buy a new computer at the store, there are some trials installed. Usually for Microsoft Office; and other products you have never heard of. The companies sponsored the retailer, and that’s why you got it. If you are licensing your appliance / tool and an add on needs an additional license… and the add on of the add on too, then chances are good you are license looping your expectations based on the influence of 3rd party sales consultants.

This marks a severe disconnect with reality, and this indicator does not appear isolated. If you have a product focused workflow in the SOC that is fine. But if you have a 9-to-5 SOC and the product is showing traces of Indicator 3 and 4, you need to re-evaluate what you are doing. Spending money is not an accomplishment.

Indicator 6: the security first syndrome

“Let’s get it out of the way with: any security concerns? No, then let’s go on with the meeting.”

Security first” is the exclusion of security posture instead of the incorporation. Security can only be incorporated if it incorporates itself. If you have traces of Indicator 2, this is impossible. Information security also is about leading by example, and if you cannot make time to assist other departments directly, chances are good they don’t even acknowledge what you are doing.

Indicator 7: executive summary page 1 contains "Everything"[tm]

Assurance is an important deliverable. But if you need executive summaries on page 1 because no executive will care about page 2, it’s very likely that there is a lack of transparency. - Because executives have time to read financial or legal reports. They are liable - for information security as well. Even if there is no CSO.

Obviously if you have Indicator 2 present, you might not have enough content in the reports. “Everything is fine, we have been audited and…” The first word marks the end of the sentence for the executive. There is no “Everything”.
Period.

If I get a report from an assurance / consulting company, and I find the word everything in the executive summary, I send it back before I forward it. This is basic quality control.
Unless you are serious about your expectations, don’t even try to make executive summaries. Because you don’t need executive support, if you have “Everything” already.

Indicator 8: repetitive meetings versus ignorance - the shrug off

Security awareness or security development training happens every 3-6 months, and let’s say you really perform regular lessons, tests and reviews. Good for you. But what if every time the same employees or projects fail, and they shrug it off?

If people can shrug it off, I mean if people don’t care putting their work place or projects at risk, there need to be consequences. That being said, before you even think about making an effort for something like this, you need to eliminate all the failure indicators at your own department.
Chances are good, that your guidance is inefficient because your own security posture is inefficient. - You can’t share, what you don’t procure. If you don’t eat your own dog food, don’t expect other people to do so.

Bothering people (with polices and block-lists) is the wrong way to target ignorance. Instead you need to make sure that good examples get the spotlight. Ignorance is an indicator of the internal failures of the security department; in most cases.
Easy to say? No, it just is a hard truth. If you don’t like it, call it differently. That will help for sure. And get you into these repetitive meetings your coworkers ignore.

Indicator 9: 8, 7, 6, 5, 4, 3, 2, 1 - fix

“We need to do something about this.” Correct. And what are the OKRs, KPIs about “this”? If people throw their “this” and “that” around, and don’t even bother to refer directly to the core issue, chances are good you only think in "this"s and "that"s and not in "it"s. In other worlds: the less specific you get, the less related (read relevant) to the company you are. Communication, and I mean non specific references like this, are a good indicator of failure.
What is not visible and transparent, cannot be referred to. What is not clear to everyone, has no common name. What is not common, has no meaning. What has no meaning, is not relevant. What is not relevant, can fail without consequences. This leads Indicator 8. 8 leads 7 and so on.

I call it the “reverse domination” effect.“We need to do something about this”. Actually the sentence embodies a call for action. In reality it’s the opposite, because there cannot be action without a goal.

Summary: better fail than sorry

Don’t setup projects if you have one of these indicators present. Save all the effort to layout your guidance internally, and let it shine. It’s better not to build on top of a bad basis. Security projects fail - often. Otherwise we’d see less data-breaches, less customer records being lost, less companies being hacked and by far less credit card fraud.
Often infosec professionals excuse this with “it’s all the lack of awareness” or cite the common “you can’t patch stupid”. In reality the problem is the attitude here. If you are too scared to say, that you have failed, how do you expect to be taken seriously?
Look at the past incidents: are you sure that it’s all due to awareness issues? None of this is your fault? You did everything you could? Welcome to the state, indicated by indicator 8: the shrug off. :wink: