Security Awareness training: use cultural awareness and cognitive systems to message the audience

Tags: #<Tag:0x00007f389d34da50> #<Tag:0x00007f389d34d8c0> #<Tag:0x00007f389d34d730> #<Tag:0x00007f389d34d5f0>


Security awareness and work cultures

I have given security awareness trainings. You have. Here. There. Somewhere. Audiences can be difficult. - Because in these trainings you don’t want to exclude anyone. It’s not a technology topic, where you limit your crowd to the likes of you. Most likely engineers, techies or generally curious and cunning people like you and me. It’s a general topic.

In my past security awareness trainings were Chinese artists next to a Icelandic coders, next to German managers… all kinds of people. Janitors, kitchen staff, developers, sound artists… not the easiest target audience. Not the easiest jobs. Not the easiest job to do. To reach them, mean ;).

Security is a mind game

Games appeal to us. The promise to find joy and relaxation.

Is it that? - For you? There is behavioral analysis. A domain, where you find the really scary people. You talk to them, just casually, and all of a sudden you get the feeling you owe them. To owe them honesty, the truth, all of it. Why? I mean why do you still play that game. Of honesty.

There is a relation between awareness training and performance coaching. Because what security-awareness and performance-motivation have in common is, that the triggers need to stick. For it to work.

What do I mean with “triggers”? - Think of Las Vegas: people are motivated to put their hard earned coins into slot machines, all day long. The promise of performance sounds very similar. To do it. All day long. This appeals to all kinds of people. Janitors, kitchen staff, coders, me and you… Not the easiest jobs. It is not hard to trigger people to pay attention. 30 minutes of security awareness training. If it’s possible to setup a slot machine from the 60ies, which makes 1000s of Dollars every day, what are the odds against these 30 minutes?

Some put in extra hours, because they like their jobs. Others don’t. Some remain aware of security, other don’t. Let’s find out why that is. And how to change that with triggers. Let’s find out why the mind game of security awareness is a cultural challenge and why cognitive systems can be useful to trigger your audience into the right mindset. The crowd is not going to cheer. Who needs that? But the message is going to stick. They owe you that. Because security.

Cognitive systems: use triggers and stick it to them

See, performance is a game. A game you play. Performance means something to you. It’s the game you beat yourself at, or you don’t play it right. True? Right.

Security awareness is an employee performance matter. Just like being in time, having a clean shirt on, and not parking at the god damn parking sport your boss wants to put his Porsche. Performance has many dimensions. Some people perform, working from home. Some just chill and clock in hours. Some people read before they double-click. Some people double-clicked already. Maybe we should have triple-clicks…

Security awareness is this mind game. It means you beat yourself at this game. And attackers at theirs. Win win. For an attacker an unaware employee is just like a slot machine, which spits out a credit card. They probe companies all day long. Because one credit card is enough for 1000s of Dollars.

The three issues and how to solve them with framing and Game Theory

The first issue is, that 99% of all IT security professionals create their awareness training for the projector. They spend more time with Powerpoint or Prezi than with their audience. From a Game Theory perspective that’s like playing a game against yourself; as well. With nothing to win. You have two opponents: you and your audience. Is that a good strategic position? No. Of course not. It’s just that they think that there is no alternative.

The second issue is, that you cannot tell people to become responsible when you are treating them like kids. Internet filters, restricted user rights, performance-hogs like DLP agents and AV products… who needs security awareness if you have all of that? And if it’s not working perfectly well, why do you insist on it? Because security?

The third issue is, that people don’t see their risk. Unless you have a company credit card, and have to revert transactions, there is practically no personal risk. If the company loses money due to fraud, it’s not going to be deducted from a salary. And it’s also not going to be their job, which is on the line, if some Malware steals all the data. So if your InfoSec department has all these nice tools, it’s on them. Isn’t it? Why? Because it’s security.

In addition you are competing against Malware campaigns, with social engineering schemes and suggestive content. Maybe the Ransomware has an attractive person, which is smiling next to a Download button. What do you have? A degree in engineering. Sexy…

In summary: this position makes no sense. You treat people like kids, and fence them into a restricted environment. And then you ask them to be mindful and self-responsible. To do that you want to use Powerpoint or Prezi. You know, in order to present efficiently. And, you know how Malware campaigns look and what makes them appealing. Hint: it’s not Powerpoint. Or a projector.

Reframe responsibility

Sorry, I overstated a bit. You probably have security monitoring gaps. Systems where the AV does not get installed. Where there is no DLP. In a forgotten network. With Windows 2000. Sure you do. Why don’t you start with that? That there are systems, where “traceability” is limited.

Your message is, that the security posture isn’t perfect at this point in time. And that things need a fix or two, regularly. People generally do not know that. And it’s easy to relate to this.

Security is not about surveillance. Don’t lecture people like an “armchair IT guy”, who is nosy and wants to peek in everyone’s browser history. If there are things you want to do better, there is no shame in that. On the contrary. Security is not your problem. It’s everyone’s. Everyday.

What you target is that moment of lapse, where the double-click is misplaced. Ideally people request better security at the end of the training. Because the Malware, which is stealing data and credit card infos, is only a double-click away. Or a triple click, if we get that. One day we will.

The cognitive appraiser here is attention. That’s why it’s first. You must speak about being vulnerable. That is a key that unlocks the others.

The safety net

You are building a safety net for them. Like in a skiing area. If you are too fast, you get saved. Isn’t that a nice cognitive trigger?

Vacation? Everyone plans it. Everyone has it. And it makes so much sense to avoid falling of a cliff. Now you say… what’s the difference. A picture more or less in the Powerpoint presentation on the projector. What’s the big deal?

The deal is that unawareness gets associated with pain. With accidents. There is always one guy from marketing who wants to tell his pals about his hardcore skiing accident. While he is doing this, the triggers sink in. He is your “agent”. Willing or not.

The cognitive appraiser here is based upon legitimacy. Instead of teaching from the pulpit, you re-framed the anticipation.

The scandal

It sucks: if your credit card gets declined at a hotel far away from home. Or at a petrol station. They call the local police and you can get into trouble. Unless you have enough cash, of course. How much do you have?

Sure, you can revert the transaction. But if you are getting ready for departure, and the airport taxi is waiting at the hotel while you are checking out and want to pay for the mini bar, you are in trouble. Not your company.

In other words: we recommend that you keep your credit card infos safe. Not in the browser. Not in a Word file. And not as a sticky note. It’s not just about the financial damage. It’s about “the scandal”. Chances are good that this sinks in deep. I can add a personal story or two. If you want.

The cognitive appraisal here is based upon responsibility with oneself for bringing about an event that arouses emotions. Emotions are very strong carriers for messages here.


I am sure we can keep this short and simple. I don’t like licensing software. You don’t like it. But I also don’t like to pay for petrol. It turns out that it’s still necessary.

If your corporate AntiVirus solution is full of Keygen and Crack alerts, you need to start with the awareness training on a different level.
Why do people do that at work? Do they “just” use their USB sticks? Personal external drives? It can also be cultural, because there are countries where this is more acceptable. It’s not about the security metrics here, it’s about the why. I’d recommend not to look the other way, because I have been in software license audits. It can get expensive.

The cognitive appraisal here is anticipated effort. The effort is to have to explain this, like someone who is responsible.

What does the practice of the law have in common with the practice of information security?

Did you know that there is no 100% guarantee in information security? You most likely did. No Anti Virus tool, no protection measure you buy anywhere, will give a 100% guarantee. There is no 100% security.

Is there a 100% guarantee that you will win a case in court? No. Of course not. That’s one thing information security and the practice of the law have in common. Legal processes and information security processes have in common, that they are crated to reduce the risk.

By the way: is there a 100% guarantee that your “cyber insurance” will pay? Or any other insurance? Car insurance, house insurance - fire, storm etc… I think you know the answer to that.

This makes use of an appraisal rebound effect. It’s about removing the “security stress” from the audience, and to suppress the messages becoming chronically ignored. Instead the certainty of the situation is re-rendered by sharing the insight that in infosec things are unpredictable to a certain extent.

Cultural awareness

Awareness training in China - please do not generalize

  • In China (Shanghai) I made the experience, that it’s good to localize the materials and to have concrete examples about what you do not want to see. People can get offended if you do not have that. You have to render the problem concretely. Then people will avoid these “additional complications”.
    I think that is a good strategy to align the cognitive appraisers culturally in China.

  • In my experience hierarchies in China are very important. If a local manager wants to introduce you, that is a helpful gesture.

  • It’s also important that you do not use security metrics and reports, which reference cyber attacks “originating from China”, because these are often US “propaganda” materials.

Awareness training in Germany - structure is key

This is an easy one for me.

  • The materials need to have a certain structure.
  • You should not overstate risks and vulnerabilities. People won’t take it seriously otherwise. No US based cyber cyber.

Information security is alien to typical “German engineering”, because it requires out of the box thinking. Generally Germans don’t want to be confronted with this. So it makes sense to use cognitive appraisers based on legitimacy and responsibility.

Awareness training in Iceland - risk takers

A good way to describe this is: I would not want Icelandics to build an architecture for a boat. But if there is a storm and you need a good crew, they work hand in hand as one. The Icelandic culture favors risk takers. But that is limited to risks, which are socially acceptable.

  • Icelandics do not like being told what to do. You do not need a manager, to introduce you
  • Make sure there is beer, otherwise they won’t show up :slight_smile: . Beer is expensive in Iceland.
  • Ah, and they don’t like extra effort. So if you can render security problems to cause a lot of extra work (for them), chances are good they will stop causing them.

Iceland is a very interesting place, but don’t underestimate the cultural differences. People in Iceland feel quite safe on their island, far away from both Europe and the USA.


If you position security awareness to be an employee performance matter, you have good options to set cognitive triggers. These can be used to make certain messages stick. All what is left to do is being a good presenter. But if you regularly speak at security conferences, this won’t be a problem.

There are more topics, which are interesting in this context. Like using auto-appraisers as primers for direct attention, utility theory or influence models. For later blog posts.