Build a Security Culture (2015)


In this chapter, we look at how security culture comprises security awareness, and how security culture succeeds where awareness alone is doomed.

In the previous chapter I discussed how security culture is more than people and competence; culture includes the rules, laws and regulations, as well as the technology we use. Security awareness belongs in the people and competence part of the triangle.

Security awareness is a limited area, as well as a poorly defined one. There is no commonly agreed upon definition of security awareness, which in turn means that a common understanding of what security awareness really is, is non-existent. Almost everyone I talk to has their own idea of what security awareness is, and how to create awareness.

The range of ideas for building security awareness goes from using baseball bats to enforce a certain behaviour on one side, via running boring, generic and non-yielding security awareness trainings, to not doing anything at all. What is even worse is the fact that very few of these efforts are being measured; they are at best measured by anecdotal proof: “I did this, and it did/did not work.” Metrics are simply being waved off as “impossible to measure awareness”.

Because measuring your progress is important when working with culture, Chapter 6 looks deeper into the topic.

For the sake of clarity, I will use the following definition of awareness in this book:

“Knowledge or perception of a situation or fact.”

(Oxford dictionary)

What does it mean to have knowledge or perception of a situation or fact? It boils down to two things: the right competence, and the ability to apply said competence in a particular situation.

So far, so good. Building competence can be done; we see that all around us where we learn new skills and information almost daily. The human mind is an amazing machine when it comes to collecting, analysing and using new information and skills. The more we know, the easier it is for our brain to do even more, which is truly amazing11.

The flipside is that if you learn the wrong skills, outdated information and erroneous mental patterns, your brain still does a great job but just turns out the wrong answers and responses.

Let us quickly visit the brain and how it works. Take a look around you. Chances are that there is a cup of coffee, tea or other beverage nearby. For the sake of simplicity, I will call it a cup of coffee.

Take a look at the cup. The light that reflects on the object is caught by light-receptors in your eyes. The receptors pick up the wavelengths of the light, and different receptors pick up different wavelengths. The receptors trigger events that are sent down nerve paths into your brain. The light has been transformed into chemical signals.

When the different chemicals reach your brain, your brain recreates an image of the object (that cup of coffee) using one of two methods12:

- Slow: You have never seen anything like a cup of coffee before, and your brain does not know what it sees. Your brain slows down and starts to create a mental image, a pattern - what we call a mental pattern - of the object and its significance. Since this is a new observation, your brain may or may not register important properties of the object, such as it has black content, it is hot, it carries liquid, there is only one opening, the handle is there to hold the cup, the liquid is drinkable, the liquid is a drug and so on. To make your brain understand all of these details, it must be taught. You must teach it the significance of the properties, and help it eliminate the non-important ones like the colour of the cup, the size of the table and the form of the room. This slow processing and learning that takes place in your brain requires a large amount of energy (viewed from your brain’s perspective), and so it prefers a quicker, faster and less expensive way of processing information.

- Fast: Most of the time, your brain interprets information using a fast method. In this scenario, the object that your brain receives of the coffee cup is matched to an already existing mental pattern in your brain. Your brain recognises the object as a cup and automagically interprets that cup to mean hot, black liquid that helps you (your brain) to be sharper and quicker. This processing takes very little energy (again, from the perspective of your brain), and is lightning fast (compared to the slow function before). No wonder your brain prefers this one!

Now that we have taken a very shallow crash-course in human sensing, perception and information handling, it is time to look at how this functioning also works against us.

Growing up in Europe, I have a thing for ice-cream. During summer holidays, I love to cool down with a cone, or on special occasions a soft ice fresh from the counter. A soft ice in Europe is usually white, tastes of vanilla and you can have it dipped into a variety of powders, colours and liquids. Personally, I prefer a topping of chocolate powder. I observe others who have rainbow sprinkles, some have strawberry sprinkles and others again choose liquid chocolate that turns into a hard shell on their soft ice.

Imagine a hot summer day. The sun is burning, and you are walking through a city centre as a tourist. You see a shop promoting soft ice, and you watch happy, smiling people licking their soft ices as they pass you on the street. “The perfect day for a soft ice,” you think, and head to the counter.

Suddenly you discover there are three choices of soft ice. You can have the white soft ice. You can have the brown soft ice. Or you can choose the green soft ice. Let the drooling begin!

Before you choose, keep the image in your mind and tell me: what taste do the different coloured soft ice have? The white one is vanilla, right?

A brown-coloured soft ice-cream. What taste is that? Chocolate? Or sweet beans?

What taste does the green one have? Pistachio nuts (either chemical or real)? Or is it green-tea taste?

Write your answer on a piece of paper. Your answer depends on a lot of things, all learned!

Firstly, your mental image of a soft ice-cream was stored in your brain, and was brought out just by thinking about it. Along with the image, you may also have felt the summer heat, you may have recalled the smells and taste from that memory, and possibly you heard the noises and sounds that you experienced when you had that soft ice. This is your brain going on fast mode. It has learned what soft ice is and the surroundings that go together with it. Every time you think of soft ice, it brings these memories out.

This processing is really good, because you can recall, recognise and act upon a particular situation: summer, vacation, soft ice. Properly trained, this could be you recognising a phishing scam, a Trojan or a threatening situation.

Secondly, your mental image of the soft ice represents what we can call acquired taste. Your answers to the question of colours and the taste each colour must represent are dependent on your culture. If, like me, you are a westerner who grew up and spent most of your time in Europe, Australia or North America, you have learned that when it comes to sweets, and ice-cream in particular, white is likely to mean vanilla, brown to be chocolate and green to be pistachio nut flavour.

Now recall the scenario of me in front of that counter, having the choice of vanilla, chocolate or pistachio. Of those, I would choose chocolate first, pistachio second and vanilla third. So what do I order?

This is where context matters. Am I about to order a soft ice in Europe? Australia? Or in North America? Or am I somewhere else? And if I am somewhere else, do the rules I know, the mental patterns I have learned, still apply?

In fact, I am in Kyoto in Japan. I had a one-day excursion to this fantastic city during my first visit to Japan some years ago. Since I had been in Japan for a few days already, I had picked up on their different idea of sweets. Some I loved, and some I found very hard to understand. So, before I ordered, I asked what kind of flavours they had, and was given the answer: plain (vanilla), sweet beans and green tea.

I probably did look surprised, because I was really expecting the answer to be vanilla, chocolate and pistachio. I expected these particular answers due to one of the characteristics of mental patterns: they get stronger every time we use them. So every time I have seen brown ice cream with the taste of chocolate, that particular mental pattern grew stronger in my mind until the pattern became so strong that anything that no longer fitted into the pattern surely must have been wrong, impossible, or both.

This is sometimes referred to as the Expert Bias13. Experts are experts in their particular field because they have had the opportunity to narrow their area of focus, and work mainly in that field so long that their mental patterns are strong and efficient. The flipside they (and those around experts) may experience is the expert’s lack of ability to see things from other perspectives - their mental patterns are so strong that they can no longer review their position.

Similar mechanisms apply to people who are not given the opportunity to have their current mental patterns challenged. If you grow up in a particular culture, and you are not exposed to other cultures, it becomes very hard for you to understand that other people may behave differently from you, and that their behaviour may not be malintent or even wrong.

There I was in Kyoto, by this time knowing that brown is sweet beans and not chocolate. My brain, craving for the chocolate flavour, quickly convinced me that sweet bean flavour cannot be that bad and is probably almost as good as chocolate. I ordered a brown soft-ice cone without toppings.

Still expecting chocolate, my brain almost shut down when my taste buds sent the taste from my mouth to my brain. I could not believe what I had. I tried several times, and each time the same happened: my brain expected a different taste, and the difference was too great to conceive. All my brain could do was tell me that this did not fit the pattern, this was not right, this was wrong!

I experienced a strong case of cognitive dissonance, a psychological phenomenon that happens when your brain expects something particular to happen, and something else happens instead. It is like your brain just says, “This cannot be. I don’t believe this.” And then it just denies any of the novelty.

Cognitive dissonance14 is important in awareness too. If your security awareness programmes are not properly aligned to your organisation’s particular needs, they are more likely to create similar responses in your participants that I had in Kyoto. More importantly, I understood the context, and I knew about these effects. Even then, I fell victim to this mental process. We all do, more often than we like to admit.

Part of our job as security officers is to help our colleagues understand risk and teach them appropriate responses. To do that, we need to understand how our human mind functions, so that we can adapt our training efforts to build knowledge and perception to deal with security issues in the correct manner.

Understanding our brain’s shortcuts and mishaps should also help you to understand that it is not your employees who are stupid, it is a question of how we communicate with them that matters.

The main difference between security awareness and security culture is that culture is more than just awareness. If you recall from Chapter Two, security culture is a combination of people, policy and technology. Awareness is only about people, and only a subset of the people: it is knowledge only. This does not mean we do not need awareness. Awareness, or competence as I prefer to call it, is vital for people to have to do the right thing. The key is to consider competence as one way of building culture, not an end in itself.

Security awareness in itself only helps people know about, or be aware of, the security issue you are training in. Knowing something is not the same as changing a behaviour, which is usually what we want to do when we train people about phishing attempts, password security or clean-desk policies. Knowing about an issue is only one of the steps towards changing that behaviour. Using the Social learning theory15, we discover that there is a four-step cognitive process people use to learn:

1. Attention

2. Retention

3. Reproduction

4. Motivation.

Each of these steps is important, and awareness is often mostly about the first two.

What does this mean in practice? Let us take a closer look at each of the four elements:


Attention is about the learner paying attention to the activity to learn. The one who is learning must be present, pay attention and take an interest in what is going on. As in life in general, there are things that impact this step: the learner himself, as well as the training and content.

To enhance attention, we can provide relevance to the learner by explaining why this training behaviour is important. We can also provide an environment where the behaviour we want is already modelled, and show this modelled behaviour.

Security awareness programmes that stop at this level are recognised by measuring attendance only; they report on the number of people taking a particular course.


Retention is about the learner’s ability to retain information. Again, the learner’s abilities are at play, and we can help them retain the knowledge by creating an environment that enables easy learning, adapting the content to the level of knowledge of the learner as well as repeating as necessary.

With security culture in mind, we can adapt our programmes and their content to the needs of the learner by analysing the audience before we develop the actual training content. People are different, and may need different approaches to learn best.

Security awareness programmes that stop at this level are recognised by measuring attendance, and repeating the same training programme at some intervals, like a yearly phishing training programme.


Reproduction is about showing that the behaviour is learned. In this stage, the learner will reproduce the learned behaviour and show that they know what to do and how to do it.

Many awareness programmes stop at this level. They use skill tests, questionnaires and other quality-assessment tools during and right after the training programme, showing some level of reproduction. An example is a phishing training programme where you measure how many learners click on a link during the training and not after.


The final step, the target to reach for, is to motivate the learner to reproduce the behaviour consistently outside of the learning situation. The learner is taking into account both formal and informal information to decide whether or not to reproduce a behaviour. Both technology and policies play important roles to motivate the learner. If, for example, you want people to discover and report phishing emails, and your reporting system requires them to file a three-page form, their motivation will be lower. They may very well be aware of the problem and know how to handle it, yet the technology to do so is too much of a burden for them to commit to the behaviour.

Security awareness programmes at this level measure behaviour on a number of levels. They may look at attendance of training courses, but will use that number only as an indicator that there is activity. They will implement tests to measure actual competence, or the ability to reproduce. They also implement other metrics to measure the impact of behaviour on their systems using logs and data analysis.

Organisations that implement programmes at this level use a structural approach that helps them focus on improving their security culture. They may still call what they do security awareness trainings, when in practice they are running successful security culture programmes, building and maintaining the kind of security culture they want.

Looking at the definition of culture again - the ideas, customs and behaviour of a society or group - it becomes clear that having knowledge or perception of a situation or a fact is not enough if we want to change culture. What we need to do is to identify the ideas, the customs and the behaviour that reside in our organisation today, and consider what ideas, customs and behaviours we want in our organisation. Bridging that gap is what our efforts should be all about.

By now you may be wondering if I expect you to do all of this by yourself. I don’t. Nor should anybody else. Changing culture is a task done by a number of people, and your job is to be part of that force. In the next chapter I will give you some pointers as to whom to ally with.

Building bridges

Building Bridges

John, the CISO of a large, multinational bank, learned that his colleagues in other parts of the organisation viewed his work and team as a nuisance, a distraction to their own work and a hassle that was forced upon them. “They just don’t get it,” complained John, “I’m here to secure the business so that we can avoid breaches and downtime, and all they give me are complaints and negativity.” John is not alone in facing this challenge. The challenge is to demonstrate a clear business value that resonates with the rest of the organisation, even if plans must be changed along the way.

The first step in this direction is to understand the business, and how business in general functions. As part of that understanding, John realised that security secures the business and reduces risk. The purpose of security is not to remove all risk, nor is it a question of getting in the way of business. “I understood that there cannot be security without business,” he told me, “but it very well may be business without security!” After this revelation, John toured the different departments and locations, discussing security issues with department managers, country managers and many more. His focus was not on selling security, but to learn about the challenges each department faced, and to learn how he could change the communication within the bank. He set out to build bridges instead of burning them.

John also learned that he could use help to build his security culture message and to spread it throughout the bank. He reached out to the HR department and involved them in his security culture programme. The HR department are a key resource when working with culture in any organisation, and they are also the specialists when it comes to training programmes. Next, he asked the bank’s marketing department if they would help craft a message and the needed collateral to empower his security culture campaigns. At first, he was more than a little sceptical to involve creative people who had no understanding of security. Using arguments like “they are trained in communication”, and “by working with you, they will learn security”, I convinced him to try it on one campaign at first.

By reaching out and building bridges, John set up a core security culture team with members from his own team, from HR and from marketing. He also invited key people from around the bank to be on an advisory board, who were asked to test campaign ideas, comment on materials and give feedback to enhance the overall performance of the security culture programme.

11 The psychologists Gigerenzer, Kahneman and Pinker (among others) have developed a variety of models that reflect how humans make decisions, learn new things, and so on. For instance, Kahneman received the Nobel Prize for his work on Prospect Theory, which describes how humans make decisions where the probabilities of certain outcomes are known. Using heuristic processes, humans combine disparate data in order to refine their decisions, even where the individual is unaware or oblivious to the fact that they have this information. (Kahneman and Tversky, “Prospect Theory: An Analysis of Decision Under Risk”, 1979.)

12 This example is derived from Kahneman’s Thinking Fast and Slow.

13 This is also called the ‘Curse of Knowledge’, and extends beyond expertise and into the difficulties of reasoning with someone else’s beliefs. In the words of Birch and Bloom, “adults’ own knowledge of an event’s outcome can compromise their ability to reason about another person’s beliefs about that event.” (Susan Birch and Paul Bloom, “The Curse of Knowledge in Reasoning About False Beliefs”, 2007.)

14 Cognitive dissonance can be described as the state of having inconsistent thoughts, beliefs or attitudes. In general, it is a mechanism by which a person rationalises conflicting experiences or knowledge, without having to accept that these are, in fact, at odds. (Leon Festinger, A theory of cognitive dissonance, 1957.)

15 Bandura and Walters, 1963