Your day job is about security, and like most CSOs out there, you have a IT background. Most likely, you are still quite handy with the tech, and if forced to, you are able to set some firewall rules, and possibly even change a routing table or two.
You are likely to have picked up on the trend that people are the weakest link in your security chain, and you most probably have some sort of user awareness training in place. You know it is important, and everybody does it, at least that is what your training supplier tells you. And you can tick that box off on your compliance sheet.
Like many other CSOs, you are also likely to not have reached the level of user awareness you imagined and hoped for, and you may have reached the level of frustration of Dave Aitel, who last year went all out and said that "You should not train employees for security awareness".
The human mind has many flaws. Yours does, and mine does too. We are jumping to conclusions without considering all the relevant information. We are constructing facts from fiction, because it makes us able to do what we want, not what is right. We are extremely vulnerable to peer pressure. We are blind to things we do not know about.
This implies that even if you know a lot about security, you are likely not to know a lot about people, how they function, and how groups form and interact. You may (and probably do) think that you know a lot about people. Consider this, then: do you have a minor, or a major, in a social science? Do you know what social science is, anyway?
Social sciences is a collective term describing the different sciences about humans, human interaction and groups, including (but not limited to):
- Social anthropology
- Organizational theory.
The blind spot bias works to help you focus on what you have to focus on, and avoid being interrupted by thoughts that are not relevant. The flip-side of the blind spot bias is that it is very hard to accept things you do not know. For example, if you have grown up in the western world, you are likely to consider insects in any form to be inedible. Traveling to a part of the world where insects are part of the human diet, blind spot bias may have you respond with disbelief and decide that locals are crazy, wrong and plain stupid. If, however, you grew up in an environment where insects are a regular part of the diet, you would consider such a response from a visitor strange and stupid.
The blind spot bias works against you by making it very hard for you to accept and realize other possible solutions, and the further away the solution is from your "known environment", the harder the blind spot bias will oppose such solutions.
Another interesting bias is the confirmation bias: the need to find evidence that confirms our theories and beliefs, which makes us disregard information that contradicts them. If we use Dave Aitel as an example (sorry, Dave), the confirmation bias made him see only the faults of and problems with user awareness trainings. The more proof he found that he was right, the less likely he was to look for contradictory evidence.