Business Psychology for Business Continuity

Smart business continuity managers, like other smart people, know that shamelessly applying good ideas to meet objectives is an efficient way of moving forwards, whether the good idea was theirs in the first place or somebody else’s. With this in mind, here are some interesting ideas from a discussion on the challenges of effective cybersecurity, which could equally well apply to the propagation of business continuity.

Essentially, the concepts revolve around the mental biases and blinkers that can affect anyone, and that can lead us to dismiss or ignore threats that end up causing catastrophic problems (data breaches in the case of cybersecurity, business interruption in the case of business continuity):

  • “Normalcy bias” If we have not experienced something in the past (like critical supplier bankruptcy), we consider that it will not happen in the future. This, of course, makes for unrealistic business impact analysis.
  • “Probability blindness” We fail to see that extreme cases (like the whole IT infrastructure being wiped out) are not isolated on/off-style possibilities, but that they lie on a continuous curve of degradation and are where we can end up if things are allowed to get worse and worse.
  • “Hope it goes away” We cross our fingers and hope that if we ignore a problem, it will go away. This has about as much chance of working as ignoring your tax return, i.e. nil.
  • “Parkinson’s Law of Triviality” We focus more on things we understand and less on what we don’t understand. Thus, we may spend hours on RPO because we can calculate it, but completely bypass issues like MTPD (or how long before the entire organisation suffers serious damage).
  • “Not Invented Here” Also known as the “curse of knowledge” bias, this prevents us from accepting good ideas from other people. After all, we are such BC experts that we already know about any good BC idea, right? Wrong – unfortunately.

The first step to better business continuity through better psychology is therefore to recognise that these biases exist and to spot them as they occur. Then we can start to weed them out and their effects, in order to get closer to reality and effective BC planning and management.

This entry was posted in Business Continuity and tagged , , , . Bookmark the permalink.

Comments are closed.