Unconscious biases relating to cost, conformity and compliance are undermining effective security argues James Wilkes from Gray Page.
I was asked to speak at a conference recently about maritime security and given 12 minutes in which to do so.
Being asked to speak for 12 minutes is notionally more attractive than being asked to talk for much longer.
However, a short slot makes you think very hard about what you want to say.
The time limit compels you to take the many things you could talk about and edit them down to no more than a few clearly communicated and meaningful points.
The American writer and humourist Mark Twain recognised this when he apologised to a friend for sending a long letter, explaining he didn’t have time to write a short one.
There is a parallel, I think, with what it takes to conceive and implement good security practice.
Security that is effective – security that works in actually protecting seafarers – is security that has been really well thought through in the first place.
If you accept that premise, then we have to ask why so much of what we’re trying to achieve security-wise is often wrapped up in biblical-length procedures where the original intent gets lost in the inclination toward quantity over quality.
Why much of the security equipment and many of the measures employed are not particularly effective at doing what we evidently need them to do.
And why we plough on, on the basis of a legacy of ideas that might have been appropriate 10 or 15 years ago without asking how relevant and effective they are now, and are likely to be in the near future.
Let me give you an example.
If there was a bomb threat against a building, its occupants would all be evacuated and moved to a distance away that is considered safe for them, should an explosion occur.
What won’t happen is everyone in the building being asked to form into search-parties to try and locate the bomb. Why? Because it’s patently not safe! Lives would be at risk and that is unacceptable.
In contrast to that, simulated bomb searches feature prominently in the list of ship security drills conducted by crew.
By implication, if there was to be a bomb threat against a ship, or a crew genuinely believed that they were in a situation where it was likely that a bomb could have been placed on board, the crew would conduct a search for it.
The nature of the drill implies that they wouldn’t evacuate the ship: rather, they would go and look for the bomb.
We accept that being at sea is more hazardous than working in an office environment, but if you think about it, having crew search for a potential bomb is crazy.
In their shoes, would you want to go on a bomb hunt?
No. I’m sure you’d rather get as far away from the threat as possible and leave it to someone who is actually trained to deal with it.
The drill has its genesis in the implementation of the ISPS Code.
Bomb risks are clearly a security issue and it’s sensible that the risk is considered thoroughly.
But I’m pretty sure that the idea of having crew searching for bombs is not the right response to the issue.
Indeed, it seems to me that there are quite a few things that are being done on board ships under the auspices of security that are not the right solutions to the problems they are designed to address.
I don’t say that flippantly. It’s just that with more 20 years’ experience in the field, I’m certain that the measure of what we do on board ships security-wise has to be its effectiveness.
In thinking long and hard about why we are where we are, I believe I have hit on something, possibly. It is an insight I owe to the world of marketing and advertising.
For many years, some marketers have been studying behavioural psychology in in an effort to understand how humans perceive value, make choices between products and services and why we like some things surprisingly more than we like others.
What I’ve read of the work is that the choices we make and decisions we take are in many ways a function of our personal and professional biases; in other words, the psychological prism of prejudices and proclivities through which we see things.
I’ve come to the conclusion that we’ve got an unwieldy mass of questionably effective security practice because of three particular biases: the compliance bias, the conformity bias and the cost bias.
Are we complying with regulations and guidance? Is this what everyone else is doing? How much is this going to cost and are we willing to pay it?
If we revisit the bomb search scenario in those terms, I think you can see how they come into play.
Does the drill meet the compliance test? Yes. It’s an accepted security drill that we can readily undertake and log for audit purposes.
Does it conform to what everyone else is doing? Yes. You’ll find it in the security procedures of a lot of ships.
What’s the cost of it? Nothing. It doesn’t cost a penny – which usually makes budget holders happy.
Another example where I think these biases are evident is with Ship Security Alert Systems (SSAS).
The idea behind SSAS is a good one: the principal purpose being for the crew of the ship to be able to alert relevant people ashore that the security of the ship is under threat or that it has been compromised.
However, I think there is a disconnect between the original idea and how SSAS works – or doesn’t – in practice and I suspect that this disconnect is the result of compliance, conformity and cost biases.
Regulation six of the ISPS Code which deals with Ship Security Alert Systems says: “The system shall be capable of being activated from the navigation bridge and at least one other location.”
This, I would argue, has been interpreted largely to mean you should have two points of activation for the SSAS on your ship.
Ask yourself how many SSAS activation points do you have on each ship? Two. Why? Because that’s what the regulation says.
And I think that is the compliance bias kicking in.
I’d also argue that conformity and cost are also involved.
Two points of activation is what everyone else does and having more than two points would obviously cost more money, albeit probably not very much.
But what happens if the crewmember that discovers that the security of the ship is under threat or has been compromised is not on the navigation bridge or anywhere close to the second activation point?
At best precious time is lost activating the SSAS and at worst it’s not activated at all – which makes the point of SSAS redundant.
Regulation six also says that the activation points “shall be designed so as to prevent the inadvertent initiation of the ship security alert.”
So, not only are there too few activation points on a ship, the regulation is written in a way so as to disincentivise the activation of the system in the first place.
It seems more concerned with not activating the system accidentally, than activating it at all.
But what’s worse? An alert being issued accidentally and which spins up a few people ashore for 20 minutes or an alert not being issued when there is a problem and no one ashore being in a position to respond during those crucial first hours of an actual security incident.
What I think is missing in both of these examples and in the way maritime security is being delivered onboard ships more widely is the absence of a security bias.
What I mean by ‘security bias’ is prioritising the effectiveness of the security that is being put in place. ‘Effectiveness’ must come ahead of ticking a box or conforming with how others do it, or because of budget diktats.
The fundamental problem with the compliance, conformity and cost biases is that they don’t demand to know if what is being done is right.
Surely delivering security that works – delivering security that is really effective in protecting seafarers – is the first purpose of security.
If, when we think about how we deliver security, we do so with a bias toward what is effective and what is needed to make it work, we’ll be asking the right questions at the outset.
The answers to those questions might be uncomfortable. The issues are usually pretty gritty.
How best to protect the lives and wellbeing of our seafarers is challenging and it demands that we think hard about what we’re doing in order that we get it right.
But that is how good security practise evolves.
I’m not suggesting that getting security right is easy.
If it was easy then everyone would be doing it and maybe the seafarers abducted from ships operating off West Africa over the last few months would not have been.
If it was easy we could put a substantial dent in the stowaway numbers and the $15m it costs the shipping industry each year.
Neither am I suggesting that we can ignore compliance, conformity and cost.
There are regulations that must be met.
It’s important that we act as an industry together and, of course, there is no getting away from cost-pressures.
However, if we allow these biases to prevail over the security bias we actually retard the development of best and better practice.
We also deter innovation in a seascape where security risks are evolving and seafarers need protection accordingly.
I’m certain that it’s possible to improve the security that is being delivered on board ships.
I’m also certain that it can be done more cost effectively, assuming that the measure of cost-effectiveness includes ‘effectiveness’ and not just ‘cost’.
We just need to give it more thought.