Oops, something went wrong!

The amazing pictures of the Concordia lying helpless on its side probably induced a balance of laughter and horror.

The Costa Concordia cruise ship ran aground off the coast of Tuscany. Searching for survivors.

The arrest of the captain may be standard procedure but the hint that it was all his fault was implicit in an apparent comment he has given that the rock should not have been there.

We are often dishonest about why we want to find whose fault it is, pretending that we need to know to stop making the same mistake again. But history shows, and perhaps our own personal experience backs it up, that we make the same mistakes again and again. Hitler made the same mistake as Napoleon in invading Russia and train and plane crashes are getting fewer but only slowly and usually because people have been taken out of the system altogether!

We want to blame someone because our feelings go that way. Maybe because we feel guilty whenever there is suffering or we feel worried someone is going to blame us so let’s blame someone as soon as possible, maybe we learned that at school?

So at first, when these big things happen, the press goes full pelt into trying to find who is to blame, and then interest passes and eventually, sometime many years, some reason for failure is decided on and some blame and some punishment given out, but by then it is usually a much smaller story.

Just as I have been recommending we take into account things like risk, complexity, the degree of collaboration and the stage of development of any project, while we think about our thinking, the same things apply when things go wrong.

It may be that what went wrong with Concordia was simple, maybe the Captain was drunk in charge, but if so then a lot more must have gone wrong as this is a big ship with a lot of passengers lives at stake and loads of money, so one drunk Captain should not be enough.

More likely it is a complex fault, something wrong with the software for navigation, a failure in the checking process before releasing the ship, combined maybe with some human error, the failure to take signals seriously, the switching off of a warning light, that kind of thing.

There also may be design faults, apparently ships like that are not supposed to just tip over, life rafts which you can’t use if it does tip over seem pretty daft design and so on.

These kinds of faults happen again and again because things mostly are complex. We treat systems with people in it as if people are also machines, and they are not  and when working through failure scenarios logic and reasoning may dominate. No faults occur because our segmented logic failed, because our reasoning is unsound. They occur because we failed to imagine problems with the bigger picture, we did not use creative thinking enough to think about what might occur.

This is like the story about a plane that nearly failed to take off because the Captain had done all the usual estimates of average weight per passenger but nobody had known that the people on the plane were coin collectors! (Easy Jet now have a no weight limit on passenger hand luggage, which is a bit worrying.)

The final check is intuitive and it is the most difficult check of all. Intuitive checks are hard because if we do them again and again our intuition starts to get hard to ‘read’. If every time you got on a plane you asked yourself, do I feel safe today, then eventually something would be shouting, NO! But you would probably be OK.

But intuitive checks can save lives. There is a true story about Michael Riley in charge of release of defensive missiles on the destroyer, HMS Gloucester. He was in the Persian Gulf during the Gulf War. His inner fear told him to fire, but the missiles were coming along a path that the US planes used repeatedly, planes which always had their ‘I am on your side’ signal off to avoid being shot by the enemy anti aircraft guns! Only with analysis much later did they work out that the signals from the planes were just a fraction different to the normal signals but they were enough to trigger his intuition to fire.

The same was true for a Russian commander who did not order the firing of nuclear weapons against the West (many years ago). He was praised but also disciplined.  There was a computer error. He may have saved the world.

What we all need to do, whenever something goes wrong, is hold back, be slow to blame, be slow to find fault, and if that is our job to find out what went wrong, and how it might be avoided in future, we need to think about complexity, about how error is often the result of ‘collaborative failure’ with catalytic risk.

Some wisdom in these words also, perhaps:





About Graham Rawlinson

I now have 5 books published as Ebooks http://amzn.to/iOyowj. They feel like part of a life's work, somehow all the different jobs I have had in my life, from postman to psychologist to facilitator of inventions and running a food business, they all build into a way of loving life, the ups and the downs. I hope you like the blogs I write, and then like the books I write. I hope you will want to take some time out of your life to share some thoughts with me. For that, I thank you. Graham
This entry was posted in Happiness, science, Transport, War. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s