Hard data is immensely useful. Now there’s a surprise. That’s facts and figures. That’s accurate descriptions of occurrences. That’s measurements and readings of important factors. From this kind of data, a picture can be painted of events good and bad. However, this picture is not complete. It’s certainly not complete for any system that involves the interactions of humans and machines.
What’s often less visible is the need for what I might call – soft data. As such it’s not “soft”. I’m just using that loose term to distinguish it. Fine, you could say that social media is littered with the stuff. Vast qualities of instant judgements and colourful opinions. An array of off-the-shelf solutions to life’s ills. That’s all well and good for entertainment. It’s not so useful as a means of getting to the truth.
In civil aviation voluntary reporting systems have been around for several decades. They are not always successful, mainly because there’s a fair amount of trust required to use them when something major happens. When volunteering information there needs to be a level of assurance that the information will not be misused.
The human inclination to seek to blame is intrinsic. We wake-up in the morning, look out the window, and if it’s rainy and windy then someone is to blame. Probably a weather reporter for not warning us of a coming storm. Blame is a way of making sense of negative events without having to do lot of tedious investigation and analysis.
Don’t get me wrong. Accountability is vital. If someone does something unspeakably bad, they must be held accountable. That is a form of blame. Tracing the bad event back to the root cause. If that cause is found to be negligence or malicious intent, then blame can be assigned.
Where a good safety culture exists, as it often the case in civil aviation, then it is wrong to assume that undesirable outcomes can always be linked to a bad actor of some kind.
Human error is forever with us. Even with the absolute best of intent no one is immune from this pervasive creature. It can be illusive. There are environments where owning up to making mistakes is fine. Sadly, I’m sure it’s not uncommon to have worked in environments where such openness is punished. The difference between a good culture and a bad one.
One of my past jobs involved negotiation with a contactor. Every change that we made to a complex contact had a cost attracted to it. So, there was an understandable sensitivity to making changes. At the same time our customer for the product kept asking for changes. There’s nothing worse than being in a tense meeting with a contactor and having my boss pull the rug from under my feet. Seeking to blame a change on my error rather than a customer request. Introducing a voluntary reporting system in such an environment is pointless.
My message here is clear. Voluntary reporting in aviation is a powerful tool. Reports submitted by employees can offer insights that are not available by just looking at hard data. These reporting systems maybe required by regulation or company policy. However, without a good sound safety culture they can be all but useless. A safety culture that is defended and supported by employees and the senior management of an organisation.