Trust in Voluntary Reporting

Hard data is immensely useful. Now there’s a surprise. That’s facts and figures. That’s accurate descriptions of occurrences. That’s measurements and readings of important factors. From this kind of data, a picture can be painted of events good and bad. However, this picture is not complete. It’s certainly not complete for any system that involves the interactions of humans and machines.

What’s often less visible is the need for what I might call – soft data. As such it’s not “soft”. I’m just using that loose term to distinguish it. Fine, you could say that social media is littered with the stuff. Vast qualities of instant judgements and colourful opinions. An array of off-the-shelf solutions to life’s ills. That’s all well and good for entertainment. It’s not so useful as a means of getting to the truth.

In civil aviation voluntary reporting systems have been around for several decades. They are not always successful, mainly because there’s a fair amount of trust required to use them when something major happens. When volunteering information there needs to be a level of assurance that the information will not be misused.

The human inclination to seek to blame is intrinsic. We wake-up in the morning, look out the window, and if it’s rainy and windy then someone is to blame. Probably a weather reporter for not warning us of a coming storm. Blame is a way of making sense of negative events without having to do lot of tedious investigation and analysis.

Don’t get me wrong. Accountability is vital. If someone does something unspeakably bad, they must be held accountable. That is a form of blame. Tracing the bad event back to the root cause. If that cause is found to be negligence or malicious intent, then blame can be assigned.

Where a good safety culture exists, as it often the case in civil aviation, then it is wrong to assume that undesirable outcomes can always be linked to a bad actor of some kind.

Human error is forever with us. Even with the absolute best of intent no one is immune from this pervasive creature. It can be illusive. There are environments where owning up to making mistakes is fine. Sadly, I’m sure it’s not uncommon to have worked in environments where such openness is punished. The difference between a good culture and a bad one.

One of my past jobs involved negotiation with a contactor. Every change that we made to a complex contact had a cost attracted to it. So, there was an understandable sensitivity to making changes. At the same time our customer for the product kept asking for changes. There’s nothing worse than being in a tense meeting with a contactor and having my boss pull the rug from under my feet. Seeking to blame a change on my error rather than a customer request. Introducing a voluntary reporting system in such an environment is pointless.

My message here is clear. Voluntary reporting in aviation is a powerful tool. Reports submitted by employees can offer insights that are not available by just looking at hard data. These reporting systems maybe required by regulation or company policy. However, without a good sound safety culture they can be all but useless. A safety culture that is defended and supported by employees and the senior management of an organisation.

Safety Culture 2

This may sound at variance with my last blog. I hope it’s not. I hope it’s complementary. What I’m highlighting here has been observed over decades of contact with a wide variety of organisations.

The term safety culture is fused into the pillars of ICAO Annex 19. The essence of building a good safety culture that fosters sound practices and encourages communications, in a non-punitive environment is at the heart of standards and recommended practices. With all those decades behind us the reader might assume that there’s unambiguous and well aligned attitudes and ways of working throughout the aviation industry. That’s not so.

On a spectrum of what could be called hard to soft the manner of application of know best practices can take different forms. By the way, please disassociate those two words with both easy and difficult. That’s not what I mean.

In my interpretation “hard” means like pages of Niccolo Machiavelli’s The Prince[1]. Aggressive, persistent, mandatory, uncompromising and all encompassing.

In my interpretation “soft” means like pages of The Little Book of Calm by Paul Wilson[2]. Harmonious, enlightened, progressive, sympathetic, and understanding.

As with extremes on any scale, going to the ends of that scale are not the best way to operate. I say “best” in terms of getting to ways of working to endure with engagement and effectiveness. I observe much of this depends on how power is disseminated through an organisational structure. Highly hierarchical organisations will approach culture differently from organisations with a relatively flat management system.

It may not be surprising to suggest that aviation Authorities can veer towards the “hard” approach and staff Unions towards the “soft” approach. Even when both are trying to reach the same goal. Where people come from a military background, command and control can be an instinctive reaction. Where people come from an advanced technology company background, collaboration and communication can be an instinctive reaction. In my observation there are advantages in both a hard and soft safety cultural approaches.

One advantage of a hard safety culture is that the time between discovery of a safety problem, taking corrective action and resolving that operational problem can be short. Clearly, that has distinct safety advantages. Certain airlines come to mind.

One advantage of a soft safety culture is that there can be the discovery of safety problems that would otherwise remain hidden. Where collective ownership of the problem is not in question. Again, clearly, that has distinct safety advantages too. Certain manufacturers come to mind.

I guess my message is as per much ancient thinking. All things in moderation. Try to reap the benefits of both ends of the scale. Balance.


[1] https://www.londonreviewbookshop.co.uk/stock/the-prince-niccolo-machiavelli

[2] https://www.waterstones.com/book/the-little-book-of-calm/paul-wilson/9780241257449

RAAC

Reinforced autoclaved aerated concrete (RAAC) is making the News in the UK. An unknown number of buildings are deemed dangerous because of the aging of this material[1]. RAAC has a limited lifespan. It’s inferior to standard concrete but lightweight and low-cost at the start of its life. It was typically used in precast panels in walls, roofs and sometimes floors.

The UK Government says it has been aware of RAAC in public sector buildings, including schools, since 1994. Warnings from the Health and Safety Executive (HSE) says that RAAC could – collapse with little or no notice. This “bubbly” form of concrete can creep and deflect over time, and this can be aggravated by water penetration. So, regular inspection and maintenance are vital to keep this material safe. Especially in a country known for its inclement weather.

It’s reasonable to say there lies a problem. The public estate has been through a period of austerity. One of the first tasks to get cut back, when funds are short is regular maintenance. Now, I am making some assumptions in this respect, but they are reasonable. Public sector spending has been under significant pressure for a long time.

The other dogmatic notion that has hindered a solution to this building problem is centralisation. There was a time when local authorities managed schools. They still do but in smaller numbers. Centralised funding has decreased the power of local people to address problems with the school estate.

Aging buildings have something in common with aging structures in aviation. There’s always a demand to keep going for as long as possible. There’s always the difficulty of determining the safety margin that is acceptable. There’s always a pressure on maintenance costs.

Believe it or not aircraft structures do fail[2]. There’s a tendency to forget this source of incidents and accidents but they never go away[3]. What happens in industries where safety is a priority is investigation, feedback and learning from incidents and accidents. The aim being to ensure that there’s no repeat of known failures. Rules and regulations change to address known problems.

The vulnerability to moisture and the limited lifespan of RAAC should have been a loud wake-up call. No doubt it was for some well-managed, well-resourced enlightened organisations. Central Government has bulked at the cost of fixing this known building safety problem. A culture of delaying the fixing of difficult problems has won.

In civil aviation there’s a powerful tool called an Airworthiness Directive (AD). It’s not something that an aircraft operator can ignore or put on the back burner. The AD can mandate inspections and changes to an aircraft when an unsafe condition exists.

In the schools cases in the News, the impression is given that Government Ministers have dragged their heels and only acted at the last possible moment. Maybe the construction industry and public estate needs a strong regulator that can issue mandatory directives. Known unsafe conditions should not be left unaddressed or significantly delayed.


[1] https://www.local.gov.uk/topics/housing-and-planning/information-reinforced-autoclaved-aerated-concrete-raac

[2] https://www.faa.gov/lessons_learned/transport_airplane/accidents/N73711

[3] https://www.faa.gov/lessons_learned/transport_airplane/accidents/TC-JAV

Titan’s fate

Firstly, condolences to the families and friends of those who perished in the deep ocean last week. This fatal tragedy took place in the full glare of the public spotlight. It’s time to give those affected time to grieve for their loss.

I will address the subject of vehicle safety in a technical manner. It’s immensely sad when what is known must be re-learnt in such a tragic way. By their nature, passenger vehicles that enter hostile environments will present high risks. There is always a likelihood of an event of significant severity as to cause injury. The imperative should be to reduce that probability as much as possible.  

The Transportation Safety Board of Canada (TSB)[1] has launched an investigation into the events that led to the loss of the submersible called: Titan. That organisation will do a detailed investigation into the reasons behind the accident that led to the deaths of five people on-board. 

There’s much conjecture about the factors involved in this catastrophe. News media and social media are awash with speculations. The facts are that contact was lost with the Titan’s support vessel and a catastrophic event took place[2].

What has come to light in the aftermath of this event is the public statements made by the driving force behind the Titan project. This has been contrasted with the those from the submersible community who spoke out on their concerns about the project.

My reflection on this information is to say that – safety starts at the top. If the entrepreneurs who promote these adventures are not literate, humble, and vigilant then outcomes are going to be negative. Those in leadership positions need to listen to those with expertise in their field of endeavour. Accepted, that it’s not the case that everyone will agree all the time about operational and technical risks but an open dialogue is vital.

I know that innovation often takes the path of trying, failing, trying again, failing, and trying again to eventually succeed. However, no vehicle should enter public service without sufficient proving.  Independent oversight adds value too. The cultural framework within which this happens shapes success or failure. That’s why there’s good reason for design certification. That’s to apply time and energy to extensive testing, applying recognised standards and listening to reputable expertise. At its best it’s an opportunity to draw on widespread experiences from the past – good and bad.

Systems that prove to be safe most often come about from those who take on knowledge, experience, and learning. Yes, this work is not free. It can cost much to go from theory to practice. When the impact of failure, when the outcome is tragic for families, loved ones and colleagues these expenses are not so large.

We must take every opportunity to learn from such fatal accidents to make them extremely rare. 

#Safey Management #SystemSafety #HumanFactors #SafetyCulture


[1] https://www.tsb.gc.ca/eng/medias-media/deploiement-deployment/marine/2023/m23a0169-20230623.html

[2] https://www.tsb.gc.ca/eng/enquetes-investigations/marine/2023/m23a0169/m23a0169.html