Trust in Voluntary Reporting

Hard data is immensely useful. Now there’s a surprise. That’s facts and figures. That’s accurate descriptions of occurrences. That’s measurements and readings of important factors. From this kind of data, a picture can be painted of events good and bad. However, this picture is not complete. It’s certainly not complete for any system that involves the interactions of humans and machines.

What’s often less visible is the need for what I might call – soft data. As such it’s not “soft”. I’m just using that loose term to distinguish it. Fine, you could say that social media is littered with the stuff. Vast qualities of instant judgements and colourful opinions. An array of off-the-shelf solutions to life’s ills. That’s all well and good for entertainment. It’s not so useful as a means of getting to the truth.

In civil aviation voluntary reporting systems have been around for several decades. They are not always successful, mainly because there’s a fair amount of trust required to use them when something major happens. When volunteering information there needs to be a level of assurance that the information will not be misused.

The human inclination to seek to blame is intrinsic. We wake-up in the morning, look out the window, and if it’s rainy and windy then someone is to blame. Probably a weather reporter for not warning us of a coming storm. Blame is a way of making sense of negative events without having to do lot of tedious investigation and analysis.

Don’t get me wrong. Accountability is vital. If someone does something unspeakably bad, they must be held accountable. That is a form of blame. Tracing the bad event back to the root cause. If that cause is found to be negligence or malicious intent, then blame can be assigned.

Where a good safety culture exists, as it often the case in civil aviation, then it is wrong to assume that undesirable outcomes can always be linked to a bad actor of some kind.

Human error is forever with us. Even with the absolute best of intent no one is immune from this pervasive creature. It can be illusive. There are environments where owning up to making mistakes is fine. Sadly, I’m sure it’s not uncommon to have worked in environments where such openness is punished. The difference between a good culture and a bad one.

One of my past jobs involved negotiation with a contactor. Every change that we made to a complex contact had a cost attracted to it. So, there was an understandable sensitivity to making changes. At the same time our customer for the product kept asking for changes. There’s nothing worse than being in a tense meeting with a contactor and having my boss pull the rug from under my feet. Seeking to blame a change on my error rather than a customer request. Introducing a voluntary reporting system in such an environment is pointless.

My message here is clear. Voluntary reporting in aviation is a powerful tool. Reports submitted by employees can offer insights that are not available by just looking at hard data. These reporting systems maybe required by regulation or company policy. However, without a good sound safety culture they can be all but useless. A safety culture that is defended and supported by employees and the senior management of an organisation.

Shifting Perspectives

Daily writing prompt
What’s a topic or issue about which you’ve changed your mind?

If you write the perfect rule, you will get the desired outcome. Authoring a specification that is robust and watertight will assure success. Having the best possible plan will deliver the best possible results. All sounds reasonable – doesn’t it? It’s not surprising that someone like me, having been schooled in project management, and working in engineering, would have a rational and systematic approach to problem solving. A proven highly successful way of implementing complex technical projects and delivering successful outcomes.

As an analogy I’ll start with mathematics. Nature is a curious beast. What we lean about complex systems is that what happens is highly dependent upon a start point. The initial conditions. Graduate level mathematics about control systems with feedback show that their behaviour changes a lot with a change of initial conditions. So, it’s reasonable to extend that to a systematic approach to just about anything. It’s often true.

Fail to plan – plan to fail. That idiom is a simple few words to sum up this cause and effect. Used by famous names and often quoted. Management training books are littered with this notion.

20-years ago, my team introduced the first European Aviation Safety Plan[1]. This initiative was built around the idea that to achieve a common objective a plan is the best and quickest way to get there. A roadmap, a pathway, a strategy, call it what you will.

Start by identifying problems and then propose a fix for each one. Not all problems but the ones that fit that awkward Americanism – the low hanging fruit. Namely, the biggest problems (fruit) that can be solved with the least effort (easily picked).

Here’s where I’ve changed your mind. Maybe not changed in a dramatic sense but shifted perspective. It’s essential to have a plan, even if it’s just in my head, but it can be overstated as the most important part of a process of change.

The Plan, Do, Check, and Act (PDCA) cycle, starts with a plan. It must start that way. However, each of the four steps is equally important. Seems obvious to say. Even so, it’s often the case that a press release, or alike, will state – we have a plan, roadmap, pathway, strategy, as if that’s the job done.

Management teams will smile with a sense of achievement and show off their plans. A decade down the line that celebration might seem less momentous as the “do” part of the process turns out to be harder than anticipated.

This basic model for systematic change is a good one. Where I’ve changed my emphasis is in the distribution of effort. Don’t put all available energies into constructing the perfect plan. Yes, the initial conditions are important but they are not everything. The key part of the process is the cycle. Going around it with regularity is a way of delivering continuous improvement. Afterall, when it comes to a subject like aviation safety, that’s what’s needed.


[1] 2005 – DECISION OF THE MANAGEMENT BOARD ADOPTING THE 2006 WORK PROGRAMME OF THE EUROPEAN AVIATION SAFETY AGENCY

From Prescription to Performance-Based Regulation

One regulatory development that has stuck since the start of the new century is the idea that we need to transition from prescriptive requirements to performance-based requirements. It’s not too hard to understand where the motivation to change has come from but there are several strands to the path. Here’s three that come to mind.

For one, the intense dislike of overbearing governmental regulators who adopt an almost parental attitude towards industry. It’s true that safety regulatory bodies have a duty to serve the public interest. The difficulty arises in interpreting that brief. Not as police officers sometimes did, imagining everyone as a potential miscreant.

My experience as a regulator started at a time when traditional institutional approach was quite common. There was a respectful distance between the airworthiness surveyor or operations inspector and the aviation industry that they oversaw. I think, even the term “surveyor” was one inherited from the insurance industry at the birth of flying.

A wave of liberalisation swept into the 1980s. It was an anathema to those who started their careers as men from the Ministry. The idea that regulators should be in a partnership with industry to meet common goals was not easily accepted. Undoubtably a change was necessary and, naturally, easier for an up-and-coming generation.

The next move away from regulatory prescription came as its value declined. That is, not that there will not always be an element of prescription by matter of the written law. However, for detailed technical considerations it became less and less practical to say, this is the way it must be. The minute decision-makers were faced with the complexity of a microprocessor it become clear that it’s not effective to simply prescribe solutions.

Much of the changes that took place can be traced to the evolution of system safety assessment and the use of probabilistic methods in aviation. In mechanics, prescribing a safety guard for a chain drive is straightforward. For complex electronics saying when a flight system is safe enough requires a different approach. Regulators are now driven to set objective rather than dictate solutions.

My third point is a future looking one. Whatever the history and heritage of aeronautical innovation, it’s true that a “conservative” but rapid adoption of new technology continues to be a source of success. Great safety success as well as commercial success.

Hidden amongst the successes are products, and ways of working that don’t meet the grade. The joke goes something like this: “How can I make a fortune in aviation?” Answer: “Just start with a big one.” Implicit in this observation is a wiliness to innovate at risk. That means, amongst many things, having confidence, adaptability and not be so constrained as to be assured failure. An objective or performance-based approach to safety regulation opens opportunity to innovate more freely whilst still protecting the public interest in safety.

There’s no fixed destination for regulatory development.

Just Culture

My thought is that we’ve forgotten the discussion of more than a decade ago. There was a time when the thoughtful reflections on responsibility and accountability were much discussed.

Without focusing on specific examples, there are plenty to choose from, there’s the propensity of our institutions and politicians to reach for “blame” as a first response. When situations go bad the instinctive inclination to hunt out someone to blame. This is an all too prevalent habit.

Naturally, in cases, there’s the strong need to identify who is accountable for bad decisions. Society does not like it when the powerful protect, cocoon themselves and grab for immunity. Certainly, some people and organisations are genuinely blameworthy. However, if we scrutinise and point the finger of blame, it doesn’t help if that finger is pointed at a person’s honest errors. There isn’t a human on this planet who hasn’t made an error.

The finger of blame is easily pointed. Judgment so often falls after an event. The time when more is known, and hindsight comes into play. This tips the balance. It’s so much easier to say: why on Earth did you do that? I would never have done that.

For people to come forward and be fairly heard in an open and fair inquiry or investigation they need to have the confidence that they are not stepping into a public blame-fest. Without trust between those on all sides of it’s less likely that the truth will come out.

“Just Culture” is a concept written into aviation legislation and followed by others. The overriding aim is to learn from mistakes. It’s the surest way of not repeating the same mistakes time and time again. It’s beneficial to have that long-term learning objective. Why suffer the pain of a bad event when the means to avoid it are known and understood?

Now, I’m going back 20-years. I remember being part of an international working group[1] called GAIN. The group compiled guidance about organisational culture. At the time, the group was considering the subject in the context of the air traffic profession. Guidance like the one referenced, emphasise that a Just Culture is not simply a no-blame culture. It’s not, and never has been a way of circumventing accountability.

Determining culpability can be complex. There’s often a test to consider the wilfulness of the participants in a bad event. In other words, did they carelessly, intentionally, maliciously or negligently make decisions that resulted in the bad event? In these cases, the “they” could be an individual or an organisation.

Gross negligence, wilful abuses and destructive acts should be addressed by the enforcement of laws. If we say the criminalisation of honest people involved in bad events has a negative impact. That is not to negate the need for enforcement. Regulators in all sorts of walks of life have a duty to apply enforcement where and when it’s appropriate. Maybe we ought to have applied that to the UK water industry.

My plea here is to first consider the nature of the events in question. Was there an element of genuine honest human error? Is the right balance being struck between the need to learn and the need to ensure accountability?

NOTE: Just Culture is defined in EU law as “A culture in which operational staff or others are not held accountable for actions, acts, omissions or decisions commensurate with their experience and training, but gross negligence, intentional violations and destructive actions are not tolerated” EC 376/2014 Art. 2 Para. 12.


[1] A Roadmap to a Just Culture https://flightsafety.org/files/just_culture.pdf

Culture

Yet again, Boeing is in the news. The events of recent times, I feel are immensely sad. Now, it is reported that the FAA has opened an investigation into a possible manufacturing quality lapse on the Boeing 787 aircraft[1]. Concern is that inspection records may have been falsified.

A company that once had a massive professional engineering reputation has sunk to a place where expectations are low. It’s not so much that the company is having a Gerald Ratner moment. Unfortunately, the constant stream of bad news indicates something deeper.

It’s interesting to note that Frank Shrontz[2] passed away last Friday at the grand age of 92. He was the CEO and Chairman of Boeing, who led the company during development of the Boeing 737NG and Boeing 777 aircraft. In the 1990s, I worked on both large aircraft types.

A commonly held view is that, after his time and the merger with McDonnell Douglas the culture of the organisation changed. There’s a view that business schools graduates took over and the mighty engineering ethos that Boeing was known for then went into decline. Some of this maybe anecdotal. Afterall, the whole world has changed in the last 30-years. However, it’s undoubtably true that a lot of people lament the passing of an engineering culture that aimed to be the best.

A famous quote comes to mind: “Culture eats strategy for breakfast.” Those sharp 5 words get discussed time and time again. Having been involved in a lot of strategic planning in my time it’s not nice to read. How wonderful intent, and well described policies can be diluted or ignored is often an indicator of decline. It’s that cartoon of two cavemen pushing a cart with a square wheel. One says to the other: “I’ve been so busy. Working my socks off”. Ignored, on the ground is an unused round wheel. If an organisation’s culture is aggressively centred on short-term gain, then many of the opportunities to fix stuff gets blown out of the window.

We keep talking about “performance” as if it’s a magic pill. Performance based rules, performance-based oversight, and a long list of performance indicators. That, in of itself is not a bad thing. Let’s face it we all want to get better at something. The problem lies with performance only being tagged to commercial performance. Or where commercial performance trumps every other value an engineering company affirms.

To make it clear that all the above is not just a one company problem, it’s useful to look at what confidential reporting schemes have to say. UK CHIRP is a long standing one. Many recent CHIRP reports cite management as a predominant issue[3]. Leadership skills are an issue.


[1] https://aviationweek.com/air-transport/some-787-production-test-records-were-falsified-boeing-says

[2] https://www.seattletimes.com/business/boeing-aerospace/frank-shrontz-former-ceo-and-chairman-of-boeing-dies-at-92/

[3] https://chirp.co.uk/newsletter/trust-in-management-and-cultures-is-the-key-to-promoting-confidence-in-safety-reporting/

Chat

Yesterday afternoon, at the till in a major supermarket and the man in front of me was getting stressed. I was standing in line waiting without a care in the world. In front of the man, in front of me, the till assistant, or checkout operator, dependent on how you see it, and a customer were locked in day-to-day conversation. Just being sociable. From what I could hear that customer may have once worked in that supermarket at some time.

The man in front of me was getting grumpy. He turned back and muttered his disdain for the shop’s staff because they were holding him up. Their everyday conversation was an afront to him. They were wasting his valuable time. Not overly aggressive but he had an agitation that often comes from a degree of unwelcome stress. He was in a hurry or at least heavily felt the pressure of time.

When I got to the till the assistant asked: What did he say? I quickly paraphrased what was said. Conscious that I had no desire to inflame the situation that had now passed by. Reactions can be unpredictable. We live in an era of polarisation.

The gentleman working at the till was into small talk. He clearly loved to chat to customers. Then for me he put that in context. He said that when it gets to about ten, in the evening, people are more than happy to talk as they pack their shopping. With some people it’s the only conversation they have in a day. He was proud that staff were encouraged to be warm and friendly.

Now, there’s a contrast. The life of Mr busy, busy, busy verses the life of the forgotten. That division is at the heart of one of society’s biggest troubles. A tribe that is over-employed, anxious, and living on the edge and a tribe that is lost, lonely and forgotten.

One prone to exasperation and being impatient. The other desperate for social contact and empathy. How on earth did we construct a society that tries to work on that level? Not only that but supermarket managers are desperately trying to automate everything[1]. Already there’s three types of automation in that you can do your own till check in a couple of different ways.

The milk of human kindness shouldn’t be sneered at. Wow. You see how quickly I reverted to Shakespeare without even knowing it. That simple phrase has its origins in the play Macbeth. The play that I was forced to read at school. The play that did nothing much to lift my appallingly bad grades at English. To Lady Macbeth, the “milk of human kindness” was objectionable. To her real men had no need of it. We all know where that led. Don’t go there.

So, next time you are standing in a short que, stomping your feet, imagining the clock spinning around, give it a rest. If you find yourself thinking this is too much, I can’t deal with it anymore, do a double take. Relax. Breath slowly. Dig deep and discover some small talk. It might be more meaningful than you first think.

On another subject. I agree with Graham Nash[2]. A day in the life[3] is a truly great song. It’s life as a musical tapestry. The song wanders around the mind using hardly any words but painting a picture all the way up to the sky. I’ll not heaping yet more praise on The Beatles, they’ve enough for several centuries. That said, May 1967 was a magical moment. Even if I did only know it with toy cars and in short trousers. It’s not the daily news but I’ll bet there’s probably now more holes in Britain’s roads than ever.


[1] https://www.bbc.com/worklife/article/20170619-how-long-will-it-take-for-your-job-to-be-automated

[2] https://en.wikipedia.org/wiki/Graham_Nash

[3] https://genius.com/The-beatles-a-day-in-the-life-lyrics