What’s happening? Two words, and what seems like the easiest question in the world. Open your phone, look at the screen and a myriad of different sources of information are screaming for your immediate attention. They are all saying – look at me, look now, this is vital and don’t miss out. Naturally, most of us will tune out a big percentage of this attention-grabbing noise. If we didn’t life would be intolerable. The art of living sanely is identifying what matters from the clutter.
So, what happens in aviation when a Chief Executive or Director turns to a Safety Manager and askes – what’s happening? It’s a test of whether that manager’s finger is on the pulse, and they know what’s happening in the real world as it happens.
This is a place I’ve been. It’s a good place to be if you have done your homework. It’s the way trust is built between the key players who carry the safety responsibility within an organisation.
One of the tools in the aviation safety manager’s toolbox is that of Safety Performance Indicators (SPIs). In fact, it’s part of an international standard[1] as part of a package for conducting safety assurance. Technically, we are talking about data-based parameters used for monitoring and assessing safety performance.
The ideas are simple. It’s to create a dashboard that displays up-to-date results of safety analysis so that they can be viewed and discussed. Like your car’s dashboard, it’s not a random set of numbers, bar-charts, and dials. It should be a carefully designed selection of those parameters that are most useful in answering the question that started this short blog.
That information display design requires great care and forethought. Especially if there’s a likelihood that serious actions will be predicated on the information displayed. Seems common sense. Trouble is that there are plenty of examples of how not to do this running around. Here’s a few of the dangers to look out for:
Telling people what the want to hear. A dashboard that glows green all the time it’s useless. If the indicators become a way of showing off what a great job the safety department is doing the whole effort loses its meaning. If the dashboard is linked to the boss’s bonus, the danger is that pressure will be applied to make the indicators green.
Excessive volatility. It’s hard to take indicators seriously if they are changing at such a rate that no series of actions are likely to have an impact. Confidence can be destroyed by constantly changing the tune. New information should be presented if it arises rapidly, but a Christmas tree of flashing lights often causes the viewer to disbelieve.
Hardy perennials. There are indicators, like say; the number of reported occurrences, which are broad brush and frequently used. They are useful, if interpreted correctly. Unfortunately, there’s a risk of overreliance upon such general abstractions. They can mask more interesting phenomena. Each operational organisation has a uniqueness that should be reflected in the data gathered, analysed, and displayed.
For each SPI there should be an alert level. It can be a switch from a traffic light indication of green to amber. Then for the more critical parameters there should be a level that is deemed to be unacceptable. Now, that might be a red indicator that triggers a specific set of significant actions. The unscheduled removal or shutdown of a system or equipment may be tolerable up to a certain point. Beyond that threshold there’s serious safety concerns to be urgently addressed.
The situation to avoid is ending up with many indicators that make seeing the “wood from the trees” more difficult than it would otherwise be. Afterall, this important safety tool is intended to focus minds on the riskiest parts of an operation.
[1] ICAO Annex 19 – Safety Management. Appendix 2. Framework for a Safety Management System (SMS). 3. Safety assurance. 3.1 Safety performance monitoring and measurement.