Come on, one software control system is much like another. We don’t want to know what’s inside the box. We just want to know what it does. Well, that’s one point of view. Slowly, year by year, as what’s in the box has becomes more and more complex, or at least difficult to understand, so the opinion expressed above gets more airtime. There’s no doubt, I don’t give a lot of thought to how my iPhone does what it does in the palm of my hand. Whereas 30-years ago, I was intrigued to understand how a symbol generator created characters on an aircraft electronic display.
Levels of interconnection, integration and interoperation create independencies that become harder and harder to see and understand. I suppose we ought to coin a new “inter” word to sum up the high density of functions ticking away behind the curtain of everyday acceptance. The hidden workings of machines that we cannot live without. It’s much more than lines of software code that are transforming our lives. And transforming flying. Today, oceans of algorithmic go on data crunching with a high degree of autonomy. Some of it is transparent to a smart set of specialist technical gurus but most of us, even expert us, sit outside the advancing wave of change.
What I find intriguing is discussion about how society will react when super complex systems go badly wrong. We know something of what happens when conventional systems go wrong. A few minutes studying the recent Boeing 737 MAX saga is a good illustration of what can happen.
It’s a rule in my mind that whatever autonomy a system is given, someone somewhere cannot escape accountability for its actions. Yes, dystopia SiFi stories are full of rouge machines running amok. Society will surely not allow that to happen – will we?
Industry and regulators both have an immensely important role to work together to mange risks. Politicians have a basic responsibility to listen to the conclusions of expert findings. When the amalgam of workings inside the box has such features as machine learning we go way beyond the conversional approach to systems. Beyond what we have been doing successfully to assure safety for the last 30-years.
Demands for greater performance means that we cannot be luddite about the use of non-deterministic systems in safety related control systems. Their adaptability, agility and flexibility can help us meet many environmental and societal aims. But the classical questions of – what if? Still need to be addressed in detail to assure resilience, robustness, and basic levels of safety.
And we must do all this at the same time as updating airborne software of some flying aircraft using floppy disks.