There’s not just one form of Artificial Intelligence (AI). This group term hides a great panoply of different configurations, shapes, and forms of applications.
One of the most impactful applications is that of machine learning or expert systems. It’s where we go beyond a conventional computer’s ability to store and manipulate information against set rules. It’s where the machine has the capability to learn new ways of interpreting information and thus becomes different every day of operation from the day it was switched on. That’s a bit vague but it captures the essence of moving from deterministic to non-deterministic systems.
In all this we do presuppose that such complex systems are in the hands of able and highly illiterate users who understand what they are doing in training that learning machine. There’s debate about how bias in algorithms can produce unintended consequences. In addition, a reliable and trustworthy machine can be trained in a way that embeds errors and biases too[1].
Just as a child picks up the bad habits of a parent, so “intelligent” machines can learn from pilots, controllers and engineers who may have less than optimal ways of undertaking tasks. This Human-AI interplay is likely to become a major area of study. As the topic of Human Factors is itself a large body of material.
Already with the debate on social media it is all too obvious that the aviation community has a wide range of views on the use of AI. All the way from utter rejection, or scepticism deeming such systems as “unsafe” to advocates who profess only the benefits and merits of such systems.
Clearly, both extreme ends of the spectrum of professional views don’t help much. I don’t think that the promoters of AI want to see blind overreliance on it. Equally, surly even ardent sceptics can see virtue in making the best use of the accumulated knowledge that is available.
I can foresee a system of systems approach. With my parent and child analogy, from time to time a child will ask a question that is blunt and to the point. A question that demands a straightforward answer. This can be uncomfortable but hits out at biases and bad habits.
In aircraft systems there are boundaries that must be respected. The physics of flight dictate that going beyond those boundaries is generally not good for life and limb. So, a system programmed to question an expert system, one AI questioning another AI, or even question its trainer, is not beyond the realms of possibility. It might even be a good idea.