In an on-line event, I listened to Professor Dr. Saskia Nagel of Aachen University[1] speak on Artificial Intelligence (AI) and Ethics last evening. It’s a topic that arouses a lot of interest amongst engineers and just about every other profession.
The talk was a round robin of the subject touching on points of debate that are far from resolved. Her talk provided an overview of key present ethical questions spanning the development and uses of “AI” technologies. It’s interesting that even the title of the talk was questionable. The debate rages as to what is encompassed in the commonly used term “AI”.
Scientific and technological advances have consequences that are best anticipated, in so much as we can. Far too often, as in the case of mobile phones, a capability has been launched onto humanity because of its great utility without much thinking through of potential impacts.
In a way, our collective mindset remains stone age. We do things because we can rather than asking the question as to whether we should or not. The Australian movie and musical Muriel’s Wedding captures this nicely[2]. “You Can’t Stop Progress” was the election billboard slogan of politician Bill Heslop in the story. The same theme might be posted as “Growth, growth, growth” in the current economic climate.
To an extent that’s what’s happening in the more audacious parts of aviation innovation. Different ways in which AI technologies can be used to facilitate autonomous flight are being explored and promoted. There’s no doubt such technology can process massive amounts of information in no time at all when compared with you or me. That advantage is only one side of the story.
Investigating questions of autonomy quickly leads to discussions on accountability and responsibility. In flight, there’s inevitably complex interactions between people and machines. On the normally rare occasions when this results in harm it’s essential to be able to say what or who was responsible.
It goes further than that too. Even to persuade a passenger to ride on an autonomous vehicle a good deal of confidence must be built-up. A fear of flying is often counteracted by arguments based on the long history of safe flight and the trustworthiness of those operating a transport system.
A question is: how do we trust something we don’t understand? Not a new question. Few members of the flying public may understand how a modern transport aircraft works. We put our faith in independent knowledgeable professionals asking difficult questions of the designers and builders of aircraft. We put our faith in rigorous controls and processes. If the internal workings of a complex machine are not explainable to those independent professionals, we have a problem. Thus, another key topic of explainability.
This is a fascinating research area. I’ve no doubt there are workable solutions, but we are some ways from having them to hand at the moment. Applied ethics are part of the toolbox needed.
[1] https://www.ethics.rwth-aachen.de/cms/ETHICS/Das-Lehr-und-Forschungsgebiet/Team/~fcnwz/Saskia-Nagel/