More and more AI-based systems can have an impact on our lives. They include algorithms for quality loan applications, managing patri- monks, automatically analyzing CVs, assisting in medical diagnostics, monitoring critical sites... Public decisions are also concerned: combating tax fraud, allocating social housing, assigning pupils to a school or students to a training course...
Whether they are based on learning techniques or more symbolic approaches, the design of these algorithmes is not neutral, the results they produce are not necessarily correct and their performance evolves over time, either through their intrinsic capacity for learning or through changes in their environment. Moreover, the impact of the data used for learning is critical to their functioning.
Indeed, these algorithms (whether learning-based or symbolic AI-based) implement, most often in an opaque and sometimes unmastered manner, priority, preference and ranking criteria that are generally unknown to the people concerned. This opacity can also mask all kinds of abuses: discrimination, unfair treatment, manipulation, etc.
From the user's point of view, the real need is to have an intelligible explanation (explicability) rather than traceability of the reasoning behind the decision.
ringing. Indeed, the operation of the underlying algorithm is of little interest to them: it is mainly a question of obtaining a useful explanation to understand or even interpret the results. However, providing these explanations is not simple, especially when the algorithms are based on learning techniques. Even the designers of these algorithms are not able to analyse the reasoning behind the results of these methods. This is the "black box" effect.
From the point of view of companies, organisations, public authorities and users, understanding these black boxes and guaranteeing their behaviour will become a major challenge. This is particularly true if we want to promote the insertion of AI components for the financial industry or in critical applications, whether for mobility, medicine, aeronautics, defence and security.
Take care and be safe.