3 Questions for

Jörn Müller-Quade

Professor of Cryptography and Security at the Karlsruhe Institute of Technology (KIT) and Head of the IT Security, Privacy, Legal and Ethical Framework Working Group of Plattform Lernende Systeme

3 Questions for Jörn Müller-Quade

Trustworthy AI systems: "Certificates can provide orientation"

Artificial Intelligence (AI) can make our everyday lives easier, traffic safer and healthcare better. But realizing this potential requires trustworthy AI systems that people are happy to use. Certification of AI can increase trust in the technology. Jörn Müller-Quade explains what specifics need to be considered when certifying AI and why not all AI systems need to undergo testing. The professor researches cryptography and learning systems at the Karlsruhe Institute of Technology. He is also head of the "IT Security" working group of Plattform Lernende Systeme.

1

Mr Müller-Quade, why is it important to certify Artificial Intelligence?

Jörn Müller-Quade: How AI systems arrive at their decisions is often incomprehensible even to experts. This is also referred to as black box systems. A customer cannot judge for himself whether the use of an AI system is safe in a particular context. This is where a certificate can provide orientation and give conscientious manufacturers an advantage on the market.

2

What distinguishes the certification of Artificial Intelligence from the certification of other IT systems?

Jörn Müller-Quade: Decisions made by AI systems are often not easy to understand, especially in the case of learning systems. However, certification becomes much more difficult if one cannot understand the system. In some cases, one will probably have to combine well-understood protection mechanisms with AI. In addition, AI systems can learn, i.e., change dynamically, which is why a one-time, static certification is not sufficient. Certification must become an open process.

3

How does certification succeed in ensuring the quality of AI systems without inhibiting innovation?

Jörn Müller-Quade: Since certification can be complex and time-consuming, you should only certify AI systems that have an increased criticality. If, for example, an algorithm that is supposed to suggest pieces of music to me is wrong, that is certainly no drama and certification is not necessary. Certification is more necessary for autonomous driving or for AI systems in medical technology.

Go back