Answers to the Questions

Chapter 1, Explaining Artificial Intelligence with Python

  1. Understanding the theory of an ML algorithm is enough for XAI. (True|False)

    False. Implementing an ML program requires more than theoretical knowledge.

    True. If a user just wants an intuitive explanation of an ML algorithm.

  2. Explaining the origin of datasets is not necessary for XAI. (True|False)

    True. If a third party has certified the dataset.

    False. If you are building the dataset, you need to make sure you are respecting privacy laws.

  3. Explaining the results of an ML algorithm is sufficient. (True|False)

    True. If a user is not interested in anything else but the result.

    False. If it is required to explain how a result was reached.

  4. It is not necessary for ...

Get Hands-On Explainable AI (XAI) with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.