Answers to the Questions
Chapter 1, Explaining Artificial Intelligence with Python
- Understanding the theory of an ML algorithm is enough for XAI. (True|False)
False. Implementing an ML program requires more than theoretical knowledge.
True. If a user just wants an intuitive explanation of an ML algorithm.
- Explaining the origin of datasets is not necessary for XAI. (True|False)
True. If a third party has certified the dataset.
False. If you are building the dataset, you need to make sure you are respecting privacy laws.
- Explaining the results of an ML algorithm is sufficient. (True|False)
True. If a user is not interested in anything else but the result.
False. If it is required to explain how a result was reached.
- It is not necessary for ...
Get Hands-On Explainable AI (XAI) with Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.