6

AI Fairness with Google's What-If Tool (WIT)

Google's PAIR (People + AI Research) designed the What-If Tool (WIT), a visualization tool designed to investigate the fairness of an AI model. Usage of WIT leads us directly to a critical ethical and moral question: what do we consider as fair? WIT provides the tools to represent what we view as bias so that we can build the most impartial AI systems possible.

Developers of machine learning (ML) systems focus on accuracy and fairness. WIT takes us directly to the center of human-machine pragmatic decision-making. In Chapter 2, White Box XAI for AI Bias and Ethics, we discovered the difficulty each person experiences when trying to make the right decision. MIT's Moral Machine experiment brought ...

Get Hands-On Explainable AI (XAI) with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.