EPILOGUE

AI Bias and Systems

In retrospect, it was unlikely to turn out well. But in 2016, Microsoft researchers released an AI algorithm called Tay to learn how to interact on Twitter. Within hours, it learned and began to spew out offensive tweets. Tay was not alone in becoming the worst of us. Stories like this abound and make many, including businesses, reluctant to adopt AI, not because AI prediction performs worse than people. Instead, AI may be too good at behaving like them.

This shouldn’t be a surprise. AI prediction requires data, and especially for data that involves predicting something about people, the training data comes from people. This can have merit, such as when training to play a game against people, but people are imperfect, ...

Get Power and Prediction now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.