Transparency in AI Decision Making
Date: This event took place live on June 22 2017
Presented by: Andy Hickl
Duration: Approximately 60 minutes.
Questions? Please send email to
State-of-the-art machine learning techniques offer extraordinary performance on everything from text analysis to feature classification, but they often function as "black boxes" that obscure their decision-making. As artificial intelligence makes its way into critical processes in every industry, stakeholders will demand transparency from their models and algorithms.
This webcast will discuss the problem of interpretability in AI and address techniques for building transparent artificial intelligence applications with explainable outcomes.
About Andy Hickl, Chief Product Officer at Intel Saffron Cognitive Solutions Group
Prior to Intel, Andy was a Senior Director for Innovation at Vulcan Inc. He co-founded the Vulcan Proving Ground (an early-stage technology incubator) and served as Executive Technical Advisor to Paul G. Allen.
A serial entrepreneur, Andy has co-founded three startups: A.R.O. Inc. (Vulcan Ventures, Paul Allen), Swingly, and Extractiv. Andy also served as CEO of Language Computer Corporation, a leading natural language processing company.
Andy has led research in natural language processing, machine learning, artificial intelligence, computer vision, and ubiquitous computing. His work can be found in the proceedings of AAAI, ACL, SIGIR, and NIPS.