For hundreds of years, scientists and philosophers have dreamed of intelligent calculation machines that can perform work that is otherwise performed by humans. The advent, design, and development of computers moved this dream toward a reality, and in 1956, artificial intelligence (AI) became an academic discipline. But only recently has computing technology caught up to the scale of data and processing power to enable machines to intelligently “think.”
Business intelligence (BI) has undergone its own evolution since the term was first coined. Beginning in the 1960s, enterprises used mainframes to support mission-critical applications such as reconciling the general ledger. In the 1980s and 1990s, BI software became an industry in its own right. In the late 1990s and early 2000s, new vendors emphasized usability and self-serve capabilities. Now, BI is being usurped by analytics software that uses larger scale and improved processing performance to enable search-based and AI-driven analytics capabilities.
For decades, AI was out of reach because the requisite compute scale and processing capabilities did not exist. Even when computational processing power advanced to adequate speed, costs kept AI development beyond the reach of many otherwise-interested parties.
Now in the age of big data and nanosecond processing, machines can rapidly mimic aspects of human reasoning and decision making across massive volumes of data. Through neural networks ...