October 2024
Intermediate to advanced
384 pages
13h 7m
English
In 2017, a team at Google Brain introduced an advanced artificial intelligence (AI) deep learning architecture called the Transformer. Since then, the Transformer has become the standard for tackling various natural language processing (NLP) tasks in academia and industry. It is likely that you have interacted with models built on top of the Transformer architecture in recent years without even realizing it. Google, for example, experimented with using the Bidirectional Encoder Representations from Transformer (BERT), an LLM the company created, to enhance its search engine by better understanding users’ search queries. In more recent years, Google started to use Gemini, another LLM it created, to overhaul ...