Learn how to apply state-of-the-art transformer-based models including BERT and GPT to solve modern NLP tasks.
Overview Introduction to Transformer Models for NLP LiveLessons provides a comprehensive overview of transformers and the mechanisms—attention, embedding, and tokenization—that set the stage for state-of-the-art NLP models like BERT and GPT to flourish. The focus for these lessons is providing a practical, comprehensive, and functional understanding of transformer architectures and how they are used to create modern NLP pipelines. Throughout this series, instructor Sinan Ozdemir will bring theory to life through illustrations, solved mathematical examples, and straightforward Python examples within Jupyter notebooks.
All lessons in the course are grounded by real-life case studies and hands-on code examples. After completing this lesson, you will be in a great position to understand and build cutting-edge NLP pipelines using transformers. You will also be provided with extensive resources and curriculum detail which can all be found at the course’s GitHub repository.
About the Instructor Sinan Ozdemir’is currently Founder and CTO of Shiba Technologies. Sinan is a former lecturer of Data Science at Johns Hopkins University and the author of multiple textbooks on data science and machine learning. Additionally, he is the founder of the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. He holds a master’s degree in Pure Mathematics from Johns Hopkins University and is based in San Francisco, CA.
Skill Level
Intermediate
Advanced
Learn How To
Recognize which type of transformer-based model is best for a given task
Understand how transformers process text and make predictions
Fine-tune a transformer-based model
Create pipelines using fine-tuned models
Deploy fine-tuned models and use them in production
Who Should Take This Course
Intermediate/advanced machine learning engineers with experience with ML, neural networks, and NLP
Those interested in state-of-the art NLP architecture
Those interested in productionizing NLP models
Those comfortable using libraries like Tensorflow or PyTorch
Those comfortable with linear algebra and vector/matrix operations
Course Requirements
Python 3 proficiency with some experience working in interactive Python environments including Notebooks (Jupyter/Google Colab/Kaggle Kernels)
Comfortable using the Pandas library and either Tensorflow or PyTorch
Understanding of ML/deep learning fundamentals including train/test splits, loss/cost functions, and gradient descent
About Pearson Video Training:
Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Sams, and Que Topics include: IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at http://www.informit.com/video.
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month, and much more.