Book description
Natural language processing (NLP) went through a profound transformation in the mid1980s when it shifted to make heavy use of corpora and datadriven techniques to analyze language.
Since then, the use of statistical techniques in NLP has evolved in several ways. One such example of evolution took place in the late 1990s or early 2000s, when fullfledged Bayesian machinery was introduced to NLP. This Bayesian approach to NLP has come to accommodate various shortcomings in the frequentist approach and to enrich it, especially in the unsupervised setting, where statistical learning is done without target prediction examples.
In this book, we cover the methods and algorithms that are needed to fluently read Bayesian learning papers in NLP and to do research in the area. These methods and algorithms are partially borrowed from both machine learning and statistics and are partially developed "inhouse" in NLP. We cover inference techniques such as Markov chain Monte Carlo sampling and variational inference, Bayesian estimation, and nonparametric modeling. In response to rapid changes in the field, this second edition of the book includes a new chapter on representation learning and neural networks in the Bayesian context. We also cover fundamental concepts in Bayesian statistics such as prior distributions, conjugacy, and generative modeling. Finally, we review some of the fundamental modeling techniques in NLP, such as grammar modeling, neural networks and representation learning, and their use with Bayesian analysis.
Table of contents
 List of Figures
 List of Figures
 List of Figures
 Preface (First Edition)
 Acknowledgments (First Edition)
 Preface (Second Edition)
 Preliminaries
 Introduction
 Priors
 Bayesian Estimation

Sampling Methods
 MCMC Algorithms: Overview
 NLP Model Structure for MCMC Inference
 Gibbs Sampling
 The Metropolis–Hastings Algorithm
 Slice Sampling
 Simulated Annealing
 Convergence of MCMC Algorithms
 Markov Chain: Basic Theory
 Sampling Algorithms Not in the MCMC Realm
 Monte Carlo Integration
 Discussion
 Conclusion and Summary
 Exercises
 Variational Inference
 Nonparametric Priors
 Bayesian Grammar Models
 Representation Learning and Neural Networks
 Closing Remarks
 Basic Concepts
 Distribution Catalog
 Bibliography (1/6)
 Bibliography (2/6)
 Bibliography (3/6)
 Bibliography (4/6)
 Bibliography (5/6)
 Bibliography (6/6)
 Author's Biography
 Blank Page (1/4)
 Blank Page (2/4)
 Blank Page (3/4)
 Blank Page (4/4)
Product information
 Title: Bayesian Analysis in Natural Language Processing, 2nd Edition
 Author(s):
 Release date: April 2019
 Publisher(s): Morgan & Claypool Publishers
 ISBN: 9781681735276
You might also like
book
Python Crash Course, 2nd Edition
This is the second edition of the best selling Python book in the world. Python Crash …
video
Python Fundamentals
51+ hours of video instruction. Overview The professional programmer’s Deitel® video guide to Python development with …
book
HandsOn Machine Learning with ScikitLearn, Keras, and TensorFlow, 2nd Edition
Through a series of recent breakthroughs, deep learning has boosted the entire field of machine learning. …
book
Head First Design Patterns, 2nd Edition
You know you don’t want to reinvent the wheel, so you look to design patterns—the lessons …