Natural Language Processing in Action video edition

Video description

"Learn both the theory and practical skills needed to go beyond merely understanding the inner workings of NLP, and start creating your own algorithms or models."
Dr. Arwen Griffioen, Zendesk

Natural Language Processing in Action is your guide to creating machines that understand human language using the power of Python with its ecosystem of packages dedicated to NLP and AI.

Recent advances in deep learning empower applications to understand text and speech with extreme accuracy. The result? Chatbots that can imitate real people, meaningful resume-to-job matches, superb predictive search, and automatically generated document summaries—all at a low cost. New techniques, along with accessible tools like Keras and TensorFlow, make professional-quality NLP easier than ever before.

Natural Language Processing in Action is your guide to building machines that can read and interpret human language. In it, you’ll use readily available Python packages to capture the meaning in text and react accordingly. The book expands traditional NLP approaches to include neural networks, modern deep learning algorithms, and generative techniques as you tackle real-world problems like extracting dates and names, composing text, and answering free-form questions.

  • Some sentences in this book were written by NLP! Can you guess which ones?
  • Working with Keras, TensorFlow, gensim, and scikit-learn
  • Rule-based and data-based NLP
  • Scalable pipelines
This book/course requires a basic understanding of deep learning and intermediate Python skills.

Hobson Lane, Cole Howard, and Hannes Max Hapke are experienced NLP engineers who use these techniques in production.

Provides a great overview of current NLP tools in Python. I’ll definitely be keeping this book on hand for my own NLP work. Highly recommended!
Tony Mullen, Northeastern University–Seattle

An intuitive guide to get you started with NLP. The book is full of programming examples that help you learn in a very pragmatic way.
Tommaso Teofili, Adobe Systems


Table of contents

  1. Part 1. Wordy machines
  2. Chapter 1. Packets of thought (NLP overview)
    1. Natural language vs. programming language
    2. The magic
    3. The math
    4. Practical applications
    5. Language through a computer’s “eyes”
    6. A simple chatbot
    7. Another way
    8. A brief overflight of hyperspace
    9. Word order and grammar
    10. A chatbot natural language pipeline
    11. Processing in depth
    12. Natural language IQ
  3. Chapter 2. Build your vocabulary (word tokenization)
    1. Challenges (a preview of stemming)
    2. Building your vocabulary with a tokenizer Part 1
    3. Building your vocabulary with a tokenizer Part 2
    4. Dot product
    5. A token improvement
    6. Extending your vocabulary with n-grams Part 1
    7. Extending your vocabulary with n-grams Part 2
    8. Normalizing your vocabulary Part 1
    9. Normalizing your vocabulary Part 2
    10. Normalizing your vocabulary Part 3
    11. Sentiment
    12. VADER—A rule-based sentiment analyzer
  4. Chapter 3. Math with words (TF-IDF vectors)
    1. Math with words (TF-IDF vectors)
    2. Bag of words
    3. Vectorizing
    4. Vector spaces
    5. Zipf’s Law
    6. Topic modeling
    7. Relevance ranking
    8. Okapi BM25
  5. Chapter 4. Finding meaning in word counts (semantic analysis)
    1. From word counts to topic scores
    2. TF-IDF vectors and lemmatization
    3. Thought experiment
    4. An algorithm for scoring topics
    5. An LDA classifier
    6. Latent semantic analysis
    7. Your thought experiment made real
    8. Singular value decomposition
    9. U—left singular vectors
    10. SVD matrix orientation
    11. Principal component analysis
    12. Stop horsing around and get back to NLP
    13. Using truncated SVD for SMS message semantic analysis
    14. Latent Dirichlet allocation (LDiA)
    15. LDiA topic model for SMS messages
    16. Distance and similarity
    17. Steering with feedback
    18. Topic vector power
    19. Semantic search
  6. Part 2. Deeper learning (neural networks)
  7. Chapter 5. Baby steps with neural networks (perceptrons and backpropagation)
    1. Neural networks, the ingredient list
    2. Detour through bias Part 1
    3. Detour through bias Part 2
    4. Detour through bias Part 3
    5. Let’s go skiing—the error surface
    6. Keras: Neural networks in Python
  8. Chapter 6. Reasoning with word vectors (Word2vec)
    1. Semantic queries and analogies
    2. Word vectors
    3. Vector-oriented reasoning
    4. How to compute Word2vec representations Part 1
    5. How to compute Word2vec representations Part 2
    6. How to use the gensim.word2vec module
    7. How to generate your own word vector representations
    8. fastText
    9. Visualizing word relationships
    10. Unnatural words
  9. Chapter 7. Getting words in order with convolutional neural networks (CNNs)
    1. Learning meaning
    2. Toolkit
    3. Convolutional neural nets
    4. Padding
    5. Narrow windows indeed
    6. Implementation in Keras: prepping the data
    7. Convolutional neural network architecture
    8. The cherry on the sundae
    9. Using the model in a pipeline
  10. Chapter 8. Loopy (recurrent) neural networks (RNNs)
    1. Loopy (recurrent) neural networks (RNNs)
    2. Remembering with recurrent networks
    3. Backpropagation through time
    4. Recap
    5. Putting things together
    6. Hyperparameters
    7. Predicting
  11. Chapter 9. Improving retention with long short-term memory networks
    1. LSTM Part 1
    2. LSTM Part 2
    3. Backpropagation through time
    4. Back to the dirty data
    5. My turn to chat
    6. My turn to speak more clearly
    7. Learned how to say, but not yet what
  12. Chapter 10. Sequence-to-sequence models and attention
    1. Encoder-decoder architecture
    2. Decoding thought
    3. Look familiar?
    4. Assembling a sequence-to-sequence pipeline
    5. Sequence encoder
    6. Training the sequence-to-sequence network
    7. Building a chatbot using sequence-to-sequence networks
    8. Enhancements
    9. In the real world
  13. Part 3. Getting real (real-world NLP challenges)
  14. Chapter 11. Information extraction (named entity extraction and question answering)
    1. Named entities and relations
    2. A knowledge base
    3. Regular patterns
    4. Information worth extracting
    5. Extracting dates
    6. Extracting relationships (relations)
    7. Relation normalization and extraction
    8. Why won’t split('.!?') work?
  15. Chapter 12. Getting chatty (dialog engines)
    1. Language skill
    2. Modern approaches Part 1
    3. Modern approaches Part 2
    4. Pattern-matching approach
    5. A pattern-matching chatbot with AIML Part 1
    6. A pattern-matching chatbot with AIML Part 2
    7. Grounding
    8. Retrieval (search)
    9. Example retrieval-based chatbot
    10. Generative models
    11. Four-wheel drive
    12. Design process
    13. Trickery
  16. Chapter 13. Scaling up (optimization, parallelization, and batch processing)
    1. Too much of a good thing (data)
    2. Optimizing NLP algorithms
    3. Advanced indexing
    4. Advanced indexing with Annoy
    5. Why use approximate indexes at all?
    6. Constant RAM algorithms
    7. Parallelizing your NLP computations
    8. Reducing the memory footprint during model training
  17. App A. Your NLP tools
    1. Anaconda3
    2. Mac
  18. App B. Playful Python and regular expressions
    1. Working with strings
    2. Regular expressions
  19. App C. Vectors and matrices (linear algebra fundamentals)
    1. Vectors
    2. Distances Part 2
  20. App D. Machine learning tools and techniques
    1. Data selection and avoiding bias
    2. Knowing is half the battle
    3. Holding your model back
    4. Imbalanced training sets
    5. Performance metrics
  21. App F. Locality sensitive hashing
    1. High-dimensional vectors are different
    2. High-dimensional thinking
    3. High-dimensional indexing

Product information

  • Title: Natural Language Processing in Action video edition
  • Author(s): Hobson Lane, Cole Howard, Hannes Hapke
  • Release date: April 2019
  • Publisher(s): Manning Publications
  • ISBN: None