6

Text Summarization with Seq2seq Attention and Transformer Networks

Summarizing a piece of text challenges a deep learning model's understanding of language. Summarization can be considered a uniquely human ability, where the gist of a piece of text needs to be understood and phrased. In the previous chapters, we have built components that can help in summarization. First, we used BERT to encode text and perform sentiment analysis. Then, we used a decoder architecture with GPT-2 to generate text. Putting the Encoder and Decoder together yields a summarization model. In this chapter, we will implement a seq2seq Encoder-Decoder with Bahdanau Attention. Specifically, we will cover the following topics:

  • Overview of extractive and abstractive text ...

Get Advanced Natural Language Processing with TensorFlow 2 now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.