Chapter 5. Ranking search results with word embeddings

This chapter covers

  • Statistical and probabilistic retrieval models
  • Working with the ranking algorithm in Lucene
  • Neural information retrieval models
  • Using averaged word embeddings to rank search results

Since chapter 2, we’ve been building components based on neural networks that can improve a search engine. These components aim to help the search engine better capture user intent by expanding synonyms, generating alternative representations of a query, and giving smarter suggestions while the user is typing a query. As these approaches show, a query can be expanded, adapted, and transformed before matching with the terms stored in the inverted indexes is performed. Then, as mentioned ...

Get Deep Learning for Search now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.