Book description
In today's rapidly changing AI technology environment, software engineers often struggle to build real-world applications with large language models (LLM). The benefits of incorporating open source LLMs into existing workflows is often offset by the need to create custom components. That's where Haystack comes in. This open source framework is a collection of the most useful tools, integrations, and infrastructure building blocks to help you design and build scalable, API-driven LLM backends.
With Haystack, it's easy to build extractive or generative QA, Google-like semantic search to query large-scale textual data, or a reliable and secure ChatGPT-like experience on top of technical documentation. This guide serves as a collection of useful retrieval augmented generation (RAG) mental models and offers ML engineers, AI engineers, and backend engineers a practical blueprint for the LLM software development lifecycle.
Publisher resources
Table of contents
- Brief Table of Contents (Not Yet Final)
- 1. Using Generative AI with Haystack
- 2. Optimizing AI
- 3. Observable AI
- 4. Advanced RAG and Keeping Pace with AI Developments
- About the Author
Product information
- Title: Retrieval Augmented Generation in Production with Haystack
- Author(s):
- Release date: February 2025
- Publisher(s): O'Reilly Media, Inc.
- ISBN: 9781098165147
You might also like
article
Run Llama-2 Models Locally with llama.cpp
Llama is Meta’s answer to the growing demand for LLMs. Unlike its well-known technological relative, ChatGPT, …
article
Use Github Copilot for Prompt Engineering
Using GitHub Copilot can feel like magic. The tool automatically fills out entire blocks of code--but …
article
Reinventing the Organization for GenAI and LLMs
Previous technology breakthroughs did not upend organizational structure, but generative AI and LLMs will. We now …
article
Predict Stock Prices with LSTM Networks
These shortcuts delve into generative AI, where algorithms and models create synthetic data, detect anomalies, and …