In Chapter 6, you learned how to fine-tune Llama 2 with using LoRA, a technique to make your model knowledgeable in a new domain, one it hasn’t specifically been trained on.
In this chapter, you’re going to learn monitoring, testing, debugging, and tracing LLM-powered applications using LangSmith. This is an end-to-end observability platform from the creators of LangChain, designed to facilitate creating reliable, explainable, debuggable applications.
You’ll learn how to make your debugging and testing during the development ...