Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Generative AI on Google Cloud with LangChain

You're reading from   Generative AI on Google Cloud with LangChain Design scalable generative AI solutions with Python, LangChain, and Vertex AI on Google Cloud

Arrow left icon
Product type Paperback
Published in Dec 2024
Publisher Packt
ISBN-13 9781835889329
Length 306 pages
Edition 1st Edition
Concepts
Arrow right icon
Author (1):
Arrow left icon
Leonid Kuligin Leonid Kuligin
Author Profile Icon Leonid Kuligin
Leonid Kuligin
Arrow right icon
View More author details
Toc

Table of Contents (22) Chapters Close

Preface 1. Part 1: Intro to LangChain and Generative AI on Google Cloud
2. Chapter 1: Using LangChain with Google Cloud FREE CHAPTER 3. Chapter 2: Foundational Models on Google Cloud 4. Part 2: Hallucinations and Grounding Responses
5. Chapter 3: Grounding Responses 6. Chapter 4: Vector Search on Google Cloud 7. Chapter 5: Ingesting Documents 8. Chapter 6: Multimodality 9. Part 3: Common Generative AI Architectures
10. Chapter 7: Working with Long Context 11. Chapter 8: Building Chatbots 12. Chapter 9: Tools and Function Calling 13. Chapter 10: Agents 14. Chapter 11: Agentic Workflows 15. Part 4: Designing Generative AI Applications
16. Chapter 12: Evaluating GenAI Applications 17. Chapter 13: Generative AI System Design 18. Index 19. Other Books You May Enjoy Appendix 1: Overview of Generative AI 1. Appendix 2: Google Cloud Foundations

Understanding RAG applications – closed-book versus open-book question-answering

We are all used to modern search engines. They solve our information retrieval needs by making us follow a common pattern [3]:

Figure 3.1: A typical flow that fulfills information retrieval needs

Figure 3.1: A typical flow that fulfills information retrieval needs

Without going into too much detail, on the architectural level, model information retrieval systems use a corpus (that is typically some kind of inverted index). They then use ranking and retrieval to find the information to answer the query:

Figure 3.2: A typical flow that fulfills information retrieval needs

Figure 3.2: A typical flow that fulfills information retrieval needs

Already in 2020, it was demonstrated that LLMs do memorize facts they have seen during training, and they can be used as knowledge repositories [16]. In 2021, a model-based approach was suggested for search and question-answering tasks [2]. There are two options for how we can use LLMs. First, we can fine-tune an LLM and use it to complete...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image