Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Generative AI on Google Cloud with LangChain

You're reading from   Generative AI on Google Cloud with LangChain Design scalable generative AI solutions with Python, LangChain, and Vertex AI on Google Cloud

Arrow left icon
Product type Paperback
Published in Dec 2024
Publisher Packt
ISBN-13 9781835889329
Length 306 pages
Edition 1st Edition
Concepts
Arrow right icon
Author (1):
Arrow left icon
Leonid Kuligin Leonid Kuligin
Author Profile Icon Leonid Kuligin
Leonid Kuligin
Arrow right icon
View More author details
Toc

Table of Contents (22) Chapters Close

Preface 1. Part 1: Intro to LangChain and Generative AI on Google Cloud
2. Chapter 1: Using LangChain with Google Cloud FREE CHAPTER 3. Chapter 2: Foundational Models on Google Cloud 4. Part 2: Hallucinations and Grounding Responses
5. Chapter 3: Grounding Responses 6. Chapter 4: Vector Search on Google Cloud 7. Chapter 5: Ingesting Documents 8. Chapter 6: Multimodality 9. Part 3: Common Generative AI Architectures
10. Chapter 7: Working with Long Context 11. Chapter 8: Building Chatbots 12. Chapter 9: Tools and Function Calling 13. Chapter 10: Agents 14. Chapter 11: Agentic Workflows 15. Part 4: Designing Generative AI Applications
16. Chapter 12: Evaluating GenAI Applications 17. Chapter 13: Generative AI System Design 18. Index 19. Other Books You May Enjoy Appendix 1: Overview of Generative AI 1. Appendix 2: Google Cloud Foundations

Answering questions on long documents

Summarization is just the first step in achieving the goal of our users: Answering questions on long documents! No one has time to read these long documents, and while summaries are helpful, what users really want is a clear, concise answer to a question. In this section, we will discuss how you can use the long-context window of modern LLMs to efficiently answer questions on long documents.

With the introduction of Gemini 1.5 Pro, Google was able to significantly expand the context available for processing one message to the LLM. Imagine that you can perform Q&A on a 132-page long document with a single call! To easily enable you to work with long context, LangChain introduced new types of messages:

# Don't be confused by image_url when loading pdfs--this started as an image loader
pdf_message = {
    "type": "image_url",
    "image_url": {"url": &quot...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image