Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Generative AI on Google Cloud with LangChain

You're reading from   Generative AI on Google Cloud with LangChain Design scalable generative AI solutions with Python, LangChain, and Vertex AI on Google Cloud

Arrow left icon
Product type Paperback
Published in Dec 2024
Publisher Packt
ISBN-13 9781835889329
Length 306 pages
Edition 1st Edition
Concepts
Arrow right icon
Author (1):
Arrow left icon
Leonid Kuligin Leonid Kuligin
Author Profile Icon Leonid Kuligin
Leonid Kuligin
Arrow right icon
View More author details
Toc

Table of Contents (22) Chapters Close

Preface 1. Part 1: Intro to LangChain and Generative AI on Google Cloud
2. Chapter 1: Using LangChain with Google Cloud FREE CHAPTER 3. Chapter 2: Foundational Models on Google Cloud 4. Part 2: Hallucinations and Grounding Responses
5. Chapter 3: Grounding Responses 6. Chapter 4: Vector Search on Google Cloud 7. Chapter 5: Ingesting Documents 8. Chapter 6: Multimodality 9. Part 3: Common Generative AI Architectures
10. Chapter 7: Working with Long Context 11. Chapter 8: Building Chatbots 12. Chapter 9: Tools and Function Calling 13. Chapter 10: Agents 14. Chapter 11: Agentic Workflows 15. Part 4: Designing Generative AI Applications
16. Chapter 12: Evaluating GenAI Applications 17. Chapter 13: Generative AI System Design 18. Index 19. Other Books You May Enjoy Appendix 1: Overview of Generative AI 1. Appendix 2: Google Cloud Foundations

Using chat models

Chat models are increasingly adopted for their ability to handle conversational tasks,, and as we’ll see later, a lot of features (such as multimodal inputs) are supported on LangChain only for chat models. There are two key differences between a chat model and an LLM interface:

  • A chat model takes either a prompt or a list of BaseMessage as input
  • A chat model provides BaseMessage as output

First, what is BaseMessage? It’s a data structure to represent messages with three fields – content, type (who is the author of the message), and additional_kwargs as an optional dict. Here’s an example:

from langchain_core.messages import BaseMessage
message = BaseMessage(
    content="Hi, how are you?",
    type="human", additional_kwargs={"chapter": 2})

In practice, we typically deal with HumanMessage or AIMessage (the names of these classes speak for themselves...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image