Summary
In this chapter, we have introduced the foundational models offered by Google Cloud and which model will serve your use case best. You should be able to call Google Cloud foundational models, such as Gemini Pro, using the LangChain SDK. You should also understand how you can create simple LLM systems that take instructions to perform simple tasks using examples and a limited context passed via an input prompt. As you now know, Google Cloud allows you to connect both open source and partner models using the same logic and approach as in their native models. We also looked at the distinction between an LLM and a chat model interface on LangChain, and we discussed important concepts, such as SystemMessage
, PromptTemplate
, OutputParser
, and callbacks.
We built our first simple chains, and we used key classes – VertexAI
, ChatVertexAI
, and VertexAIModelGarden
– to work with Google Cloud models on LangChain.
In the next chapter, we will look at how you can significantly...