Using chat models
Chat models are increasingly adopted for their ability to handle conversational tasks,, and as we’ll see later, a lot of features (such as multimodal inputs) are supported on LangChain only for chat models. There are two key differences between a chat model and an LLM interface:
- A chat model takes either a prompt or a list of
BaseMessage
as input - A chat model provides
BaseMessage
as output
First, what is BaseMessage
? It’s a data structure to represent messages with three fields – content
, type
(who is the author of the message), and additional_kwargs
as an optional dict. Here’s an example:
from langchain_core.messages import BaseMessage message = BaseMessage( content="Hi, how are you?", type="human", additional_kwargs={"chapter": 2})
In practice, we typically deal with HumanMessage
or AIMessage
(the names of these classes speak for themselves...