Controlled generation
To ensure consistent communication between different nodes and standardize the structure of responses from an LLM, Gemini models allow you to specify the desired response format as a JSON string.
You can achieve this by setting the response_schema
parameter to an OpenAPI (https://spec.openapis.org/oas/latest.html) dictionary and the response_mime_type
parameter to "application/json"
. This can be done either in the constructor of the ChatVertexAI
class or within the invoke
method.
In the following example, we’ll instruct the model to structure its response as an array of objects, each containing the city_name
and population
properties:
from langchain_google_vertexai import ChatVertexAI model = ChatVertexAI( model_name="gemini-1.5-flash-001", response_schema = { "type": "array", "items": { ...