ChatGoogleGenerativeAI
This docs will help you get started with Google AI chat models. For detailed documentation of all ChatGoogleGenerativeAI features and configurations head to the API reference.
Google AI offers a number of different chat models. For information on the latest models, their features, context windows, etc. head to the Google AI docs.
Google's Gemini models are accessible through Google AI and through Google Cloud Vertex AI. Using Google AI just requires a Google account and an API key. Using Google Cloud Vertex AI requires a Google Cloud account (with term agreements and billing) but offers enterprise features like customer encription key, virtual private cloud, and more.
To learn more about the key features of the two APIs see the Google docs.
Overview
Integration details
Class | Package | Local | Serializable | JS support | Package downloads | Package latest |
---|---|---|---|---|---|---|
ChatGoogleGenerativeAI | langchain-google-genai | ❌ | beta | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ |
Setup
To access Google AI models you'll need to create a Google Acount account, get a Google AI API key, and install the langchain-google-genai
integration package.
Credentials
Head to https://ai.google.dev/gemini-api/docs/api-key to generate a Google AI API key. Once you've done this set the GOOGLE_API_KEY environment variable:
import getpass
import os
if "GOOGLE_API_KEY" not in os.environ:
os.environ["GOOGLE_API_KEY"] = getpass.getpass("Enter your Google AI API key: ")
If you want to get automated tracing of your model calls you can also set your LangSmith API key by uncommenting below:
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")
# os.environ["LANGSMITH_TRACING"] = "true"
Installation
The LangChain Google AI integration lives in the langchain-google-genai
package:
%pip install -qU langchain-google-genai
Instantiation
Now we can instantiate our model object and generate chat completions:
from langchain_google_genai import ChatGoogleGenerativeAI
llm = ChatGoogleGenerativeAI(
model="gemini-1.5-pro",
temperature=0,
max_tokens=None,
timeout=None,
max_retries=2,
# other params...
)