Skip to main content
Chat models are language models that use a sequence of messages as inputs and return messages as outputs .

Install and use

Install:
npm i @langchain/groq
Add environment variables:
GROQ_API_KEY=your-api-key
Instantiate the model:
import { ChatGroq } from "@langchain/groq";

const model = new ChatGroq({
  model: "llama-3.3-70b-versatile",
  temperature: 0
});
await model.invoke("Hello, world!")
Install:
npm i @langchain/openai
Add environment variables:
OPENAI_API_KEY=your-api-key
Instantiate the model:
import { ChatOpenAI } from "@langchain/openai";

const model = new ChatOpenAI({ model: "gpt-4o-mini" });
await model.invoke("Hello, world!")
Install:
npm i @langchain/anthropic
Add environment variables:
ANTHROPIC_API_KEY=your-api-key
Instantiate the model:
import { ChatAnthropic } from "@langchain/anthropic";

const model = new ChatAnthropic({
  model: "claude-3-sonnet-20240620",
  temperature: 0
});
await model.invoke("Hello, world!")
Install:
npm i @langchain/google-genai
Add environment variables:
GOOGLE_API_KEY=your-api-key
Instantiate the model:
import { ChatGoogleGenerativeAI } from "@langchain/google-genai";

const model = new ChatGoogleGenerativeAI({
  modelName: "gemini-2.5-flash-lite-latest",
  temperature: 0
});
await model.invoke("Hello, world!")
Install:
npm i @langchain/community
Add environment variables:
FIREWORKS_API_KEY=your-api-key
Instantiate the model:
import { ChatFireworks } from "@langchain/community/chat_models/fireworks";

const model = new ChatFireworks({
  model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
  temperature: 0
});
await model.invoke("Hello, world!")
Install:
npm i @langchain/mistralai
Add environment variables:
MISTRAL_API_KEY=your-api-key
Instantiate the model:
import { ChatMistralAI } from "@langchain/mistralai";

const model = new ChatMistralAI({
  model: "mistral-large-latest",
  temperature: 0
});
await model.invoke("Hello, world!")
Install:
npm i @langchain/google-vertexai
Add environment variables:
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
Instantiate the model:
import { ChatVertexAI } from "@langchain/google-vertexai";

const model = new ChatVertexAI({
  model: "gemini-1.5-flash",
  temperature: 0
});
await model.invoke("Hello, world!")
ModelStreamJSON modeTool CallingwithStructuredOutput()Multimodal
BedrockChat🟡 (Bedrock Anthropic only)🟡 (Bedrock Anthropic only)🟡 (Bedrock Anthropic only)
ChatBedrockConverse
ChatAnthropic
ChatCloudflareWorkersAI
ChatCohere
ChatFireworks
ChatGoogleGenerativeAI
ChatVertexAI
ChatGroq
ChatMistralAI
ChatOllama
ChatOpenAI
ChatTogetherAI
ChatXAI

All chat models

If you’d like to contribute an integration, see Contributing integrations.
I