Overview
Integration details
Class | Package | Local | Serializable | JS support | Downloads | Version |
---|---|---|---|---|---|---|
ChatFireworks | langchain-fireworks | ❌ | beta | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ |
Setup
To access Fireworks models you’ll need to create a Fireworks account, get an API key, and install thelangchain-fireworks
integration package.
Credentials
Head to (fireworks.ai/login to sign up to Fireworks and generate an API key. Once you’ve done this set the FIREWORKS_API_KEY environment variable:Installation
The LangChain Fireworks integration lives in thelangchain-fireworks
package:
Instantiation
Now we can instantiate our model object and generate chat completions:- TODO: Update model instantiation with relevant params.
Invocation
API reference
For detailed documentation of all ChatFireworks features and configurations head to the API reference: python.langchain.com/api_reference/fireworks/chat_models/langchain_fireworks.chat_models.ChatFireworks.html To use thelangchain-fireworks
package, follow these installation steps:
Basic usage
Setting up
-
Sign in to Fireworks AI to obtain an API Key to access the models, and make sure it is set as the
FIREWORKS_API_KEY
environment variable. Once you’ve signed in and obtained an API key, follow these steps to set theFIREWORKS_API_KEY
environment variable:- Linux/macOS: Open your terminal and execute the following command:
Note: To make this environment variable persistent across terminal sessions, add the above line to your~/.bashrc
,~/.bash_profile
, or~/.zshrc
file.- Windows: For Command Prompt, use:
-
Set up your model using a model id. If the model is not set, the default model is
fireworks-llama-v2-7b-chat
. See the full, most up-to-date model list on fireworks.ai.