Providers
Using any LLM provider in opencode.
opencode uses the AI SDK and Models.dev to support for 75+ LLM providers and it supports running local models.
To add a provider you need to:
- Add the API keys for the provider using
opencode auth login
. - Configure the provider in your opencode config.
Credentials
When you add a provider’s API keys with opencode auth login
, they are stored
in ~/.local/share/opencode/auth.json
.
Config
You can customize the providers through the provider
section in your opencode
config.
Base URL
You can customize the base URL for any provider by setting the baseURL
option. This is useful when using proxy services or custom endpoints.
{ "$schema": "https://opencode.ai/config.json", "provider": { "anthropic": { "options": { "baseURL": "https://api.anthropic.com/v1" } } }}
Custom provider
To add any OpenAI-compatible provider that’s not listed in opencode auth login
:
-
Run
opencode auth login
and scroll down to Other.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ...│ ● Other└ -
Enter a unique ID for the provider.
Terminal window $ opencode auth login┌ Add credential│◇ Enter provider id│ myprovider└ -
Enter your API key for the provider.
Terminal window $ opencode auth login┌ Add credential│▲ This only stores a credential for myprovider - you will need configure it in opencode.json, check the docs for examples.│◇ Enter your API key│ sk-...└ -
Create or update your
opencode.json
file in your project directory:opencode.json {"$schema": "https://opencode.ai/config.json","provider": {"myprovider": {"npm": "@ai-sdk/openai-compatible","name": "My AI ProviderDisplay Name","options": {"baseURL": "https://api.myprovider.com/v1"},"models": {"my-model-name": {"name": "My Model Display Name"}}}}}Here are the configuration options:
- npm: AI SDK package to use,
@ai-sdk/openai-compatible
for OpenAI-compatible providers - name: Display name in UI.
- models: Available models.
- options.baseURL: API endpoint URL.
- options.apiKey: Optionally set the API key, if not using auth.
- options.headers: Optionally set custom headers.
More on the advanced options in the example below.
- npm: AI SDK package to use,
-
Run the
/models
command and your custom provider and models will appear in the selection list.
Example
Here’s an example setting the apiKey
and headers
options.
{ "$schema": "https://opencode.ai/config.json", "provider": { "myprovider": { "npm": "@ai-sdk/openai-compatible", "name": "My AI ProviderDisplay Name", "options": { "baseURL": "https://api.myprovider.com/v1", "apiKey": "{env:ANTHROPIC_API_KEY}", "headers": { "Authorization": "Bearer custom-token" } }, "models": { "my-model-name": { "name": "My Model Display Name" } } } }}
We are setting the apiKey
using the env
variable syntax, learn more.
Directory
Let’s look at some of the providers in detail. If you’d like to add a provider to the list, feel free to open a PR.
Amazon Bedrock
To use Amazon Bedrock with opencode:
-
Head over to the Model catalog in the Amazon Bedrock console and request access to the models you want.
-
You’ll need either to set one of the following environment variables:
AWS_ACCESS_KEY_ID
: You can get this by creating an IAM user and generating an access key for it.AWS_PROFILE
: First login through AWS IAM Identity Center (or AWS SSO) usingaws sso login
. Then get the name of the profile you want to use.AWS_BEARER_TOKEN_BEDROCK
: You can generate a long-term API key from the Amazon Bedrock console.
Once you have one of the above, set it while running opencode.
Terminal window AWS_ACCESS_KEY_ID=XXX opencodeOr add it to a
.env
file in the project root..env AWS_ACCESS_KEY_ID=XXXOr add it to your bash profile.
~/.bash_profile export AWS_ACCESS_KEY_ID=XXX -
Run the
/models
command to select the model you want.
Anthropic
We recommend signing up for Claude Pro or Max, it’s the most cost-effective way to use opencode.
Once you’ve singed up, run opencode auth login
and select Anthropic.
$ opencode auth login
┌ Add credential│◆ Select provider│ ● Anthropic (recommended)│ ○ OpenAI│ ○ Google│ ...└
Here you can select the Claude Pro/Max option and it’ll open your browser and ask you to authenticate.
$ opencode auth login┌ Add credential│◇ Select provider│ Anthropic│◆ Login method│ ● Claude Pro/Max│ ○ Create API Key│ ○ Manually enter API Key└
Now all the the Anthropic models should be available when you use the /models
command.
Using API keys
You can also select Create API Key if you don’t have a Pro/Max subscription. It’ll also open your browser and ask you to login to Anthropic and give you a code you can paste in your terminal.
Or if you already have an API key, you can select Manually enter API Key and paste it in your terminal.
Azure OpenAI
-
Head over to the Azure portal and create an Azure OpenAI resource. You’ll need:
- Resource name: This becomes part of your API endpoint (
https://RESOURCE_NAME.openai.azure.com/
) - API key: Either
KEY 1
orKEY 2
from your resource
- Resource name: This becomes part of your API endpoint (
-
Go to Azure AI Foundry and deploy a model.
-
Run
opencode auth login
and select Azure.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● Azure│ ...└ -
Enter your API key.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ Azure│◇ Enter your API key│ _└ -
Set your resource name as an environment variable:
Terminal window AZURE_RESOURCE_NAME=XXX opencodeOr add it to a
.env
file in the project root:.env AZURE_RESOURCE_NAME=XXXOr add it to your bash profile:
~/.bash_profile export AZURE_RESOURCE_NAME=XXX -
Run the
/models
command to select your deployed model.
Cerebras
-
Head over to the Cerebras console, create an account, and generate an API key.
-
Run
opencode auth login
and select Cerebras.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● Cerebras│ ...└ -
Enter your Cerebras API key.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ Cerebras│◇ Enter your API key│ _└ -
Run the
/models
command to select a model like Qwen 3 Coder 480B.
DeepSeek
-
Head over to the DeepSeek console, create an account, and click Create new API key.
-
Run
opencode auth login
and select DeepSeek.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● DeepSeek│ ...└ -
Enter your DeepSeek API key.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ DeepSeek│◇ Enter your API key│ _└ -
Run the
/models
command to select a DeepSeek model like DeepSeek Reasoner.
Fireworks AI
-
Head over to the Fireworks AI console, create an account, and click Create API Key.
-
Run
opencode auth login
and select Fireworks AI.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● Fireworks AI│ ...└ -
Enter your Fireworks AI API key.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ Fireworks AI│◇ Enter your API key│ _└ -
Run the
/models
command to select a model like Kimi K2 Instruct.
GitHub Copilot
To use your GitHub Copilot subscription with opencode:
-
Run
opencode auth login
and select GitHub Copilot.Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ GitHub Copilot│◇ ──────────────────────────────────────────────╮│ ││ Please visit: https://github.com/login/device ││ Enter code: 8F43-6FCF ││ │├─────────────────────────────────────────────────╯│◓ Waiting for authorization... -
Navigate to github.com/login/device and enter the code.
-
Now run the
/models
command to select the model you want.
Groq
-
Head over to the Groq console, click Create API Key, and copy the key.
-
Run
opencode auth login
and select Groq.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● Groq│ ...└ -
Enter the API key for the provider.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ Groq│◇ Enter your API key│ _└ -
Run the
/models
command to select the one you want.
LM Studio
You can configure opencode to use local models through LM Studio.
{ "$schema": "https://opencode.ai/config.json", "provider": { "lmstudio": { "npm": "@ai-sdk/openai-compatible", "name": "LM Studio (local)", "options": { "baseURL": "http://127.0.0.1:1234/v1" }, "models": { "google/gemma-3n-e4b": { "name": "Gemma 3n-e4b (local)" } } } }}
In this example:
lmstudio
is the custom provider ID. This can be any string you want.npm
specifies the package to use for this provider. Here,@ai-sdk/openai-compatible
is used for any OpenAI-compatible API.name
is the display name for the provider in the UI.options.baseURL
is the endpoint for the local server.models
is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
Moonshot AI
To use Kimi K2 from Moonshot AI:
-
Head over to the Moonshot AI console, create an account, and click Create API key.
-
Run
opencode auth login
and select Other.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ...│ ● Other└ -
Enter
moonshot
as the provider ID.Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ Other│◇ Enter provider id│ moonshot└ -
Enter your Moonshot API key.
Terminal window $ opencode auth login┌ Add credential│◇ Enter your API key│ sk-...└ -
Configure Moonshot in your opencode config.
opencode.json {"$schema": "https://opencode.ai/config.json","provider": {"moonshot": {"npm": "@ai-sdk/openai-compatible","name": "Moonshot AI","options": {"baseURL": "https://api.moonshot.ai/v1"},"models": {"kimi-k2-0711-preview": {"name": "Kimi K2"}}}}} -
Run the
/models
command to select Kimi K2.
Ollama
You can configure opencode to use local models through Ollama.
{ "$schema": "https://opencode.ai/config.json", "provider": { "ollama": { "npm": "@ai-sdk/openai-compatible", "name": "Ollama (local)", "options": { "baseURL": "http://localhost:11434/v1" }, "models": { "llama2": { "name": "Llama 2" } } } }}
In this example:
ollama
is the custom provider ID. This can be any string you want.npm
specifies the package to use for this provider. Here,@ai-sdk/openai-compatible
is used for any OpenAI-compatible API.name
is the display name for the provider in the UI.options.baseURL
is the endpoint for the local server.models
is a map of model IDs to their configurations. The model name will be displayed in the model selection list.
OpenAI
https://platform.openai.com/api-keys
-
Head over to the OpenAI Platform console, click Create new secret key, and copy the key.
-
Run
opencode auth login
and select OpenAI.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● OpenAI│ ...└ -
Enter the API key for the provider.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ OpenAI│◇ Enter your API key│ _└ -
Run the
/models
command to select the one you want.
OpenRouter
-
Head over to the OpenRouter dashboard, click Create API Key, and copy the key.
-
Run
opencode auth login
and select OpenRouter.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● OpenRouter│ ○ Anthropic│ ○ Google│ ...└ -
Enter the API key for the provider.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ OpenRouter│◇ Enter your API key│ _└ -
Many OpenRouter models are preloaded by default, run the
/models
command to select the one you want.You can also add additional models through your opencode config.
opencode.json {"$schema": "https://opencode.ai/config.json","provider": {"openrouter": {"models": {"somecoolnewmodel": {}}}}} -
You can also customize them through your opencode config. Here’s an example of specifying a provider
opencode.json {"$schema": "https://opencode.ai/config.json","provider": {"openrouter": {"models": {"moonshotai/kimi-k2": {"options": {"provider": {"order": ["baseten"],"allow_fallbacks": false}}}}}}}
Together AI
-
Head over to the Together AI console, create an account, and click Add Key.
-
Run
opencode auth login
and select Together AI.Terminal window $ opencode auth login┌ Add credential│◆ Select provider│ ● Together AI│ ...└ -
Enter your Together AI API key.
Terminal window $ opencode auth login┌ Add credential│◇ Select provider│ Together AI│◇ Enter your API key│ _└ -
Run the
/models
command to select a model like Kimi K2 Instruct.
Troubleshooting
If you are having trouble with configuring a provider, check the following:
-
Check the auth setup: Run
opencode auth list
to see if the credentials for the provider are added to your config.This doesn’t apply to providers like Amazon Bedrock, that rely on environment variables for their auth.
-
For custom providers, check the opencode config and:
- Make sure the provider ID used in
opencode auth login
matches the ID in your opencode config. - The right npm package is used for the provider. For example, use
@ai-sdk/cerebras
for Cerebras. And for all other OpenAI-compatible providers, use@ai-sdk/openai-compatible
. - Check correct API endpoint is used in the
options.baseURL
field.
- Make sure the provider ID used in