Skip to main content
DeepSeek provides powerful AI models with an OpenAI-compatible API.
PropertyValue
Providerdeepseek
AuthDEEPSEEK_API_KEY
APIOpenAI-compatible
Base URLhttps://api.deepseek.com

Getting started

1

Get your API key

Create an API key at platform.deepseek.com.
2

Run onboarding

fluffbuzz onboard --auth-choice deepseek-api-key
This will prompt for your API key and set deepseek/deepseek-chat as the default model.
3

Verify models are available

fluffbuzz models list --provider deepseek
For scripted or headless installations, pass all flags directly:
fluffbuzz onboard --non-interactive \
  --mode local \
  --auth-choice deepseek-api-key \
  --deepseek-api-key "$DEEPSEEK_API_KEY" \
  --skip-health \
  --accept-risk
If the Gateway runs as a daemon (launchd/systemd), make sure DEEPSEEK_API_KEY is available to that process (for example, in ~/.fluffbuzz/.env or via env.shellEnv).

Built-in catalog

Model refNameInputContextMax outputNotes
deepseek/deepseek-chatDeepSeek Chattext131,0728,192Default model; DeepSeek V3.2 non-thinking surface
deepseek/deepseek-reasonerDeepSeek Reasonertext131,07265,536Reasoning-enabled V3.2 surface
Both bundled models currently advertise streaming usage compatibility in source.

Config example

{
  env: { DEEPSEEK_API_KEY: "sk-..." },
  agents: {
    defaults: {
      model: { primary: "deepseek/deepseek-chat" },
    },
  },
}

Model selection

Choosing providers, model refs, and failover behavior.

Configuration reference

Full config reference for agents, models, and providers.