Configure DeepSeek in WebStorm (Official / OpenRouter / SiliconFlow)

Jan 7, 2026

To use DeepSeek in WebStorm, find an AI plugin or settings page that supports OpenAI-compatible endpoints, then fill in the Base URL and API key. Below are the parameters and example configs for the three common providers.

Quick Parameters

ProviderBase URLRecommended Model
DeepSeek Officialhttps://api.deepseek.comdeepseek-chat (V3) / deepseek-reasoner (R1)
OpenRouterhttps://openrouter.ai/api/v1deepseek/deepseek-chat
SiliconFlowhttps://api.siliconflow.cn/v1deepseek-ai/DeepSeek-V3

Field names vary by plugin (Base URL / API Base / apiBase). Use the equivalent field in your UI.

Where to Configure (WebStorm)

  • Open Settings/Preferences → Plugins and install an OpenAI-compatible plugin.
  • Search the plugin name in Settings and open its configuration.
  • Choose an OpenAI compatible provider and fill in Base URL and API key.
  • Set the default model, save, and test in the AI panel.

Get an API Key

DeepSeek Official Example

{
  "baseUrl": "https://api.deepseek.com",
  "apiKey": "YOUR_API_KEY",
  "model": "deepseek-chat"
}

OpenRouter Example

{
  "baseUrl": "https://openrouter.ai/api/v1",
  "apiKey": "YOUR_API_KEY",
  "model": "deepseek/deepseek-chat"
}

SiliconFlow Example

{
  "baseUrl": "https://api.siliconflow.cn/v1",
  "apiKey": "YOUR_API_KEY",
  "model": "deepseek-ai/DeepSeek-V3"
}

FAQ

  • 401 / 403 errors: Check that your API key is correct and your balance is sufficient.
  • Slow responses: Try switching to OpenRouter or SiliconFlow.
  • Model not found: Verify the model name matches the provider's latest list.
DeepSeekHubs

DeepSeekHubs