LangChain is the de facto standard framework for building LLM applications. Since DeepSeek V3 is fully compatible with the OpenAI API, we can use the langchain-openai package for seamless integration.
Quick Start (Python)
1. Install Dependencies
pip install -U langchain-openai2. Standard Integration
We use the ChatOpenAI class, pointing the base_url to DeepSeek.
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model='deepseek-chat',
openai_api_key='sk-xxxx',
openai_api_base='https://api.deepseek.com',
max_tokens=1024
)
response = llm.invoke("Hello DeepSeek! Write a Python bubble sort function.")
print(response.content)Environment Variables (Recommended)
For production, store your secrets in a .env file:
OPENAI_API_KEY=sk-xxxx # Your DeepSeek Key
OPENAI_API_BASE=https://api.deepseek.comThen simplify your code:
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
load_dotenv()
# LangChain automatically picks up the env vars
llm = ChatOpenAI(model='deepseek-chat')Advanced: Streaming
for chunk in llm.stream("Tell me a joke about Python"):
print(chunk.content, end="", flush=True)FAQ
Q: AuthenticationError?
A: Ensure openai_api_base is exactly https://api.deepseek.com (no /v1 suffix usually, as ChatOpenAI handles it, but try https://api.deepseek.com/v1 if you use an older version).
Q: Function Calling?
A: Yes, DeepSeek V3 supports function calling. In LangChain, simply use bind_tools and it works out of the box.

