Skip to main content

Call Responses API models on Claude Code

This tutorial shows how to call the Responses API models like codex-mini and o3-pro from the Claude Code endpoint on LiteLLM.

Pre-requisites:

1. Setup config.yaml​

model_list:
- model_name: codex-mini
litellm_params:
model: codex-mini
api_key: sk-proj-1234567890
api_base: https://api.openai.com/v1

2. Start proxy​

litellm --config /path/to/config.yaml

# RUNNING on http://0.0.0.0:4000

3. Test it! (Curl)​

curl -X POST http://0.0.0.0:4000/v1/messages \
-H "Authorization: Bearer sk-proj-1234567890" \
-H "Content-Type: application/json" \
-d '{
"model": "codex-mini",
"messages": [{"role": "user", "content": "What is the capital of France?"}]
}'

4. Test it! (Claude Code)​

  • Setup environment variables
export ANTHROPIC_API_BASE="http://0.0.0.0:4000"
export ANTHROPIC_API_KEY="sk-1234" # replace with your LiteLLM key
  • Start a Claude Code session
claude --model codex-mini-latest
  • Send a message