English | 简体中文
A high-performance AI API relay service built with Rust. Supports multi-platform account management and intelligent scheduling for Claude, Gemini, and OpenAI Responses (Codex).
| Platform | Authentication | Description |
|---|---|---|
| Claude | OAuth / API Key | Supports Claude Code CLI OAuth and standard API Key |
| Gemini | Google OAuth | Supports Google OAuth authentication |
| OpenAI Responses | API Key | Supports OpenAI Responses API (Codex CLI) |
- Smart Account Scheduling - Priority-based automatic account switching
- Sticky Sessions - Same session bound to same account for context continuity
- Auto Token Refresh - OAuth token auto-renewal with 10-second advance refresh
- Proxy Support - Independent SOCKS5/HTTP proxy per account
- Custom API URL - Configurable API endpoints (mirrors/proxies)
- Streaming Response - Full SSE streaming support
- Error Failover - Intelligent error detection and automatic account switching
mkdir cc-relay && cd cc-relay
curl -O https://raw.githubusercontent.com/wakaka6/claude-code-relay/main/config.example.toml
curl -O https://raw.githubusercontent.com/wakaka6/claude-code-relay/main/docker-compose.yml
mv config.example.toml config.toml
# Edit config.toml with your account info
docker compose up -dOr use docker run:
docker run -d \
--name cc-relay-server \
-p 3000:3000 \
-v ./config.toml:/app/config.toml:ro \
-v ./data:/app/data \
wakaka6/claude-code-relay:latestNote: Docker deployment requires setting
hostto0.0.0.0in config. See FAQ.
paru -S claude-code-relay
sudo vim /etc/cc-relay-server/config.toml # Configure account info
sudo systemctl enable --now cc-relay-serverbrew tap wakaka6/tap
brew install claude-code-relay
vim $(brew --prefix)/etc/cc-relay-server/config.toml # Configure account info
brew services start claude-code-relayDownload the binary for your platform from Releases:
./cc-relay-server --config config.tomlgit clone https://github.com/wakaka6/claude-code-relay.git
cd claude-code-relay
cargo build --release
cp config.example.toml config.toml
# Edit config.toml
./target/release/cc-relay-server --config config.tomlcurl http://localhost:3000/healthapi_keys = ["your-relay-key"] # Must be before [server]
[server]
host = "127.0.0.1" # Use "0.0.0.0" for Docker deployment
port = 3000
[[accounts]]
type = "claude-api"
id = "main"
name = "Main Account"
priority = 100
enabled = true
api_key = "sk-ant-api03-xxxx"[server]
host = "127.0.0.1"
port = 3000
database_path = "data/relay.db"
log_level = "info" # trace, debug, info, warn, errorUsed for client authentication when accessing the Relay service. This is independent of upstream API keys. Clients must include one of these keys in the Authorization or x-api-key header.
api_keys = [
"your-api-key-1",
"your-api-key-2",
]Leave empty api_keys = [] to disable authentication. Any key will work, and usage will be tracked as anonymous.
[session]
sticky_ttl_seconds = 3600 # Session TTL (default: 1 hour)
renewal_threshold_seconds = 300 # Renew when less than 5 minutes remaining
unavailable_cooldown_seconds = 3600 # Account unavailable cooldown (default: 1 hour)
rate_limit_cooldown_seconds = 60 # Rate limit cooldown (default: 60 seconds)
# Uses Retry-After header from API if available, falls back to this defaultOnly configure the platforms you need.
Claude OAuth Account
[[accounts]]
type = "claude-oauth"
id = "claude-1"
name = "Claude OAuth Account"
priority = 100
enabled = true
refresh_token = "your-refresh-token"
api_url = "https://api.anthropic.com" # OptionalClaude API Key Account
[[accounts]]
type = "claude-api"
id = "claude-api-1"
name = "Claude API Account"
priority = 90
enabled = true
api_key = "sk-ant-api03-xxxx"Gemini Account
[[accounts]]
type = "gemini"
id = "gemini-1"
name = "Gemini Account"
priority = 100
enabled = true
refresh_token = "your-google-refresh-token"OpenAI Responses Account
[[accounts]]
type = "openai-responses"
id = "codex-1"
name = "OpenAI Responses Account"
priority = 100
enabled = true
api_key = "sk-your-openai-api-key"Proxy Configuration
[accounts.proxy]
type = "socks5" # or "http"
host = "127.0.0.1"
port = 1080
username = "user" # Optional
password = "pass" # Optional| Service | Endpoint | Description |
|---|---|---|
| Claude | POST /api/v1/messages |
Claude Messages API |
POST /claude/v1/messages |
Alias route | |
| Gemini | POST /gemini/v1/models/:model:generateContent |
Standard generation |
POST /gemini/v1/models/:model:streamGenerateContent |
Streaming generation | |
| OpenAI Compatible | POST /openai/v1/chat/completions |
Convert to Claude |
| OpenAI Responses | POST /openai/v1/responses |
Responses API |
| System | GET /health |
Health check |
Claude Code CLI
export ANTHROPIC_BASE_URL=http://localhost:3000
export ANTHROPIC_API_KEY=your-relay-api-key
claudeGemini CLI
export GEMINI_API_BASE=http://localhost:3000/gemini
export GEMINI_API_KEY=your-relay-api-key
geminiOpenAI Codex CLI
export OPENAI_BASE_URL=http://localhost:3000/openai/v1
export OPENAI_API_KEY=your-relay-api-key
codexPython / Node.js SDK
Python:
import anthropic
client = anthropic.Anthropic(base_url="http://localhost:3000", api_key="your-key")Node.js:
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
baseURL: "http://localhost:3000",
apiKey: "your-key",
});cargo test # Run tests
cargo clippy # Lint code
cargo fmt # Format codeCannot connect to service after starting with Docker Compose
Problem: Client cannot connect to localhost:3000.
Cause: Config has host = "127.0.0.1" which only listens on container's internal network.
Solution: Change to host = "0.0.0.0":
[server]
host = "0.0.0.0"
port = 3000Contributions are welcome! Feel free to submit Issues and Pull Requests.