epicflare ships with:
- D1 (
APP_DB) for relational storage - KV (
OAUTH_KV) for OAuth/session state (owned by@cloudflare/workers-oauth-provider) - Durable Objects (
MCP_OBJECT) for MCP server state
This guide covers how to add common Cloudflare offerings on top of the starter:
- R2 (object storage)
- Workers AI
- AI Gateway
- An additional KV namespace for app data (separate from
OAUTH_KV)
All examples assume you are using the template's wrangler.jsonc and that you
run commands from the repo root.
For local, interactive development:
- Log in once with
bunx wrangler login - Wrangler uses browser-based OAuth, and commands run as your user
For CI / GitHub Actions / automation:
- Create a Cloudflare API token and provide it as
CLOUDFLARE_API_TOKEN - Prefer least-privilege, account-scoped tokens (avoid "All accounts")
Cloudflare's API token UI changes over time, but the shape is stable:
- Token permissions are Account permissions (not Zone permissions)
- You grant Read/Edit per product area
- Wrangler deploys, creates resources, and sets secrets via the API token
Recommended baseline permissions for this template (deploy + existing D1/KV):
Workers Scripts:Edit(deploy, update, delete preview Workers)Workers KV Storage:Edit(OAuth/session KV)D1:Edit(migrations, database operations)
Add these permissions when you add the corresponding offering:
- R2:
R2 Storage:Edit - Workers AI:
Workers AI:Read(forwrangler ai models; deploy still uses Workers Scripts) - AI Gateway (only if you want to manage gateways via API):
AI Gateway:Edit
If you use bunx wrangler secret put ... in CI, your token must also be able to
edit Worker secrets (covered by Workers Scripts:Edit).
Use R2 for file uploads, user-generated media, and other blob/object storage.
R2 Storage:Edit(for creating/listing buckets via Wrangler)
Create separate buckets for production vs preview/testing:
bunx wrangler r2 bucket create <app-name>-uploadsbunx wrangler r2 bucket create <app-name>-uploads-previewbunx wrangler r2 bucket list
Wrangler can also update wrangler.jsonc for you:
bunx wrangler r2 bucket create <app-name>-uploads --binding UPLOADS_BUCKET --env production --update-configbunx wrangler r2 bucket create <app-name>-uploads-preview --binding UPLOADS_BUCKET --env preview --update-config
Add an r2_buckets binding in each environment you want to use it:
env.production: bind the production bucketenv.preview/env.test: bind the preview bucket
Example (production):
Example (preview/test):
"r2_buckets": [
{
"binding": "UPLOADS_BUCKET",
"bucket_name": "<app-name>-uploads-preview"
}
],bun run dev runs wrangler dev --local, so R2 is emulated locally. To hit
remote R2 from dev, run Wrangler with --remote (for example
bun ./wrangler-env.ts dev --remote).
Workers AI lets your Worker call Cloudflare-hosted AI models.
Workers Scripts:Edit(deploy)Workers AI:Read(optional; forwrangler ai models)
bunx wrangler ai modelsbunx wrangler ai models --json
Add the AI binding per environment:
"ai": { "binding": "AI" }If you want the Ai helper client, install it:
bun add @cloudflare/ai
In your Worker code, you typically construct a client like:
import { Ai } from '@cloudflare/ai'
const ai = new Ai(env.AI)AI Gateway provides analytics, caching, retries, and controls for calling third-party AI providers (for example OpenAI or Anthropic) from your Worker.
If you create/configure gateways via the dashboard, you do not need any extra token permissions beyond what you already use to deploy Workers.
If you want to manage gateways via API automation, add:
AI Gateway:Edit
There is no wrangler subcommand for AI Gateway in Wrangler v4. Create the
gateway in the Cloudflare dashboard:
- Cloudflare dashboard ->
AI->AI Gateway - Create a gateway and copy:
- your
account_id - the
gateway_idyou chose/received
- your
AI Gateway uses provider-specific upstream paths. For OpenAI-compatible traffic, the URL looks like:
https://gateway.ai.cloudflare.com/v1/<account_id>/<gateway_id>/openai
You can do a quick smoke test with curl (example uses OPENAI_API_KEY):
curl -sS "https://gateway.ai.cloudflare.com/v1/<account_id>/<gateway_id>/openai/v1/models" -H "Authorization: Bearer $OPENAI_API_KEY" | head
Most provider SDKs support overriding the base URL:
- Keep the provider API key as a Worker secret (for example
OPENAI_API_KEY) - Point the SDK's base URL at the gateway URL
For this starter's chat implementation, remote AI mode is expected to use a
gateway. Set AI_GATEWAY_ID in local .env when opting into AI_MODE=remote,
and configure GitHub Actions secrets so deploy workflows can sync it into the
worker secrets. Use AI_GATEWAY_ID for production deploys and
AI_GATEWAY_ID_PREVIEW for preview deploys if you want preview traffic routed
through a different gateway.
This template already binds OAUTH_KV for OAuth/session state. Treat OAUTH_KV
as "owned" by @cloudflare/workers-oauth-provider:
- Avoid mixing app data into
OAUTH_KV - Avoid key prefix collisions with the OAuth library
- Keep quotas/evictions isolated from auth state
Workers KV Storage:Edit
Create a dedicated namespace for app data, for example APP_KV:
bunx wrangler kv namespace create <app-name>-app --binding APP_KV --env production --update-configbunx wrangler kv namespace create <app-name>-app-preview --binding APP_KV --env production --preview --update-config
If you are not using --update-config, record the namespace IDs and add them to
wrangler.jsonc manually (see next section).
Add a second entry in kv_namespaces alongside OAUTH_KV:
"kv_namespaces": [
{
"binding": "OAUTH_KV",
"id": "<oauth-kv-id>",
"preview_id": "<oauth-kv-preview-id>"
},
{
"binding": "APP_KV",
"id": "<app-kv-id>",
"preview_id": "<app-kv-preview-id>"
}
],If you're looking for a higher-level way to integrate AI (streaming responses, tool/function calling, typed outputs, and a clean client/server contract), consider TanStack AI.
This starter is not a React app, so the main thing to know is: TanStack AI is framework-agnostic at its core. You can use:
@tanstack/aion the server (Workers) to run models and tools@tanstack/ai-clientin any UI (headless) to manage chat state + streaming
You only need the React/Solid packages if your UI framework benefits from their hooks.
Cloudflare maintains an official integration package that makes TanStack AI work well with both:
- Workers AI (
env.AIbinding) - AI Gateway (
env.AI.gateway("<gateway-id>"))
It includes ready-to-use adapters for chat, image generation, transcription, text-to-speech, and summarization for Workers AI models, plus AI Gateway routing for OpenAI/Anthropic/Gemini/Grok/OpenRouter. See:
bun add @tanstack/ai @tanstack/ai-client @cloudflare/tanstack-ai
If you want to route to third-party providers through AI Gateway, also install the TanStack provider packages you use (for example OpenAI or Anthropic):
bun add @tanstack/ai-openaibun add @tanstack/ai-anthropic
Add the Workers AI binding in each environment you want:
"ai": { "binding": "AI" }Then your Worker receives env.AI at runtime.
TanStack AI supports multiple streaming formats. A common pattern is:
- A Worker endpoint accepts
{ messages, conversationId? }as JSON - Server runs
chat(...)and returns a streamingResponse - Your UI uses
@tanstack/ai-clientwith either:fetchHttpStream("/api/chat")(newline-delimited JSON), orfetchServerSentEvents("/api/chat")(SSE)
Workers AI example (direct binding, no third-party API keys):
import { createWorkersAiChat } from '@cloudflare/tanstack-ai'
import { chat, toHttpResponse } from '@tanstack/ai'
const adapter = createWorkersAiChat('@cf/meta/llama-4-scout-17b-16e-instruct', {
binding: env.AI,
})
const stream = chat({
adapter,
stream: true,
messages,
conversationId,
})
return toHttpResponse(stream)AI Gateway example (route OpenAI requests through your gateway):
import { createOpenAiChat } from '@cloudflare/tanstack-ai'
import { chat, toHttpResponse } from '@tanstack/ai'
const adapter = createOpenAiChat('gpt-4o', {
binding: env.AI.gateway('my-gateway-id'),
// Depending on your gateway mode, you may not need to pass an API key here.
// apiKey: env.OPENAI_API_KEY,
})
const stream = chat({ adapter, stream: true, messages, conversationId })
return toHttpResponse(stream)Client usage (headless, works with any UI toolkit) looks like:
import { ChatClient, fetchHttpStream } from '@tanstack/ai-client'
const client = new ChatClient({
connection: fetchHttpStream('/api/chat'),
onMessagesChange: (next) => {
// Render however your app renders (not tied to React)
console.log(next)
},
})