Python FastAPI AI agent with OpenAI integration demonstrating AI workloads with Aspire.
This sample demonstrates Aspire 13's Python support combined with AI workloads, showcasing a Python-based AI chat agent powered by OpenAI.
- Aspire CLI
- Docker
- Python 3.8+
- uv
- OpenAI API key
aspire run # Run locally
aspire deploy # Deploy to Docker Compose
aspire do docker-compose-down-dc # Teardown deploymentWhen you run the app for the first time, Aspire will prompt you for your OpenAI API key.
The application consists of:
- Aspire AppHost - Orchestrates the Python AI agent
- Python AI Agent - FastAPI service with web UI and REST API powered by OpenAI
- Chat UI - Simple web interface for chatting with the AI agent
The apphost.ts configuration demonstrates AI workloads with Aspire:
import { createBuilder } from "./.modules/aspire.js";
const builder = await createBuilder();
await builder.addDockerComposeEnvironment("dc");
const openAiApiKey = await builder.addParameter("openai-api-key", { secret: true });
await builder.addOpenAI("openai")
.withApiKey(openAiApiKey);
await builder.addUvicornApp("ai-agent", "./agent", "main:app")
.withUv()
.withExternalHttpEndpoints()
.withEnvironment("OPENAI_API_KEY", openAiApiKey);
await builder.build().run();Key features:
- Python AI Integration: Uses
addUvicornAppto run FastAPI with OpenAI SDK - OpenAI Integration:
addOpenAIprompts for API key on first run and securely stores it - Direct Key Injection: The OpenAI API key is passed directly to the Python app as
OPENAI_API_KEY - uv Package Manager: Uses
.withUv()for fast dependency installation frompyproject.toml - Web UI: Clean, modern chat interface for interacting with the AI
- Automatic Virtual Environment: Aspire creates
.venvand installs dependencies with uv - External HTTP Endpoints: AI agent accessible externally for testing
- Session Management: Maintains conversation history per session for context-aware responses
Once you run aspire run, you can access the AI agent in two ways:
Open your browser and navigate to the AI agent's endpoint shown in the Aspire Dashboard. You'll see a clean chat interface where you can:
- Type messages and get AI responses
- See conversation history
- Watch typing indicators while the AI thinks
- Get automatic session management
The Python AI agent also provides REST endpoints:
GET /- Serves the chat UIGET /api- API informationGET /health- Health check (shows OpenAI availability)POST /chat- Send a message and get AI responseGET /sessions- List active conversation sessionsDELETE /sessions/{id}- Clear a conversation session
This sample is instructional and is not production-ready; see the repository's
security disclaimer. By default, /chat
is anonymous. If you configure AGENT_API_KEY, /chat, GET /sessions, and
DELETE /sessions/{id} require the same value in the X-API-Key header.
The code includes basic OpenAI quota and cost controls: message length, session/history caps, response token caps, per-client rate limiting, and a model allow-list. Production deployments should add real authentication and authorization (see FastAPI security), abuse monitoring, persistent session storage if conversations must survive process restarts, and prompt/content safety controls aligned with OpenAI production best practices, OpenAI safety best practices, and OWASP LLM Prompt Injection guidance.
You can also interact with the AI agent programmatically via its REST API:
# Send a chat message
curl -X POST http://localhost:<port>/chat \
-H "Content-Type: application/json" \
-d '{"message": "What is the fastest land animal?", "session_id": "my-session"}'
# Check health
curl http://localhost:<port>/health
# List active sessions
curl http://localhost:<port>/sessions- Virtual Environment: Aspire automatically creates a
.venvdirectory for the Python agent - Fast Dependency Installation: uv installs dependencies from
pyproject.toml(much faster than pip) - AI Agent Startup: The Python FastAPI app initializes the OpenAI client with the provided API key
- Web UI: The root endpoint (
/) serves a static HTML chat interface - REST API: Additional endpoints for programmatic access
- Session Management: The agent maintains conversation history per session for context-aware responses
This sample includes VS Code configuration for Python development:
.vscode/settings.json: Configures the Python interpreter to use the Aspire-created virtual environment- After running
aspire run, open the sample in VS Code for full IntelliSense and debugging support - The virtual environment at
agent/.venvwill be automatically detected
Deploy to Docker Compose:
aspire deployThis will:
- Generate a Dockerfile for the Python application
- Install Python dependencies and build container image
- Generate Docker Compose configuration
- Deploy the application stack
- The Python app uses automatic dependency installation via
pyproject.tomlwith uv (much faster than pip) - Hot reload is enabled for local development
- OpenAI responses are stored in session history for contextual conversations
- Use the Aspire Dashboard to view logs and monitor the agent
- uv automatically creates a
uv.lockfile for reproducible builds
