A Model Context Protocol (MCP) server for AWS HealthLake FHIR operations. Provides 11 tools for comprehensive FHIR resource management with automatic datastore discovery.
- Features
- Prerequisites
- Quick Start
- MCP Client Configuration
- Read-Only Mode
- Available Tools
- Usage Examples
- Authentication
- Error Handling
- Troubleshooting
- Development
- Contributing
- License
- Support
- 11 FHIR Tools: Complete CRUD operations (6 read-only, 5 write), advanced search, patient-everything, job management
- Read-Only Mode: Security-focused mode that blocks all mutating operations while preserving read access
- MCP Resources: Automatic datastore discovery - no manual datastore IDs needed
- Advanced Search: Chained parameters, includes, revIncludes, modifiers, and date/number prefixes with pagination
- AWS Integration: SigV4 authentication with automatic credential handling and region support
- Comprehensive Testing: 235 tests with 96% coverage ensuring reliability
- Task Automation: Poethepoet integration for streamlined development workflow
- Error Handling: Structured error responses with specific error types and helpful messages
- Docker Support: Containerized deployment with flexible authentication options
- Python 3.10+ (required by MCP framework)
- AWS credentials configured
- AWS HealthLake access with appropriate permissions
Choose your preferred installation method:
| Kiro | Cursor | VS Code |
|---|---|---|
# Install and run latest version automatically
uvx awslabs.healthlake-mcp-server@latestuv tool install awslabs.healthlake-mcp-server
awslabs.healthlake-mcp-server# Build and run with Docker
docker build -t healthlake-mcp-server .
docker run -e AWS_ACCESS_KEY_ID=xxx -e AWS_SECRET_ACCESS_KEY=yyy healthlake-mcp-server
# Or use pre-built image with environment variables
docker run -e AWS_ACCESS_KEY_ID=your_key -e AWS_SECRET_ACCESS_KEY=your_secret -e AWS_REGION=us-east-1 awslabs/healthlake-mcp-server
# With AWS profile (mount credentials)
docker run -v ~/.aws:/root/.aws -e AWS_PROFILE=your-profile awslabs/healthlake-mcp-server
# Read-only mode
docker run -e AWS_ACCESS_KEY_ID=your_key -e AWS_SECRET_ACCESS_KEY=your_secret -e AWS_REGION=us-east-1 awslabs/healthlake-mcp-server --readonlySee the Kiro IDE documentation or the Kiro CLI documentation for details.
For global configuration, edit ~/.kiro/settings/mcp.json. For project-specific configuration, edit .kiro/settings/mcp.json in your project directory.
Configuration:
{
"mcpServers": {
"healthlake": {
"command": "uvx",
"args": ["awslabs.healthlake-mcp-server@latest"],
"env": {
"AWS_REGION": "us-east-1",
"AWS_PROFILE": "your-profile-name",
"MCP_LOG_LEVEL": "INFO"
}
}
}
}Read-Only Configuration:
{
"mcpServers": {
"healthlake-readonly": {
"command": "uvx",
"args": ["awslabs.healthlake-mcp-server@latest", "--readonly"],
"env": {
"AWS_REGION": "us-east-1",
"AWS_PROFILE": "your-profile-name",
"MCP_LOG_LEVEL": "INFO"
}
}
}
}With environment variables:
{
"mcpServers": {
"healthlake": {
"command": "docker",
"args": [
"run", "--rm",
"-e", "AWS_ACCESS_KEY_ID=your_key",
"-e", "AWS_SECRET_ACCESS_KEY=your_secret",
"-e", "AWS_REGION=us-east-1",
"-e", "MCP_LOG_LEVEL=INFO",
"awslabs/healthlake-mcp-server"
]
}
}
}With AWS credentials mounted:
{
"mcpServers": {
"healthlake": {
"command": "docker",
"args": [
"run", "--rm",
"-v", "~/.aws:/root/.aws",
"-e", "AWS_PROFILE=your-profile",
"-e", "MCP_LOG_LEVEL=INFO",
"awslabs/healthlake-mcp-server"
]
}
}
}Read-Only Mode with Docker:
{
"mcpServers": {
"healthlake-readonly": {
"command": "docker",
"args": [
"run", "--rm",
"-e", "AWS_ACCESS_KEY_ID=your_key",
"-e", "AWS_SECRET_ACCESS_KEY=your_secret",
"-e", "AWS_REGION=us-east-1",
"-e", "MCP_LOG_LEVEL=INFO",
"awslabs/healthlake-mcp-server",
"--readonly"
]
}
}
}See examples/mcp_config.json for additional configuration examples.
The server supports a read-only mode that prevents all mutating operations while still allowing read operations. This is useful for:
- Safety: Preventing accidental modifications in production environments
- Testing: Allowing safe exploration of FHIR resources without risk of changes
- Auditing: Running the server in environments where only read access should be allowed
- Compliance: Meeting security requirements for read-only access to healthcare data
Add the --readonly flag when starting the server:
# Using uvx
uvx awslabs.healthlake-mcp-server@latest --readonly
# Or if installed locally
python -m awslabs.healthlake_mcp_server.main --readonly| Operation | Available | Description |
|---|---|---|
list_datastores |
✅ | List all HealthLake datastores |
get_datastore_details |
✅ | Get detailed datastore information |
read_fhir_resource |
✅ | Retrieve specific FHIR resources |
search_fhir_resources |
✅ | Advanced FHIR search operations |
patient_everything |
✅ | Comprehensive patient record retrieval |
list_fhir_jobs |
✅ | Monitor import/export job status |
| Operation | Blocked | Description |
|---|---|---|
create_fhir_resource |
❌ | Create new FHIR resources |
update_fhir_resource |
❌ | Update existing FHIR resources |
delete_fhir_resource |
❌ | Delete FHIR resources |
start_fhir_import_job |
❌ | Start FHIR data import jobs |
start_fhir_export_job |
❌ | Start FHIR data export jobs |
The server provides 11 comprehensive FHIR tools organized into four categories:
list_datastores- List all HealthLake datastores with optional status filteringget_datastore_details- Get detailed datastore information including endpoints and metadata
create_fhir_resource- Create new FHIR resources with validationread_fhir_resource- Retrieve specific FHIR resources by IDupdate_fhir_resource- Update existing FHIR resources with versioningdelete_fhir_resource- Delete FHIR resources from datastores
search_fhir_resources- Advanced FHIR search with modifiers, chaining, includes, and paginationpatient_everything- Comprehensive patient record retrieval using FHIR $patient-everything operation
start_fhir_import_job- Start FHIR data import jobs from S3start_fhir_export_job- Start FHIR data export jobs to S3list_fhir_jobs- List and monitor import/export jobs with status filtering
The server automatically exposes HealthLake datastores as MCP resources, enabling:
- Automatic discovery of available datastores
- No manual datastore ID entry required
- Status visibility (ACTIVE, CREATING, etc.)
- Metadata access (creation date, endpoints, etc.)
// Create a patient (datastore discovered automatically)
{
"datastore_id": "discovered-from-resources",
"resource_type": "Patient",
"resource_data": {
"resourceType": "Patient",
"name": [{"family": "Smith", "given": ["John"]}],
"gender": "male"
}
}// Search with modifiers and includes
{
"datastore_id": "discovered-from-resources",
"resource_type": "Patient",
"search_params": {
"name:contains": "smith",
"birthdate": "ge1990-01-01"
},
"include_params": ["Patient:general-practitioner"],
"revinclude_params": ["Observation:subject"]
}// Get all resources for a patient
{
"datastore_id": "discovered-from-resources",
"patient_id": "patient-123",
"start": "2023-01-01",
"end": "2023-12-31"
}Configure AWS credentials using any of these methods:
- AWS CLI:
aws configure - Environment variables:
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY - IAM roles (for EC2/Lambda)
- AWS profiles: Set
AWS_PROFILEenvironment variable
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"healthlake:ListFHIRDatastores",
"healthlake:DescribeFHIRDatastore",
"healthlake:CreateResource",
"healthlake:ReadResource",
"healthlake:UpdateResource",
"healthlake:DeleteResource",
"healthlake:SearchWithGet",
"healthlake:SearchWithPost",
"healthlake:StartFHIRImportJob",
"healthlake:StartFHIRExportJob",
"healthlake:ListFHIRImportJobs",
"healthlake:ListFHIRExportJobs"
],
"Resource": "*"
}
]
}All tools return structured error responses:
{
"error": true,
"type": "validation_error",
"message": "Datastore ID must be 32 characters"
}Error Types:
validation_error- Invalid input parametersnot_found- Resource or datastore not foundauth_error- AWS credentials not configuredservice_error- AWS HealthLake service errorserver_error- Internal server error
"AWS credentials not configured"
- Run
aws configureor set environment variables - Verify
AWS_REGIONis set correctly
"Resource not found"
- Ensure datastore exists and is ACTIVE
- Check datastore ID is correct (32 characters)
- Verify you have access to the datastore
"Validation error"
- Check required parameters are provided
- Ensure datastore ID format is correct
- Verify count parameters are within 1-100 range
Set environment variable for detailed logging:
export PYTHONPATH=.
export MCP_LOG_LEVEL=DEBUG
awslabs.healthlake-mcp-servergit clone <repository-url>
cd healthlake-mcp-server
uv sync --dev
source .venv/bin/activate # On Windows: .venv\Scripts\activategit clone <repository-url>
cd healthlake-mcp-server
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -e ".[dev]"git clone <repository-url>
cd healthlake-mcp-server
# Create conda environment
conda create -n healthlake-mcp python=3.10
conda activate healthlake-mcp
# Install dependencies
pip install -e ".[dev]"# After activating your virtual environment
python -m awslabs.healthlake_mcp_server.main
# Or using the installed script
awslabs.healthlake-mcp-server# Run tests
poe test
# Run tests with coverage
poe test-cov
# Format code
poe format
# Lint code
poe lint
# Run all quality checks
poe check
# Clean build artifacts
poe clean
# Build package
poe build
# Run server
poe runThe project uses Poethepoet for task automation. Run poe --help to see all available tasks:
- Testing:
test,test-cov - Code Quality:
lint,format,check,security - Build & Run:
build,run - Cleanup:
clean
# Run all checks
poe check- Install Python extension
- Select the virtual environment:
Ctrl+Shift+P→ "Python: Select Interpreter" - Choose
.venv/bin/python
- File → Settings → Project → Python Interpreter
- Add Interpreter → Existing Environment
- Select
.venv/bin/python
# Run unit tests (fast, no AWS dependencies)
poe test
# Run with coverage
poe test-cov
# Format code
poe format
# Lint code
poe lintTest Results: 235 tests pass, 96% coverage
awslabs/healthlake_mcp_server/
├── server.py # MCP server with tool handlers
├── fhir_operations.py # AWS HealthLake client operations
├── models.py # Pydantic validation models
├── main.py # Entry point
└── __init__.py # Package initialization
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make changes and add tests
- Run tests:
poe test - Format code:
poe format - Submit a pull request
Licensed under the Apache License, Version 2.0. See LICENSE file for details.
For issues and questions:
- Check the troubleshooting section above
- Review AWS HealthLake documentation
- Open an issue in the repository