Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
56 changes: 56 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Build artifacts
**/bin/
**/obj/
**/publish/
**/.vs/
**/.vscode/

# Git
.git/
.gitignore
.gitattributes
.github/

# Documentation
*.md
!src/**/*.md
docs/

# Database files
**/*.sqlite
**/*.sqlite-shm
**/*.sqlite-wal
**/*.mdf
**/*.ldf

# Node/Frontend
node_modules/
**/node_modules/

# IDE
.vs/
.vscode/
*.suo
*.user
*.userosscache
*.sln.docstates

# Tests
**/Tests/

# Docker
Dockerfile*
docker-compose*.yml
.dockerignore
.env
.env.*

# CI/CD
.github/

# Claude
.claude/

# Misc
*.log
*.tmp
64 changes: 64 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# ===================================================================
# Purdue.io Docker Configuration
# ===================================================================
# Copy this file to .env and update with your configuration
#
# IMPORTANT: Never commit .env file to version control!
# ===================================================================

# -------------------------------------------------------------------
# PostgreSQL Database Configuration
# -------------------------------------------------------------------
POSTGRES_DB=purdueio
POSTGRES_USER=purdueio
POSTGRES_PASSWORD=changeme_in_production

# -------------------------------------------------------------------
# API Configuration
# -------------------------------------------------------------------
# Port to expose the API on the host machine
API_PORT=8080

# ASP.NET Core environment (Development or Production)
ASPNETCORE_ENVIRONMENT=Production

# -------------------------------------------------------------------
# CatalogSync Configuration
# -------------------------------------------------------------------

# Sync Schedule (cron expression)
# Determines when the catalog sync runs automatically
#
# Default: 0 2 * * * (Daily at 2:00 AM)
#
# Common Examples:
# - "0 2 * * *" = Daily at 2:00 AM (DEFAULT)
# - "0 */6 * * *" = Every 6 hours
# - "0 */2 * * *" = Every 2 hours
# - "*/30 * * * *" = Every 30 minutes
# - "0 0 * * 0" = Weekly on Sunday at midnight
#
SYNC_SCHEDULE=0 2 * * *

# Terms to Sync (optional)
# Comma-separated list of term codes to sync
# Example: SYNC_TERMS=202410,202510,202520
# Leave empty to sync all available terms based on SYNC_ALL_TERMS setting
SYNC_TERMS=

# Subjects to Sync (optional)
# Comma-separated list of subject codes to sync
# Example: SYNC_SUBJECTS=CS,MA,PHYS,ECE,ENGL
# Leave empty to sync all subjects
SYNC_SUBJECTS=

# Sync All Terms
# Controls whether to sync all historical terms or only current/future terms
# - false (default): Only sync current and future terms
# - true: Sync all available terms including historical data
SYNC_ALL_TERMS=false

# Run Once Mode (for testing/initialization)
# Set to "true" to run the sync once and exit (useful for testing)
# Set to "false" for continuous cron scheduling (production mode)
RUN_ONCE=false
114 changes: 114 additions & 0 deletions .github/workflows/docker-publish.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
name: Docker Build and Publish

on:
pull_request:
branches: [ main ]
paths:
- 'src/**'
- 'Dockerfile.*'
- 'docker/**'
- '.github/workflows/docker-publish.yml'
release:
types: [published]
workflow_dispatch:
inputs:
tag:
description: 'Tag to build and push (e.g., v1.0.0 or latest)'
required: true
default: 'latest'

env:
REGISTRY: ghcr.io
IMAGE_NAME_API: ${{ github.repository }}/api
IMAGE_NAME_CATALOGSYNC: ${{ github.repository }}/catalogsync

jobs:
# Verify builds on PRs (no push)
verify-build:
name: Verify Docker Builds
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'

strategy:
matrix:
service: [api, catalogsync]

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Build ${{ matrix.service }} (verification only)
uses: docker/build-push-action@v5
with:
context: .
file: ./Dockerfile.${{ matrix.service }}
push: false
tags: purdueio-${{ matrix.service }}:test
cache-from: type=gha
cache-to: type=gha,mode=max

# Build and push on releases
build-and-push:
name: Build and Push Docker Images
runs-on: ubuntu-latest
if: github.event_name == 'release' || github.event_name == 'workflow_dispatch'
permissions:
contents: read
packages: write

strategy:
matrix:
include:
- service: api
image_name_var: IMAGE_NAME_API
- service: catalogsync
image_name_var: IMAGE_NAME_CATALOGSYNC

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Log in to Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ matrix.service == 'api' && env.IMAGE_NAME_API || env.IMAGE_NAME_CATALOGSYNC }}
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=raw,value=latest,enable={{is_default_branch}}
type=raw,value=${{ github.event.inputs.tag }},enable=${{ github.event_name == 'workflow_dispatch' }}

- name: Build and push ${{ matrix.service }}
uses: docker/build-push-action@v5
with:
context: .
file: ./Dockerfile.${{ matrix.service }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max

- name: Output image tags
run: |
echo "### Published ${{ matrix.service }} image :rocket:" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Tags:**" >> $GITHUB_STEP_SUMMARY
echo '```' >> $GITHUB_STEP_SUMMARY
echo "${{ steps.meta.outputs.tags }}" >> $GITHUB_STEP_SUMMARY
echo '```' >> $GITHUB_STEP_SUMMARY
10 changes: 10 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -194,3 +194,13 @@ $RECYCLE.BIN/
.DS_Store
*.mdf
*.ldf

# =========================
# Docker
# =========================

# Environment files (contain secrets)
.env

# Docker temporary files
.docker/
40 changes: 40 additions & 0 deletions Dockerfile.api
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Stage 1: Build
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
WORKDIR /src

# Copy only the project files needed for API (and its dependencies)
COPY src/Api/Api.csproj ./Api/
COPY src/Database/Database.csproj ./Database/
COPY src/Database.Migrations.Sqlite/Database.Migrations.Sqlite.csproj ./Database.Migrations.Sqlite/
COPY src/Database.Migrations.Npgsql/Database.Migrations.Npgsql.csproj ./Database.Migrations.Npgsql/

# Restore dependencies for API project only (not entire solution)
RUN dotnet restore Api/Api.csproj

# Copy all source files
COPY src/ ./

# Publish the API project
RUN dotnet publish Api/Api.csproj -c Release -r linux-x64 \
--self-contained=true \
-p:PublishReadyToRun=true \
-o /app/publish

# Stage 2: Runtime
FROM mcr.microsoft.com/dotnet/aspnet:9.0
WORKDIR /app

# Copy published application (includes wwwroot if present)
COPY --from=build /app/publish .

# Expose port 8080 (HTTP only for internal Docker network)
EXPOSE 8080

# Configure ASP.NET Core to listen on port 8080
ENV ASPNETCORE_URLS=http://+:8080 \
ASPNETCORE_ENVIRONMENT=Production

# Run as non-root user for security
USER app

ENTRYPOINT ["./Api"]
51 changes: 51 additions & 0 deletions Dockerfile.catalogsync
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Stage 1: Build
FROM mcr.microsoft.com/dotnet/sdk:9.0 AS build
WORKDIR /src

# Copy only the project files needed for CatalogSync (and its dependencies)
COPY src/CatalogSync/CatalogSync.csproj ./CatalogSync/
COPY src/Database/Database.csproj ./Database/
COPY src/Scraper/Scraper.csproj ./Scraper/
COPY src/Database.Migrations.Sqlite/Database.Migrations.Sqlite.csproj ./Database.Migrations.Sqlite/
COPY src/Database.Migrations.Npgsql/Database.Migrations.Npgsql.csproj ./Database.Migrations.Npgsql/

# Restore dependencies for CatalogSync project only (not entire solution)
RUN dotnet restore CatalogSync/CatalogSync.csproj

# Copy all source files
COPY src/ ./

# Publish the CatalogSync project
RUN dotnet publish CatalogSync/CatalogSync.csproj -c Release -r linux-x64 \
--self-contained=true \
-p:PublishReadyToRun=true \
-o /app/publish

# Stage 2: Runtime with supercronic
FROM mcr.microsoft.com/dotnet/aspnet:9.0
WORKDIR /app

# Install supercronic for Docker-friendly cron scheduling
ENV SUPERCRONIC_URL=https://github.com/aptible/supercronic/releases/download/v0.2.29/supercronic-linux-amd64 \
SUPERCRONIC=supercronic-linux-amd64 \
SUPERCRONIC_SHA1SUM=cd48d45c4b10f3f0bfdd3a57d054cd05ac96812b

RUN apt-get update && apt-get install -y --no-install-recommends curl \
&& curl -fsSLO "$SUPERCRONIC_URL" \
&& echo "${SUPERCRONIC_SHA1SUM} ${SUPERCRONIC}" | sha1sum -c - \
&& chmod +x "$SUPERCRONIC" \
&& mv "$SUPERCRONIC" /usr/local/bin/supercronic \
&& apt-get purge -y --auto-remove curl \
&& rm -rf /var/lib/apt/lists/*

# Copy published application
COPY --from=build /app/publish .

# Copy entrypoint script and ensure it's executable by all users
COPY docker/catalogsync-entrypoint.sh /app/entrypoint.sh
RUN chmod 755 /app/entrypoint.sh

# Run as non-root user for security
USER app

ENTRYPOINT ["/app/entrypoint.sh"]
42 changes: 40 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,47 @@ there through the query tester at [http://api.purdue.io/](api.purdue.io/).

# Building and Running

## Tools
## Docker (Recommended)

Purdue.io is written in C# on .NET 8. It will run natively on most major
The easiest way to run Purdue.io is using Docker. This will start all services including PostgreSQL, the API, and the CatalogSync scheduler.

### Prerequisites

- [Docker](https://docs.docker.com/get-docker/)
- [Docker Compose](https://docs.docker.com/compose/install/)

### Quick Start (Using Pre-built Images)

```sh
# 1. Download the production docker-compose file
curl -O https://raw.githubusercontent.com/Purdue-io/PurdueApi/main/docker-compose.production.yml
curl -O https://raw.githubusercontent.com/Purdue-io/PurdueApi/main/.env.example

# 2. Copy the example environment file
cp .env.example .env

# 3. (Optional) Edit .env to configure sync schedule, database credentials, etc.

# 4. Pull and start all services
docker compose -f docker-compose.production.yml up -d

# 5. Watch the logs
docker compose -f docker-compose.production.yml logs -f

# 6. Access the API at http://localhost:8080/odata
```

**Pre-built images are available at:**
- `ghcr.io/purdue-io/purdueapi/api:latest`
- `ghcr.io/purdue-io/purdueapi/catalogsync:latest`

For building from source, detailed configuration options, and troubleshooting, see [docs/DOCKER.md](docs/DOCKER.md).

## Local Development

### Tools

Purdue.io is written in C# on .NET 9. It will run natively on most major
architectures and operating systems (Windows, Linux, Mac OS).

Entity Framework is used to communicate with an underlying database provider. Currently,
Expand Down
Loading