Skip to content

Commit 4bb9b10

Browse files
authored
Merge pull request #13 from cicorias/feedback-day1
update for docker run
2 parents 1472386 + 3cb7697 commit 4bb9b10

1 file changed

Lines changed: 121 additions & 6 deletions

File tree

README.md

Lines changed: 121 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ Output files are written to `./output/data/output/` (`my_scene.ply`, `my_scene.s
6767
If you prefer running via Docker (e.g., for production or Azure Blob Storage integration):
6868

6969
<details>
70-
<summary>Local Docker Mode</summary>
70+
<summary>Local Docker Watch Mode</summary>
7171

7272
```bash
7373
# Create directory structure
@@ -103,14 +103,129 @@ ls output/scene_001/
103103
</details>
104104

105105
<details>
106-
<summary>Azure Blob Storage Mode</summary>
106+
<summary>Docker Batch Mode (Azurite / Azure Blob Storage)</summary>
107+
108+
Batch mode runs the processor as a one-shot container that downloads blobs from Azure Storage (or Azurite emulator), processes them, uploads outputs, and exits. No file watching, no FUSE mounts, no privileged mode needed.
109+
110+
#### Prerequisites
111+
112+
* Docker installed and running
113+
* [uv](https://docs.astral.sh/uv/) and Python 3 (for the Azurite helper script)
114+
* Test videos downloaded (`./scripts/e2e/01-download-testdata.sh`)
115+
116+
#### Step 1: Build the CPU image
107117

108118
```bash
109-
docker run -d --privileged \
110-
--name 3dgs-processor \
111-
-e AZURE_STORAGE_CONNECTION_STRING="DefaultEndpointsProtocol=..." \
112-
-e AZURE_CONTAINER_NAME=3dgs-input \
119+
docker build --target cpu -t 3dgs-processor:cpu .
120+
```
121+
122+
#### Step 2: Start Azurite (Azure Storage emulator)
123+
124+
```bash
125+
docker network create 3dgs-e2e-net
126+
127+
docker run -d --rm --name azurite-e2e \
128+
--network 3dgs-e2e-net \
129+
-p 10000:10000 \
130+
mcr.microsoft.com/azure-storage/azurite \
131+
azurite-blob --blobHost 0.0.0.0 --blobPort 10000 --skipApiVersionCheck
132+
133+
# Wait for Azurite to be ready
134+
curl -s http://127.0.0.1:10000/ > /dev/null && echo "Azurite ready"
135+
```
136+
137+
#### Step 3: Upload test videos and generate a SAS token
138+
139+
```bash
140+
# Create a Python venv and install dependencies (one-time setup)
141+
uv venv output/.e2e-venv
142+
source output/.e2e-venv/bin/activate
143+
uv pip install azure-storage-blob
144+
145+
# Create containers (input, output, processed, error) and upload videos
146+
python3 scripts/e2e/azurite_helper.py setup testdata/south_building_videos "my_scene/"
147+
148+
# Generate a SAS token for the processor
149+
SAS_TOKEN=$(python3 scripts/e2e/azurite_helper.py sas)
150+
```
151+
152+
#### Step 4: Run the processor container in batch mode
153+
154+
```bash
155+
docker run --rm --name 3dgs-e2e-batch \
156+
--network 3dgs-e2e-net \
157+
-v $(pwd)/container-test/config.yaml:/config/config.yaml:ro \
158+
-e RUN_MODE=batch \
159+
-e AZURE_STORAGE_ACCOUNT=devstoreaccount1 \
160+
-e AZURE_STORAGE_ENDPOINT=http://azurite-e2e:10000/devstoreaccount1 \
161+
-e "AZURE_STORAGE_SAS_TOKEN=$SAS_TOKEN" \
162+
-e BATCH_INPUT_PREFIX=my_scene/ \
163+
-e BACKEND=mock \
164+
-e FORCE_CPU_BACKEND=1 \
165+
-e COLMAP_USE_CPU=1 \
166+
-e COLMAP_MATCHER=sequential \
167+
-e COLMAP_MAX_NUM_FEATURES=2048 \
168+
-e FRAME_RATE=2 \
169+
-e MIN_VIDEO_FRAMES=5 \
170+
-e MIN_VIDEO_DURATION=0.5 \
171+
-e MIN_RECONSTRUCTION_POINTS=100 \
172+
-e RECONSTRUCTION_BACKEND=colmap \
173+
-e MAX_RETRIES=1 \
174+
-e LOG_LEVEL=info \
175+
-e TEMP_PATH=/tmp/3dgs-work \
176+
3dgs-processor:cpu
177+
```
178+
179+
The container will: download videos from Azurite → extract frames (FFmpeg) → reconstruct with COLMAP → mock-train → export PLY + SPLAT → upload outputs → move inputs to `processed` → exit 0.
180+
181+
#### Step 5: Verify outputs
182+
183+
```bash
184+
python3 scripts/e2e/azurite_helper.py verify "my_scene/"
185+
```
186+
187+
Expected:
188+
```
189+
✅ PLY: my_scene/my_scene.ply (42443 bytes)
190+
✅ SPLAT: my_scene/my_scene.splat (32000 bytes)
191+
✅ manifest: present
192+
✅ processed: 3 input video(s) archived
193+
✅ input: cleaned (all blobs moved)
194+
✅ error: empty (no failures)
195+
```
196+
197+
#### Step 6: Cleanup
198+
199+
```bash
200+
docker stop azurite-e2e
201+
docker network rm 3dgs-e2e-net
202+
deactivate # exit Python venv
203+
```
204+
205+
#### Fully automated alternative
206+
207+
The E2E script runs all of the above automatically:
208+
209+
```bash
210+
./scripts/e2e/04-run-e2e.sh --mode batch
211+
```
212+
213+
#### Using real Azure Blob Storage (production)
214+
215+
Replace Azurite with your Azure account. Authentication options (in priority order):
216+
217+
1. **SAS Token**: `-e AZURE_STORAGE_SAS_TOKEN="?sv=2022-..."`
218+
2. **Managed Identity**: `-e AZURE_USE_MANAGED_IDENTITY=true` (Azure VMs/AKS)
219+
3. **Azure CLI**: Default — requires `az login` on the host
220+
221+
```bash
222+
docker run --rm \
223+
-e RUN_MODE=batch \
224+
-e AZURE_STORAGE_ACCOUNT=youraccount \
225+
-e "AZURE_STORAGE_SAS_TOKEN=?sv=2022-..." \
226+
-e BATCH_INPUT_PREFIX=scene_001/ \
113227
-e BACKEND=gsplat \
228+
--gpus all \
114229
youracr.azurecr.io/3dgs-processor:gpu
115230
```
116231

0 commit comments

Comments
 (0)