- _load_cforch_config() falls back to CF_ORCH_URL / CF_LICENSE_KEY / OLLAMA_HOST / OLLAMA_MODEL env vars when label_tool.yaml cforch: key is absent or empty (yaml wins when both present) - CF_LICENSE_KEY forwarded to benchmark subprocess env so cf-orch agent can authenticate without it appearing in command args - GET /api/cforch/config endpoint — returns resolved connection state; redacts license key (returns license_key_set bool only) - SettingsView: connection status pill (cf-orch / Ollama / unconfigured) loaded from /api/cforch/config on mount; shows env vs yaml source - .env.example documenting all relevant vars - config/label_tool.yaml.example: full cforch: section with all keys - environment.yml: add circuitforge-core>=0.9.0 dependency - .gitignore: add .env - 4 new tests (17 total in test_cforch.py); 136 passing overall Closes #10
19 lines
1.1 KiB
Text
19 lines
1.1 KiB
Text
# Avocet — environment variable configuration
|
|
# Copy to .env and fill in values. All keys are optional.
|
|
# label_tool.yaml takes precedence over env vars where both exist.
|
|
|
|
# ── Local inference (Ollama) ───────────────────────────────────────────────────
|
|
# OLLAMA_HOST defaults to http://localhost:11434 if unset.
|
|
OLLAMA_HOST=http://localhost:11434
|
|
OLLAMA_MODEL=llama3.2:3b
|
|
|
|
# ── cf-orch coordinator (paid/premium tiers) ───────────────────────────────────
|
|
# Required for multi-GPU LLM benchmarking via the cf-orch benchmark harness.
|
|
# Free-tier users can leave these unset and use Ollama only.
|
|
CF_ORCH_URL=http://localhost:7700
|
|
CF_LICENSE_KEY=CFG-AVCT-xxxx-xxxx-xxxx
|
|
|
|
# ── Cloud LLM backends (optional — paid/premium) ──────────────────────────────
|
|
# Set one of these to use a cloud LLM instead of a local model.
|
|
# ANTHROPIC_API_KEY=sk-ant-...
|
|
# OPENAI_API_KEY=sk-...
|