Pick up cf-core env-var LLM config + coordinator auth (local-first arch) #67

Closed
opened 2026-04-03 08:50:37 -07:00 by pyr0ball · 0 comments
Owner

Context

cf-core now supports two new capabilities that Peregrine should adopt:

1. LLM env-var auto-config (no llm.yaml required)

LLMRouter now auto-configures from env vars when ~/.config/circuitforge/llm.yaml is absent. Bare-metal self-hosters no longer need to copy a config file.

Env vars picked up automatically:

ANTHROPIC_API_KEY    → anthropic backend (claude-haiku-4-5 default)
OPENAI_API_KEY       → openai-compat → api.openai.com (gpt-4o-mini default)
OLLAMA_HOST          → openai-compat → local Ollama (default: http://localhost:11434)
OLLAMA_MODEL         → model name to request from Ollama (default: llama3.2:3b)
OPENAI_MODEL         → model override for OpenAI backend
ANTHROPIC_MODEL      → model override for Anthropic backend

2. CF-hosted coordinator auth (CF_LICENSE_KEY)

CFOrchClient now reads CF_LICENSE_KEY and sends it as Authorization: Bearer on all coordinator requests. Paid+ users can point CF_ORCH_URL at the hosted coordinator (https://orch.circuitforge.tech) using their license key.

Tasks

  • Add env vars to .env.example with comments explaining each
  • Update first-run wizard / onboarding to surface env-var LLM setup as the simple path
  • Verify LLMRouter fallback works end-to-end with just OLLAMA_HOST set
  • Bump circuitforge-core dependency to >= 0.6.0 once released

Refs

cf-core commit: 3deae05 — feat: local-first LLM config + hosted coordinator auth

## Context cf-core now supports two new capabilities that Peregrine should adopt: ### 1. LLM env-var auto-config (no `llm.yaml` required) `LLMRouter` now auto-configures from env vars when `~/.config/circuitforge/llm.yaml` is absent. Bare-metal self-hosters no longer need to copy a config file. **Env vars picked up automatically:** ``` ANTHROPIC_API_KEY → anthropic backend (claude-haiku-4-5 default) OPENAI_API_KEY → openai-compat → api.openai.com (gpt-4o-mini default) OLLAMA_HOST → openai-compat → local Ollama (default: http://localhost:11434) OLLAMA_MODEL → model name to request from Ollama (default: llama3.2:3b) OPENAI_MODEL → model override for OpenAI backend ANTHROPIC_MODEL → model override for Anthropic backend ``` ### 2. CF-hosted coordinator auth (`CF_LICENSE_KEY`) `CFOrchClient` now reads `CF_LICENSE_KEY` and sends it as `Authorization: Bearer` on all coordinator requests. Paid+ users can point `CF_ORCH_URL` at the hosted coordinator (`https://orch.circuitforge.tech`) using their license key. ## Tasks - [ ] Add env vars to `.env.example` with comments explaining each - [ ] Update first-run wizard / onboarding to surface env-var LLM setup as the simple path - [ ] Verify `LLMRouter` fallback works end-to-end with just `OLLAMA_HOST` set - [ ] Bump `circuitforge-core` dependency to >= 0.6.0 once released ## Refs cf-core commit: `3deae05` — feat: local-first LLM config + hosted coordinator auth
pyr0ball added the
enhancement
label 2026-04-03 08:50:37 -07:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: Circuit-Forge/peregrine#67
No description provided.