snipe/config/llm.cloud.yaml
pyr0ball af1ffa1d94
Some checks are pending
CI / Python tests (push) Waiting to run
CI / Frontend typecheck + tests (push) Waiting to run
Mirror / mirror (push) Waiting to run
feat: wire Search with AI to cf-orch → Ollama (llama3.1:8b)
- Add app/llm/router.py shim — tri-level config lookup:
  repo config/llm.yaml → ~/.config/circuitforge/llm.yaml → env vars
- Add config/llm.cloud.yaml — ollama via cf-orch, llama3.1:8b
- Add config/llm.yaml.example — self-hosted reference config
- compose.cloud.yml: mount llm.cloud.yaml, set CF_ORCH_URL,
  add host.docker.internal:host-gateway (required on Linux Docker)
- api/main.py: use app.llm.router.LLMRouter (shim) not core directly
- .env.example: update LLM section to reference config/llm.yaml.example
- .gitignore: exclude config/llm.yaml (keep example + cloud yaml)

End-to-end tested: 3.2s for "used RTX 3080 under $400, no mining cards"
via cloud container → host.docker.internal:11434 → Ollama llama3.1:8b
2026-04-14 13:23:44 -07:00

38 lines
1.2 KiB
YAML

# config/llm.cloud.yaml
# Snipe — LLM config for the managed cloud instance (menagerie)
#
# Mounted read-only into the cloud API container at /app/config/llm.yaml
# (see compose.cloud.yml). Personal fine-tunes and local-only backends
# (claude_code, copilot) are intentionally excluded here.
#
# CF Orchestrator routes both ollama and vllm allocations for VRAM-aware
# scheduling. CF_ORCH_URL must be set in .env for allocations to resolve;
# if cf-orch is unreachable the backend falls back to its static base_url.
#
# Model choice for query builder: llama3.1:8b
# - Reliable instruction following and JSON output
# - No creative fine-tuning drift (unlike writer models in the pool)
# - Fits comfortably in 8 GB VRAM alongside other services
backends:
ollama:
type: openai_compat
base_url: http://host.docker.internal:11434/v1
api_key: ollama
model: llama3.1:8b
enabled: true
supports_images: false
cf_orch:
service: ollama
ttl_s: 300
anthropic:
type: anthropic
api_key_env: ANTHROPIC_API_KEY
model: claude-haiku-4-5-20251001
enabled: false
supports_images: false
fallback_order:
- ollama
- anthropic