Some checks failed
CI / test (pull_request) Failing after 1m16s
- Lower vue_ui_beta gate to "free" so all licensed users can access the new UI without a paid subscription - Remove "Paid tier" wording from the Try New UI banner - Fix Vue SPA navigation in cloud/demo deployments: add VITE_BASE_PATH build arg so Vite sets the correct subpath base, and pass import.meta.env.BASE_URL to createWebHistory() so router links emit /peregrine/... paths that Caddy can match - Fix feedback button missing on cloud instance by passing FORGEJO_API_TOKEN through compose.cloud.yml - Remove vLLM container from compose.yml (vLLM dropped from stack; cf-research service in cfcore covers the use case) - Fix cloud config path in Apply page (use get_config_dir() so per-user cloud data roots resolve correctly for user.yaml and resume YAML) - Refactor generate_cover_letter._build_system_context and _build_mission_notes to accept explicit profile arg (enables per-user cover letter generation in cloud multi-tenant mode) - Add API proxy block to nginx.conf (Vue web container can now call /api/ directly without Vite dev proxy) - Update .env.example: remove vLLM vars, add research model + tuning vars for external vLLM deployments - Update llm.yaml: switch vllm base_url to host.docker.internal (vLLM now runs outside Docker stack) Closes #63 (feedback button) Related: #8 (Vue SPA), #50–#62 (parity milestone)
41 lines
1.7 KiB
Text
41 lines
1.7 KiB
Text
# .env.example — copy to .env
|
|
# Auto-generated by the setup wizard, or fill in manually.
|
|
# NEVER commit .env to git.
|
|
|
|
STREAMLIT_PORT=8501
|
|
OLLAMA_PORT=11434
|
|
VLLM_PORT=8000
|
|
SEARXNG_PORT=8888
|
|
VISION_PORT=8002
|
|
VISION_MODEL=vikhyatk/moondream2
|
|
VISION_REVISION=2025-01-09
|
|
|
|
DOCS_DIR=~/Documents/JobSearch
|
|
OLLAMA_MODELS_DIR=~/models/ollama
|
|
VLLM_MODELS_DIR=~/models/vllm # override with full path to your model dir
|
|
VLLM_MODEL=Ouro-1.4B # cover letters — fast 1.4B model
|
|
VLLM_RESEARCH_MODEL=Ouro-2.6B-Thinking # research — reasoning 2.6B model; restart vllm to switch
|
|
VLLM_MAX_MODEL_LEN=4096 # increase to 8192 for Thinking models with long CoT
|
|
VLLM_GPU_MEM_UTIL=0.75 # lower to 0.6 if sharing GPU with other services
|
|
OLLAMA_DEFAULT_MODEL=llama3.2:3b
|
|
|
|
# API keys (required for remote profile)
|
|
ANTHROPIC_API_KEY=
|
|
OPENAI_COMPAT_URL=
|
|
OPENAI_COMPAT_KEY=
|
|
|
|
# Feedback button — Forgejo issue filing
|
|
FORGEJO_API_TOKEN=
|
|
FORGEJO_REPO=pyr0ball/peregrine
|
|
FORGEJO_API_URL=https://git.opensourcesolarpunk.com/api/v1
|
|
# GITHUB_TOKEN= # future — enable when public mirror is active
|
|
# GITHUB_REPO= # future
|
|
|
|
# Cloud multi-tenancy (compose.cloud.yml only — do not set for local installs)
|
|
CLOUD_MODE=false
|
|
CLOUD_DATA_ROOT=/devl/menagerie-data
|
|
DIRECTUS_JWT_SECRET= # must match website/.env DIRECTUS_SECRET value
|
|
CF_SERVER_SECRET= # random 64-char hex — generate: openssl rand -hex 32
|
|
PLATFORM_DB_URL=postgresql://cf_platform:<password>@host.docker.internal:5433/circuitforge_platform
|
|
HEIMDALL_URL=http://cf-license:8000 # internal Docker URL; override for external access
|
|
HEIMDALL_ADMIN_TOKEN= # must match ADMIN_TOKEN in circuitforge-license .env
|