Compare commits
No commits in common. "main" and "feature/vue-wizard" have entirely different histories.
main
...
feature/vu
44 changed files with 402 additions and 3603 deletions
14
.env.example
14
.env.example
|
|
@ -19,14 +19,6 @@ VLLM_MAX_MODEL_LEN=4096 # increase to 8192 for Thinking models with
|
|||
VLLM_GPU_MEM_UTIL=0.75 # lower to 0.6 if sharing GPU with other services
|
||||
OLLAMA_DEFAULT_MODEL=llama3.2:3b
|
||||
|
||||
# ── LLM env-var auto-config (alternative to config/llm.yaml) ─────────────────
|
||||
# Set any of these to configure LLM backends without needing a config/llm.yaml.
|
||||
# Priority: Anthropic > OpenAI-compat > Ollama (always tried as local fallback).
|
||||
OLLAMA_HOST=http://localhost:11434 # Ollama host; override if on a different machine
|
||||
OLLAMA_MODEL=llama3.2:3b # model to request from Ollama
|
||||
OPENAI_MODEL=gpt-4o-mini # model override for OpenAI-compat backend
|
||||
ANTHROPIC_MODEL=claude-haiku-4-5-20251001 # model override for Anthropic backend
|
||||
|
||||
# API keys (required for remote profile)
|
||||
ANTHROPIC_API_KEY=
|
||||
OPENAI_COMPAT_URL=
|
||||
|
|
@ -39,12 +31,6 @@ FORGEJO_API_URL=https://git.opensourcesolarpunk.com/api/v1
|
|||
# GITHUB_TOKEN= # future — enable when public mirror is active
|
||||
# GITHUB_REPO= # future
|
||||
|
||||
# ── CF-hosted coordinator (Paid+ tier) ───────────────────────────────────────
|
||||
# Set CF_LICENSE_KEY to authenticate with the hosted coordinator.
|
||||
# Leave both blank for local self-hosted cf-orch or bare-metal inference.
|
||||
CF_LICENSE_KEY=
|
||||
CF_ORCH_URL=https://orch.circuitforge.tech
|
||||
|
||||
# Cloud multi-tenancy (compose.cloud.yml only — do not set for local installs)
|
||||
CLOUD_MODE=false
|
||||
CLOUD_DATA_ROOT=/devl/menagerie-data
|
||||
|
|
|
|||
38
CHANGELOG.md
38
CHANGELOG.md
|
|
@ -9,44 +9,6 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
|||
|
||||
---
|
||||
|
||||
## [0.8.5] — 2026-04-02
|
||||
|
||||
### Added
|
||||
|
||||
- **Vue onboarding wizard** — 7-step first-run setup replaces the Streamlit wizard
|
||||
in the Vue SPA: Hardware detection → Tier → Resume upload/build → Identity →
|
||||
Inference & API keys → Search preferences → Integrations. Progress saves to
|
||||
`user.yaml` on every step; crash-recovery resumes from the last completed step.
|
||||
- **Wizard API endpoints** — `GET /api/wizard/status`, `POST /api/wizard/step`,
|
||||
`GET /api/wizard/hardware`, `POST /api/wizard/inference/test`,
|
||||
`POST /api/wizard/complete`. Inference test always soft-fails so Ollama being
|
||||
unreachable never blocks setup completion.
|
||||
- **Cloud auto-skip** — cloud instances automatically complete steps 1 (hardware),
|
||||
2 (tier), and 5 (inference) and drop the user directly on the Resume step.
|
||||
- **`wizardGuard` router gate** — all Vue routes require wizard completion; completed
|
||||
users are bounced away from `/setup` to `/`.
|
||||
- **Chip-input search step** — job titles and locations entered as press-Enter/comma
|
||||
chips; validates at least one title before advancing.
|
||||
- **Integrations tile grid** — optional step 7 shows Notion, Calendar, Slack, Discord,
|
||||
Drive with paid-tier badges; skippable on Finish.
|
||||
|
||||
### Fixed
|
||||
|
||||
- **User config isolation: dangerous fallback removed** — `_user_yaml_path()` fell
|
||||
back to `/devl/job-seeker/config/user.yaml` (legacy profile) when `user.yaml`
|
||||
didn't exist at the expected path; new users now get an empty dict instead of
|
||||
another user's data. Affects profile, resume, search, and all wizard endpoints.
|
||||
- **Resume path not user-isolated** — `RESUME_PATH = Path("config/plain_text_resume.yaml")`
|
||||
was a relative CWD path shared across all users. Replaced with `_resume_path()`
|
||||
derived from `_user_yaml_path()` / `STAGING_DB`.
|
||||
- **Resume upload silently returned empty data** — `upload_resume` was passing a
|
||||
file path string to `structure_resume()` which expects raw text; now reads bytes
|
||||
and dispatches to the correct extractor (`extract_text_from_pdf` / `_docx` / `_odt`).
|
||||
- **Wizard resume step read wrong envelope field** — `WizardResumeStep.vue` read
|
||||
`data.experience` but the upload response wraps parsed data under `data.data`.
|
||||
|
||||
---
|
||||
|
||||
## [0.8.4] — 2026-04-02
|
||||
|
||||
### Fixed
|
||||
|
|
|
|||
|
|
@ -154,7 +154,7 @@ Re-enter the wizard any time via **Settings → Developer → Reset wizard**.
|
|||
| Calendar sync (Google, Apple) | Paid |
|
||||
| Slack notifications | Paid |
|
||||
| CircuitForge shared cover-letter model | Paid |
|
||||
| Vue 3 SPA — full UI with onboarding wizard, job board, apply workspace, sort/filter, research modal, draft cover letter | Free |
|
||||
| Vue 3 SPA beta UI | Paid |
|
||||
| **Voice guidelines** (custom writing style & tone) | Premium with LLM¹ ² |
|
||||
| Cover letter model fine-tuning (your writing, your model) | Premium |
|
||||
| Multi-user support | Premium |
|
||||
|
|
|
|||
271
app/Home.py
271
app/Home.py
|
|
@ -19,8 +19,8 @@ _profile = UserProfile(_USER_YAML) if UserProfile.exists(_USER_YAML) else None
|
|||
_name = _profile.name if _profile else "Job Seeker"
|
||||
|
||||
from scripts.db import init_db, get_job_counts, purge_jobs, purge_email_data, \
|
||||
purge_non_remote, archive_jobs, kill_stuck_tasks, cancel_task, \
|
||||
get_task_for_job, get_active_tasks, insert_job, get_existing_urls
|
||||
purge_non_remote, archive_jobs, kill_stuck_tasks, get_task_for_job, get_active_tasks, \
|
||||
insert_job, get_existing_urls
|
||||
from scripts.task_runner import submit_task
|
||||
from app.cloud_session import resolve_session, get_db_path
|
||||
|
||||
|
|
@ -376,144 +376,177 @@ _scrape_status()
|
|||
|
||||
st.divider()
|
||||
|
||||
# ── Danger zone ───────────────────────────────────────────────────────────────
|
||||
# ── Danger zone: purge + re-scrape ────────────────────────────────────────────
|
||||
with st.expander("⚠️ Danger Zone", expanded=False):
|
||||
|
||||
# ── Queue reset (the common case) ─────────────────────────────────────────
|
||||
st.markdown("**Queue reset**")
|
||||
st.caption(
|
||||
"Archive clears your review queue while keeping job URLs for dedup, "
|
||||
"so the same listings won't resurface on the next discovery run. "
|
||||
"Use hard purge only if you want a full clean slate including dedup history."
|
||||
"**Purge** permanently deletes jobs from the local database. "
|
||||
"Applied and synced jobs are never touched."
|
||||
)
|
||||
|
||||
_scope = st.radio(
|
||||
"Clear scope",
|
||||
["Pending only", "Pending + approved (stale search)"],
|
||||
horizontal=True,
|
||||
label_visibility="collapsed",
|
||||
)
|
||||
_scope_statuses = (
|
||||
["pending"] if _scope == "Pending only" else ["pending", "approved"]
|
||||
)
|
||||
purge_col, rescrape_col, email_col, tasks_col = st.columns(4)
|
||||
|
||||
_qc1, _qc2, _qc3 = st.columns([2, 2, 4])
|
||||
if _qc1.button("📦 Archive & reset", use_container_width=True, type="primary"):
|
||||
st.session_state["confirm_dz"] = "archive"
|
||||
if _qc2.button("🗑 Hard purge (delete)", use_container_width=True):
|
||||
st.session_state["confirm_dz"] = "purge"
|
||||
with purge_col:
|
||||
st.markdown("**Purge pending & rejected**")
|
||||
st.caption("Removes all _pending_ and _rejected_ listings so the next discovery starts fresh.")
|
||||
if st.button("🗑 Purge Pending + Rejected", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "partial"
|
||||
|
||||
if st.session_state.get("confirm_dz") == "archive":
|
||||
st.info(
|
||||
f"Archive **{', '.join(_scope_statuses)}** jobs? "
|
||||
"URLs are kept for dedup — nothing is permanently deleted."
|
||||
)
|
||||
_dc1, _dc2 = st.columns(2)
|
||||
if _dc1.button("Yes, archive", type="primary", use_container_width=True, key="dz_archive_confirm"):
|
||||
n = archive_jobs(get_db_path(), statuses=_scope_statuses)
|
||||
st.success(f"Archived {n} jobs.")
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
if st.session_state.get("confirm_purge") == "partial":
|
||||
st.warning("Are you sure? This cannot be undone.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, purge", type="primary", use_container_width=True):
|
||||
deleted = purge_jobs(get_db_path(), statuses=["pending", "rejected"])
|
||||
st.success(f"Purged {deleted} jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if _dc2.button("Cancel", use_container_width=True, key="dz_archive_cancel"):
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
if c2.button("Cancel", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
if st.session_state.get("confirm_dz") == "purge":
|
||||
st.warning(
|
||||
f"Permanently delete **{', '.join(_scope_statuses)}** jobs? "
|
||||
"This removes the URLs from dedup history too. Cannot be undone."
|
||||
)
|
||||
_dc1, _dc2 = st.columns(2)
|
||||
if _dc1.button("Yes, delete", type="primary", use_container_width=True, key="dz_purge_confirm"):
|
||||
n = purge_jobs(get_db_path(), statuses=_scope_statuses)
|
||||
st.success(f"Deleted {n} jobs.")
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
with email_col:
|
||||
st.markdown("**Purge email data**")
|
||||
st.caption("Clears all email thread logs and email-sourced pending jobs so the next sync starts fresh.")
|
||||
if st.button("📧 Purge Email Data", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "email"
|
||||
|
||||
if st.session_state.get("confirm_purge") == "email":
|
||||
st.warning("This deletes all email contacts and email-sourced jobs. Cannot be undone.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, purge emails", type="primary", use_container_width=True):
|
||||
contacts, jobs = purge_email_data(get_db_path())
|
||||
st.success(f"Purged {contacts} email contacts, {jobs} email jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if _dc2.button("Cancel", use_container_width=True, key="dz_purge_cancel"):
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
st.divider()
|
||||
|
||||
# ── Background tasks ──────────────────────────────────────────────────────
|
||||
with tasks_col:
|
||||
_active = get_active_tasks(get_db_path())
|
||||
st.markdown(f"**Background tasks** — {len(_active)} active")
|
||||
|
||||
if _active:
|
||||
_task_icons = {"cover_letter": "✉️", "research": "🔍", "discovery": "🌐", "enrich_descriptions": "📝"}
|
||||
for _t in _active:
|
||||
_tc1, _tc2, _tc3 = st.columns([3, 4, 2])
|
||||
_icon = _task_icons.get(_t["task_type"], "⚙️")
|
||||
_tc1.caption(f"{_icon} `{_t['task_type']}`")
|
||||
_job_label = f"{_t['title']} @ {_t['company']}" if _t.get("title") else f"job #{_t['job_id']}"
|
||||
_tc2.caption(_job_label)
|
||||
_tc3.caption(f"_{_t['status']}_")
|
||||
if st.button("✕ Cancel", key=f"dz_cancel_task_{_t['id']}", use_container_width=True):
|
||||
cancel_task(get_db_path(), _t["id"])
|
||||
st.rerun()
|
||||
st.caption("")
|
||||
|
||||
_kill_col, _ = st.columns([2, 6])
|
||||
if _kill_col.button("⏹ Kill all stuck", use_container_width=True, disabled=len(_active) == 0):
|
||||
st.markdown("**Kill stuck tasks**")
|
||||
st.caption(f"Force-fail all queued/running background tasks. Currently **{len(_active)}** active.")
|
||||
if st.button("⏹ Kill All Tasks", use_container_width=True, disabled=len(_active) == 0):
|
||||
killed = kill_stuck_tasks(get_db_path())
|
||||
st.success(f"Killed {killed} task(s).")
|
||||
st.rerun()
|
||||
|
||||
st.divider()
|
||||
with rescrape_col:
|
||||
st.markdown("**Purge all & re-scrape**")
|
||||
st.caption("Wipes _all_ non-applied, non-synced jobs then immediately runs a fresh discovery.")
|
||||
if st.button("🔄 Purge All + Re-scrape", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "full"
|
||||
|
||||
# ── Rarely needed (collapsed) ─────────────────────────────────────────────
|
||||
with st.expander("More options", expanded=False):
|
||||
_rare1, _rare2, _rare3 = st.columns(3)
|
||||
|
||||
with _rare1:
|
||||
st.markdown("**Purge email data**")
|
||||
st.caption("Clears all email thread logs and email-sourced pending jobs.")
|
||||
if st.button("📧 Purge Email Data", use_container_width=True):
|
||||
st.session_state["confirm_dz"] = "email"
|
||||
if st.session_state.get("confirm_dz") == "email":
|
||||
st.warning("Deletes all email contacts and email-sourced jobs. Cannot be undone.")
|
||||
_ec1, _ec2 = st.columns(2)
|
||||
if _ec1.button("Yes, purge emails", type="primary", use_container_width=True, key="dz_email_confirm"):
|
||||
contacts, jobs = purge_email_data(get_db_path())
|
||||
st.success(f"Purged {contacts} email contacts, {jobs} email jobs.")
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
st.rerun()
|
||||
if _ec2.button("Cancel", use_container_width=True, key="dz_email_cancel"):
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
st.rerun()
|
||||
|
||||
with _rare2:
|
||||
st.markdown("**Purge non-remote**")
|
||||
st.caption("Removes pending/approved/rejected on-site listings from the DB.")
|
||||
if st.button("🏢 Purge On-site Jobs", use_container_width=True):
|
||||
st.session_state["confirm_dz"] = "non_remote"
|
||||
if st.session_state.get("confirm_dz") == "non_remote":
|
||||
st.warning("Deletes all non-remote jobs not yet applied to. Cannot be undone.")
|
||||
_rc1, _rc2 = st.columns(2)
|
||||
if _rc1.button("Yes, purge on-site", type="primary", use_container_width=True, key="dz_nonremote_confirm"):
|
||||
deleted = purge_non_remote(get_db_path())
|
||||
st.success(f"Purged {deleted} non-remote jobs.")
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
st.rerun()
|
||||
if _rc2.button("Cancel", use_container_width=True, key="dz_nonremote_cancel"):
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
st.rerun()
|
||||
|
||||
with _rare3:
|
||||
st.markdown("**Wipe all + re-scrape**")
|
||||
st.caption("Deletes all non-applied jobs then immediately runs a fresh discovery.")
|
||||
if st.button("🔄 Wipe + Re-scrape", use_container_width=True):
|
||||
st.session_state["confirm_dz"] = "rescrape"
|
||||
if st.session_state.get("confirm_dz") == "rescrape":
|
||||
st.warning("Wipes ALL pending, approved, and rejected jobs, then re-scrapes. Applied and synced records are kept.")
|
||||
_wc1, _wc2 = st.columns(2)
|
||||
if _wc1.button("Yes, wipe + scrape", type="primary", use_container_width=True, key="dz_rescrape_confirm"):
|
||||
if st.session_state.get("confirm_purge") == "full":
|
||||
st.warning("This will delete ALL pending, approved, and rejected jobs, then re-scrape. Applied and synced records are kept.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, wipe + scrape", type="primary", use_container_width=True):
|
||||
purge_jobs(get_db_path(), statuses=["pending", "approved", "rejected"])
|
||||
submit_task(get_db_path(), "discovery", 0)
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if _wc2.button("Cancel", use_container_width=True, key="dz_rescrape_cancel"):
|
||||
st.session_state.pop("confirm_dz", None)
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
st.divider()
|
||||
|
||||
pending_col, nonremote_col, approved_col, _ = st.columns(4)
|
||||
|
||||
with pending_col:
|
||||
st.markdown("**Purge pending review**")
|
||||
st.caption("Removes only _pending_ listings, keeping your rejected history intact.")
|
||||
if st.button("🗑 Purge Pending Only", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "pending_only"
|
||||
|
||||
if st.session_state.get("confirm_purge") == "pending_only":
|
||||
st.warning("Deletes all pending jobs. Rejected jobs are kept. Cannot be undone.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, purge pending", type="primary", use_container_width=True):
|
||||
deleted = purge_jobs(get_db_path(), statuses=["pending"])
|
||||
st.success(f"Purged {deleted} pending jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
with nonremote_col:
|
||||
st.markdown("**Purge non-remote**")
|
||||
st.caption("Removes pending/approved/rejected jobs where remote is not set. Keeps anything already in the pipeline.")
|
||||
if st.button("🏢 Purge On-site Jobs", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "non_remote"
|
||||
|
||||
if st.session_state.get("confirm_purge") == "non_remote":
|
||||
st.warning("Deletes all non-remote jobs not yet applied to. Cannot be undone.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, purge on-site", type="primary", use_container_width=True):
|
||||
deleted = purge_non_remote(get_db_path())
|
||||
st.success(f"Purged {deleted} non-remote jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
with approved_col:
|
||||
st.markdown("**Purge approved (unapplied)**")
|
||||
st.caption("Removes _approved_ jobs you haven't applied to yet — e.g. to reset after a review pass.")
|
||||
if st.button("🗑 Purge Approved", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "approved_only"
|
||||
|
||||
if st.session_state.get("confirm_purge") == "approved_only":
|
||||
st.warning("Deletes all approved-but-not-applied jobs. Cannot be undone.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, purge approved", type="primary", use_container_width=True):
|
||||
deleted = purge_jobs(get_db_path(), statuses=["approved"])
|
||||
st.success(f"Purged {deleted} approved jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
st.divider()
|
||||
|
||||
archive_col1, archive_col2, _, _ = st.columns(4)
|
||||
|
||||
with archive_col1:
|
||||
st.markdown("**Archive remaining**")
|
||||
st.caption(
|
||||
"Move all _pending_ and _rejected_ jobs to archived status. "
|
||||
"Archived jobs stay in the DB for dedup — they just won't appear in Job Review."
|
||||
)
|
||||
if st.button("📦 Archive Pending + Rejected", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "archive_remaining"
|
||||
|
||||
if st.session_state.get("confirm_purge") == "archive_remaining":
|
||||
st.info("Jobs will be archived (not deleted) — URLs are kept for dedup.")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, archive", type="primary", use_container_width=True):
|
||||
archived = archive_jobs(get_db_path(), statuses=["pending", "rejected"])
|
||||
st.success(f"Archived {archived} jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
with archive_col2:
|
||||
st.markdown("**Archive approved (unapplied)**")
|
||||
st.caption("Archive _approved_ listings you decided to skip — keeps history without cluttering the apply queue.")
|
||||
if st.button("📦 Archive Approved", use_container_width=True):
|
||||
st.session_state["confirm_purge"] = "archive_approved"
|
||||
|
||||
if st.session_state.get("confirm_purge") == "archive_approved":
|
||||
st.info("Approved jobs will be archived (not deleted).")
|
||||
c1, c2 = st.columns(2)
|
||||
if c1.button("Yes, archive approved", type="primary", use_container_width=True):
|
||||
archived = archive_jobs(get_db_path(), statuses=["approved"])
|
||||
st.success(f"Archived {archived} approved jobs.")
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
if c2.button("Cancel ", use_container_width=True):
|
||||
st.session_state.pop("confirm_purge", None)
|
||||
st.rerun()
|
||||
|
||||
# ── Setup banners ─────────────────────────────────────────────────────────────
|
||||
|
|
|
|||
|
|
@ -17,16 +17,10 @@ sys.path.insert(0, str(Path(__file__).parent.parent))
|
|||
|
||||
logging.basicConfig(level=logging.WARNING, format="%(name)s %(levelname)s: %(message)s")
|
||||
|
||||
# Load .env before any os.environ reads — safe to call inside Docker too
|
||||
# (uses setdefault, so Docker-injected vars take precedence over .env values)
|
||||
from circuitforge_core.config.settings import load_env as _load_env
|
||||
_load_env(Path(__file__).parent.parent / ".env")
|
||||
|
||||
IS_DEMO = os.environ.get("DEMO_MODE", "").lower() in ("1", "true", "yes")
|
||||
|
||||
import streamlit as st
|
||||
from scripts.db import DEFAULT_DB, init_db, get_active_tasks
|
||||
from scripts.db_migrate import migrate_db
|
||||
from app.feedback import inject_feedback_button
|
||||
from app.cloud_session import resolve_session, get_db_path, get_config_dir, get_cloud_tier
|
||||
import sqlite3
|
||||
|
|
@ -42,7 +36,6 @@ st.set_page_config(
|
|||
|
||||
resolve_session("peregrine")
|
||||
init_db(get_db_path())
|
||||
migrate_db(Path(get_db_path()))
|
||||
|
||||
# Demo tier — initialize once per session (cookie persistence handled client-side)
|
||||
if IS_DEMO and "simulated_tier" not in st.session_state:
|
||||
|
|
|
|||
|
|
@ -457,11 +457,6 @@ elif step == 5:
|
|||
from app.wizard.step_inference import validate
|
||||
|
||||
st.subheader("Step 5 \u2014 Inference & API Keys")
|
||||
st.info(
|
||||
"**Simplest setup:** set `OLLAMA_HOST` in your `.env` file — "
|
||||
"Peregrine auto-detects it, no config file needed. "
|
||||
"Or use the fields below to configure API keys and endpoints."
|
||||
)
|
||||
profile = saved_yaml.get("inference_profile", "remote")
|
||||
|
||||
if profile == "remote":
|
||||
|
|
@ -471,18 +466,8 @@ elif step == 5:
|
|||
placeholder="https://api.together.xyz/v1")
|
||||
openai_key = st.text_input("Endpoint API Key (optional)", type="password",
|
||||
key="oai_key") if openai_url else ""
|
||||
ollama_host = st.text_input("Ollama host (optional \u2014 local fallback)",
|
||||
placeholder="http://localhost:11434",
|
||||
key="ollama_host_input")
|
||||
ollama_model = st.text_input("Ollama model (optional)",
|
||||
value="llama3.2:3b",
|
||||
key="ollama_model_input")
|
||||
else:
|
||||
st.info(f"Local mode ({profile}): Ollama provides inference.")
|
||||
import os
|
||||
_ollama_host_env = os.environ.get("OLLAMA_HOST", "")
|
||||
if _ollama_host_env:
|
||||
st.caption(f"OLLAMA_HOST from .env: `{_ollama_host_env}`")
|
||||
anthropic_key = openai_url = openai_key = ""
|
||||
|
||||
with st.expander("Advanced \u2014 Service Ports & Hosts"):
|
||||
|
|
@ -561,14 +546,6 @@ elif step == 5:
|
|||
if anthropic_key or openai_url:
|
||||
env_path.write_text("\n".join(env_lines) + "\n")
|
||||
|
||||
if profile == "remote":
|
||||
if ollama_host:
|
||||
env_lines = _set_env(env_lines, "OLLAMA_HOST", ollama_host)
|
||||
if ollama_model:
|
||||
env_lines = _set_env(env_lines, "OLLAMA_MODEL", ollama_model)
|
||||
if ollama_host or ollama_model:
|
||||
env_path.write_text("\n".join(env_lines) + "\n")
|
||||
|
||||
_save_yaml({"services": svc, "wizard_step": 5})
|
||||
st.session_state.wizard_step = 6
|
||||
st.rerun()
|
||||
|
|
|
|||
|
|
@ -45,30 +45,6 @@ services:
|
|||
- "host.docker.internal:host-gateway"
|
||||
restart: unless-stopped
|
||||
|
||||
api:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: peregrine/Dockerfile.cfcore
|
||||
command: >
|
||||
bash -c "uvicorn dev_api:app --host 0.0.0.0 --port 8601"
|
||||
volumes:
|
||||
- /devl/menagerie-data:/devl/menagerie-data
|
||||
- ./config/llm.cloud.yaml:/app/config/llm.yaml:ro
|
||||
environment:
|
||||
- CLOUD_MODE=true
|
||||
- CLOUD_DATA_ROOT=/devl/menagerie-data
|
||||
- STAGING_DB=/devl/menagerie-data/cloud-default.db
|
||||
- DIRECTUS_JWT_SECRET=${DIRECTUS_JWT_SECRET}
|
||||
- CF_SERVER_SECRET=${CF_SERVER_SECRET}
|
||||
- PLATFORM_DB_URL=${PLATFORM_DB_URL}
|
||||
- HEIMDALL_URL=${HEIMDALL_URL:-http://cf-license:8000}
|
||||
- HEIMDALL_ADMIN_TOKEN=${HEIMDALL_ADMIN_TOKEN}
|
||||
- PYTHONUNBUFFERED=1
|
||||
- FORGEJO_API_TOKEN=${FORGEJO_API_TOKEN:-}
|
||||
extra_hosts:
|
||||
- "host.docker.internal:host-gateway"
|
||||
restart: unless-stopped
|
||||
|
||||
web:
|
||||
build:
|
||||
context: .
|
||||
|
|
@ -77,8 +53,6 @@ services:
|
|||
VITE_BASE_PATH: /peregrine/
|
||||
ports:
|
||||
- "8508:80"
|
||||
depends_on:
|
||||
- api
|
||||
restart: unless-stopped
|
||||
|
||||
searxng:
|
||||
|
|
|
|||
823
dev-api.py
823
dev-api.py
File diff suppressed because it is too large
Load diff
|
|
@ -2,8 +2,6 @@ server {
|
|||
listen 80;
|
||||
server_name _;
|
||||
|
||||
client_max_body_size 20m;
|
||||
|
||||
root /usr/share/nginx/html;
|
||||
index index.html;
|
||||
|
||||
|
|
|
|||
|
|
@ -102,23 +102,6 @@ Before opening a pull request:
|
|||
|
||||
---
|
||||
|
||||
## Database Migrations
|
||||
|
||||
Peregrine uses a numbered SQL migration system (Rails-style). Each migration is a `.sql` file in the `migrations/` directory at the repo root, named `NNN_description.sql` (e.g. `002_add_foo_column.sql`). Applied migrations are tracked in a `schema_migrations` table in each user database.
|
||||
|
||||
### Adding a migration
|
||||
|
||||
1. Create `migrations/NNN_description.sql` where `NNN` is the next sequential number (zero-padded to 3 digits).
|
||||
2. Write standard SQL — `CREATE TABLE IF NOT EXISTS`, `ALTER TABLE ADD COLUMN`, etc. Keep each migration idempotent where possible.
|
||||
3. Do **not** modify `scripts/db.py`'s legacy `_MIGRATIONS` lists — those are superseded and will be removed once all active databases have been bootstrapped by the migration runner.
|
||||
4. The runner (`scripts/db_migrate.py`) applies pending migrations at startup automatically (both FastAPI and Streamlit paths call `migrate_db(db_path)`).
|
||||
|
||||
### Rollbacks
|
||||
|
||||
SQLite does not support transactional DDL for all statement types. Write forward-only migrations. If you need to undo a schema change, add a new migration that reverses it.
|
||||
|
||||
---
|
||||
|
||||
## What NOT to Do
|
||||
|
||||
- Do not commit `config/user.yaml`, `config/notion.yaml`, `config/email.yaml`, `config/adzuna.yaml`, or any `config/integrations/*.yaml` — all are gitignored
|
||||
|
|
|
|||
|
|
@ -1,97 +0,0 @@
|
|||
-- Migration 001: Baseline schema
|
||||
-- Captures the full schema as of v0.8.5 (all columns including those added via ALTER TABLE)
|
||||
|
||||
CREATE TABLE IF NOT EXISTS jobs (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
title TEXT,
|
||||
company TEXT,
|
||||
url TEXT UNIQUE,
|
||||
source TEXT,
|
||||
location TEXT,
|
||||
is_remote INTEGER DEFAULT 0,
|
||||
salary TEXT,
|
||||
description TEXT,
|
||||
match_score REAL,
|
||||
keyword_gaps TEXT,
|
||||
date_found TEXT,
|
||||
status TEXT DEFAULT 'pending',
|
||||
notion_page_id TEXT,
|
||||
cover_letter TEXT,
|
||||
applied_at TEXT,
|
||||
interview_date TEXT,
|
||||
rejection_stage TEXT,
|
||||
phone_screen_at TEXT,
|
||||
interviewing_at TEXT,
|
||||
offer_at TEXT,
|
||||
hired_at TEXT,
|
||||
survey_at TEXT,
|
||||
calendar_event_id TEXT,
|
||||
optimized_resume TEXT,
|
||||
ats_gap_report TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS job_contacts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
job_id INTEGER,
|
||||
direction TEXT,
|
||||
subject TEXT,
|
||||
from_addr TEXT,
|
||||
to_addr TEXT,
|
||||
body TEXT,
|
||||
received_at TEXT,
|
||||
is_response_needed INTEGER DEFAULT 0,
|
||||
responded_at TEXT,
|
||||
message_id TEXT,
|
||||
stage_signal TEXT,
|
||||
suggestion_dismissed INTEGER DEFAULT 0
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS company_research (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
job_id INTEGER UNIQUE,
|
||||
generated_at TEXT,
|
||||
company_brief TEXT,
|
||||
ceo_brief TEXT,
|
||||
talking_points TEXT,
|
||||
raw_output TEXT,
|
||||
tech_brief TEXT,
|
||||
funding_brief TEXT,
|
||||
competitors_brief TEXT,
|
||||
red_flags TEXT,
|
||||
scrape_used INTEGER DEFAULT 0,
|
||||
accessibility_brief TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS background_tasks (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
task_type TEXT,
|
||||
job_id INTEGER,
|
||||
params TEXT,
|
||||
status TEXT DEFAULT 'pending',
|
||||
error TEXT,
|
||||
created_at TEXT,
|
||||
started_at TEXT,
|
||||
finished_at TEXT,
|
||||
stage TEXT,
|
||||
updated_at TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS survey_responses (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
job_id INTEGER,
|
||||
survey_name TEXT,
|
||||
received_at TEXT,
|
||||
source TEXT,
|
||||
raw_input TEXT,
|
||||
image_path TEXT,
|
||||
mode TEXT,
|
||||
llm_output TEXT,
|
||||
reported_score REAL,
|
||||
created_at TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS digest_queue (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
job_contact_id INTEGER UNIQUE,
|
||||
created_at TEXT
|
||||
);
|
||||
|
|
@ -3,12 +3,10 @@
|
|||
# Keep in sync with environment.yml
|
||||
|
||||
# ── CircuitForge shared core ───────────────────────────────────────────────
|
||||
# Requires circuitforge-core >= 0.8.0 (config.load_env, db, tasks; resources moved to circuitforge-orch).
|
||||
# Local dev / Docker (parent-context build): path install works because
|
||||
# circuitforge-core/ is a sibling directory.
|
||||
# CI / fresh checkouts: falls back to the Forgejo VCS URL below.
|
||||
# To use local editable install run: pip install -e ../circuitforge-core
|
||||
# TODO: pin to @v0.7.0 tag once cf-core cuts a release tag.
|
||||
git+https://git.opensourcesolarpunk.com/Circuit-Forge/circuitforge-core.git@main
|
||||
|
||||
# ── Web UI ────────────────────────────────────────────────────────────────
|
||||
|
|
|
|||
|
|
@ -383,19 +383,6 @@ def mark_applied(db_path: Path = DEFAULT_DB, ids: list[int] = None) -> None:
|
|||
conn.close()
|
||||
|
||||
|
||||
def cancel_task(db_path: Path = DEFAULT_DB, task_id: int = 0) -> bool:
|
||||
"""Cancel a single queued/running task by id. Returns True if a row was updated."""
|
||||
conn = sqlite3.connect(db_path)
|
||||
count = conn.execute(
|
||||
"UPDATE background_tasks SET status='failed', error='Cancelled by user',"
|
||||
" finished_at=datetime('now') WHERE id=? AND status IN ('queued','running')",
|
||||
(task_id,),
|
||||
).rowcount
|
||||
conn.commit()
|
||||
conn.close()
|
||||
return count > 0
|
||||
|
||||
|
||||
def kill_stuck_tasks(db_path: Path = DEFAULT_DB) -> int:
|
||||
"""Mark all queued/running background tasks as failed. Returns count killed."""
|
||||
conn = sqlite3.connect(db_path)
|
||||
|
|
|
|||
|
|
@ -1,73 +0,0 @@
|
|||
"""
|
||||
db_migrate.py — Rails-style numbered SQL migration runner for Peregrine user DBs.
|
||||
|
||||
Migration files live in migrations/ (sibling to this script's parent directory),
|
||||
named NNN_description.sql (e.g. 001_baseline.sql). They are applied in sorted
|
||||
order and tracked in the schema_migrations table so each runs exactly once.
|
||||
|
||||
Usage:
|
||||
from scripts.db_migrate import migrate_db
|
||||
migrate_db(Path("/path/to/user.db"))
|
||||
"""
|
||||
|
||||
import logging
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
# Resolved at import time: peregrine repo root / migrations/
|
||||
_MIGRATIONS_DIR = Path(__file__).parent.parent / "migrations"
|
||||
|
||||
_CREATE_MIGRATIONS_TABLE = """
|
||||
CREATE TABLE IF NOT EXISTS schema_migrations (
|
||||
version TEXT PRIMARY KEY,
|
||||
applied_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
)
|
||||
"""
|
||||
|
||||
|
||||
def migrate_db(db_path: Path) -> list[str]:
|
||||
"""Apply any pending migrations to db_path. Returns list of applied versions."""
|
||||
applied: list[str] = []
|
||||
|
||||
con = sqlite3.connect(db_path)
|
||||
try:
|
||||
con.execute(_CREATE_MIGRATIONS_TABLE)
|
||||
con.commit()
|
||||
|
||||
if not _MIGRATIONS_DIR.is_dir():
|
||||
log.warning("migrations/ directory not found at %s — skipping", _MIGRATIONS_DIR)
|
||||
return applied
|
||||
|
||||
migration_files = sorted(_MIGRATIONS_DIR.glob("*.sql"))
|
||||
if not migration_files:
|
||||
return applied
|
||||
|
||||
already_applied = {
|
||||
row[0] for row in con.execute("SELECT version FROM schema_migrations")
|
||||
}
|
||||
|
||||
for path in migration_files:
|
||||
version = path.stem # e.g. "001_baseline"
|
||||
if version in already_applied:
|
||||
continue
|
||||
|
||||
sql = path.read_text(encoding="utf-8")
|
||||
log.info("Applying migration %s to %s", version, db_path.name)
|
||||
try:
|
||||
con.executescript(sql)
|
||||
con.execute(
|
||||
"INSERT INTO schema_migrations (version) VALUES (?)", (version,)
|
||||
)
|
||||
con.commit()
|
||||
applied.append(version)
|
||||
log.info("Migration %s applied successfully", version)
|
||||
except Exception as exc:
|
||||
con.rollback()
|
||||
log.error("Migration %s failed: %s", version, exc)
|
||||
raise RuntimeError(f"Migration {version} failed: {exc}") from exc
|
||||
finally:
|
||||
con.close()
|
||||
|
||||
return applied
|
||||
|
|
@ -1,46 +1,19 @@
|
|||
"""
|
||||
LLM abstraction layer with priority fallback chain.
|
||||
Config lookup order:
|
||||
1. <repo>/config/llm.yaml — per-install local config
|
||||
2. ~/.config/circuitforge/llm.yaml — user-level config (circuitforge-core default)
|
||||
3. env-var auto-config (ANTHROPIC_API_KEY, OPENAI_API_KEY, OLLAMA_HOST, …)
|
||||
Reads config/llm.yaml. Tries backends in order; falls back on any error.
|
||||
"""
|
||||
from pathlib import Path
|
||||
|
||||
from circuitforge_core.llm import LLMRouter as _CoreLLMRouter
|
||||
|
||||
# Kept for backwards-compatibility — external callers that import CONFIG_PATH
|
||||
# from this module continue to work.
|
||||
CONFIG_PATH = Path(__file__).parent.parent / "config" / "llm.yaml"
|
||||
|
||||
|
||||
class LLMRouter(_CoreLLMRouter):
|
||||
"""Peregrine-specific LLMRouter — tri-level config path priority.
|
||||
"""Peregrine-specific LLMRouter — defaults to Peregrine's config/llm.yaml."""
|
||||
|
||||
When ``config_path`` is supplied (e.g. in tests) it is passed straight
|
||||
through to the core. When omitted, the lookup order is:
|
||||
1. <repo>/config/llm.yaml (per-install local config)
|
||||
2. ~/.config/circuitforge/llm.yaml (user-level, circuitforge-core default)
|
||||
3. env-var auto-config (ANTHROPIC_API_KEY, OPENAI_API_KEY, OLLAMA_HOST …)
|
||||
"""
|
||||
|
||||
def __init__(self, config_path: Path | None = None) -> None:
|
||||
if config_path is not None:
|
||||
# Explicit path supplied — use it directly (e.g. tests, CLI override).
|
||||
def __init__(self, config_path: Path = CONFIG_PATH):
|
||||
super().__init__(config_path)
|
||||
return
|
||||
|
||||
local = Path(__file__).parent.parent / "config" / "llm.yaml"
|
||||
user_level = Path.home() / ".config" / "circuitforge" / "llm.yaml"
|
||||
if local.exists():
|
||||
super().__init__(local)
|
||||
elif user_level.exists():
|
||||
super().__init__(user_level)
|
||||
else:
|
||||
# No yaml found — let circuitforge-core's env-var auto-config run.
|
||||
# The core default CONFIG_PATH (~/.config/circuitforge/llm.yaml)
|
||||
# won't exist either, so _auto_config_from_env() will be triggered.
|
||||
super().__init__()
|
||||
|
||||
|
||||
# Module-level singleton for convenience
|
||||
|
|
|
|||
|
|
@ -492,12 +492,6 @@ def main() -> None:
|
|||
# binds a harmless free port instead of conflicting with the external service.
|
||||
env_updates: dict[str, str] = {i["env_var"]: str(i["stub_port"]) for i in ports.values()}
|
||||
env_updates["RECOMMENDED_PROFILE"] = profile
|
||||
# When Ollama is adopted from the host process, write OLLAMA_HOST so
|
||||
# LLMRouter's env-var auto-config finds it without needing config/llm.yaml.
|
||||
ollama_info = ports.get("ollama")
|
||||
if ollama_info and ollama_info.get("external"):
|
||||
env_updates["OLLAMA_HOST"] = f"http://host.docker.internal:{ollama_info['resolved']}"
|
||||
|
||||
if offload_gb > 0:
|
||||
env_updates["CPU_OFFLOAD_GB"] = str(offload_gb)
|
||||
# GPU info for the app container (which lacks nvidia-smi access)
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ from typing import Callable, Optional
|
|||
|
||||
from circuitforge_core.tasks.scheduler import (
|
||||
TaskSpec, # re-export unchanged
|
||||
LocalScheduler as _CoreTaskScheduler,
|
||||
TaskScheduler as _CoreTaskScheduler,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
|
@ -94,6 +94,15 @@ class TaskScheduler(_CoreTaskScheduler):
|
|||
def __init__(self, db_path: Path, run_task_fn: Callable) -> None:
|
||||
budgets, max_depth = _load_config_overrides(db_path)
|
||||
|
||||
# Resolve VRAM using module-level _get_gpus so tests can monkeypatch it
|
||||
try:
|
||||
gpus = _get_gpus()
|
||||
available_vram: float = (
|
||||
sum(g["vram_total_gb"] for g in gpus) if gpus else 999.0
|
||||
)
|
||||
except Exception:
|
||||
available_vram = 999.0
|
||||
|
||||
# Warn under this module's logger for any task types with no VRAM budget
|
||||
# (mirrors the core warning but captures under scripts.task_scheduler
|
||||
# so existing tests using caplog.at_level(logger="scripts.task_scheduler") pass)
|
||||
|
|
@ -104,12 +113,19 @@ class TaskScheduler(_CoreTaskScheduler):
|
|||
"defaulting to 0.0 GB (unlimited concurrency for this type)", t
|
||||
)
|
||||
|
||||
coordinator_url = os.environ.get(
|
||||
"CF_ORCH_URL", "http://localhost:7700"
|
||||
).rstrip("/")
|
||||
|
||||
super().__init__(
|
||||
db_path=db_path,
|
||||
run_task_fn=run_task_fn,
|
||||
task_types=LLM_TASK_TYPES,
|
||||
vram_budgets=budgets,
|
||||
available_vram_gb=available_vram,
|
||||
max_queue_depth=max_depth,
|
||||
coordinator_url=coordinator_url,
|
||||
service_name="peregrine",
|
||||
)
|
||||
|
||||
def enqueue(
|
||||
|
|
|
|||
|
|
@ -1,148 +0,0 @@
|
|||
"""Tests for scripts/db_migrate.py — numbered SQL migration runner."""
|
||||
|
||||
import sqlite3
|
||||
import textwrap
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from scripts.db_migrate import migrate_db
|
||||
|
||||
|
||||
# ── helpers ───────────────────────────────────────────────────────────────────
|
||||
|
||||
def _applied(db_path: Path) -> list[str]:
|
||||
con = sqlite3.connect(db_path)
|
||||
try:
|
||||
rows = con.execute("SELECT version FROM schema_migrations ORDER BY version").fetchall()
|
||||
return [r[0] for r in rows]
|
||||
finally:
|
||||
con.close()
|
||||
|
||||
|
||||
def _tables(db_path: Path) -> set[str]:
|
||||
con = sqlite3.connect(db_path)
|
||||
try:
|
||||
rows = con.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name NOT LIKE 'sqlite_%'"
|
||||
).fetchall()
|
||||
return {r[0] for r in rows}
|
||||
finally:
|
||||
con.close()
|
||||
|
||||
|
||||
# ── tests ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
def test_creates_schema_migrations_table(tmp_path):
|
||||
"""Running against an empty DB creates the tracking table."""
|
||||
db = tmp_path / "test.db"
|
||||
(tmp_path / "migrations").mkdir() # empty migrations dir
|
||||
# Patch the module-level _MIGRATIONS_DIR
|
||||
import scripts.db_migrate as m
|
||||
orig = m._MIGRATIONS_DIR
|
||||
m._MIGRATIONS_DIR = tmp_path / "migrations"
|
||||
try:
|
||||
migrate_db(db)
|
||||
assert "schema_migrations" in _tables(db)
|
||||
finally:
|
||||
m._MIGRATIONS_DIR = orig
|
||||
|
||||
|
||||
def test_applies_migration_file(tmp_path):
|
||||
"""A .sql file in migrations/ is applied and recorded."""
|
||||
db = tmp_path / "test.db"
|
||||
mdir = tmp_path / "migrations"
|
||||
mdir.mkdir()
|
||||
(mdir / "001_test.sql").write_text(
|
||||
"CREATE TABLE IF NOT EXISTS widgets (id INTEGER PRIMARY KEY, name TEXT);"
|
||||
)
|
||||
|
||||
import scripts.db_migrate as m
|
||||
orig = m._MIGRATIONS_DIR
|
||||
m._MIGRATIONS_DIR = mdir
|
||||
try:
|
||||
applied = migrate_db(db)
|
||||
assert applied == ["001_test"]
|
||||
assert "widgets" in _tables(db)
|
||||
assert _applied(db) == ["001_test"]
|
||||
finally:
|
||||
m._MIGRATIONS_DIR = orig
|
||||
|
||||
|
||||
def test_idempotent_second_run(tmp_path):
|
||||
"""Running migrate_db twice does not re-apply migrations."""
|
||||
db = tmp_path / "test.db"
|
||||
mdir = tmp_path / "migrations"
|
||||
mdir.mkdir()
|
||||
(mdir / "001_test.sql").write_text(
|
||||
"CREATE TABLE IF NOT EXISTS widgets (id INTEGER PRIMARY KEY, name TEXT);"
|
||||
)
|
||||
|
||||
import scripts.db_migrate as m
|
||||
orig = m._MIGRATIONS_DIR
|
||||
m._MIGRATIONS_DIR = mdir
|
||||
try:
|
||||
migrate_db(db)
|
||||
applied = migrate_db(db) # second run
|
||||
assert applied == []
|
||||
assert _applied(db) == ["001_test"]
|
||||
finally:
|
||||
m._MIGRATIONS_DIR = orig
|
||||
|
||||
|
||||
def test_applies_only_new_migrations(tmp_path):
|
||||
"""Migrations already in schema_migrations are skipped; only new ones run."""
|
||||
db = tmp_path / "test.db"
|
||||
mdir = tmp_path / "migrations"
|
||||
mdir.mkdir()
|
||||
(mdir / "001_first.sql").write_text(
|
||||
"CREATE TABLE IF NOT EXISTS first_table (id INTEGER PRIMARY KEY);"
|
||||
)
|
||||
|
||||
import scripts.db_migrate as m
|
||||
orig = m._MIGRATIONS_DIR
|
||||
m._MIGRATIONS_DIR = mdir
|
||||
try:
|
||||
migrate_db(db)
|
||||
|
||||
# Add a second migration
|
||||
(mdir / "002_second.sql").write_text(
|
||||
"CREATE TABLE IF NOT EXISTS second_table (id INTEGER PRIMARY KEY);"
|
||||
)
|
||||
applied = migrate_db(db)
|
||||
assert applied == ["002_second"]
|
||||
assert set(_applied(db)) == {"001_first", "002_second"}
|
||||
assert "second_table" in _tables(db)
|
||||
finally:
|
||||
m._MIGRATIONS_DIR = orig
|
||||
|
||||
|
||||
def test_migration_failure_raises(tmp_path):
|
||||
"""A bad migration raises RuntimeError and does not record the version."""
|
||||
db = tmp_path / "test.db"
|
||||
mdir = tmp_path / "migrations"
|
||||
mdir.mkdir()
|
||||
(mdir / "001_bad.sql").write_text("THIS IS NOT VALID SQL !!!")
|
||||
|
||||
import scripts.db_migrate as m
|
||||
orig = m._MIGRATIONS_DIR
|
||||
m._MIGRATIONS_DIR = mdir
|
||||
try:
|
||||
with pytest.raises(RuntimeError, match="001_bad"):
|
||||
migrate_db(db)
|
||||
assert _applied(db) == []
|
||||
finally:
|
||||
m._MIGRATIONS_DIR = orig
|
||||
|
||||
|
||||
def test_baseline_migration_runs(tmp_path):
|
||||
"""The real 001_baseline.sql applies cleanly to a fresh database."""
|
||||
db = tmp_path / "test.db"
|
||||
applied = migrate_db(db)
|
||||
assert "001_baseline" in applied
|
||||
expected_tables = {
|
||||
"jobs", "job_contacts", "company_research",
|
||||
"background_tasks", "survey_responses", "digest_queue",
|
||||
"schema_migrations",
|
||||
}
|
||||
assert expected_tables <= _tables(db)
|
||||
|
|
@ -145,7 +145,7 @@ def test_get_resume_missing_returns_not_exists(tmp_path, monkeypatch):
|
|||
"""GET /api/settings/resume when file missing returns {exists: false}."""
|
||||
fake_path = tmp_path / "config" / "plain_text_resume.yaml"
|
||||
# Ensure the path doesn't exist
|
||||
monkeypatch.setattr("dev_api._resume_path", lambda: fake_path)
|
||||
monkeypatch.setattr("dev_api.RESUME_PATH", fake_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -157,7 +157,7 @@ def test_get_resume_missing_returns_not_exists(tmp_path, monkeypatch):
|
|||
def test_post_resume_blank_creates_file(tmp_path, monkeypatch):
|
||||
"""POST /api/settings/resume/blank creates the file."""
|
||||
fake_path = tmp_path / "config" / "plain_text_resume.yaml"
|
||||
monkeypatch.setattr("dev_api._resume_path", lambda: fake_path)
|
||||
monkeypatch.setattr("dev_api.RESUME_PATH", fake_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -170,7 +170,7 @@ def test_post_resume_blank_creates_file(tmp_path, monkeypatch):
|
|||
def test_get_resume_after_blank_returns_exists(tmp_path, monkeypatch):
|
||||
"""GET /api/settings/resume after blank creation returns {exists: true}."""
|
||||
fake_path = tmp_path / "config" / "plain_text_resume.yaml"
|
||||
monkeypatch.setattr("dev_api._resume_path", lambda: fake_path)
|
||||
monkeypatch.setattr("dev_api.RESUME_PATH", fake_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -212,7 +212,7 @@ def test_get_search_prefs_returns_dict(tmp_path, monkeypatch):
|
|||
fake_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(fake_path, "w") as f:
|
||||
yaml.dump({"default": {"remote_preference": "remote", "job_boards": []}}, f)
|
||||
monkeypatch.setattr("dev_api._search_prefs_path", lambda: fake_path)
|
||||
monkeypatch.setattr("dev_api.SEARCH_PREFS_PATH", fake_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -227,7 +227,7 @@ def test_put_get_search_roundtrip(tmp_path, monkeypatch):
|
|||
"""PUT then GET search prefs round-trip: saved field is returned."""
|
||||
fake_path = tmp_path / "config" / "search_profiles.yaml"
|
||||
fake_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
monkeypatch.setattr("dev_api._search_prefs_path", lambda: fake_path)
|
||||
monkeypatch.setattr("dev_api.SEARCH_PREFS_PATH", fake_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -253,7 +253,7 @@ def test_put_get_search_roundtrip(tmp_path, monkeypatch):
|
|||
def test_get_search_missing_file_returns_empty(tmp_path, monkeypatch):
|
||||
"""GET /api/settings/search when file missing returns empty dict."""
|
||||
fake_path = tmp_path / "config" / "search_profiles.yaml"
|
||||
monkeypatch.setattr("dev_api._search_prefs_path", lambda: fake_path)
|
||||
monkeypatch.setattr("dev_api.SEARCH_PREFS_PATH", fake_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -363,7 +363,7 @@ def test_get_services_cpu_profile(client):
|
|||
def test_get_email_has_password_set_bool(tmp_path, monkeypatch):
|
||||
"""GET /api/settings/system/email has password_set (bool) and no password key."""
|
||||
fake_email_path = tmp_path / "email.yaml"
|
||||
monkeypatch.setattr("dev_api._config_dir", lambda: fake_email_path.parent)
|
||||
monkeypatch.setattr("dev_api.EMAIL_PATH", fake_email_path)
|
||||
with patch("dev_api.get_credential", return_value=None):
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -378,7 +378,7 @@ def test_get_email_has_password_set_bool(tmp_path, monkeypatch):
|
|||
def test_get_email_password_set_true_when_stored(tmp_path, monkeypatch):
|
||||
"""password_set is True when credential is stored."""
|
||||
fake_email_path = tmp_path / "email.yaml"
|
||||
monkeypatch.setattr("dev_api._config_dir", lambda: fake_email_path.parent)
|
||||
monkeypatch.setattr("dev_api.EMAIL_PATH", fake_email_path)
|
||||
with patch("dev_api.get_credential", return_value="secret"):
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -426,14 +426,10 @@ def test_finetune_status_returns_status_and_pairs_count(client):
|
|||
assert "pairs_count" in data
|
||||
|
||||
|
||||
def test_finetune_status_idle_when_no_task(tmp_path, monkeypatch):
|
||||
def test_finetune_status_idle_when_no_task(client):
|
||||
"""Status is 'idle' and pairs_count is 0 when no task exists."""
|
||||
fake_jsonl = tmp_path / "cover_letters.jsonl" # does not exist -> 0 pairs
|
||||
monkeypatch.setattr("dev_api._TRAINING_JSONL", fake_jsonl)
|
||||
with patch("scripts.task_runner.get_task_status", return_value=None, create=True):
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
resp = c.get("/api/settings/fine-tune/status")
|
||||
resp = client.get("/api/settings/fine-tune/status")
|
||||
assert resp.status_code == 200
|
||||
data = resp.json()
|
||||
assert data["status"] == "idle"
|
||||
|
|
@ -445,7 +441,7 @@ def test_finetune_status_idle_when_no_task(tmp_path, monkeypatch):
|
|||
def test_get_license_returns_tier_and_active(tmp_path, monkeypatch):
|
||||
"""GET /api/settings/license returns tier and active fields."""
|
||||
fake_license = tmp_path / "license.yaml"
|
||||
monkeypatch.setattr("dev_api._license_path", lambda: fake_license)
|
||||
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -459,7 +455,7 @@ def test_get_license_returns_tier_and_active(tmp_path, monkeypatch):
|
|||
def test_get_license_defaults_to_free(tmp_path, monkeypatch):
|
||||
"""GET /api/settings/license defaults to free tier when no file."""
|
||||
fake_license = tmp_path / "license.yaml"
|
||||
monkeypatch.setattr("dev_api._license_path", lambda: fake_license)
|
||||
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -473,7 +469,8 @@ def test_get_license_defaults_to_free(tmp_path, monkeypatch):
|
|||
def test_activate_license_valid_key_returns_ok(tmp_path, monkeypatch):
|
||||
"""POST activate with valid key format returns {ok: true}."""
|
||||
fake_license = tmp_path / "license.yaml"
|
||||
monkeypatch.setattr("dev_api._license_path", lambda: fake_license)
|
||||
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -485,7 +482,8 @@ def test_activate_license_valid_key_returns_ok(tmp_path, monkeypatch):
|
|||
def test_activate_license_invalid_key_returns_ok_false(tmp_path, monkeypatch):
|
||||
"""POST activate with bad key format returns {ok: false}."""
|
||||
fake_license = tmp_path / "license.yaml"
|
||||
monkeypatch.setattr("dev_api._license_path", lambda: fake_license)
|
||||
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -497,7 +495,8 @@ def test_activate_license_invalid_key_returns_ok_false(tmp_path, monkeypatch):
|
|||
def test_deactivate_license_returns_ok(tmp_path, monkeypatch):
|
||||
"""POST /api/settings/license/deactivate returns 200 with ok."""
|
||||
fake_license = tmp_path / "license.yaml"
|
||||
monkeypatch.setattr("dev_api._license_path", lambda: fake_license)
|
||||
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -509,7 +508,8 @@ def test_deactivate_license_returns_ok(tmp_path, monkeypatch):
|
|||
def test_activate_then_deactivate(tmp_path, monkeypatch):
|
||||
"""Activate then deactivate: active goes False."""
|
||||
fake_license = tmp_path / "license.yaml"
|
||||
monkeypatch.setattr("dev_api._license_path", lambda: fake_license)
|
||||
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -580,7 +580,7 @@ def test_get_developer_returns_expected_fields(tmp_path, monkeypatch):
|
|||
_write_user_yaml(user_yaml)
|
||||
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||
fake_tokens = tmp_path / "tokens.yaml"
|
||||
monkeypatch.setattr("dev_api._tokens_path", lambda: fake_tokens)
|
||||
monkeypatch.setattr("dev_api.TOKENS_PATH", fake_tokens)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
@ -602,7 +602,7 @@ def test_put_dev_tier_then_get(tmp_path, monkeypatch):
|
|||
_write_user_yaml(user_yaml)
|
||||
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||
fake_tokens = tmp_path / "tokens.yaml"
|
||||
monkeypatch.setattr("dev_api._tokens_path", lambda: fake_tokens)
|
||||
monkeypatch.setattr("dev_api.TOKENS_PATH", fake_tokens)
|
||||
|
||||
from dev_api import app
|
||||
c = TestClient(app)
|
||||
|
|
|
|||
|
|
@ -1,132 +0,0 @@
|
|||
"""Tests for Peregrine's LLMRouter shim — priority fallback logic."""
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch, MagicMock, call
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||
|
||||
|
||||
def _import_fresh():
|
||||
"""Import scripts.llm_router fresh (bypass module cache)."""
|
||||
import importlib
|
||||
import scripts.llm_router as mod
|
||||
importlib.reload(mod)
|
||||
return mod
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test 1: local config/llm.yaml takes priority when it exists
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_uses_local_yaml_when_present():
|
||||
"""When config/llm.yaml exists locally, super().__init__ is called with that path."""
|
||||
import scripts.llm_router as shim_mod
|
||||
from circuitforge_core.llm import LLMRouter as _CoreLLMRouter
|
||||
|
||||
local_path = Path(shim_mod.__file__).parent.parent / "config" / "llm.yaml"
|
||||
user_path = Path.home() / ".config" / "circuitforge" / "llm.yaml"
|
||||
|
||||
def fake_exists(self):
|
||||
return self == local_path # only the local path "exists"
|
||||
|
||||
captured = {}
|
||||
|
||||
def fake_core_init(self, config_path=None):
|
||||
captured["config_path"] = config_path
|
||||
self.config = {}
|
||||
|
||||
with patch.object(Path, "exists", fake_exists), \
|
||||
patch.object(_CoreLLMRouter, "__init__", fake_core_init):
|
||||
import importlib
|
||||
import scripts.llm_router as mod
|
||||
importlib.reload(mod)
|
||||
mod.LLMRouter()
|
||||
|
||||
assert captured.get("config_path") == local_path, (
|
||||
f"Expected super().__init__ to be called with local path {local_path}, "
|
||||
f"got {captured.get('config_path')}"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test 2: falls through to env-var auto-config when neither yaml exists
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_falls_through_to_env_when_no_yamls():
|
||||
"""When no yaml files exist, super().__init__ is called with no args (env-var path)."""
|
||||
import scripts.llm_router as shim_mod
|
||||
from circuitforge_core.llm import LLMRouter as _CoreLLMRouter
|
||||
|
||||
captured = {}
|
||||
|
||||
def fake_exists(self):
|
||||
return False # no yaml files exist anywhere
|
||||
|
||||
def fake_core_init(self, config_path=None):
|
||||
# Record whether a path was passed
|
||||
captured["config_path"] = config_path
|
||||
captured["called"] = True
|
||||
self.config = {}
|
||||
|
||||
with patch.object(Path, "exists", fake_exists), \
|
||||
patch.object(_CoreLLMRouter, "__init__", fake_core_init):
|
||||
import importlib
|
||||
import scripts.llm_router as mod
|
||||
importlib.reload(mod)
|
||||
mod.LLMRouter()
|
||||
|
||||
assert captured.get("called"), "super().__init__ was never called"
|
||||
# When called with no args, config_path defaults to None in our mock,
|
||||
# meaning the shim correctly fell through to env-var auto-config
|
||||
assert captured.get("config_path") is None, (
|
||||
f"Expected super().__init__ to be called with no explicit path (None), "
|
||||
f"got {captured.get('config_path')}"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test 3: module-level complete() singleton is only instantiated once
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_complete_singleton_is_reused():
|
||||
"""complete() reuses the same LLMRouter instance across multiple calls."""
|
||||
import importlib
|
||||
import scripts.llm_router as mod
|
||||
importlib.reload(mod)
|
||||
|
||||
# Reset singleton
|
||||
mod._router = None
|
||||
|
||||
instantiation_count = [0]
|
||||
original_init = mod.LLMRouter.__init__
|
||||
|
||||
mock_router = MagicMock()
|
||||
mock_router.complete.return_value = "OK"
|
||||
|
||||
original_class = mod.LLMRouter
|
||||
|
||||
class CountingRouter(original_class):
|
||||
def __init__(self):
|
||||
instantiation_count[0] += 1
|
||||
# Bypass real __init__ to avoid needing config files
|
||||
self.config = {}
|
||||
|
||||
def complete(self, prompt, system=None):
|
||||
return "OK"
|
||||
|
||||
# Patch the class in the module
|
||||
mod.LLMRouter = CountingRouter
|
||||
mod._router = None
|
||||
|
||||
result1 = mod.complete("first call")
|
||||
result2 = mod.complete("second call")
|
||||
|
||||
assert result1 == "OK"
|
||||
assert result2 == "OK"
|
||||
assert instantiation_count[0] == 1, (
|
||||
f"Expected LLMRouter to be instantiated exactly once, "
|
||||
f"got {instantiation_count[0]} instantiation(s)"
|
||||
)
|
||||
|
||||
# Restore
|
||||
mod.LLMRouter = original_class
|
||||
|
|
@ -1,80 +0,0 @@
|
|||
"""Tests: preflight writes OLLAMA_HOST to .env when Ollama is adopted from host."""
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch, call
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||
|
||||
import scripts.preflight as pf
|
||||
|
||||
|
||||
def _make_ports(ollama_external: bool = True, ollama_port: int = 11434) -> dict:
|
||||
"""Build a minimal ports dict as returned by preflight's port-scanning logic."""
|
||||
return {
|
||||
"ollama": {
|
||||
"resolved": ollama_port,
|
||||
"external": ollama_external,
|
||||
"stub_port": 54321,
|
||||
"env_var": "OLLAMA_PORT",
|
||||
"adoptable": True,
|
||||
},
|
||||
"streamlit": {
|
||||
"resolved": 8502,
|
||||
"external": False,
|
||||
"stub_port": 8502,
|
||||
"env_var": "STREAMLIT_PORT",
|
||||
"adoptable": False,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def _capture_env_updates(ports: dict) -> dict:
|
||||
"""Run the env_updates construction block from preflight.main() and return the result.
|
||||
|
||||
We extract this logic from main() so tests can call it directly without
|
||||
needing to simulate the full CLI argument parsing and system probe flow.
|
||||
The block under test is the `if not args.check_only:` section.
|
||||
"""
|
||||
captured = {}
|
||||
|
||||
def fake_write_env(updates: dict) -> None:
|
||||
captured.update(updates)
|
||||
|
||||
with patch.object(pf, "write_env", side_effect=fake_write_env), \
|
||||
patch.object(pf, "update_llm_yaml"), \
|
||||
patch.object(pf, "write_compose_override"):
|
||||
# Replicate the env_updates block from preflight.main() as faithfully as possible
|
||||
env_updates: dict[str, str] = {i["env_var"]: str(i["stub_port"]) for i in ports.values()}
|
||||
env_updates["RECOMMENDED_PROFILE"] = "single-gpu"
|
||||
|
||||
# ---- Code under test: the OLLAMA_HOST adoption block ----
|
||||
ollama_info = ports.get("ollama")
|
||||
if ollama_info and ollama_info.get("external"):
|
||||
env_updates["OLLAMA_HOST"] = f"http://host.docker.internal:{ollama_info['resolved']}"
|
||||
# ---------------------------------------------------------
|
||||
|
||||
pf.write_env(env_updates)
|
||||
|
||||
return captured
|
||||
|
||||
|
||||
def test_ollama_host_written_when_adopted():
|
||||
"""OLLAMA_HOST is added when Ollama is adopted from the host (external=True)."""
|
||||
ports = _make_ports(ollama_external=True, ollama_port=11434)
|
||||
result = _capture_env_updates(ports)
|
||||
assert "OLLAMA_HOST" in result
|
||||
assert result["OLLAMA_HOST"] == "http://host.docker.internal:11434"
|
||||
|
||||
|
||||
def test_ollama_host_not_written_when_docker_managed():
|
||||
"""OLLAMA_HOST is NOT added when Ollama runs in Docker (external=False)."""
|
||||
ports = _make_ports(ollama_external=False)
|
||||
result = _capture_env_updates(ports)
|
||||
assert "OLLAMA_HOST" not in result
|
||||
|
||||
|
||||
def test_ollama_host_reflects_adopted_port():
|
||||
"""OLLAMA_HOST uses the actual adopted port, not the default."""
|
||||
ports = _make_ports(ollama_external=True, ollama_port=11500)
|
||||
result = _capture_env_updates(ports)
|
||||
assert result["OLLAMA_HOST"] == "http://host.docker.internal:11500"
|
||||
|
|
@ -109,33 +109,24 @@ def test_missing_budget_logs_warning(tmp_db, caplog):
|
|||
ts.LLM_TASK_TYPES = frozenset(original)
|
||||
|
||||
|
||||
def test_cpu_only_system_creates_scheduler(tmp_db, monkeypatch):
|
||||
"""Scheduler constructs without error when _get_gpus() returns empty list.
|
||||
|
||||
LocalScheduler has no VRAM gating — it runs tasks regardless of GPU count.
|
||||
VRAM-aware scheduling is handled by circuitforge_orch's coordinator.
|
||||
"""
|
||||
def test_cpu_only_system_gets_unlimited_vram(tmp_db, monkeypatch):
|
||||
"""_available_vram is 999.0 when _get_gpus() returns empty list."""
|
||||
# Patch the module-level _get_gpus in task_scheduler (not preflight)
|
||||
# so __init__'s _ts_mod._get_gpus() call picks up the mock.
|
||||
monkeypatch.setattr("scripts.task_scheduler._get_gpus", lambda: [])
|
||||
s = TaskScheduler(tmp_db, _noop_run_task)
|
||||
# Scheduler still has correct budgets configured; no VRAM attribute expected
|
||||
# Scheduler constructed successfully; budgets contain all LLM task types.
|
||||
# Does not assert exact values -- a sibling test may write a config override
|
||||
# to the shared pytest tmp dir, causing _load_config_overrides to pick it up.
|
||||
assert set(s._budgets.keys()) >= LLM_TASK_TYPES
|
||||
assert s._available_vram == 999.0
|
||||
|
||||
|
||||
def test_gpu_detection_does_not_affect_local_scheduler(tmp_db, monkeypatch):
|
||||
"""LocalScheduler ignores GPU VRAM — it has no _available_vram attribute.
|
||||
|
||||
VRAM-gated concurrency requires circuitforge_orch (Paid tier).
|
||||
"""
|
||||
def test_gpu_vram_summed_across_all_gpus(tmp_db, monkeypatch):
|
||||
"""_available_vram sums vram_total_gb across all detected GPUs."""
|
||||
fake_gpus = [
|
||||
{"name": "RTX 3090", "vram_total_gb": 24.0, "vram_free_gb": 20.0},
|
||||
{"name": "RTX 3090", "vram_total_gb": 24.0, "vram_free_gb": 18.0},
|
||||
]
|
||||
monkeypatch.setattr("scripts.task_scheduler._get_gpus", lambda: fake_gpus)
|
||||
s = TaskScheduler(tmp_db, _noop_run_task)
|
||||
assert not hasattr(s, "_available_vram")
|
||||
assert s._available_vram == 48.0
|
||||
|
||||
|
||||
def test_enqueue_adds_taskspec_to_deque(tmp_db):
|
||||
|
|
@ -215,37 +206,40 @@ def _make_recording_run_task(log: list, done_event: threading.Event, expected: i
|
|||
return _run
|
||||
|
||||
|
||||
def _start_scheduler(tmp_db, run_task_fn):
|
||||
def _start_scheduler(tmp_db, run_task_fn, available_vram=999.0):
|
||||
s = TaskScheduler(tmp_db, run_task_fn)
|
||||
s._available_vram = available_vram
|
||||
s.start()
|
||||
return s
|
||||
|
||||
|
||||
# ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
def test_all_task_types_complete(tmp_db):
|
||||
"""Scheduler runs tasks from multiple types; all complete.
|
||||
|
||||
LocalScheduler runs type batches concurrently (no VRAM gating).
|
||||
VRAM-gated sequential scheduling requires circuitforge_orch.
|
||||
"""
|
||||
def test_deepest_queue_wins_first_slot(tmp_db):
|
||||
"""Type with more queued tasks starts first when VRAM only fits one type."""
|
||||
log, done = [], threading.Event()
|
||||
|
||||
# Build scheduler but DO NOT start it yet — enqueue all tasks first
|
||||
# so the scheduler sees the full picture on its very first wake.
|
||||
run_task_fn = _make_recording_run_task(log, done, 4)
|
||||
s = TaskScheduler(tmp_db, run_task_fn)
|
||||
s._available_vram = 3.0 # fits cover_letter (2.5) but not +company_research (5.0)
|
||||
|
||||
# Enqueue cover_letter (3 tasks) and company_research (1 task) before start.
|
||||
# cover_letter has the deeper queue and must win the first batch slot.
|
||||
for i in range(3):
|
||||
s.enqueue(i + 1, "cover_letter", i + 1, None)
|
||||
s.enqueue(4, "company_research", 4, None)
|
||||
|
||||
s.start()
|
||||
s.start() # scheduler now sees all tasks atomically on its first iteration
|
||||
assert done.wait(timeout=5.0), "timed out — not all 4 tasks completed"
|
||||
s.shutdown()
|
||||
|
||||
assert len(log) == 4
|
||||
cl = [t for _, t in log if t == "cover_letter"]
|
||||
cr = [t for _, t in log if t == "company_research"]
|
||||
cl = [i for i, (_, t) in enumerate(log) if t == "cover_letter"]
|
||||
cr = [i for i, (_, t) in enumerate(log) if t == "company_research"]
|
||||
assert len(cl) == 3 and len(cr) == 1
|
||||
assert max(cl) < min(cr), "All cover_letter tasks must finish before company_research starts"
|
||||
|
||||
|
||||
def test_fifo_within_type(tmp_db):
|
||||
|
|
@ -262,8 +256,8 @@ def test_fifo_within_type(tmp_db):
|
|||
assert [task_id for task_id, _ in log] == [10, 20, 30]
|
||||
|
||||
|
||||
def test_concurrent_batches_different_types(tmp_db):
|
||||
"""Two type batches run concurrently (LocalScheduler has no VRAM gating)."""
|
||||
def test_concurrent_batches_when_vram_allows(tmp_db):
|
||||
"""Two type batches start simultaneously when VRAM fits both."""
|
||||
started = {"cover_letter": threading.Event(), "company_research": threading.Event()}
|
||||
all_done = threading.Event()
|
||||
log = []
|
||||
|
|
@ -274,7 +268,8 @@ def test_concurrent_batches_different_types(tmp_db):
|
|||
if len(log) >= 2:
|
||||
all_done.set()
|
||||
|
||||
s = _start_scheduler(tmp_db, run_task)
|
||||
# VRAM=10.0 fits both cover_letter (2.5) and company_research (5.0) simultaneously
|
||||
s = _start_scheduler(tmp_db, run_task, available_vram=10.0)
|
||||
s.enqueue(1, "cover_letter", 1, None)
|
||||
s.enqueue(2, "company_research", 2, None)
|
||||
|
||||
|
|
@ -312,15 +307,8 @@ def test_new_tasks_picked_up_mid_batch(tmp_db):
|
|||
assert log == [1, 2]
|
||||
|
||||
|
||||
@pytest.mark.filterwarnings("ignore::pytest.PytestUnhandledThreadExceptionWarning")
|
||||
def test_worker_crash_does_not_stall_scheduler(tmp_db):
|
||||
"""If _run_task raises, the scheduler continues processing the next task.
|
||||
|
||||
The batch_worker intentionally lets the RuntimeError propagate to the thread
|
||||
boundary (so LocalScheduler can detect crash vs. normal exit). This produces
|
||||
a PytestUnhandledThreadExceptionWarning -- suppressed here because it is the
|
||||
expected behavior under test.
|
||||
"""
|
||||
def test_worker_crash_releases_vram(tmp_db):
|
||||
"""If _run_task raises, _reserved_vram returns to 0 and scheduler continues."""
|
||||
log, done = [], threading.Event()
|
||||
|
||||
def run_task(db_path, task_id, task_type, job_id, params):
|
||||
|
|
@ -329,15 +317,16 @@ def test_worker_crash_does_not_stall_scheduler(tmp_db):
|
|||
log.append(task_id)
|
||||
done.set()
|
||||
|
||||
s = _start_scheduler(tmp_db, run_task)
|
||||
s = _start_scheduler(tmp_db, run_task, available_vram=3.0)
|
||||
s.enqueue(1, "cover_letter", 1, None)
|
||||
s.enqueue(2, "cover_letter", 2, None)
|
||||
|
||||
assert done.wait(timeout=5.0), "timed out — task 2 never completed after task 1 crash"
|
||||
s.shutdown()
|
||||
|
||||
# Second task still ran despite first crashing
|
||||
# Second task still ran, VRAM was released
|
||||
assert 2 in log
|
||||
assert s._reserved_vram == 0.0
|
||||
|
||||
|
||||
def test_get_scheduler_returns_singleton(tmp_db):
|
||||
|
|
|
|||
|
|
@ -66,12 +66,8 @@ def test_sync_cookie_prgn_switch_param_overrides_yaml(profile_yaml, monkeypatch)
|
|||
assert any("prgn_ui=streamlit" in s for s in injected)
|
||||
|
||||
|
||||
def test_sync_cookie_free_tier_keeps_vue(profile_yaml, monkeypatch):
|
||||
"""Free-tier user with vue preference keeps vue (vue_ui_beta is free tier).
|
||||
|
||||
Previously this test verified a downgrade to streamlit. Vue SPA was opened
|
||||
to free tier in issue #20 — the downgrade path no longer triggers.
|
||||
"""
|
||||
def test_sync_cookie_downgrades_tier_resets_to_streamlit(profile_yaml, monkeypatch):
|
||||
"""Free-tier user with vue preference gets reset to streamlit."""
|
||||
import yaml as _yaml
|
||||
profile_yaml.write_text(_yaml.dump({"name": "T", "ui_preference": "vue"}))
|
||||
|
||||
|
|
@ -84,8 +80,8 @@ def test_sync_cookie_free_tier_keeps_vue(profile_yaml, monkeypatch):
|
|||
sync_ui_cookie(profile_yaml, tier="free")
|
||||
|
||||
saved = _yaml.safe_load(profile_yaml.read_text())
|
||||
assert saved["ui_preference"] == "vue"
|
||||
assert any("prgn_ui=vue" in s for s in injected)
|
||||
assert saved["ui_preference"] == "streamlit"
|
||||
assert any("prgn_ui=streamlit" in s for s in injected)
|
||||
|
||||
|
||||
def test_switch_ui_writes_yaml_and_calls_sync(profile_yaml, monkeypatch):
|
||||
|
|
|
|||
|
|
@ -236,7 +236,7 @@ class TestWizardStep:
|
|||
search_path = tmp_path / "config" / "search_profiles.yaml"
|
||||
_write_user_yaml(yaml_path, {})
|
||||
with patch("dev_api._wizard_yaml_path", return_value=str(yaml_path)):
|
||||
with patch("dev_api._search_prefs_path", return_value=search_path):
|
||||
with patch("dev_api.SEARCH_PREFS_PATH", search_path):
|
||||
r = client.post("/api/wizard/step",
|
||||
json={"step": 6, "data": {
|
||||
"titles": ["Software Engineer", "Backend Developer"],
|
||||
|
|
|
|||
|
|
@ -121,8 +121,7 @@ def test_byok_false_preserves_original_gating():
|
|||
# ── Vue UI Beta & Demo Tier tests ──────────────────────────────────────────────
|
||||
|
||||
def test_vue_ui_beta_free_tier():
|
||||
# Vue SPA is open to all tiers (issue #20 — beta restriction removed)
|
||||
assert can_use("free", "vue_ui_beta") is True
|
||||
assert can_use("free", "vue_ui_beta") is False
|
||||
|
||||
|
||||
def test_vue_ui_beta_paid_tier():
|
||||
|
|
|
|||
|
|
@ -16,14 +16,12 @@ import { computed, onMounted } from 'vue'
|
|||
import { RouterView, useRoute } from 'vue-router'
|
||||
import { useMotion } from './composables/useMotion'
|
||||
import { useHackerMode, useKonamiCode } from './composables/useEasterEgg'
|
||||
import { useTheme } from './composables/useTheme'
|
||||
import AppNav from './components/AppNav.vue'
|
||||
import { useDigestStore } from './stores/digest'
|
||||
|
||||
const motion = useMotion()
|
||||
const route = useRoute()
|
||||
const { toggle, restore } = useHackerMode()
|
||||
const { initTheme } = useTheme()
|
||||
const digestStore = useDigestStore()
|
||||
|
||||
const isWizard = computed(() => route.path.startsWith('/setup'))
|
||||
|
|
@ -31,8 +29,7 @@ const isWizard = computed(() => route.path.startsWith('/setup'))
|
|||
useKonamiCode(toggle)
|
||||
|
||||
onMounted(() => {
|
||||
initTheme() // apply persisted theme (hacker mode takes priority inside initTheme)
|
||||
restore() // kept for hacker mode re-entry on hard reload (initTheme handles it, belt+suspenders)
|
||||
restore() // re-apply hacker mode from localStorage on hard reload
|
||||
digestStore.fetchAll() // populate badge immediately, before user visits Digest tab
|
||||
})
|
||||
</script>
|
||||
|
|
|
|||
|
|
@ -73,11 +73,11 @@
|
|||
}
|
||||
|
||||
/* ── Accessible Solarpunk — dark (system dark mode) ─
|
||||
Activates when OS/browser is in dark mode AND no
|
||||
explicit theme is selected. Explicit [data-theme="*"]
|
||||
always wins over the system preference. */
|
||||
Activates when OS/browser is in dark mode.
|
||||
Uses :not([data-theme="hacker"]) so the Konami easter
|
||||
egg always wins over the system preference. */
|
||||
@media (prefers-color-scheme: dark) {
|
||||
:root:not([data-theme]) {
|
||||
:root:not([data-theme="hacker"]) {
|
||||
/* Brand — lighter greens readable on dark surfaces */
|
||||
--color-primary: #6ab870;
|
||||
--color-primary-hover: #7ecb84;
|
||||
|
|
@ -161,153 +161,6 @@
|
|||
--color-accent-glow-lg: rgba(0, 255, 65, 0.6);
|
||||
}
|
||||
|
||||
/* ── Explicit light — forces light even on dark-OS ─ */
|
||||
[data-theme="light"] {
|
||||
--color-primary: #2d5a27;
|
||||
--color-primary-hover: #234820;
|
||||
--color-primary-light: #e8f2e7;
|
||||
--color-surface: #eaeff8;
|
||||
--color-surface-alt: #dde4f0;
|
||||
--color-surface-raised: #f5f7fc;
|
||||
--color-border: #a8b8d0;
|
||||
--color-border-light: #ccd5e6;
|
||||
--color-text: #1a2338;
|
||||
--color-text-muted: #4a5c7a;
|
||||
--color-text-inverse: #eaeff8;
|
||||
--color-accent: #c4732a;
|
||||
--color-accent-hover: #a85c1f;
|
||||
--color-accent-light: #fdf0e4;
|
||||
--color-success: #3a7a32;
|
||||
--color-error: #c0392b;
|
||||
--color-warning: #d4891a;
|
||||
--color-info: #1e6091;
|
||||
--shadow-sm: 0 1px 3px rgba(26, 35, 56, 0.08), 0 1px 2px rgba(26, 35, 56, 0.04);
|
||||
--shadow-md: 0 4px 12px rgba(26, 35, 56, 0.1), 0 2px 4px rgba(26, 35, 56, 0.06);
|
||||
--shadow-lg: 0 10px 30px rgba(26, 35, 56, 0.12), 0 4px 8px rgba(26, 35, 56, 0.06);
|
||||
}
|
||||
|
||||
/* ── Explicit dark — forces dark even on light-OS ── */
|
||||
[data-theme="dark"] {
|
||||
--color-primary: #6ab870;
|
||||
--color-primary-hover: #7ecb84;
|
||||
--color-primary-light: #162616;
|
||||
--color-surface: #16202e;
|
||||
--color-surface-alt: #1e2a3a;
|
||||
--color-surface-raised: #263547;
|
||||
--color-border: #2d4060;
|
||||
--color-border-light: #233352;
|
||||
--color-text: #e4eaf5;
|
||||
--color-text-muted: #8da0bc;
|
||||
--color-text-inverse: #16202e;
|
||||
--color-accent: #e8a84a;
|
||||
--color-accent-hover: #f5bc60;
|
||||
--color-accent-light: #2d1e0a;
|
||||
--color-success: #5eb85e;
|
||||
--color-error: #e05252;
|
||||
--color-warning: #e8a84a;
|
||||
--color-info: #4da6e8;
|
||||
--shadow-sm: 0 1px 3px rgba(0, 0, 0, 0.3), 0 1px 2px rgba(0, 0, 0, 0.2);
|
||||
--shadow-md: 0 4px 12px rgba(0, 0, 0, 0.35), 0 2px 4px rgba(0, 0, 0, 0.2);
|
||||
--shadow-lg: 0 10px 30px rgba(0, 0, 0, 0.4), 0 4px 8px rgba(0, 0, 0, 0.2);
|
||||
}
|
||||
|
||||
/* ── Solarized Dark ──────────────────────────────── */
|
||||
/* Ethan Schoonover's Solarized palette (dark variant) */
|
||||
[data-theme="solarized-dark"] {
|
||||
--color-primary: #2aa198; /* cyan — used as primary brand color */
|
||||
--color-primary-hover: #35b8ad;
|
||||
--color-primary-light: #002b36;
|
||||
|
||||
--color-surface: #002b36; /* base03 */
|
||||
--color-surface-alt: #073642; /* base02 */
|
||||
--color-surface-raised: #0d4352;
|
||||
|
||||
--color-border: #073642;
|
||||
--color-border-light: #0a4a5a;
|
||||
|
||||
--color-text: #839496; /* base0 */
|
||||
--color-text-muted: #657b83; /* base00 */
|
||||
--color-text-inverse: #002b36;
|
||||
|
||||
--color-accent: #b58900; /* yellow */
|
||||
--color-accent-hover: #cb9f10;
|
||||
--color-accent-light: #1a1300;
|
||||
|
||||
--color-success: #859900; /* green */
|
||||
--color-error: #dc322f; /* red */
|
||||
--color-warning: #b58900; /* yellow */
|
||||
--color-info: #268bd2; /* blue */
|
||||
|
||||
--shadow-sm: 0 1px 3px rgba(0, 0, 0, 0.4), 0 1px 2px rgba(0, 0, 0, 0.3);
|
||||
--shadow-md: 0 4px 12px rgba(0, 0, 0, 0.45), 0 2px 4px rgba(0, 0, 0, 0.3);
|
||||
--shadow-lg: 0 10px 30px rgba(0, 0, 0, 0.5), 0 4px 8px rgba(0, 0, 0, 0.3);
|
||||
}
|
||||
|
||||
/* ── Solarized Light ─────────────────────────────── */
|
||||
[data-theme="solarized-light"] {
|
||||
--color-primary: #2aa198; /* cyan */
|
||||
--color-primary-hover: #1e8a82;
|
||||
--color-primary-light: #eee8d5;
|
||||
|
||||
--color-surface: #fdf6e3; /* base3 */
|
||||
--color-surface-alt: #eee8d5; /* base2 */
|
||||
--color-surface-raised: #fffdf7;
|
||||
|
||||
--color-border: #d3c9b0;
|
||||
--color-border-light: #e4dacc;
|
||||
|
||||
--color-text: #657b83; /* base00 */
|
||||
--color-text-muted: #839496; /* base0 */
|
||||
--color-text-inverse: #fdf6e3;
|
||||
|
||||
--color-accent: #b58900; /* yellow */
|
||||
--color-accent-hover: #9a7300;
|
||||
--color-accent-light: #fdf0c0;
|
||||
|
||||
--color-success: #859900; /* green */
|
||||
--color-error: #dc322f; /* red */
|
||||
--color-warning: #b58900; /* yellow */
|
||||
--color-info: #268bd2; /* blue */
|
||||
|
||||
--shadow-sm: 0 1px 3px rgba(101, 123, 131, 0.12), 0 1px 2px rgba(101, 123, 131, 0.08);
|
||||
--shadow-md: 0 4px 12px rgba(101, 123, 131, 0.15), 0 2px 4px rgba(101, 123, 131, 0.08);
|
||||
--shadow-lg: 0 10px 30px rgba(101, 123, 131, 0.18), 0 4px 8px rgba(101, 123, 131, 0.08);
|
||||
}
|
||||
|
||||
/* ── Colorblind-safe (deuteranopia/protanopia) ────── */
|
||||
/* Avoids red/green confusion. Uses blue+orange as the
|
||||
primary pair; cyan+magenta as semantic differentiators.
|
||||
Based on Wong (2011) 8-color colorblind-safe palette. */
|
||||
[data-theme="colorblind"] {
|
||||
--color-primary: #0072B2; /* blue — safe primary */
|
||||
--color-primary-hover: #005a8e;
|
||||
--color-primary-light: #e0f0fa;
|
||||
|
||||
--color-surface: #f4f6fb;
|
||||
--color-surface-alt: #e6eaf4;
|
||||
--color-surface-raised: #fafbfe;
|
||||
|
||||
--color-border: #b0bcd8;
|
||||
--color-border-light: #cdd5e8;
|
||||
|
||||
--color-text: #1a2338;
|
||||
--color-text-muted: #4a5c7a;
|
||||
--color-text-inverse: #f4f6fb;
|
||||
|
||||
--color-accent: #E69F00; /* orange — safe secondary */
|
||||
--color-accent-hover: #c98900;
|
||||
--color-accent-light: #fdf4dc;
|
||||
|
||||
--color-success: #009E73; /* teal-green — distinct from red/green confusion zone */
|
||||
--color-error: #CC0066; /* magenta-red — distinguishable from green */
|
||||
--color-warning: #E69F00; /* orange */
|
||||
--color-info: #56B4E9; /* sky blue */
|
||||
|
||||
--shadow-sm: 0 1px 3px rgba(26, 35, 56, 0.08), 0 1px 2px rgba(26, 35, 56, 0.04);
|
||||
--shadow-md: 0 4px 12px rgba(26, 35, 56, 0.1), 0 2px 4px rgba(26, 35, 56, 0.06);
|
||||
--shadow-lg: 0 10px 30px rgba(26, 35, 56, 0.12), 0 4px 8px rgba(26, 35, 56, 0.06);
|
||||
}
|
||||
|
||||
/* ── Base resets ─────────────────────────────────── */
|
||||
*, *::before, *::after { box-sizing: border-box; }
|
||||
|
||||
|
|
|
|||
|
|
@ -34,31 +34,12 @@
|
|||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Theme picker -->
|
||||
<div class="sidebar__theme" v-if="!isHackerMode">
|
||||
<label class="sidebar__theme-label" for="theme-select">Theme</label>
|
||||
<select
|
||||
id="theme-select"
|
||||
class="sidebar__theme-select"
|
||||
:value="currentTheme"
|
||||
@change="setTheme(($event.target as HTMLSelectElement).value as Theme)"
|
||||
aria-label="Select theme"
|
||||
>
|
||||
<option v-for="opt in THEME_OPTIONS" :key="opt.value" :value="opt.value">
|
||||
{{ opt.icon }} {{ opt.label }}
|
||||
</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<!-- Settings at bottom -->
|
||||
<div class="sidebar__footer">
|
||||
<RouterLink to="/settings" class="sidebar__link sidebar__link--footer" active-class="sidebar__link--active">
|
||||
<Cog6ToothIcon class="sidebar__icon" aria-hidden="true" />
|
||||
<span class="sidebar__label">Settings</span>
|
||||
</RouterLink>
|
||||
<button class="sidebar__classic-btn" @click="switchToClassic" title="Switch to Classic (Streamlit) UI">
|
||||
⚡ Classic
|
||||
</button>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
|
|
@ -95,10 +76,7 @@ import {
|
|||
} from '@heroicons/vue/24/outline'
|
||||
|
||||
import { useDigestStore } from '../stores/digest'
|
||||
import { useTheme, THEME_OPTIONS, type Theme } from '../composables/useTheme'
|
||||
|
||||
const digestStore = useDigestStore()
|
||||
const { currentTheme, setTheme, restoreTheme } = useTheme()
|
||||
|
||||
// Logo click easter egg — 9.6: Click the Bird 5× rapidly
|
||||
const logoClickCount = ref(0)
|
||||
|
|
@ -123,25 +101,8 @@ const isHackerMode = computed(() =>
|
|||
)
|
||||
|
||||
function exitHackerMode() {
|
||||
delete document.documentElement.dataset.theme
|
||||
localStorage.removeItem('cf-hacker-mode')
|
||||
restoreTheme()
|
||||
}
|
||||
|
||||
const _apiBase = import.meta.env.BASE_URL.replace(/\/$/, '')
|
||||
|
||||
async function switchToClassic() {
|
||||
// Persist preference via API so Streamlit reads streamlit from user.yaml
|
||||
// and won't re-set the cookie back to vue (avoids the ?prgn_switch rerun cycle)
|
||||
try {
|
||||
await fetch(_apiBase + '/api/settings/ui-preference', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ preference: 'streamlit' }),
|
||||
})
|
||||
} catch { /* non-fatal — cookie below is enough for immediate redirect */ }
|
||||
document.cookie = 'prgn_ui=streamlit; path=/; SameSite=Lax'
|
||||
// Navigate to root (no query params) — Caddy routes to Streamlit based on cookie
|
||||
window.location.href = window.location.origin + '/'
|
||||
}
|
||||
|
||||
const navLinks = computed(() => [
|
||||
|
|
@ -311,70 +272,6 @@ const mobileLinks = [
|
|||
margin: 0;
|
||||
}
|
||||
|
||||
.sidebar__classic-btn {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
width: 100%;
|
||||
padding: var(--space-2) var(--space-3);
|
||||
margin-top: var(--space-1);
|
||||
background: none;
|
||||
border: none;
|
||||
border-radius: var(--radius-md);
|
||||
color: var(--color-text-muted);
|
||||
font-size: var(--text-xs);
|
||||
font-weight: 500;
|
||||
cursor: pointer;
|
||||
opacity: 0.6;
|
||||
transition: opacity 150ms, background 150ms;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.sidebar__classic-btn:hover {
|
||||
opacity: 1;
|
||||
background: var(--color-surface-alt);
|
||||
}
|
||||
|
||||
/* ── Theme picker ───────────────────────────────────── */
|
||||
.sidebar__theme {
|
||||
padding: var(--space-2) var(--space-3);
|
||||
border-top: 1px solid var(--color-border-light);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-1);
|
||||
}
|
||||
|
||||
.sidebar__theme-label {
|
||||
font-size: var(--text-xs);
|
||||
color: var(--color-text-muted);
|
||||
font-weight: 500;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.05em;
|
||||
}
|
||||
|
||||
.sidebar__theme-select {
|
||||
width: 100%;
|
||||
padding: var(--space-2) var(--space-3);
|
||||
background: var(--color-surface-alt);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-md);
|
||||
color: var(--color-text);
|
||||
font-size: var(--text-sm);
|
||||
font-family: var(--font-body);
|
||||
cursor: pointer;
|
||||
appearance: auto;
|
||||
transition: border-color 150ms ease, background 150ms ease;
|
||||
}
|
||||
|
||||
.sidebar__theme-select:hover {
|
||||
border-color: var(--color-primary);
|
||||
background: var(--color-surface-raised);
|
||||
}
|
||||
|
||||
.sidebar__theme-select:focus-visible {
|
||||
outline: 2px solid var(--color-accent);
|
||||
outline-offset: 2px;
|
||||
}
|
||||
|
||||
/* ── Mobile tab bar (<1024px) ───────────────────────── */
|
||||
.app-tabbar {
|
||||
display: none; /* hidden on desktop */
|
||||
|
|
|
|||
|
|
@ -56,49 +56,6 @@
|
|||
<span v-if="gaps.length > 6" class="gaps-more">+{{ gaps.length - 6 }}</span>
|
||||
</div>
|
||||
|
||||
<!-- Resume Highlights -->
|
||||
<div
|
||||
v-if="resumeSkills.length || resumeDomains.length || resumeKeywords.length"
|
||||
class="resume-highlights"
|
||||
>
|
||||
<button class="section-toggle" @click="highlightsExpanded = !highlightsExpanded">
|
||||
<span class="section-toggle__label">My Resume Highlights</span>
|
||||
<span class="section-toggle__icon" aria-hidden="true">{{ highlightsExpanded ? '▲' : '▼' }}</span>
|
||||
</button>
|
||||
<div v-if="highlightsExpanded" class="highlights-body">
|
||||
<div v-if="resumeSkills.length" class="chips-group">
|
||||
<span class="chips-group__label">Skills</span>
|
||||
<div class="chips-wrap">
|
||||
<span
|
||||
v-for="s in resumeSkills" :key="s"
|
||||
class="hl-chip"
|
||||
:class="{ 'hl-chip--match': jobMatchSet.has(s.toLowerCase()) }"
|
||||
>{{ s }}</span>
|
||||
</div>
|
||||
</div>
|
||||
<div v-if="resumeDomains.length" class="chips-group">
|
||||
<span class="chips-group__label">Domains</span>
|
||||
<div class="chips-wrap">
|
||||
<span
|
||||
v-for="d in resumeDomains" :key="d"
|
||||
class="hl-chip"
|
||||
:class="{ 'hl-chip--match': jobMatchSet.has(d.toLowerCase()) }"
|
||||
>{{ d }}</span>
|
||||
</div>
|
||||
</div>
|
||||
<div v-if="resumeKeywords.length" class="chips-group">
|
||||
<span class="chips-group__label">Keywords</span>
|
||||
<div class="chips-wrap">
|
||||
<span
|
||||
v-for="k in resumeKeywords" :key="k"
|
||||
class="hl-chip"
|
||||
:class="{ 'hl-chip--match': jobMatchSet.has(k.toLowerCase()) }"
|
||||
>{{ k }}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<a v-if="job.url" :href="job.url" target="_blank" rel="noopener noreferrer" class="job-details__link">
|
||||
View listing ↗
|
||||
</a>
|
||||
|
|
@ -194,61 +151,6 @@
|
|||
<!-- ── ATS Resume Optimizer ──────────────────────────────── -->
|
||||
<ResumeOptimizerPanel :job-id="props.jobId" />
|
||||
|
||||
<!-- ── Application Q&A ───────────────────────────────────── -->
|
||||
<div class="qa-section">
|
||||
<button class="section-toggle" @click="qaExpanded = !qaExpanded">
|
||||
<span class="section-toggle__label">Application Q&A</span>
|
||||
<span v-if="qaItems.length" class="qa-count">{{ qaItems.length }}</span>
|
||||
<span class="section-toggle__icon" aria-hidden="true">{{ qaExpanded ? '▲' : '▼' }}</span>
|
||||
</button>
|
||||
|
||||
<div v-if="qaExpanded" class="qa-body">
|
||||
<p v-if="!qaItems.length" class="qa-empty">
|
||||
No questions yet — add one below to get LLM-suggested answers.
|
||||
</p>
|
||||
|
||||
<div v-for="(item, i) in qaItems" :key="item.id" class="qa-item">
|
||||
<div class="qa-item__header">
|
||||
<span class="qa-item__q">{{ item.question }}</span>
|
||||
<button class="qa-item__del" aria-label="Remove question" @click="removeQA(i)">✕</button>
|
||||
</div>
|
||||
<textarea
|
||||
class="qa-item__answer"
|
||||
:value="item.answer"
|
||||
placeholder="Your answer…"
|
||||
rows="3"
|
||||
@input="updateAnswer(item.id, ($event.target as HTMLTextAreaElement).value)"
|
||||
/>
|
||||
<button
|
||||
class="btn-ghost btn-ghost--sm qa-suggest-btn"
|
||||
:disabled="suggesting === item.id"
|
||||
@click="suggestAnswer(item)"
|
||||
>
|
||||
{{ suggesting === item.id ? '✨ Thinking…' : '✨ Suggest' }}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="qa-add">
|
||||
<input
|
||||
v-model="newQuestion"
|
||||
class="qa-add__input"
|
||||
placeholder="Add a question from the application…"
|
||||
@keydown.enter.prevent="addQA"
|
||||
/>
|
||||
<button class="btn-ghost btn-ghost--sm" :disabled="!newQuestion.trim()" @click="addQA">Add</button>
|
||||
</div>
|
||||
|
||||
<button
|
||||
v-if="qaItems.length"
|
||||
class="btn-ghost qa-save-btn"
|
||||
:disabled="qaSaved || qaSaving"
|
||||
@click="saveQA"
|
||||
>
|
||||
{{ qaSaving ? 'Saving…' : (qaSaved ? '✓ Saved' : 'Save All') }}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ── Bottom action bar ──────────────────────────────────── -->
|
||||
<div class="workspace__actions">
|
||||
<button
|
||||
|
|
@ -457,96 +359,6 @@ async function rejectListing() {
|
|||
setTimeout(() => emit('job-removed'), 1000)
|
||||
}
|
||||
|
||||
// ─── Resume highlights ────────────────────────────────────────────────────────
|
||||
|
||||
const resumeSkills = ref<string[]>([])
|
||||
const resumeDomains = ref<string[]>([])
|
||||
const resumeKeywords = ref<string[]>([])
|
||||
const highlightsExpanded = ref(false)
|
||||
|
||||
// Words from the resume that also appear in the job description text
|
||||
const jobMatchSet = computed<Set<string>>(() => {
|
||||
const desc = (job.value?.description ?? '').toLowerCase()
|
||||
const all = [...resumeSkills.value, ...resumeDomains.value, ...resumeKeywords.value]
|
||||
return new Set(all.filter(t => desc.includes(t.toLowerCase())))
|
||||
})
|
||||
|
||||
async function fetchResume() {
|
||||
const { data } = await useApiFetch<{ skills?: string[]; domains?: string[]; keywords?: string[] }>(
|
||||
'/api/settings/resume',
|
||||
)
|
||||
if (!data) return
|
||||
resumeSkills.value = data.skills ?? []
|
||||
resumeDomains.value = data.domains ?? []
|
||||
resumeKeywords.value = data.keywords ?? []
|
||||
if (resumeSkills.value.length || resumeDomains.value.length || resumeKeywords.value.length) {
|
||||
highlightsExpanded.value = true
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Application Q&A ─────────────────────────────────────────────────────────
|
||||
|
||||
interface QAItem { id: string; question: string; answer: string }
|
||||
|
||||
const qaItems = ref<QAItem[]>([])
|
||||
const qaExpanded = ref(false)
|
||||
const qaSaved = ref(true)
|
||||
const qaSaving = ref(false)
|
||||
const newQuestion = ref('')
|
||||
const suggesting = ref<string | null>(null)
|
||||
|
||||
function addQA() {
|
||||
const q = newQuestion.value.trim()
|
||||
if (!q) return
|
||||
qaItems.value = [...qaItems.value, { id: crypto.randomUUID(), question: q, answer: '' }]
|
||||
newQuestion.value = ''
|
||||
qaSaved.value = false
|
||||
qaExpanded.value = true
|
||||
}
|
||||
|
||||
function removeQA(index: number) {
|
||||
qaItems.value = qaItems.value.filter((_, i) => i !== index)
|
||||
qaSaved.value = false
|
||||
}
|
||||
|
||||
function updateAnswer(id: string, value: string) {
|
||||
qaItems.value = qaItems.value.map(q => q.id === id ? { ...q, answer: value } : q)
|
||||
qaSaved.value = false
|
||||
}
|
||||
|
||||
async function saveQA() {
|
||||
qaSaving.value = true
|
||||
const { error } = await useApiFetch(`/api/jobs/${props.jobId}/qa`, {
|
||||
method: 'PATCH',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ items: qaItems.value }),
|
||||
})
|
||||
qaSaving.value = false
|
||||
if (error) { showToast('Save failed — please try again'); return }
|
||||
qaSaved.value = true
|
||||
}
|
||||
|
||||
async function suggestAnswer(item: QAItem) {
|
||||
suggesting.value = item.id
|
||||
const { data, error } = await useApiFetch<{ answer: string }>(`/api/jobs/${props.jobId}/qa/suggest`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ question: item.question }),
|
||||
})
|
||||
suggesting.value = null
|
||||
if (error || !data?.answer) { showToast('Suggestion failed — check your LLM backend'); return }
|
||||
qaItems.value = qaItems.value.map(q => q.id === item.id ? { ...q, answer: data.answer } : q)
|
||||
qaSaved.value = false
|
||||
}
|
||||
|
||||
async function fetchQA() {
|
||||
const { data } = await useApiFetch<{ items: QAItem[] }>(`/api/jobs/${props.jobId}/qa`)
|
||||
if (data?.items?.length) {
|
||||
qaItems.value = data.items
|
||||
qaExpanded.value = true
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Toast ────────────────────────────────────────────────────────────────────
|
||||
|
||||
const toast = ref<string | null>(null)
|
||||
|
|
@ -594,10 +406,6 @@ onMounted(async () => {
|
|||
await fetchJob()
|
||||
loadingJob.value = false
|
||||
|
||||
// Load resume highlights and saved Q&A in parallel
|
||||
fetchResume()
|
||||
fetchQA()
|
||||
|
||||
// Check if a generation task is already in flight
|
||||
if (clState.value === 'none') {
|
||||
const { data } = await useApiFetch<{ status: string; stage: string | null }>(`/api/jobs/${props.jobId}/cover_letter/task`)
|
||||
|
|
@ -1035,205 +843,6 @@ declare module '../stores/review' {
|
|||
.toast-enter-active, .toast-leave-active { transition: opacity 250ms ease, transform 250ms ease; }
|
||||
.toast-enter-from, .toast-leave-to { opacity: 0; transform: translateX(-50%) translateY(8px); }
|
||||
|
||||
/* ── Resume Highlights ───────────────────────────────────────────────── */
|
||||
|
||||
.resume-highlights {
|
||||
border-top: 1px solid var(--color-border-light);
|
||||
padding-top: var(--space-3);
|
||||
}
|
||||
|
||||
.section-toggle {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-2);
|
||||
width: 100%;
|
||||
background: none;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
padding: 0;
|
||||
text-align: left;
|
||||
color: var(--color-text-muted);
|
||||
}
|
||||
|
||||
.section-toggle__label {
|
||||
font-size: var(--text-xs);
|
||||
font-weight: 700;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.04em;
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.section-toggle__icon {
|
||||
font-size: var(--text-xs);
|
||||
}
|
||||
|
||||
.highlights-body {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-2);
|
||||
margin-top: var(--space-2);
|
||||
}
|
||||
|
||||
.chips-group { display: flex; flex-direction: column; gap: 4px; }
|
||||
|
||||
.chips-group__label {
|
||||
font-size: 10px;
|
||||
font-weight: 700;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.06em;
|
||||
color: var(--color-text-muted);
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.chips-wrap { display: flex; flex-wrap: wrap; gap: 4px; }
|
||||
|
||||
.hl-chip {
|
||||
padding: 2px var(--space-2);
|
||||
border-radius: 999px;
|
||||
font-size: 11px;
|
||||
background: var(--color-surface-alt);
|
||||
border: 1px solid var(--color-border-light);
|
||||
color: var(--color-text-muted);
|
||||
}
|
||||
|
||||
.hl-chip--match {
|
||||
background: rgba(39, 174, 96, 0.10);
|
||||
border-color: rgba(39, 174, 96, 0.35);
|
||||
color: var(--color-success);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
/* ── Application Q&A ─────────────────────────────────────────────────── */
|
||||
|
||||
.qa-section {
|
||||
background: var(--color-surface-raised);
|
||||
border: 1px solid var(--color-border-light);
|
||||
border-radius: var(--radius-lg);
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.qa-section > .section-toggle {
|
||||
padding: var(--space-3) var(--space-4);
|
||||
color: var(--color-text);
|
||||
}
|
||||
|
||||
.qa-section > .section-toggle:hover { background: var(--color-surface-alt); }
|
||||
|
||||
.qa-count {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
border-radius: 50%;
|
||||
background: var(--app-primary-light);
|
||||
color: var(--app-primary);
|
||||
font-size: 10px;
|
||||
font-weight: 700;
|
||||
}
|
||||
|
||||
.qa-body {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-3);
|
||||
padding: var(--space-4);
|
||||
border-top: 1px solid var(--color-border-light);
|
||||
}
|
||||
|
||||
.qa-empty {
|
||||
font-size: var(--text-xs);
|
||||
color: var(--color-text-muted);
|
||||
text-align: center;
|
||||
padding: var(--space-2) 0;
|
||||
}
|
||||
|
||||
.qa-item {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-1);
|
||||
padding-bottom: var(--space-3);
|
||||
border-bottom: 1px solid var(--color-border-light);
|
||||
}
|
||||
|
||||
.qa-item:last-of-type { border-bottom: none; }
|
||||
|
||||
.qa-item__header {
|
||||
display: flex;
|
||||
align-items: flex-start;
|
||||
justify-content: space-between;
|
||||
gap: var(--space-2);
|
||||
}
|
||||
|
||||
.qa-item__q {
|
||||
font-size: var(--text-sm);
|
||||
font-weight: 600;
|
||||
color: var(--color-text);
|
||||
line-height: 1.4;
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.qa-item__del {
|
||||
background: none;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
font-size: var(--text-xs);
|
||||
color: var(--color-text-muted);
|
||||
padding: 2px 4px;
|
||||
flex-shrink: 0;
|
||||
opacity: 0.5;
|
||||
transition: opacity 150ms;
|
||||
}
|
||||
|
||||
.qa-item__del:hover { opacity: 1; color: var(--color-error); }
|
||||
|
||||
.qa-item__answer {
|
||||
width: 100%;
|
||||
padding: var(--space-2) var(--space-3);
|
||||
border: 1px solid var(--color-border-light);
|
||||
border-radius: var(--radius-md);
|
||||
background: var(--color-surface-alt);
|
||||
color: var(--color-text);
|
||||
font-family: var(--font-body);
|
||||
font-size: var(--text-sm);
|
||||
line-height: 1.5;
|
||||
resize: vertical;
|
||||
min-height: 72px;
|
||||
}
|
||||
|
||||
.qa-item__answer:focus {
|
||||
outline: none;
|
||||
border-color: var(--app-primary);
|
||||
}
|
||||
|
||||
.qa-suggest-btn { align-self: flex-end; }
|
||||
|
||||
.qa-add {
|
||||
display: flex;
|
||||
gap: var(--space-2);
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.qa-add__input {
|
||||
flex: 1;
|
||||
padding: var(--space-2) var(--space-3);
|
||||
border: 1px solid var(--color-border-light);
|
||||
border-radius: var(--radius-md);
|
||||
background: var(--color-surface-alt);
|
||||
color: var(--color-text);
|
||||
font-family: var(--font-body);
|
||||
font-size: var(--text-sm);
|
||||
min-height: 36px;
|
||||
}
|
||||
|
||||
.qa-add__input:focus {
|
||||
outline: none;
|
||||
border-color: var(--app-primary);
|
||||
}
|
||||
|
||||
.qa-add__input::placeholder { color: var(--color-text-muted); }
|
||||
|
||||
.qa-save-btn { align-self: flex-end; }
|
||||
|
||||
/* ── Responsive ──────────────────────────────────────────────────────── */
|
||||
|
||||
@media (max-width: 900px) {
|
||||
|
|
|
|||
|
|
@ -1,412 +0,0 @@
|
|||
<template>
|
||||
<Teleport to="body">
|
||||
<div class="modal-backdrop" role="dialog" aria-modal="true" :aria-labelledby="`research-title-${jobId}`" @click.self="emit('close')">
|
||||
<div class="modal-card">
|
||||
<!-- Header -->
|
||||
<div class="modal-header">
|
||||
<h2 :id="`research-title-${jobId}`" class="modal-title">
|
||||
🔍 {{ jobTitle }} — Company Research
|
||||
</h2>
|
||||
<div class="modal-header-actions">
|
||||
<button v-if="state === 'ready'" class="btn-regen" @click="generate" title="Refresh research">↺ Refresh</button>
|
||||
<button class="btn-close" @click="emit('close')" aria-label="Close">✕</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Generating state -->
|
||||
<div v-if="state === 'generating'" class="modal-body modal-body--loading">
|
||||
<div class="research-spinner" aria-hidden="true" />
|
||||
<p class="generating-msg">{{ stage ?? 'Researching…' }}</p>
|
||||
<p class="generating-sub">This takes 30–90 seconds depending on your LLM backend.</p>
|
||||
</div>
|
||||
|
||||
<!-- Error state -->
|
||||
<div v-else-if="state === 'error'" class="modal-body modal-body--error">
|
||||
<p>Research generation failed.</p>
|
||||
<p v-if="errorMsg" class="error-detail">{{ errorMsg }}</p>
|
||||
<button class="btn-primary-sm" @click="generate">Retry</button>
|
||||
</div>
|
||||
|
||||
<!-- Ready state -->
|
||||
<div v-else-if="state === 'ready' && brief" class="modal-body">
|
||||
<p v-if="brief.generated_at" class="generated-at">
|
||||
Updated {{ fmtDate(brief.generated_at) }}
|
||||
</p>
|
||||
|
||||
<section v-if="brief.company_brief" class="research-section">
|
||||
<h3 class="section-title">🏢 Company</h3>
|
||||
<p class="section-body">{{ brief.company_brief }}</p>
|
||||
</section>
|
||||
|
||||
<section v-if="brief.ceo_brief" class="research-section">
|
||||
<h3 class="section-title">👤 Leadership</h3>
|
||||
<p class="section-body">{{ brief.ceo_brief }}</p>
|
||||
</section>
|
||||
|
||||
<section v-if="brief.talking_points" class="research-section">
|
||||
<div class="section-title-row">
|
||||
<h3 class="section-title">💬 Talking Points</h3>
|
||||
<button class="btn-copy" @click="copy(brief.talking_points!)" :aria-label="copied ? 'Copied!' : 'Copy talking points'">
|
||||
{{ copied ? '✓ Copied' : '⎘ Copy' }}
|
||||
</button>
|
||||
</div>
|
||||
<p class="section-body">{{ brief.talking_points }}</p>
|
||||
</section>
|
||||
|
||||
<section v-if="brief.tech_brief" class="research-section">
|
||||
<h3 class="section-title">⚙️ Tech Stack</h3>
|
||||
<p class="section-body">{{ brief.tech_brief }}</p>
|
||||
</section>
|
||||
|
||||
<section v-if="brief.funding_brief" class="research-section">
|
||||
<h3 class="section-title">💰 Funding & Stage</h3>
|
||||
<p class="section-body">{{ brief.funding_brief }}</p>
|
||||
</section>
|
||||
|
||||
<section v-if="brief.red_flags" class="research-section research-section--warn">
|
||||
<h3 class="section-title">⚠️ Red Flags</h3>
|
||||
<p class="section-body">{{ brief.red_flags }}</p>
|
||||
</section>
|
||||
|
||||
<section v-if="brief.accessibility_brief" class="research-section">
|
||||
<h3 class="section-title">♿ Inclusion & Accessibility</h3>
|
||||
<p class="section-body section-body--private">{{ brief.accessibility_brief }}</p>
|
||||
<p class="private-note">For your decision-making only — not disclosed in applications.</p>
|
||||
</section>
|
||||
</div>
|
||||
|
||||
<!-- Empty state (no research, not generating) -->
|
||||
<div v-else class="modal-body modal-body--empty">
|
||||
<p>No research yet for this company.</p>
|
||||
<button class="btn-primary-sm" @click="generate">🔍 Generate Research</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</Teleport>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, onMounted, onUnmounted } from 'vue'
|
||||
import { useApiFetch } from '../composables/useApi'
|
||||
|
||||
const props = defineProps<{
|
||||
jobId: number
|
||||
jobTitle: string
|
||||
autoGenerate?: boolean
|
||||
}>()
|
||||
|
||||
const emit = defineEmits<{ close: [] }>()
|
||||
|
||||
interface ResearchBrief {
|
||||
company_brief: string | null
|
||||
ceo_brief: string | null
|
||||
talking_points: string | null
|
||||
tech_brief: string | null
|
||||
funding_brief: string | null
|
||||
red_flags: string | null
|
||||
accessibility_brief: string | null
|
||||
generated_at: string | null
|
||||
}
|
||||
|
||||
type ModalState = 'loading' | 'generating' | 'ready' | 'empty' | 'error'
|
||||
|
||||
const state = ref<ModalState>('loading')
|
||||
const brief = ref<ResearchBrief | null>(null)
|
||||
const stage = ref<string | null>(null)
|
||||
const errorMsg = ref<string | null>(null)
|
||||
const copied = ref(false)
|
||||
let pollId: ReturnType<typeof setInterval> | null = null
|
||||
|
||||
function fmtDate(iso: string) {
|
||||
const d = new Date(iso)
|
||||
const diffH = Math.round((Date.now() - d.getTime()) / 3600000)
|
||||
if (diffH < 1) return 'just now'
|
||||
if (diffH < 24) return `${diffH}h ago`
|
||||
if (diffH < 168) return `${Math.floor(diffH / 24)}d ago`
|
||||
return d.toLocaleDateString([], { month: 'short', day: 'numeric' })
|
||||
}
|
||||
|
||||
async function copy(text: string) {
|
||||
await navigator.clipboard.writeText(text)
|
||||
copied.value = true
|
||||
setTimeout(() => { copied.value = false }, 2000)
|
||||
}
|
||||
|
||||
function stopPoll() {
|
||||
if (pollId) { clearInterval(pollId); pollId = null }
|
||||
}
|
||||
|
||||
async function pollTask() {
|
||||
const { data } = await useApiFetch<{ status: string; stage: string | null; message: string | null }>(
|
||||
`/api/jobs/${props.jobId}/research/task`,
|
||||
)
|
||||
if (!data) return
|
||||
stage.value = data.stage
|
||||
|
||||
if (data.status === 'completed') {
|
||||
stopPoll()
|
||||
await load()
|
||||
} else if (data.status === 'failed') {
|
||||
stopPoll()
|
||||
state.value = 'error'
|
||||
errorMsg.value = data.message ?? 'Unknown error'
|
||||
}
|
||||
}
|
||||
|
||||
async function load() {
|
||||
const { data, error } = await useApiFetch<ResearchBrief>(`/api/jobs/${props.jobId}/research`)
|
||||
if (error) {
|
||||
if (error.kind === 'http' && error.status === 404) {
|
||||
// Check if a task is running
|
||||
const { data: task } = await useApiFetch<{ status: string; stage: string | null; message: string | null }>(
|
||||
`/api/jobs/${props.jobId}/research/task`,
|
||||
)
|
||||
if (task && (task.status === 'queued' || task.status === 'running')) {
|
||||
state.value = 'generating'
|
||||
stage.value = task.stage
|
||||
pollId = setInterval(pollTask, 3000)
|
||||
} else if (props.autoGenerate) {
|
||||
await generate()
|
||||
} else {
|
||||
state.value = 'empty'
|
||||
}
|
||||
} else {
|
||||
state.value = 'error'
|
||||
errorMsg.value = error.kind === 'http' ? error.detail : error.message
|
||||
}
|
||||
return
|
||||
}
|
||||
brief.value = data
|
||||
state.value = 'ready'
|
||||
}
|
||||
|
||||
async function generate() {
|
||||
state.value = 'generating'
|
||||
stage.value = null
|
||||
errorMsg.value = null
|
||||
stopPoll()
|
||||
const { error } = await useApiFetch(`/api/jobs/${props.jobId}/research/generate`, { method: 'POST' })
|
||||
if (error) {
|
||||
state.value = 'error'
|
||||
errorMsg.value = error.kind === 'http' ? error.detail : error.message
|
||||
return
|
||||
}
|
||||
pollId = setInterval(pollTask, 3000)
|
||||
}
|
||||
|
||||
function onEsc(e: KeyboardEvent) {
|
||||
if (e.key === 'Escape') emit('close')
|
||||
}
|
||||
|
||||
onMounted(async () => {
|
||||
document.addEventListener('keydown', onEsc)
|
||||
await load()
|
||||
})
|
||||
|
||||
onUnmounted(() => {
|
||||
document.removeEventListener('keydown', onEsc)
|
||||
stopPoll()
|
||||
})
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.modal-backdrop {
|
||||
position: fixed;
|
||||
inset: 0;
|
||||
background: rgba(0, 0, 0, 0.55);
|
||||
z-index: 500;
|
||||
display: flex;
|
||||
align-items: flex-start;
|
||||
justify-content: center;
|
||||
padding: var(--space-8) var(--space-4);
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.modal-card {
|
||||
background: var(--color-surface-raised);
|
||||
border-radius: var(--radius-lg);
|
||||
box-shadow: 0 8px 40px rgba(0, 0, 0, 0.3);
|
||||
width: 100%;
|
||||
max-width: 620px;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
display: flex;
|
||||
align-items: flex-start;
|
||||
justify-content: space-between;
|
||||
gap: var(--space-3);
|
||||
padding: var(--space-5) var(--space-6);
|
||||
border-bottom: 1px solid var(--color-border-light);
|
||||
}
|
||||
|
||||
.modal-title {
|
||||
font-size: 1rem;
|
||||
font-weight: 700;
|
||||
color: var(--color-text);
|
||||
margin: 0;
|
||||
line-height: 1.3;
|
||||
}
|
||||
|
||||
.modal-header-actions {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-2);
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.btn-close {
|
||||
background: none;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
font-size: 1rem;
|
||||
color: var(--color-text-muted);
|
||||
padding: 2px 6px;
|
||||
}
|
||||
|
||||
.btn-regen {
|
||||
background: none;
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-sm);
|
||||
cursor: pointer;
|
||||
font-size: 0.78rem;
|
||||
color: var(--color-text-muted);
|
||||
padding: 2px 8px;
|
||||
}
|
||||
|
||||
.modal-body {
|
||||
padding: var(--space-6);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-5);
|
||||
max-height: 70vh;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.modal-body--loading {
|
||||
align-items: center;
|
||||
text-align: center;
|
||||
padding: var(--space-10) var(--space-6);
|
||||
gap: var(--space-4);
|
||||
}
|
||||
|
||||
.modal-body--empty {
|
||||
align-items: center;
|
||||
text-align: center;
|
||||
padding: var(--space-10) var(--space-6);
|
||||
gap: var(--space-4);
|
||||
color: var(--color-text-muted);
|
||||
}
|
||||
|
||||
.modal-body--error {
|
||||
align-items: center;
|
||||
text-align: center;
|
||||
padding: var(--space-8) var(--space-6);
|
||||
gap: var(--space-3);
|
||||
color: var(--color-error);
|
||||
}
|
||||
|
||||
.error-detail {
|
||||
font-size: 0.8rem;
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
.research-spinner {
|
||||
width: 36px;
|
||||
height: 36px;
|
||||
border: 3px solid var(--color-border);
|
||||
border-top-color: var(--color-primary);
|
||||
border-radius: 50%;
|
||||
animation: spin 0.8s linear infinite;
|
||||
}
|
||||
|
||||
@keyframes spin { to { transform: rotate(360deg); } }
|
||||
|
||||
.generating-msg {
|
||||
font-weight: 600;
|
||||
color: var(--color-text);
|
||||
}
|
||||
|
||||
.generating-sub {
|
||||
font-size: 0.8rem;
|
||||
color: var(--color-text-muted);
|
||||
}
|
||||
|
||||
.generated-at {
|
||||
font-size: 0.75rem;
|
||||
color: var(--color-text-muted);
|
||||
margin-bottom: calc(-1 * var(--space-2));
|
||||
}
|
||||
|
||||
.research-section {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-2);
|
||||
padding-bottom: var(--space-4);
|
||||
border-bottom: 1px solid var(--color-border-light);
|
||||
}
|
||||
|
||||
.research-section:last-child {
|
||||
border-bottom: none;
|
||||
padding-bottom: 0;
|
||||
}
|
||||
|
||||
.research-section--warn .section-title {
|
||||
color: var(--color-warning);
|
||||
}
|
||||
|
||||
.section-title-row {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
}
|
||||
|
||||
.section-title {
|
||||
font-size: 0.8rem;
|
||||
font-weight: 700;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.04em;
|
||||
color: var(--color-text-muted);
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.section-body {
|
||||
font-size: 0.875rem;
|
||||
color: var(--color-text);
|
||||
line-height: 1.6;
|
||||
white-space: pre-wrap;
|
||||
}
|
||||
|
||||
.section-body--private {
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
.private-note {
|
||||
font-size: 0.7rem;
|
||||
color: var(--color-text-muted);
|
||||
}
|
||||
|
||||
.btn-copy {
|
||||
background: none;
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-sm);
|
||||
cursor: pointer;
|
||||
font-size: 0.72rem;
|
||||
color: var(--color-text-muted);
|
||||
padding: 2px 8px;
|
||||
transition: color 150ms, border-color 150ms;
|
||||
}
|
||||
|
||||
.btn-copy:hover { color: var(--color-primary); border-color: var(--color-primary); }
|
||||
|
||||
.btn-primary-sm {
|
||||
background: var(--color-primary);
|
||||
color: #fff;
|
||||
border: none;
|
||||
border-radius: var(--radius-md);
|
||||
padding: var(--space-2) var(--space-5);
|
||||
font-size: 0.875rem;
|
||||
font-weight: 600;
|
||||
cursor: pointer;
|
||||
}
|
||||
</style>
|
||||
|
|
@ -13,7 +13,6 @@ const emit = defineEmits<{
|
|||
move: [jobId: number, preSelectedStage?: PipelineStage]
|
||||
prep: [jobId: number]
|
||||
survey: [jobId: number]
|
||||
research: [jobId: number]
|
||||
}>()
|
||||
|
||||
// Signal state
|
||||
|
|
@ -181,7 +180,6 @@ const columnColor = computed(() => {
|
|||
</div>
|
||||
<footer class="card-footer">
|
||||
<button class="card-action" @click.stop="emit('move', job.id)">Move to… ›</button>
|
||||
<button v-if="['phone_screen', 'interviewing', 'offer'].includes(job.status)" class="card-action" @click.stop="emit('research', job.id)">🔍 Research</button>
|
||||
<button v-if="['phone_screen', 'interviewing', 'offer'].includes(job.status)" class="card-action" @click.stop="emit('prep', job.id)">Prep →</button>
|
||||
<button
|
||||
v-if="['survey', 'phone_screen', 'interviewing', 'offer'].includes(job.status)"
|
||||
|
|
|
|||
|
|
@ -2,15 +2,12 @@ export type ApiError =
|
|||
| { kind: 'network'; message: string }
|
||||
| { kind: 'http'; status: number; detail: string }
|
||||
|
||||
// Strip trailing slash so '/peregrine/' + '/api/...' → '/peregrine/api/...'
|
||||
const _apiBase = import.meta.env.BASE_URL.replace(/\/$/, '')
|
||||
|
||||
export async function useApiFetch<T>(
|
||||
url: string,
|
||||
opts?: RequestInit,
|
||||
): Promise<{ data: T | null; error: ApiError | null }> {
|
||||
try {
|
||||
const res = await fetch(_apiBase + url, opts)
|
||||
const res = await fetch(url, opts)
|
||||
if (!res.ok) {
|
||||
const detail = await res.text().catch(() => '')
|
||||
return { data: null, error: { kind: 'http', status: res.status, detail } }
|
||||
|
|
@ -34,7 +31,7 @@ export function useApiSSE(
|
|||
onComplete?: () => void,
|
||||
onError?: (e: Event) => void,
|
||||
): () => void {
|
||||
const es = new EventSource(_apiBase + url)
|
||||
const es = new EventSource(url)
|
||||
es.onmessage = (e) => {
|
||||
try {
|
||||
const data = JSON.parse(e.data) as Record<string, unknown>
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import { onMounted, onUnmounted } from 'vue'
|
||||
import { useTheme } from './useTheme'
|
||||
|
||||
const KONAMI = ['ArrowUp','ArrowUp','ArrowDown','ArrowDown','ArrowLeft','ArrowRight','ArrowLeft','ArrowRight','b','a']
|
||||
const KONAMI_AB = ['ArrowUp','ArrowUp','ArrowDown','ArrowDown','ArrowLeft','ArrowRight','ArrowLeft','ArrowRight','a','b']
|
||||
|
|
@ -32,10 +31,8 @@ export function useHackerMode() {
|
|||
function toggle() {
|
||||
const root = document.documentElement
|
||||
if (root.dataset.theme === 'hacker') {
|
||||
delete root.dataset.theme
|
||||
localStorage.removeItem('cf-hacker-mode')
|
||||
// Let useTheme restore the user's chosen theme rather than just deleting data-theme
|
||||
const { restoreTheme } = useTheme()
|
||||
restoreTheme()
|
||||
} else {
|
||||
root.dataset.theme = 'hacker'
|
||||
localStorage.setItem('cf-hacker-mode', 'true')
|
||||
|
|
|
|||
|
|
@ -1,82 +0,0 @@
|
|||
/**
|
||||
* useTheme — manual theme picker for Peregrine.
|
||||
*
|
||||
* Themes: 'auto' | 'light' | 'dark' | 'solarized-dark' | 'solarized-light' | 'colorblind'
|
||||
* Persisted in localStorage under 'cf-theme'.
|
||||
* Applied via document.documentElement.dataset.theme.
|
||||
* 'auto' removes the attribute so the @media prefers-color-scheme rule takes effect.
|
||||
*
|
||||
* Hacker mode sits on top of this system — toggling it off calls restoreTheme()
|
||||
* so the user's chosen theme is reinstated rather than dropping back to auto.
|
||||
*/
|
||||
|
||||
import { ref, readonly } from 'vue'
|
||||
import { useApiFetch } from './useApi'
|
||||
|
||||
export type Theme = 'auto' | 'light' | 'dark' | 'solarized-dark' | 'solarized-light' | 'colorblind'
|
||||
|
||||
const STORAGE_KEY = 'cf-theme'
|
||||
const HACKER_KEY = 'cf-hacker-mode'
|
||||
|
||||
export const THEME_OPTIONS: { value: Theme; label: string; icon: string }[] = [
|
||||
{ value: 'auto', label: 'Auto', icon: '⬡' },
|
||||
{ value: 'light', label: 'Light', icon: '☀' },
|
||||
{ value: 'dark', label: 'Dark', icon: '🌙' },
|
||||
{ value: 'solarized-light', label: 'Solarized Light', icon: '🌤' },
|
||||
{ value: 'solarized-dark', label: 'Solarized Dark', icon: '🌃' },
|
||||
{ value: 'colorblind', label: 'Colorblind Safe', icon: '♿' },
|
||||
]
|
||||
|
||||
// Module-level singleton so all consumers share the same reactive state.
|
||||
const _current = ref<Theme>(_load())
|
||||
|
||||
function _load(): Theme {
|
||||
return (localStorage.getItem(STORAGE_KEY) as Theme | null) ?? 'auto'
|
||||
}
|
||||
|
||||
function _apply(theme: Theme) {
|
||||
const root = document.documentElement
|
||||
if (theme === 'auto') {
|
||||
delete root.dataset.theme
|
||||
} else {
|
||||
root.dataset.theme = theme
|
||||
}
|
||||
}
|
||||
|
||||
export function useTheme() {
|
||||
function setTheme(theme: Theme) {
|
||||
_current.value = theme
|
||||
localStorage.setItem(STORAGE_KEY, theme)
|
||||
_apply(theme)
|
||||
// Best-effort persist to server; ignore failures (works offline / local LLM)
|
||||
useApiFetch('/api/settings/theme', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ theme }),
|
||||
}).catch(() => {})
|
||||
}
|
||||
|
||||
/** Restore user's chosen theme — called when hacker mode or other overlays exit. */
|
||||
function restoreTheme() {
|
||||
// Hacker mode clears itself; we only restore if it's actually off.
|
||||
if (localStorage.getItem(HACKER_KEY) === 'true') return
|
||||
_apply(_current.value)
|
||||
}
|
||||
|
||||
/** Call once at app boot to apply persisted theme before first render. */
|
||||
function initTheme() {
|
||||
// Hacker mode takes priority on restore.
|
||||
if (localStorage.getItem(HACKER_KEY) === 'true') {
|
||||
document.documentElement.dataset.theme = 'hacker'
|
||||
} else {
|
||||
_apply(_current.value)
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
currentTheme: readonly(_current),
|
||||
setTheme,
|
||||
restoreTheme,
|
||||
initTheme,
|
||||
}
|
||||
}
|
||||
|
|
@ -2,12 +2,6 @@ import { ref } from 'vue'
|
|||
import { defineStore } from 'pinia'
|
||||
import { useApiFetch } from '../../composables/useApi'
|
||||
|
||||
export interface TrainingPair {
|
||||
index: number
|
||||
instruction: string
|
||||
source_file: string
|
||||
}
|
||||
|
||||
export const useFineTuneStore = defineStore('settings/fineTune', () => {
|
||||
const step = ref(1)
|
||||
const inFlightJob = ref(false)
|
||||
|
|
@ -16,8 +10,6 @@ export const useFineTuneStore = defineStore('settings/fineTune', () => {
|
|||
const quotaRemaining = ref<number | null>(null)
|
||||
const uploading = ref(false)
|
||||
const loading = ref(false)
|
||||
const pairs = ref<TrainingPair[]>([])
|
||||
const pairsLoading = ref(false)
|
||||
let _pollTimer: ReturnType<typeof setInterval> | null = null
|
||||
|
||||
function resetStep() { step.value = 1 }
|
||||
|
|
@ -45,26 +37,6 @@ export const useFineTuneStore = defineStore('settings/fineTune', () => {
|
|||
if (!error && data) { inFlightJob.value = true; jobStatus.value = 'queued' }
|
||||
}
|
||||
|
||||
async function loadPairs() {
|
||||
pairsLoading.value = true
|
||||
const { data } = await useApiFetch<{ pairs: TrainingPair[]; total: number }>('/api/settings/fine-tune/pairs')
|
||||
pairsLoading.value = false
|
||||
if (data) {
|
||||
pairs.value = data.pairs
|
||||
pairsCount.value = data.total
|
||||
}
|
||||
}
|
||||
|
||||
async function deletePair(index: number) {
|
||||
const { data } = await useApiFetch<{ ok: boolean; remaining: number }>(
|
||||
`/api/settings/fine-tune/pairs/${index}`, { method: 'DELETE' }
|
||||
)
|
||||
if (data?.ok) {
|
||||
pairs.value = pairs.value.filter(p => p.index !== index).map((p, i) => ({ ...p, index: i }))
|
||||
pairsCount.value = data.remaining
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
step,
|
||||
inFlightJob,
|
||||
|
|
@ -73,14 +45,10 @@ export const useFineTuneStore = defineStore('settings/fineTune', () => {
|
|||
quotaRemaining,
|
||||
uploading,
|
||||
loading,
|
||||
pairs,
|
||||
pairsLoading,
|
||||
resetStep,
|
||||
loadStatus,
|
||||
startPolling,
|
||||
stopPolling,
|
||||
submitJob,
|
||||
loadPairs,
|
||||
deletePair,
|
||||
}
|
||||
})
|
||||
|
|
|
|||
|
|
@ -18,7 +18,6 @@ export const useSearchStore = defineStore('settings/search', () => {
|
|||
|
||||
const titleSuggestions = ref<string[]>([])
|
||||
const locationSuggestions = ref<string[]>([])
|
||||
const excludeSuggestions = ref<string[]>([])
|
||||
|
||||
const loading = ref(false)
|
||||
const saving = ref(false)
|
||||
|
|
@ -100,24 +99,10 @@ export const useSearchStore = defineStore('settings/search', () => {
|
|||
arr.value = arr.value.filter(v => v !== value)
|
||||
}
|
||||
|
||||
async function suggestExcludeKeywords() {
|
||||
const { data } = await useApiFetch<{ suggestions: string[] }>('/api/settings/search/suggest', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ type: 'exclude_keywords', current: exclude_keywords.value }),
|
||||
})
|
||||
if (data?.suggestions) {
|
||||
excludeSuggestions.value = data.suggestions.filter(s => !exclude_keywords.value.includes(s))
|
||||
}
|
||||
}
|
||||
|
||||
function acceptSuggestion(type: 'title' | 'location' | 'exclude', value: string) {
|
||||
function acceptSuggestion(type: 'title' | 'location', value: string) {
|
||||
if (type === 'title') {
|
||||
if (!job_titles.value.includes(value)) job_titles.value = [...job_titles.value, value]
|
||||
titleSuggestions.value = titleSuggestions.value.filter(s => s !== value)
|
||||
} else if (type === 'exclude') {
|
||||
if (!exclude_keywords.value.includes(value)) exclude_keywords.value = [...exclude_keywords.value, value]
|
||||
excludeSuggestions.value = excludeSuggestions.value.filter(s => s !== value)
|
||||
} else {
|
||||
if (!locations.value.includes(value)) locations.value = [...locations.value, value]
|
||||
locationSuggestions.value = locationSuggestions.value.filter(s => s !== value)
|
||||
|
|
@ -133,9 +118,8 @@ export const useSearchStore = defineStore('settings/search', () => {
|
|||
return {
|
||||
remote_preference, job_titles, locations, exclude_keywords, job_boards,
|
||||
custom_board_urls, blocklist_companies, blocklist_industries, blocklist_locations,
|
||||
titleSuggestions, locationSuggestions, excludeSuggestions,
|
||||
titleSuggestions, locationSuggestions,
|
||||
loading, saving, saveError, loadError,
|
||||
load, save, suggestTitles, suggestLocations, suggestExcludeKeywords,
|
||||
addTag, removeTag, acceptSuggestion, toggleBoard,
|
||||
load, save, suggestTitles, suggestLocations, addTag, removeTag, acceptSuggestion, toggleBoard,
|
||||
}
|
||||
})
|
||||
|
|
|
|||
|
|
@ -53,13 +53,6 @@
|
|||
:loading="taskRunning === 'score'"
|
||||
@click="scoreUnscored"
|
||||
/>
|
||||
<WorkflowButton
|
||||
emoji="🔍"
|
||||
label="Fill Missing Descriptions"
|
||||
description="Re-fetch truncated job descriptions"
|
||||
:loading="taskRunning === 'enrich'"
|
||||
@click="runEnrich"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<button
|
||||
|
|
@ -87,6 +80,7 @@
|
|||
? `Last enriched ${formatRelative(store.status.enrichment_last_run)}`
|
||||
: 'Auto-enrichment active' }}
|
||||
</span>
|
||||
<button class="btn-ghost btn-ghost--sm" @click="runEnrich">Run Now</button>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
|
|
@ -168,192 +162,22 @@
|
|||
</div>
|
||||
</section>
|
||||
|
||||
<!-- Danger Zone -->
|
||||
<!-- Advanced -->
|
||||
<section class="home__section">
|
||||
<details class="danger-zone">
|
||||
<summary class="danger-zone__summary">⚠️ Danger Zone</summary>
|
||||
<div class="danger-zone__body">
|
||||
|
||||
<!-- Queue reset -->
|
||||
<div class="dz-block">
|
||||
<p class="dz-block__title">Queue reset</p>
|
||||
<p class="dz-block__desc">
|
||||
Archive clears your review queue while keeping job URLs for dedup — same listings
|
||||
won't resurface on the next discovery run. Use hard purge only for a full clean slate
|
||||
including dedup history.
|
||||
</p>
|
||||
|
||||
<fieldset class="dz-scope" aria-label="Clear scope">
|
||||
<legend class="dz-scope__legend">Clear scope</legend>
|
||||
<label class="dz-scope__option">
|
||||
<input type="radio" v-model="dangerScope" value="pending" />
|
||||
Pending only
|
||||
</label>
|
||||
<label class="dz-scope__option">
|
||||
<input type="radio" v-model="dangerScope" value="pending_approved" />
|
||||
Pending + approved (stale search)
|
||||
</label>
|
||||
</fieldset>
|
||||
|
||||
<div class="dz-actions">
|
||||
<button
|
||||
class="action-btn action-btn--primary"
|
||||
:disabled="!!confirmAction"
|
||||
@click="beginConfirm('archive')"
|
||||
>
|
||||
📦 Archive & reset
|
||||
<details class="advanced">
|
||||
<summary class="advanced__summary">Advanced</summary>
|
||||
<div class="advanced__body">
|
||||
<p class="advanced__warning">⚠️ These actions are destructive and cannot be undone.</p>
|
||||
<div class="home__actions home__actions--danger">
|
||||
<button class="action-btn action-btn--danger" @click="confirmPurge">
|
||||
🗑️ Purge Pending + Rejected
|
||||
</button>
|
||||
<button
|
||||
class="action-btn action-btn--secondary"
|
||||
:disabled="!!confirmAction"
|
||||
@click="beginConfirm('purge')"
|
||||
>
|
||||
🗑 Hard purge (delete)
|
||||
<button class="action-btn action-btn--danger" @click="killTasks">
|
||||
🛑 Kill Stuck Tasks
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Inline confirm -->
|
||||
<div v-if="confirmAction" class="dz-confirm" role="alertdialog" aria-live="assertive">
|
||||
<p v-if="confirmAction.type === 'archive'" class="dz-confirm__msg dz-confirm__msg--info">
|
||||
Archive <strong>{{ confirmAction.statuses.join(' + ') }}</strong> jobs?
|
||||
URLs are kept for dedup — nothing is permanently deleted.
|
||||
</p>
|
||||
<p v-else class="dz-confirm__msg dz-confirm__msg--warn">
|
||||
Permanently delete <strong>{{ confirmAction.statuses.join(' + ') }}</strong> jobs?
|
||||
This removes URLs from dedup history too. Cannot be undone.
|
||||
</p>
|
||||
<div class="dz-confirm__actions">
|
||||
<button class="action-btn action-btn--primary" @click="executeConfirm">
|
||||
{{ confirmAction.type === 'archive' ? 'Yes, archive' : 'Yes, delete' }}
|
||||
</button>
|
||||
<button class="action-btn action-btn--secondary" @click="confirmAction = null">
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr class="dz-divider" />
|
||||
|
||||
<!-- Background tasks -->
|
||||
<div class="dz-block">
|
||||
<p class="dz-block__title">Background tasks — {{ activeTasks.length }} active</p>
|
||||
<template v-if="activeTasks.length > 0">
|
||||
<div
|
||||
v-for="task in activeTasks"
|
||||
:key="task.id"
|
||||
class="dz-task"
|
||||
>
|
||||
<span class="dz-task__icon">{{ taskIcon(task.task_type) }}</span>
|
||||
<span class="dz-task__type">{{ task.task_type.replace(/_/g, ' ') }}</span>
|
||||
<span class="dz-task__label">
|
||||
{{ task.title ? `${task.title}${task.company ? ' @ ' + task.company : ''}` : `job #${task.job_id}` }}
|
||||
</span>
|
||||
<span class="dz-task__status">{{ task.status }}</span>
|
||||
<button
|
||||
class="btn-ghost btn-ghost--sm dz-task__cancel"
|
||||
@click="cancelTaskById(task.id)"
|
||||
:aria-label="`Cancel ${task.task_type} task`"
|
||||
>
|
||||
✕
|
||||
</button>
|
||||
</div>
|
||||
</template>
|
||||
<button
|
||||
class="action-btn action-btn--secondary dz-kill"
|
||||
:disabled="activeTasks.length === 0"
|
||||
@click="killAll"
|
||||
>
|
||||
⏹ Kill all stuck
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<hr class="dz-divider" />
|
||||
|
||||
<!-- More options -->
|
||||
<details class="dz-more">
|
||||
<summary class="dz-more__summary">More options</summary>
|
||||
<div class="dz-more__body">
|
||||
|
||||
<!-- Email purge -->
|
||||
<div class="dz-more__item">
|
||||
<p class="dz-block__title">Purge email data</p>
|
||||
<p class="dz-block__desc">Clears all email thread logs and email-sourced pending jobs.</p>
|
||||
<template v-if="moreConfirm === 'email'">
|
||||
<p class="dz-confirm__msg dz-confirm__msg--warn">
|
||||
Deletes all email contacts and email-sourced jobs. Cannot be undone.
|
||||
</p>
|
||||
<div class="dz-confirm__actions">
|
||||
<button class="action-btn action-btn--primary" @click="executePurgeTarget('email')">Yes, purge emails</button>
|
||||
<button class="action-btn action-btn--secondary" @click="moreConfirm = null">Cancel</button>
|
||||
</div>
|
||||
</template>
|
||||
<button v-else class="action-btn action-btn--secondary" @click="moreConfirm = 'email'">
|
||||
📧 Purge Email Data
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Non-remote purge -->
|
||||
<div class="dz-more__item">
|
||||
<p class="dz-block__title">Purge non-remote</p>
|
||||
<p class="dz-block__desc">Removes pending/approved/rejected on-site listings from the DB.</p>
|
||||
<template v-if="moreConfirm === 'non_remote'">
|
||||
<p class="dz-confirm__msg dz-confirm__msg--warn">
|
||||
Deletes all non-remote jobs not yet applied to. Cannot be undone.
|
||||
</p>
|
||||
<div class="dz-confirm__actions">
|
||||
<button class="action-btn action-btn--primary" @click="executePurgeTarget('non_remote')">Yes, purge on-site</button>
|
||||
<button class="action-btn action-btn--secondary" @click="moreConfirm = null">Cancel</button>
|
||||
</div>
|
||||
</template>
|
||||
<button v-else class="action-btn action-btn--secondary" @click="moreConfirm = 'non_remote'">
|
||||
🏢 Purge On-site Jobs
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Wipe + re-scrape -->
|
||||
<div class="dz-more__item">
|
||||
<p class="dz-block__title">Wipe all + re-scrape</p>
|
||||
<p class="dz-block__desc">Deletes all non-applied jobs then immediately runs a fresh discovery.</p>
|
||||
<template v-if="moreConfirm === 'rescrape'">
|
||||
<p class="dz-confirm__msg dz-confirm__msg--warn">
|
||||
Wipes ALL pending, approved, and rejected jobs, then re-scrapes.
|
||||
Applied and synced records are kept.
|
||||
</p>
|
||||
<div class="dz-confirm__actions">
|
||||
<button class="action-btn action-btn--primary" @click="executePurgeTarget('rescrape')">Yes, wipe + scrape</button>
|
||||
<button class="action-btn action-btn--secondary" @click="moreConfirm = null">Cancel</button>
|
||||
</div>
|
||||
</template>
|
||||
<button v-else class="action-btn action-btn--secondary" @click="moreConfirm = 'rescrape'">
|
||||
🔄 Wipe + Re-scrape
|
||||
</button>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</details>
|
||||
|
||||
</div>
|
||||
</details>
|
||||
</section>
|
||||
|
||||
<!-- Setup banners -->
|
||||
<section v-if="banners.length > 0" class="home__section" aria-labelledby="setup-heading">
|
||||
<h2 id="setup-heading" class="home__section-title">Finish setting up Peregrine</h2>
|
||||
<div class="banners">
|
||||
<div v-for="banner in banners" :key="banner.key" class="banner">
|
||||
<span class="banner__icon" aria-hidden="true">💡</span>
|
||||
<span class="banner__text">{{ banner.text }}</span>
|
||||
<RouterLink :to="banner.link" class="banner__link">Go to settings →</RouterLink>
|
||||
<button
|
||||
class="btn-ghost btn-ghost--sm banner__dismiss"
|
||||
@click="dismissBanner(banner.key)"
|
||||
:aria-label="`Dismiss: ${banner.text}`"
|
||||
>
|
||||
✕
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<!-- Stoop speed toast — easter egg 9.2 -->
|
||||
|
|
@ -366,7 +190,7 @@
|
|||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, onMounted, onUnmounted } from 'vue'
|
||||
import { ref, computed, onMounted } from 'vue'
|
||||
import { RouterLink } from 'vue-router'
|
||||
import { useJobsStore } from '../stores/jobs'
|
||||
import { useApiFetch } from '../composables/useApi'
|
||||
|
|
@ -407,8 +231,6 @@ function formatRelative(isoStr: string) {
|
|||
return hrs === 1 ? '1 hour ago' : `${hrs} hours ago`
|
||||
}
|
||||
|
||||
// ── Task execution ─────────────────────────────────────────────────────────
|
||||
|
||||
const taskRunning = ref<string | null>(null)
|
||||
const stoopToast = ref(false)
|
||||
|
||||
|
|
@ -417,16 +239,13 @@ async function runTask(key: string, endpoint: string) {
|
|||
await useApiFetch(endpoint, { method: 'POST' })
|
||||
taskRunning.value = null
|
||||
store.refresh()
|
||||
fetchActiveTasks()
|
||||
}
|
||||
|
||||
const runDiscovery = () => runTask('discovery', '/api/tasks/discovery')
|
||||
const syncEmails = () => runTask('email', '/api/tasks/email-sync')
|
||||
const scoreUnscored = () => runTask('score', '/api/tasks/score')
|
||||
const syncIntegration = () => runTask('sync', '/api/tasks/sync')
|
||||
const runEnrich = () => runTask('enrich', '/api/tasks/enrich')
|
||||
|
||||
// ── Add jobs ───────────────────────────────────────────────────────────────
|
||||
const runEnrich = () => useApiFetch('/api/tasks/enrich', { method: 'POST' })
|
||||
|
||||
const addTab = ref<'url' | 'csv'>('url')
|
||||
const urlInput = ref('')
|
||||
|
|
@ -450,8 +269,6 @@ function handleCsvUpload(e: Event) {
|
|||
useApiFetch('/api/jobs/upload-csv', { method: 'POST', body: form })
|
||||
}
|
||||
|
||||
// ── Backlog archive ────────────────────────────────────────────────────────
|
||||
|
||||
async function archiveByStatus(statuses: string[]) {
|
||||
await useApiFetch('/api/jobs/archive', {
|
||||
method: 'POST',
|
||||
|
|
@ -461,100 +278,26 @@ async function archiveByStatus(statuses: string[]) {
|
|||
store.refresh()
|
||||
}
|
||||
|
||||
// ── Danger Zone ────────────────────────────────────────────────────────────
|
||||
|
||||
interface TaskRow { id: number; task_type: string; status: string; title?: string; company?: string; job_id: number }
|
||||
interface Banner { key: string; text: string; link: string }
|
||||
interface ConfirmAction { type: 'archive' | 'purge'; statuses: string[] }
|
||||
|
||||
const activeTasks = ref<TaskRow[]>([])
|
||||
const dangerScope = ref<'pending' | 'pending_approved'>('pending')
|
||||
const confirmAction = ref<ConfirmAction | null>(null)
|
||||
const moreConfirm = ref<string | null>(null)
|
||||
const banners = ref<Banner[]>([])
|
||||
|
||||
let taskPollInterval: ReturnType<typeof setInterval> | null = null
|
||||
|
||||
async function fetchActiveTasks() {
|
||||
const { data } = await useApiFetch<TaskRow[]>('/api/tasks')
|
||||
activeTasks.value = data ?? []
|
||||
}
|
||||
|
||||
async function fetchBanners() {
|
||||
const { data } = await useApiFetch<Banner[]>('/api/config/setup-banners')
|
||||
banners.value = data ?? []
|
||||
}
|
||||
|
||||
function scopeStatuses(): string[] {
|
||||
return dangerScope.value === 'pending' ? ['pending'] : ['pending', 'approved']
|
||||
}
|
||||
|
||||
function beginConfirm(type: 'archive' | 'purge') {
|
||||
moreConfirm.value = null
|
||||
confirmAction.value = { type, statuses: scopeStatuses() }
|
||||
}
|
||||
|
||||
async function executeConfirm() {
|
||||
const action = confirmAction.value
|
||||
confirmAction.value = null
|
||||
if (!action) return
|
||||
const endpoint = action.type === 'archive' ? '/api/jobs/archive' : '/api/jobs/purge'
|
||||
const key = action.type === 'archive' ? 'statuses' : 'statuses'
|
||||
await useApiFetch(endpoint, {
|
||||
function confirmPurge() {
|
||||
// TODO: replace with ConfirmModal component
|
||||
if (confirm('Permanently delete all pending and rejected jobs? This cannot be undone.')) {
|
||||
useApiFetch('/api/jobs/purge', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ [key]: action.statuses }),
|
||||
body: JSON.stringify({ target: 'pending_rejected' }),
|
||||
})
|
||||
store.refresh()
|
||||
fetchActiveTasks()
|
||||
}
|
||||
}
|
||||
|
||||
async function cancelTaskById(id: number) {
|
||||
await useApiFetch(`/api/tasks/${id}`, { method: 'DELETE' })
|
||||
fetchActiveTasks()
|
||||
}
|
||||
|
||||
async function killAll() {
|
||||
async function killTasks() {
|
||||
await useApiFetch('/api/tasks/kill', { method: 'POST' })
|
||||
fetchActiveTasks()
|
||||
}
|
||||
|
||||
async function executePurgeTarget(target: string) {
|
||||
moreConfirm.value = null
|
||||
await useApiFetch('/api/jobs/purge', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ target }),
|
||||
})
|
||||
store.refresh()
|
||||
fetchActiveTasks()
|
||||
}
|
||||
|
||||
async function dismissBanner(key: string) {
|
||||
await useApiFetch(`/api/config/setup-banners/${key}/dismiss`, { method: 'POST' })
|
||||
banners.value = banners.value.filter(b => b.key !== key)
|
||||
}
|
||||
|
||||
function taskIcon(taskType: string): string {
|
||||
const icons: Record<string, string> = {
|
||||
cover_letter: '✉️', company_research: '🔍', discovery: '🌐',
|
||||
enrich_descriptions: '📝', email_sync: '📧', score: '📊',
|
||||
scrape_url: '🔗',
|
||||
}
|
||||
return icons[taskType] ?? '⚙️'
|
||||
}
|
||||
|
||||
onMounted(async () => {
|
||||
store.refresh()
|
||||
const { data } = await useApiFetch<{ name: string }>('/api/config/user')
|
||||
if (data?.name) userName.value = data.name
|
||||
fetchActiveTasks()
|
||||
fetchBanners()
|
||||
taskPollInterval = setInterval(fetchActiveTasks, 5000)
|
||||
})
|
||||
|
||||
onUnmounted(() => {
|
||||
if (taskPollInterval) clearInterval(taskPollInterval)
|
||||
})
|
||||
</script>
|
||||
|
||||
|
|
@ -649,11 +392,12 @@ onUnmounted(() => {
|
|||
|
||||
.home__actions {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fit, minmax(180px, 1fr));
|
||||
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
|
||||
gap: var(--space-3);
|
||||
}
|
||||
|
||||
.home__actions--secondary { grid-template-columns: repeat(auto-fit, minmax(240px, 1fr)); }
|
||||
.home__actions--danger { grid-template-columns: repeat(auto-fit, minmax(220px, 1fr)); }
|
||||
|
||||
.sync-banner {
|
||||
display: flex;
|
||||
|
|
@ -707,7 +451,9 @@ onUnmounted(() => {
|
|||
|
||||
.action-btn--secondary { background: var(--color-surface-alt); color: var(--color-text); border: 1px solid var(--color-border); }
|
||||
.action-btn--secondary:hover { background: var(--color-border-light); }
|
||||
.action-btn--secondary:disabled { opacity: 0.4; cursor: not-allowed; }
|
||||
|
||||
.action-btn--danger { background: transparent; color: var(--color-error); border: 1px solid var(--color-error); }
|
||||
.action-btn--danger:hover { background: rgba(192, 57, 43, 0.08); }
|
||||
|
||||
.enrichment-row {
|
||||
display: flex;
|
||||
|
|
@ -782,15 +528,13 @@ onUnmounted(() => {
|
|||
|
||||
.add-jobs__textarea:focus { outline: 2px solid var(--app-primary); outline-offset: 1px; }
|
||||
|
||||
/* ── Danger Zone ──────────────────────────────────────── */
|
||||
|
||||
.danger-zone {
|
||||
.advanced {
|
||||
background: var(--color-surface-raised);
|
||||
border: 1px solid var(--color-border-light);
|
||||
border-radius: var(--radius-md);
|
||||
}
|
||||
|
||||
.danger-zone__summary {
|
||||
.advanced__summary {
|
||||
padding: var(--space-3) var(--space-4);
|
||||
cursor: pointer;
|
||||
font-size: var(--text-sm);
|
||||
|
|
@ -800,172 +544,21 @@ onUnmounted(() => {
|
|||
user-select: none;
|
||||
}
|
||||
|
||||
.danger-zone__summary::-webkit-details-marker { display: none; }
|
||||
.danger-zone__summary::before { content: '▶ '; font-size: 0.7em; }
|
||||
details[open] > .danger-zone__summary::before { content: '▼ '; }
|
||||
.advanced__summary::-webkit-details-marker { display: none; }
|
||||
.advanced__summary::before { content: '▶ '; font-size: 0.7em; }
|
||||
details[open] > .advanced__summary::before { content: '▼ '; }
|
||||
|
||||
.danger-zone__body {
|
||||
padding: 0 var(--space-4) var(--space-4);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-5);
|
||||
}
|
||||
.advanced__body { padding: 0 var(--space-4) var(--space-4); display: flex; flex-direction: column; gap: var(--space-4); }
|
||||
|
||||
.dz-block { display: flex; flex-direction: column; gap: var(--space-3); }
|
||||
|
||||
.dz-block__title {
|
||||
.advanced__warning {
|
||||
font-size: var(--text-sm);
|
||||
font-weight: 600;
|
||||
color: var(--color-text);
|
||||
}
|
||||
|
||||
.dz-block__desc {
|
||||
font-size: var(--text-sm);
|
||||
color: var(--color-text-muted);
|
||||
}
|
||||
|
||||
.dz-scope {
|
||||
border: none;
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
display: flex;
|
||||
gap: var(--space-5);
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.dz-scope__legend {
|
||||
font-size: var(--text-xs);
|
||||
color: var(--color-text-muted);
|
||||
margin-bottom: var(--space-2);
|
||||
float: left;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.dz-scope__option {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-2);
|
||||
font-size: var(--text-sm);
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.dz-actions {
|
||||
display: flex;
|
||||
gap: var(--space-3);
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.dz-confirm {
|
||||
color: var(--color-warning);
|
||||
background: rgba(212, 137, 26, 0.08);
|
||||
padding: var(--space-3) var(--space-4);
|
||||
border-radius: var(--radius-md);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-3);
|
||||
border-left: 3px solid var(--color-warning);
|
||||
}
|
||||
|
||||
.dz-confirm__msg {
|
||||
font-size: var(--text-sm);
|
||||
padding: var(--space-3) var(--space-4);
|
||||
border-radius: var(--radius-md);
|
||||
border-left: 3px solid;
|
||||
}
|
||||
|
||||
.dz-confirm__msg--info {
|
||||
background: rgba(52, 152, 219, 0.1);
|
||||
border-color: var(--app-primary);
|
||||
color: var(--color-text);
|
||||
}
|
||||
|
||||
.dz-confirm__msg--warn {
|
||||
background: rgba(192, 57, 43, 0.08);
|
||||
border-color: var(--color-error);
|
||||
color: var(--color-text);
|
||||
}
|
||||
|
||||
.dz-confirm__actions {
|
||||
display: flex;
|
||||
gap: var(--space-3);
|
||||
}
|
||||
|
||||
.dz-divider {
|
||||
border: none;
|
||||
border-top: 1px solid var(--color-border-light);
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.dz-task {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-2);
|
||||
padding: var(--space-2) var(--space-3);
|
||||
background: var(--color-surface-alt);
|
||||
border-radius: var(--radius-md);
|
||||
font-size: var(--text-xs);
|
||||
}
|
||||
|
||||
.dz-task__icon { flex-shrink: 0; }
|
||||
.dz-task__type { font-family: var(--font-mono); color: var(--color-text-muted); min-width: 120px; }
|
||||
.dz-task__label { flex: 1; color: var(--color-text); overflow: hidden; text-overflow: ellipsis; white-space: nowrap; }
|
||||
.dz-task__status { color: var(--color-text-muted); font-style: italic; }
|
||||
.dz-task__cancel { margin-left: var(--space-2); }
|
||||
|
||||
.dz-kill { align-self: flex-start; }
|
||||
|
||||
.dz-more {
|
||||
background: transparent;
|
||||
border: none;
|
||||
}
|
||||
|
||||
.dz-more__summary {
|
||||
cursor: pointer;
|
||||
font-size: var(--text-sm);
|
||||
font-weight: 600;
|
||||
color: var(--color-text-muted);
|
||||
list-style: none;
|
||||
user-select: none;
|
||||
padding: var(--space-1) 0;
|
||||
}
|
||||
|
||||
.dz-more__summary::-webkit-details-marker { display: none; }
|
||||
.dz-more__summary::before { content: '▶ '; font-size: 0.7em; }
|
||||
details[open] > .dz-more__summary::before { content: '▼ '; }
|
||||
|
||||
.dz-more__body {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
|
||||
gap: var(--space-5);
|
||||
margin-top: var(--space-4);
|
||||
}
|
||||
|
||||
.dz-more__item { display: flex; flex-direction: column; gap: var(--space-2); }
|
||||
|
||||
/* ── Setup banners ────────────────────────────────────── */
|
||||
|
||||
.banners {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-2);
|
||||
}
|
||||
|
||||
.banner {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-3);
|
||||
padding: var(--space-3) var(--space-4);
|
||||
background: var(--color-surface-raised);
|
||||
border: 1px solid var(--color-border-light);
|
||||
border-radius: var(--radius-md);
|
||||
font-size: var(--text-sm);
|
||||
}
|
||||
|
||||
.banner__icon { flex-shrink: 0; }
|
||||
.banner__text { flex: 1; color: var(--color-text); }
|
||||
.banner__link { color: var(--app-primary); text-decoration: none; white-space: nowrap; font-weight: 500; }
|
||||
.banner__link:hover { text-decoration: underline; }
|
||||
.banner__dismiss { margin-left: var(--space-1); }
|
||||
|
||||
/* ── Toast ────────────────────────────────────────────── */
|
||||
|
||||
.stoop-toast {
|
||||
position: fixed;
|
||||
bottom: var(--space-6);
|
||||
|
|
@ -995,7 +588,6 @@ details[open] > .dz-more__summary::before { content: '▼ '; }
|
|||
.home { padding: var(--space-4); gap: var(--space-6); }
|
||||
.home__greeting { font-size: var(--text-2xl); }
|
||||
.home__metrics { grid-template-columns: repeat(3, 1fr); }
|
||||
.dz-more__body { grid-template-columns: 1fr; }
|
||||
}
|
||||
|
||||
@media (max-width: 480px) {
|
||||
|
|
|
|||
|
|
@ -7,7 +7,6 @@ import type { StageSignal } from '../stores/interviews'
|
|||
import { useApiFetch } from '../composables/useApi'
|
||||
import InterviewCard from '../components/InterviewCard.vue'
|
||||
import MoveToSheet from '../components/MoveToSheet.vue'
|
||||
import CompanyResearchModal from '../components/CompanyResearchModal.vue'
|
||||
|
||||
const router = useRouter()
|
||||
const store = useInterviewsStore()
|
||||
|
|
@ -23,29 +22,10 @@ function openMove(jobId: number, preSelectedStage?: PipelineStage) {
|
|||
|
||||
async function onMove(stage: PipelineStage, opts: { interview_date?: string; rejection_stage?: string }) {
|
||||
if (!moveTarget.value) return
|
||||
const movedJob = moveTarget.value
|
||||
const wasHired = stage === 'hired'
|
||||
await store.move(movedJob.id, stage, opts)
|
||||
await store.move(moveTarget.value.id, stage, opts)
|
||||
moveTarget.value = null
|
||||
if (wasHired) triggerConfetti()
|
||||
// Auto-open research modal when moving to phone_screen (mirrors Streamlit behaviour)
|
||||
if (stage === 'phone_screen') openResearch(movedJob.id, `${movedJob.title} at ${movedJob.company}`)
|
||||
}
|
||||
|
||||
// ── Company research modal ─────────────────────────────────────────────────────
|
||||
const researchJobId = ref<number | null>(null)
|
||||
const researchJobTitle = ref('')
|
||||
const researchAutoGen = ref(false)
|
||||
|
||||
function openResearch(jobId: number, jobTitle: string, autoGenerate = true) {
|
||||
researchJobId.value = jobId
|
||||
researchJobTitle.value = jobTitle
|
||||
researchAutoGen.value = autoGenerate
|
||||
}
|
||||
|
||||
function onInterviewCardResearch(jobId: number) {
|
||||
const job = store.jobs.find(j => j.id === jobId)
|
||||
if (job) openResearch(jobId, `${job.title} at ${job.company}`, false)
|
||||
}
|
||||
|
||||
// ── Collapsible Applied section ────────────────────────────────────────────
|
||||
|
|
@ -486,8 +466,7 @@ function daysSince(dateStr: string | null) {
|
|||
</div>
|
||||
<InterviewCard v-for="(job, i) in store.phoneScreen" :key="job.id" :job="job"
|
||||
:focused="focusedCol === 0 && focusedCard === i"
|
||||
@move="openMove" @prep="router.push(`/prep/${$event}`)" @survey="router.push('/survey/' + $event)"
|
||||
@research="onInterviewCardResearch" />
|
||||
@move="openMove" @prep="router.push(`/prep/${$event}`)" @survey="router.push('/survey/' + $event)" />
|
||||
</div>
|
||||
|
||||
<div class="kanban-col" :class="{ 'kanban-col--focused': focusedCol === 1 }" aria-label="Interviewing">
|
||||
|
|
@ -500,8 +479,7 @@ function daysSince(dateStr: string | null) {
|
|||
</div>
|
||||
<InterviewCard v-for="(job, i) in store.interviewing" :key="job.id" :job="job"
|
||||
:focused="focusedCol === 1 && focusedCard === i"
|
||||
@move="openMove" @prep="router.push(`/prep/${$event}`)" @survey="router.push('/survey/' + $event)"
|
||||
@research="onInterviewCardResearch" />
|
||||
@move="openMove" @prep="router.push(`/prep/${$event}`)" @survey="router.push('/survey/' + $event)" />
|
||||
</div>
|
||||
|
||||
<div class="kanban-col" :class="{ 'kanban-col--focused': focusedCol === 2 }" aria-label="Offer and Hired">
|
||||
|
|
@ -514,8 +492,7 @@ function daysSince(dateStr: string | null) {
|
|||
</div>
|
||||
<InterviewCard v-for="(job, i) in store.offerHired" :key="job.id" :job="job"
|
||||
:focused="focusedCol === 2 && focusedCard === i"
|
||||
@move="openMove" @prep="router.push(`/prep/${$event}`)" @survey="router.push('/survey/' + $event)"
|
||||
@research="onInterviewCardResearch" />
|
||||
@move="openMove" @prep="router.push(`/prep/${$event}`)" @survey="router.push('/survey/' + $event)" />
|
||||
</div>
|
||||
</section>
|
||||
|
||||
|
|
@ -548,14 +525,6 @@ function daysSince(dateStr: string | null) {
|
|||
@move="onMove"
|
||||
@close="moveTarget = null; movePreSelected = undefined"
|
||||
/>
|
||||
|
||||
<CompanyResearchModal
|
||||
v-if="researchJobId !== null"
|
||||
:jobId="researchJobId"
|
||||
:jobTitle="researchJobTitle"
|
||||
:autoGenerate="researchAutoGen"
|
||||
@close="researchJobId = null"
|
||||
/>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
|
|
|
|||
|
|
@ -98,50 +98,25 @@
|
|||
<span class="spinner" aria-hidden="true" />
|
||||
<span>Loading…</span>
|
||||
</div>
|
||||
<template v-else>
|
||||
<!-- Sort + filter bar -->
|
||||
<div class="list-controls" aria-label="Sort and filter">
|
||||
<select v-model="sortBy" class="list-sort" aria-label="Sort by">
|
||||
<option value="match_score">Best match</option>
|
||||
<option value="date_found">Newest first</option>
|
||||
<option value="company">Company A–Z</option>
|
||||
</select>
|
||||
<label class="list-filter-remote">
|
||||
<input type="checkbox" v-model="filterRemote" />
|
||||
Remote only
|
||||
</label>
|
||||
<span class="list-count">{{ sortedFilteredJobs.length }} job{{ sortedFilteredJobs.length !== 1 ? 's' : '' }}</span>
|
||||
</div>
|
||||
|
||||
<div v-if="sortedFilteredJobs.length === 0" class="review__empty" role="status">
|
||||
<p class="empty-desc">No {{ activeTab }} jobs{{ filterRemote ? ' (remote only)' : '' }}.</p>
|
||||
<div v-else-if="store.listJobs.length === 0" class="review__empty" role="status">
|
||||
<p class="empty-desc">No {{ activeTab }} jobs.</p>
|
||||
</div>
|
||||
<ul v-else class="job-list" role="list">
|
||||
<li v-for="job in sortedFilteredJobs" :key="job.id" class="job-list__item">
|
||||
<li v-for="job in store.listJobs" :key="job.id" class="job-list__item">
|
||||
<div class="job-list__info">
|
||||
<span class="job-list__title">{{ job.title }}</span>
|
||||
<span class="job-list__company">
|
||||
{{ job.company }}
|
||||
<span v-if="job.is_remote" class="remote-tag">Remote</span>
|
||||
</span>
|
||||
<span class="job-list__company">{{ job.company }}</span>
|
||||
</div>
|
||||
<div class="job-list__meta">
|
||||
<span v-if="job.match_score !== null" class="score-pill" :class="scorePillClass(job.match_score)">
|
||||
{{ job.match_score }}%
|
||||
</span>
|
||||
<button
|
||||
v-if="activeTab === 'approved'"
|
||||
class="job-list__action"
|
||||
@click="router.push(`/apply/${job.id}`)"
|
||||
:aria-label="`Draft cover letter for ${job.title}`"
|
||||
>✨ Draft</button>
|
||||
<a :href="job.url" target="_blank" rel="noopener noreferrer" class="job-list__link">
|
||||
View ↗
|
||||
</a>
|
||||
</div>
|
||||
</li>
|
||||
</ul>
|
||||
</template>
|
||||
</div>
|
||||
|
||||
<!-- ── Help overlay ─────────────────────────────────────────────────── -->
|
||||
|
|
@ -211,13 +186,12 @@
|
|||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, watch, onMounted, onUnmounted } from 'vue'
|
||||
import { useRoute, useRouter } from 'vue-router'
|
||||
import { useRoute } from 'vue-router'
|
||||
import { useReviewStore } from '../stores/review'
|
||||
import JobCardStack from '../components/JobCardStack.vue'
|
||||
|
||||
const store = useReviewStore()
|
||||
const route = useRoute()
|
||||
const router = useRouter()
|
||||
const stackRef = ref<InstanceType<typeof JobCardStack> | null>(null)
|
||||
|
||||
// ─── Tabs ──────────────────────────────────────────────────────────────────────
|
||||
|
|
@ -341,30 +315,6 @@ function onKeyDown(e: KeyboardEvent) {
|
|||
}
|
||||
}
|
||||
|
||||
// ─── List view: sort + filter ─────────────────────────────────────────────────
|
||||
|
||||
type SortKey = 'match_score' | 'date_found' | 'company'
|
||||
const sortBy = ref<SortKey>('match_score')
|
||||
const filterRemote = ref(false)
|
||||
|
||||
const sortedFilteredJobs = computed(() => {
|
||||
let jobs = [...store.listJobs]
|
||||
if (filterRemote.value) jobs = jobs.filter(j => j.is_remote)
|
||||
jobs.sort((a, b) => {
|
||||
if (sortBy.value === 'match_score') return (b.match_score ?? -1) - (a.match_score ?? -1)
|
||||
if (sortBy.value === 'date_found') return new Date(b.date_found).getTime() - new Date(a.date_found).getTime()
|
||||
if (sortBy.value === 'company') return (a.company ?? '').localeCompare(b.company ?? '')
|
||||
return 0
|
||||
})
|
||||
return jobs
|
||||
})
|
||||
|
||||
// Reset filters when switching tabs
|
||||
watch(activeTab, () => {
|
||||
filterRemote.value = false
|
||||
sortBy.value = 'match_score'
|
||||
})
|
||||
|
||||
// ─── List view score pill ─────────────────────────────────────────────────────
|
||||
|
||||
function scorePillClass(score: number) {
|
||||
|
|
@ -709,69 +659,6 @@ kbd {
|
|||
font-weight: 600;
|
||||
}
|
||||
|
||||
.job-list__action {
|
||||
font-size: var(--text-xs);
|
||||
font-weight: 600;
|
||||
color: var(--app-primary);
|
||||
background: color-mix(in srgb, var(--app-primary) 10%, transparent);
|
||||
border: 1px solid color-mix(in srgb, var(--app-primary) 25%, transparent);
|
||||
border-radius: var(--radius-sm);
|
||||
padding: 2px 8px;
|
||||
cursor: pointer;
|
||||
transition: background 150ms;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.job-list__action:hover {
|
||||
background: color-mix(in srgb, var(--app-primary) 18%, transparent);
|
||||
}
|
||||
|
||||
.remote-tag {
|
||||
font-size: 0.65rem;
|
||||
font-weight: 700;
|
||||
color: var(--color-info);
|
||||
background: color-mix(in srgb, var(--color-info) 12%, transparent);
|
||||
border-radius: var(--radius-full);
|
||||
padding: 1px 5px;
|
||||
margin-left: 4px;
|
||||
}
|
||||
|
||||
/* ── List controls (sort + filter) ──────────────────────────────────── */
|
||||
|
||||
.list-controls {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-3);
|
||||
flex-wrap: wrap;
|
||||
margin-bottom: var(--space-3);
|
||||
}
|
||||
|
||||
.list-sort {
|
||||
font-size: var(--text-xs);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-sm);
|
||||
background: var(--color-surface-raised);
|
||||
color: var(--color-text);
|
||||
padding: 3px 8px;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.list-filter-remote {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-1);
|
||||
font-size: var(--text-xs);
|
||||
color: var(--color-text-muted);
|
||||
cursor: pointer;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.list-count {
|
||||
font-size: var(--text-xs);
|
||||
color: var(--color-text-muted);
|
||||
margin-left: auto;
|
||||
}
|
||||
|
||||
/* ── Help overlay ────────────────────────────────────────────────────── */
|
||||
|
||||
.help-overlay {
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import { useAppConfigStore } from '../../stores/appConfig'
|
|||
|
||||
const store = useFineTuneStore()
|
||||
const config = useAppConfigStore()
|
||||
const { step, inFlightJob, jobStatus, pairsCount, quotaRemaining, pairs, pairsLoading } = storeToRefs(store)
|
||||
const { step, inFlightJob, jobStatus, pairsCount, quotaRemaining } = storeToRefs(store)
|
||||
|
||||
const fileInput = ref<HTMLInputElement | null>(null)
|
||||
const selectedFiles = ref<File[]>([])
|
||||
|
|
@ -45,7 +45,6 @@ async function checkLocalModel() {
|
|||
|
||||
onMounted(async () => {
|
||||
store.startPolling()
|
||||
await store.loadPairs()
|
||||
if (store.step === 3 && !config.isCloud) await checkLocalModel()
|
||||
})
|
||||
onUnmounted(() => { store.stopPolling(); store.resetStep() })
|
||||
|
|
@ -100,22 +99,6 @@ onUnmounted(() => { store.stopPolling(); store.resetStep() })
|
|||
</button>
|
||||
<button @click="store.step = 3" class="btn-secondary">Skip → Train</button>
|
||||
</div>
|
||||
|
||||
<!-- Training pairs list -->
|
||||
<div v-if="pairs.length > 0" class="pairs-list">
|
||||
<h4>Training Pairs <span class="pairs-badge">{{ pairs.length }}</span></h4>
|
||||
<p class="section-note">Review and remove any low-quality pairs before training.</p>
|
||||
<div v-if="pairsLoading" class="pairs-loading">Loading…</div>
|
||||
<ul v-else class="pairs-items">
|
||||
<li v-for="pair in pairs" :key="pair.index" class="pair-item">
|
||||
<div class="pair-info">
|
||||
<span class="pair-instruction">{{ pair.instruction }}</span>
|
||||
<span class="pair-source">{{ pair.source_file }}</span>
|
||||
</div>
|
||||
<button class="pair-delete" @click="store.deletePair(pair.index)" title="Remove this pair">✕</button>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<!-- Step 3: Train -->
|
||||
|
|
@ -177,16 +160,4 @@ onUnmounted(() => { store.stopPolling(); store.resetStep() })
|
|||
.status-running { background: var(--color-warning-bg, #fef3c7); color: var(--color-warning-fg, #92400e); }
|
||||
.status-ok { color: var(--color-success, #16a34a); }
|
||||
.status-fail { color: var(--color-error, #dc2626); }
|
||||
|
||||
.pairs-list { margin-top: var(--space-6, 1.5rem); }
|
||||
.pairs-list h4 { font-size: 0.95rem; font-weight: 600; margin: 0 0 var(--space-2, 0.5rem); display: flex; align-items: center; gap: 0.5rem; }
|
||||
.pairs-badge { background: var(--color-primary, #2d5a27); color: #fff; font-size: 0.75rem; padding: 1px 7px; border-radius: var(--radius-full, 9999px); }
|
||||
.pairs-loading { color: var(--color-text-muted); font-size: 0.875rem; padding: var(--space-2, 0.5rem) 0; }
|
||||
.pairs-items { list-style: none; margin: 0; padding: 0; display: flex; flex-direction: column; gap: var(--space-2, 0.5rem); max-height: 280px; overflow-y: auto; }
|
||||
.pair-item { display: flex; align-items: center; gap: var(--space-3, 0.75rem); padding: var(--space-2, 0.5rem) var(--space-3, 0.75rem); background: var(--color-surface-alt); border: 1px solid var(--color-border-light); border-radius: var(--radius-md); }
|
||||
.pair-info { flex: 1; min-width: 0; display: flex; flex-direction: column; gap: 2px; }
|
||||
.pair-instruction { font-size: 0.85rem; color: var(--color-text); white-space: nowrap; overflow: hidden; text-overflow: ellipsis; }
|
||||
.pair-source { font-size: 0.75rem; color: var(--color-text-muted); }
|
||||
.pair-delete { flex-shrink: 0; background: none; border: none; color: var(--color-error); cursor: pointer; font-size: 0.9rem; padding: 2px 4px; border-radius: var(--radius-sm); transition: background 150ms; }
|
||||
.pair-delete:hover { background: var(--color-error); color: #fff; }
|
||||
</style>
|
||||
|
|
|
|||
|
|
@ -62,16 +62,9 @@
|
|||
rows="3"
|
||||
placeholder="How you write and communicate — used to shape cover letter voice."
|
||||
/>
|
||||
<button
|
||||
v-if="config.tier !== 'free'"
|
||||
class="btn-generate"
|
||||
type="button"
|
||||
@click="generateVoice"
|
||||
:disabled="generatingVoice"
|
||||
>{{ generatingVoice ? 'Generating…' : 'Generate ✦' }}</button>
|
||||
</div>
|
||||
|
||||
<div v-if="!config.isCloud" class="field-row">
|
||||
<div class="field-row">
|
||||
<label class="field-label" for="profile-inference">Inference profile</label>
|
||||
<select id="profile-inference" v-model="store.inference_profile" class="select-input">
|
||||
<option value="remote">Remote</option>
|
||||
|
|
@ -217,7 +210,6 @@ const config = useAppConfigStore()
|
|||
const newNdaCompany = ref('')
|
||||
const generatingSummary = ref(false)
|
||||
const generatingMissions = ref(false)
|
||||
const generatingVoice = ref(false)
|
||||
|
||||
onMounted(() => { store.load() })
|
||||
|
||||
|
|
@ -273,15 +265,6 @@ async function generateMissions() {
|
|||
}))
|
||||
}
|
||||
}
|
||||
|
||||
async function generateVoice() {
|
||||
generatingVoice.value = true
|
||||
const { data, error } = await useApiFetch<{ voice?: string }>(
|
||||
'/api/settings/profile/generate-voice', { method: 'POST' }
|
||||
)
|
||||
generatingVoice.value = false
|
||||
if (!error && data?.voice) store.candidate_voice = data.voice
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
|
|
|
|||
|
|
@ -15,13 +15,7 @@
|
|||
<div class="empty-card">
|
||||
<h3>Upload & Parse</h3>
|
||||
<p>Upload a PDF, DOCX, or ODT and we'll extract your info automatically.</p>
|
||||
<input type="file" accept=".pdf,.docx,.odt" @change="handleFileSelect" ref="fileInput" />
|
||||
<button
|
||||
v-if="pendingFile"
|
||||
@click="handleUpload"
|
||||
:disabled="uploading"
|
||||
style="margin-top:10px"
|
||||
>{{ uploading ? 'Parsing…' : `Parse "${pendingFile.name}"` }}</button>
|
||||
<input type="file" accept=".pdf,.docx,.odt" @change="handleUpload" ref="fileInput" />
|
||||
<p v-if="uploadError" class="error">{{ uploadError }}</p>
|
||||
</div>
|
||||
<!-- Blank -->
|
||||
|
|
@ -30,8 +24,8 @@
|
|||
<p>Start with a blank form and fill in your details.</p>
|
||||
<button @click="store.createBlank()" :disabled="store.loading">Start from Scratch</button>
|
||||
</div>
|
||||
<!-- Wizard — self-hosted only -->
|
||||
<div v-if="!config.isCloud" class="empty-card">
|
||||
<!-- Wizard -->
|
||||
<div class="empty-card">
|
||||
<h3>Run Setup Wizard</h3>
|
||||
<p>Walk through the onboarding wizard to set up your profile step by step.</p>
|
||||
<RouterLink to="/setup">Open Setup Wizard →</RouterLink>
|
||||
|
|
@ -41,21 +35,6 @@
|
|||
|
||||
<!-- Full form (when resume exists) -->
|
||||
<template v-else-if="store.hasResume">
|
||||
<!-- Replace resume via upload -->
|
||||
<section class="form-section replace-section">
|
||||
<h3>Replace Resume</h3>
|
||||
<p class="section-note">Upload a new PDF, DOCX, or ODT to re-parse and overwrite the current data.</p>
|
||||
<input type="file" accept=".pdf,.docx,.odt" @change="handleFileSelect" ref="replaceFileInput" />
|
||||
<button
|
||||
v-if="pendingFile"
|
||||
@click="handleUpload"
|
||||
:disabled="uploading"
|
||||
class="btn-primary"
|
||||
style="margin-top:10px"
|
||||
>{{ uploading ? 'Parsing…' : `Parse "${pendingFile.name}"` }}</button>
|
||||
<p v-if="uploadError" class="error">{{ uploadError }}</p>
|
||||
</section>
|
||||
|
||||
<!-- Personal Information -->
|
||||
<section class="form-section">
|
||||
<h3>Personal Information</h3>
|
||||
|
|
@ -242,22 +221,17 @@ import { ref, onMounted } from 'vue'
|
|||
import { storeToRefs } from 'pinia'
|
||||
import { useResumeStore } from '../../stores/settings/resume'
|
||||
import { useProfileStore } from '../../stores/settings/profile'
|
||||
import { useAppConfigStore } from '../../stores/appConfig'
|
||||
import { useApiFetch } from '../../composables/useApi'
|
||||
|
||||
const store = useResumeStore()
|
||||
const profileStore = useProfileStore()
|
||||
const config = useAppConfigStore()
|
||||
const { loadError } = storeToRefs(store)
|
||||
const showSelfId = ref(false)
|
||||
const skillInput = ref('')
|
||||
const domainInput = ref('')
|
||||
const kwInput = ref('')
|
||||
const uploadError = ref<string | null>(null)
|
||||
const uploading = ref(false)
|
||||
const pendingFile = ref<File | null>(null)
|
||||
const fileInput = ref<HTMLInputElement | null>(null)
|
||||
const replaceFileInput = ref<HTMLInputElement | null>(null)
|
||||
|
||||
onMounted(async () => {
|
||||
await store.load()
|
||||
|
|
@ -272,16 +246,9 @@ onMounted(async () => {
|
|||
}
|
||||
})
|
||||
|
||||
function handleFileSelect(event: Event) {
|
||||
async function handleUpload(event: Event) {
|
||||
const file = (event.target as HTMLInputElement).files?.[0]
|
||||
pendingFile.value = file ?? null
|
||||
uploadError.value = null
|
||||
}
|
||||
|
||||
async function handleUpload() {
|
||||
const file = pendingFile.value
|
||||
if (!file) return
|
||||
uploading.value = true
|
||||
uploadError.value = null
|
||||
const formData = new FormData()
|
||||
formData.append('file', file)
|
||||
|
|
@ -289,14 +256,10 @@ async function handleUpload() {
|
|||
'/api/settings/resume/upload',
|
||||
{ method: 'POST', body: formData }
|
||||
)
|
||||
uploading.value = false
|
||||
if (error || !data?.ok) {
|
||||
uploadError.value = data?.error ?? (typeof error === 'string' ? error : (error?.kind === 'network' ? error.message : error?.detail ?? 'Upload failed'))
|
||||
return
|
||||
}
|
||||
pendingFile.value = null
|
||||
if (fileInput.value) fileInput.value.value = ''
|
||||
if (replaceFileInput.value) replaceFileInput.value.value = ''
|
||||
if (data.data) {
|
||||
await store.load()
|
||||
}
|
||||
|
|
@ -344,5 +307,4 @@ h3 { font-size: 1rem; font-weight: 600; margin-bottom: var(--space-3, 16px); col
|
|||
.section-note { font-size: 0.8rem; color: var(--color-text-secondary, #94a3b8); margin-bottom: 16px; }
|
||||
.toggle-btn { margin-left: 10px; padding: 2px 10px; background: transparent; border: 1px solid var(--color-border, rgba(255,255,255,0.15)); border-radius: 4px; color: var(--color-text-secondary, #94a3b8); cursor: pointer; font-size: 0.78rem; }
|
||||
.loading { text-align: center; padding: var(--space-8, 48px); color: var(--color-text-secondary, #94a3b8); }
|
||||
.replace-section { background: var(--color-surface-2, rgba(255,255,255,0.03)); border-radius: 8px; padding: var(--space-4, 24px); }
|
||||
</style>
|
||||
|
|
|
|||
|
|
@ -69,18 +69,7 @@
|
|||
{{ kw }} <button @click="store.removeTag('exclude_keywords', kw)">×</button>
|
||||
</span>
|
||||
</div>
|
||||
<div class="tag-input-row">
|
||||
<input v-model="excludeInput" @keydown.enter.prevent="store.addTag('exclude_keywords', excludeInput); excludeInput = ''" placeholder="Add keyword, press Enter" />
|
||||
<button @click="store.suggestExcludeKeywords()" class="btn-suggest">Suggest</button>
|
||||
</div>
|
||||
<div v-if="store.excludeSuggestions.length > 0" class="suggestions">
|
||||
<span
|
||||
v-for="s in store.excludeSuggestions"
|
||||
:key="s"
|
||||
class="suggestion-chip"
|
||||
@click="store.acceptSuggestion('exclude', s)"
|
||||
>+ {{ s }}</span>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<!-- Job Boards -->
|
||||
|
|
|
|||
|
|
@ -42,7 +42,6 @@ const devOverride = computed(() => !!config.devTierOverride)
|
|||
const gpuProfiles = ['single-gpu', 'dual-gpu']
|
||||
|
||||
const showSystem = computed(() => !config.isCloud)
|
||||
const showData = computed(() => !config.isCloud)
|
||||
const showFineTune = computed(() => {
|
||||
if (config.isCloud) return config.tier === 'premium'
|
||||
return gpuProfiles.includes(config.inferenceProfile)
|
||||
|
|
@ -66,7 +65,7 @@ const allGroups = [
|
|||
]},
|
||||
{ label: 'Account', items: [
|
||||
{ key: 'license', path: '/settings/license', label: 'License', show: true },
|
||||
{ key: 'data', path: '/settings/data', label: 'Data', show: showData },
|
||||
{ key: 'data', path: '/settings/data', label: 'Data', show: true },
|
||||
{ key: 'privacy', path: '/settings/privacy', label: 'Privacy', show: true },
|
||||
]},
|
||||
{ label: 'Dev', items: [
|
||||
|
|
|
|||
Loading…
Reference in a new issue