Compare commits

...

2 commits

Author SHA1 Message Date
065c02feb7 feat(vue): Home dashboard parity — Enrich button, Danger Zone, setup banners (closes #57)
Some checks failed
CI / test (push) Failing after 20s
API additions (dev-api.py):
- GET /api/tasks — list active background tasks
- DELETE /api/tasks/{task_id} — per-task cancel
- POST /api/tasks/kill — kill all stuck tasks
- POST /api/tasks/discovery|email-sync|enrich|score|sync — queue/trigger each workflow
- POST /api/jobs/archive — archive by statuses array
- POST /api/jobs/purge — hard delete by statuses or target (email/non_remote/rescrape)
- POST /api/jobs/add — queue URL imports
- POST /api/jobs/upload-csv — upload CSV with URL column
- GET  /api/config/setup-banners — list undismissed onboarding hints
- POST /api/config/setup-banners/{key}/dismiss — dismiss a banner

HomeView.vue:
- 4th WorkflowButton: "Fill Missing Descriptions" (always visible, not gated on enrichment_enabled)
- Danger Zone redesign: scope radio (pending-only vs pending+approved), Archive & reset (primary)
  vs Hard purge (secondary), inline confirm dialogs, active task list with per-task cancel,
  Kill all stuck button, More Options (email purge / non-remote / wipe+rescrape)
- Setup banners: dismissible onboarding hints pulled from /api/config/setup-banners,
  5-second polling for active task list to stay live

app/Home.py:
- Danger Zone redesign: same scope radio + archive/purge with confirm steps
- Background task list with per-task cancel and Kill all stuck button
- More options expander (email purge, non-remote, wipe+rescrape)
- Setup banners section at page bottom
2026-04-04 22:05:06 -07:00
53b07568d9 feat(vue): accumulated parity work — Q&A, Apply highlights, AppNav switcher, cloud API
API additions (dev-api.py split across this and next commit):
- /api/jobs/{job_id}/qa GET/PATCH/suggest — Interview Prep answer storage + LLM suggestions
- /api/settings/ui-preference POST — persist streamlit/vue preference to user.yaml
- cancel_task() added to scripts/db.py (per-task cancel for Danger Zone)

Vue / UI:
- AppNav: " Classic" button to switch back to Streamlit UI (writes cookie + persists to user.yaml)
- ApplyWorkspace: Resume Highlights panel (collapsible skills/domains/keywords with job-match highlighting)
- SettingsView: hide Data tab in cloud mode (showData guard)
- ResumeProfileView: minor improvements
- useApi.ts: error handling improvements

Infra:
- compose.cloud.yml: add api service (uvicorn dev_api running in cloud container)
- docker/web/nginx.conf: proxy /api/* to api service in cloud mode
- README.md: Vue SPA now listed as Free tier feature
2026-04-04 22:04:51 -07:00
13 changed files with 1630 additions and 261 deletions

View file

@ -154,7 +154,7 @@ Re-enter the wizard any time via **Settings → Developer → Reset wizard**.
| Calendar sync (Google, Apple) | Paid | | Calendar sync (Google, Apple) | Paid |
| Slack notifications | Paid | | Slack notifications | Paid |
| CircuitForge shared cover-letter model | Paid | | CircuitForge shared cover-letter model | Paid |
| Vue 3 SPA beta UI | Paid | | Vue 3 SPA — full UI with onboarding wizard, job board, apply workspace, sort/filter, research modal, draft cover letter | Free |
| **Voice guidelines** (custom writing style & tone) | Premium with LLM¹ ² | | **Voice guidelines** (custom writing style & tone) | Premium with LLM¹ ² |
| Cover letter model fine-tuning (your writing, your model) | Premium | | Cover letter model fine-tuning (your writing, your model) | Premium |
| Multi-user support | Premium | | Multi-user support | Premium |

View file

@ -19,8 +19,8 @@ _profile = UserProfile(_USER_YAML) if UserProfile.exists(_USER_YAML) else None
_name = _profile.name if _profile else "Job Seeker" _name = _profile.name if _profile else "Job Seeker"
from scripts.db import init_db, get_job_counts, purge_jobs, purge_email_data, \ from scripts.db import init_db, get_job_counts, purge_jobs, purge_email_data, \
purge_non_remote, archive_jobs, kill_stuck_tasks, get_task_for_job, get_active_tasks, \ purge_non_remote, archive_jobs, kill_stuck_tasks, cancel_task, \
insert_job, get_existing_urls get_task_for_job, get_active_tasks, insert_job, get_existing_urls
from scripts.task_runner import submit_task from scripts.task_runner import submit_task
from app.cloud_session import resolve_session, get_db_path from app.cloud_session import resolve_session, get_db_path
@ -376,178 +376,145 @@ _scrape_status()
st.divider() st.divider()
# ── Danger zone: purge + re-scrape ──────────────────────────────────────────── # ── Danger zone ───────────────────────────────────────────────────────────────
with st.expander("⚠️ Danger Zone", expanded=False): with st.expander("⚠️ Danger Zone", expanded=False):
# ── Queue reset (the common case) ─────────────────────────────────────────
st.markdown("**Queue reset**")
st.caption( st.caption(
"**Purge** permanently deletes jobs from the local database. " "Archive clears your review queue while keeping job URLs for dedup, "
"Applied and synced jobs are never touched." "so the same listings won't resurface on the next discovery run. "
"Use hard purge only if you want a full clean slate including dedup history."
) )
purge_col, rescrape_col, email_col, tasks_col = st.columns(4) _scope = st.radio(
"Clear scope",
["Pending only", "Pending + approved (stale search)"],
horizontal=True,
label_visibility="collapsed",
)
_scope_statuses = (
["pending"] if _scope == "Pending only" else ["pending", "approved"]
)
with purge_col: _qc1, _qc2, _qc3 = st.columns([2, 2, 4])
st.markdown("**Purge pending & rejected**") if _qc1.button("📦 Archive & reset", use_container_width=True, type="primary"):
st.caption("Removes all _pending_ and _rejected_ listings so the next discovery starts fresh.") st.session_state["confirm_dz"] = "archive"
if st.button("🗑 Purge Pending + Rejected", use_container_width=True): if _qc2.button("🗑 Hard purge (delete)", use_container_width=True):
st.session_state["confirm_purge"] = "partial" st.session_state["confirm_dz"] = "purge"
if st.session_state.get("confirm_purge") == "partial": if st.session_state.get("confirm_dz") == "archive":
st.warning("Are you sure? This cannot be undone.") st.info(
c1, c2 = st.columns(2) f"Archive **{', '.join(_scope_statuses)}** jobs? "
if c1.button("Yes, purge", type="primary", use_container_width=True): "URLs are kept for dedup — nothing is permanently deleted."
deleted = purge_jobs(get_db_path(), statuses=["pending", "rejected"]) )
st.success(f"Purged {deleted} jobs.") _dc1, _dc2 = st.columns(2)
st.session_state.pop("confirm_purge", None) if _dc1.button("Yes, archive", type="primary", use_container_width=True, key="dz_archive_confirm"):
st.rerun() n = archive_jobs(get_db_path(), statuses=_scope_statuses)
if c2.button("Cancel", use_container_width=True): st.success(f"Archived {n} jobs.")
st.session_state.pop("confirm_purge", None) st.session_state.pop("confirm_dz", None)
st.rerun() st.rerun()
if _dc2.button("Cancel", use_container_width=True, key="dz_archive_cancel"):
with email_col: st.session_state.pop("confirm_dz", None)
st.markdown("**Purge email data**")
st.caption("Clears all email thread logs and email-sourced pending jobs so the next sync starts fresh.")
if st.button("📧 Purge Email Data", use_container_width=True):
st.session_state["confirm_purge"] = "email"
if st.session_state.get("confirm_purge") == "email":
st.warning("This deletes all email contacts and email-sourced jobs. Cannot be undone.")
c1, c2 = st.columns(2)
if c1.button("Yes, purge emails", type="primary", use_container_width=True):
contacts, jobs = purge_email_data(get_db_path())
st.success(f"Purged {contacts} email contacts, {jobs} email jobs.")
st.session_state.pop("confirm_purge", None)
st.rerun()
if c2.button("Cancel ", use_container_width=True):
st.session_state.pop("confirm_purge", None)
st.rerun()
with tasks_col:
_active = get_active_tasks(get_db_path())
st.markdown("**Kill stuck tasks**")
st.caption(f"Force-fail all queued/running background tasks. Currently **{len(_active)}** active.")
if st.button("⏹ Kill All Tasks", use_container_width=True, disabled=len(_active) == 0):
killed = kill_stuck_tasks(get_db_path())
st.success(f"Killed {killed} task(s).")
st.rerun() st.rerun()
with rescrape_col: if st.session_state.get("confirm_dz") == "purge":
st.markdown("**Purge all & re-scrape**") st.warning(
st.caption("Wipes _all_ non-applied, non-synced jobs then immediately runs a fresh discovery.") f"Permanently delete **{', '.join(_scope_statuses)}** jobs? "
if st.button("🔄 Purge All + Re-scrape", use_container_width=True): "This removes the URLs from dedup history too. Cannot be undone."
st.session_state["confirm_purge"] = "full"
if st.session_state.get("confirm_purge") == "full":
st.warning("This will delete ALL pending, approved, and rejected jobs, then re-scrape. Applied and synced records are kept.")
c1, c2 = st.columns(2)
if c1.button("Yes, wipe + scrape", type="primary", use_container_width=True):
purge_jobs(get_db_path(), statuses=["pending", "approved", "rejected"])
submit_task(get_db_path(), "discovery", 0)
st.session_state.pop("confirm_purge", None)
st.rerun()
if c2.button("Cancel ", use_container_width=True):
st.session_state.pop("confirm_purge", None)
st.rerun()
st.divider()
pending_col, nonremote_col, approved_col, _ = st.columns(4)
with pending_col:
st.markdown("**Purge pending review**")
st.caption("Removes only _pending_ listings, keeping your rejected history intact.")
if st.button("🗑 Purge Pending Only", use_container_width=True):
st.session_state["confirm_purge"] = "pending_only"
if st.session_state.get("confirm_purge") == "pending_only":
st.warning("Deletes all pending jobs. Rejected jobs are kept. Cannot be undone.")
c1, c2 = st.columns(2)
if c1.button("Yes, purge pending", type="primary", use_container_width=True):
deleted = purge_jobs(get_db_path(), statuses=["pending"])
st.success(f"Purged {deleted} pending jobs.")
st.session_state.pop("confirm_purge", None)
st.rerun()
if c2.button("Cancel ", use_container_width=True):
st.session_state.pop("confirm_purge", None)
st.rerun()
with nonremote_col:
st.markdown("**Purge non-remote**")
st.caption("Removes pending/approved/rejected jobs where remote is not set. Keeps anything already in the pipeline.")
if st.button("🏢 Purge On-site Jobs", use_container_width=True):
st.session_state["confirm_purge"] = "non_remote"
if st.session_state.get("confirm_purge") == "non_remote":
st.warning("Deletes all non-remote jobs not yet applied to. Cannot be undone.")
c1, c2 = st.columns(2)
if c1.button("Yes, purge on-site", type="primary", use_container_width=True):
deleted = purge_non_remote(get_db_path())
st.success(f"Purged {deleted} non-remote jobs.")
st.session_state.pop("confirm_purge", None)
st.rerun()
if c2.button("Cancel ", use_container_width=True):
st.session_state.pop("confirm_purge", None)
st.rerun()
with approved_col:
st.markdown("**Purge approved (unapplied)**")
st.caption("Removes _approved_ jobs you haven't applied to yet — e.g. to reset after a review pass.")
if st.button("🗑 Purge Approved", use_container_width=True):
st.session_state["confirm_purge"] = "approved_only"
if st.session_state.get("confirm_purge") == "approved_only":
st.warning("Deletes all approved-but-not-applied jobs. Cannot be undone.")
c1, c2 = st.columns(2)
if c1.button("Yes, purge approved", type="primary", use_container_width=True):
deleted = purge_jobs(get_db_path(), statuses=["approved"])
st.success(f"Purged {deleted} approved jobs.")
st.session_state.pop("confirm_purge", None)
st.rerun()
if c2.button("Cancel ", use_container_width=True):
st.session_state.pop("confirm_purge", None)
st.rerun()
st.divider()
archive_col1, archive_col2, _, _ = st.columns(4)
with archive_col1:
st.markdown("**Archive remaining**")
st.caption(
"Move all _pending_ and _rejected_ jobs to archived status. "
"Archived jobs stay in the DB for dedup — they just won't appear in Job Review."
) )
if st.button("📦 Archive Pending + Rejected", use_container_width=True): _dc1, _dc2 = st.columns(2)
st.session_state["confirm_purge"] = "archive_remaining" if _dc1.button("Yes, delete", type="primary", use_container_width=True, key="dz_purge_confirm"):
n = purge_jobs(get_db_path(), statuses=_scope_statuses)
st.success(f"Deleted {n} jobs.")
st.session_state.pop("confirm_dz", None)
st.rerun()
if _dc2.button("Cancel", use_container_width=True, key="dz_purge_cancel"):
st.session_state.pop("confirm_dz", None)
st.rerun()
if st.session_state.get("confirm_purge") == "archive_remaining": st.divider()
st.info("Jobs will be archived (not deleted) — URLs are kept for dedup.")
c1, c2 = st.columns(2)
if c1.button("Yes, archive", type="primary", use_container_width=True):
archived = archive_jobs(get_db_path(), statuses=["pending", "rejected"])
st.success(f"Archived {archived} jobs.")
st.session_state.pop("confirm_purge", None)
st.rerun()
if c2.button("Cancel ", use_container_width=True):
st.session_state.pop("confirm_purge", None)
st.rerun()
with archive_col2: # ── Background tasks ──────────────────────────────────────────────────────
st.markdown("**Archive approved (unapplied)**") _active = get_active_tasks(get_db_path())
st.caption("Archive _approved_ listings you decided to skip — keeps history without cluttering the apply queue.") st.markdown(f"**Background tasks** — {len(_active)} active")
if st.button("📦 Archive Approved", use_container_width=True):
st.session_state["confirm_purge"] = "archive_approved"
if st.session_state.get("confirm_purge") == "archive_approved": if _active:
st.info("Approved jobs will be archived (not deleted).") _task_icons = {"cover_letter": "✉️", "research": "🔍", "discovery": "🌐", "enrich_descriptions": "📝"}
c1, c2 = st.columns(2) for _t in _active:
if c1.button("Yes, archive approved", type="primary", use_container_width=True): _tc1, _tc2, _tc3 = st.columns([3, 4, 2])
archived = archive_jobs(get_db_path(), statuses=["approved"]) _icon = _task_icons.get(_t["task_type"], "⚙️")
st.success(f"Archived {archived} approved jobs.") _tc1.caption(f"{_icon} `{_t['task_type']}`")
st.session_state.pop("confirm_purge", None) _job_label = f"{_t['title']} @ {_t['company']}" if _t.get("title") else f"job #{_t['job_id']}"
st.rerun() _tc2.caption(_job_label)
if c2.button("Cancel ", use_container_width=True): _tc3.caption(f"_{_t['status']}_")
st.session_state.pop("confirm_purge", None) if st.button("✕ Cancel", key=f"dz_cancel_task_{_t['id']}", use_container_width=True):
cancel_task(get_db_path(), _t["id"])
st.rerun() st.rerun()
st.caption("")
_kill_col, _ = st.columns([2, 6])
if _kill_col.button("⏹ Kill all stuck", use_container_width=True, disabled=len(_active) == 0):
killed = kill_stuck_tasks(get_db_path())
st.success(f"Killed {killed} task(s).")
st.rerun()
st.divider()
# ── Rarely needed (collapsed) ─────────────────────────────────────────────
with st.expander("More options", expanded=False):
_rare1, _rare2, _rare3 = st.columns(3)
with _rare1:
st.markdown("**Purge email data**")
st.caption("Clears all email thread logs and email-sourced pending jobs.")
if st.button("📧 Purge Email Data", use_container_width=True):
st.session_state["confirm_dz"] = "email"
if st.session_state.get("confirm_dz") == "email":
st.warning("Deletes all email contacts and email-sourced jobs. Cannot be undone.")
_ec1, _ec2 = st.columns(2)
if _ec1.button("Yes, purge emails", type="primary", use_container_width=True, key="dz_email_confirm"):
contacts, jobs = purge_email_data(get_db_path())
st.success(f"Purged {contacts} email contacts, {jobs} email jobs.")
st.session_state.pop("confirm_dz", None)
st.rerun()
if _ec2.button("Cancel", use_container_width=True, key="dz_email_cancel"):
st.session_state.pop("confirm_dz", None)
st.rerun()
with _rare2:
st.markdown("**Purge non-remote**")
st.caption("Removes pending/approved/rejected on-site listings from the DB.")
if st.button("🏢 Purge On-site Jobs", use_container_width=True):
st.session_state["confirm_dz"] = "non_remote"
if st.session_state.get("confirm_dz") == "non_remote":
st.warning("Deletes all non-remote jobs not yet applied to. Cannot be undone.")
_rc1, _rc2 = st.columns(2)
if _rc1.button("Yes, purge on-site", type="primary", use_container_width=True, key="dz_nonremote_confirm"):
deleted = purge_non_remote(get_db_path())
st.success(f"Purged {deleted} non-remote jobs.")
st.session_state.pop("confirm_dz", None)
st.rerun()
if _rc2.button("Cancel", use_container_width=True, key="dz_nonremote_cancel"):
st.session_state.pop("confirm_dz", None)
st.rerun()
with _rare3:
st.markdown("**Wipe all + re-scrape**")
st.caption("Deletes all non-applied jobs then immediately runs a fresh discovery.")
if st.button("🔄 Wipe + Re-scrape", use_container_width=True):
st.session_state["confirm_dz"] = "rescrape"
if st.session_state.get("confirm_dz") == "rescrape":
st.warning("Wipes ALL pending, approved, and rejected jobs, then re-scrapes. Applied and synced records are kept.")
_wc1, _wc2 = st.columns(2)
if _wc1.button("Yes, wipe + scrape", type="primary", use_container_width=True, key="dz_rescrape_confirm"):
purge_jobs(get_db_path(), statuses=["pending", "approved", "rejected"])
submit_task(get_db_path(), "discovery", 0)
st.session_state.pop("confirm_dz", None)
st.rerun()
if _wc2.button("Cancel", use_container_width=True, key="dz_rescrape_cancel"):
st.session_state.pop("confirm_dz", None)
st.rerun()
# ── Setup banners ───────────────────────────────────────────────────────────── # ── Setup banners ─────────────────────────────────────────────────────────────
if _profile and _profile.wizard_complete: if _profile and _profile.wizard_complete:

View file

@ -45,6 +45,30 @@ services:
- "host.docker.internal:host-gateway" - "host.docker.internal:host-gateway"
restart: unless-stopped restart: unless-stopped
api:
build:
context: ..
dockerfile: peregrine/Dockerfile.cfcore
command: >
bash -c "uvicorn dev_api:app --host 0.0.0.0 --port 8601"
volumes:
- /devl/menagerie-data:/devl/menagerie-data
- ./config/llm.cloud.yaml:/app/config/llm.yaml:ro
environment:
- CLOUD_MODE=true
- CLOUD_DATA_ROOT=/devl/menagerie-data
- STAGING_DB=/devl/menagerie-data/cloud-default.db
- DIRECTUS_JWT_SECRET=${DIRECTUS_JWT_SECRET}
- CF_SERVER_SECRET=${CF_SERVER_SECRET}
- PLATFORM_DB_URL=${PLATFORM_DB_URL}
- HEIMDALL_URL=${HEIMDALL_URL:-http://cf-license:8000}
- HEIMDALL_ADMIN_TOKEN=${HEIMDALL_ADMIN_TOKEN}
- PYTHONUNBUFFERED=1
- FORGEJO_API_TOKEN=${FORGEJO_API_TOKEN:-}
extra_hosts:
- "host.docker.internal:host-gateway"
restart: unless-stopped
web: web:
build: build:
context: . context: .
@ -53,6 +77,8 @@ services:
VITE_BASE_PATH: /peregrine/ VITE_BASE_PATH: /peregrine/
ports: ports:
- "8508:80" - "8508:80"
depends_on:
- api
restart: unless-stopped restart: unless-stopped
searxng: searxng:

View file

@ -15,6 +15,7 @@ import ssl as ssl_mod
import subprocess import subprocess
import sys import sys
import threading import threading
from contextvars import ContextVar
from datetime import datetime from datetime import datetime
from pathlib import Path from pathlib import Path
from typing import Optional, List from typing import Optional, List
@ -23,7 +24,7 @@ from urllib.parse import urlparse
import requests import requests
import yaml import yaml
from bs4 import BeautifulSoup from bs4 import BeautifulSoup
from fastapi import FastAPI, HTTPException, Response, UploadFile from fastapi import FastAPI, HTTPException, Request, Response, UploadFile
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel from pydantic import BaseModel
@ -32,10 +33,18 @@ PEREGRINE_ROOT = Path("/Library/Development/CircuitForge/peregrine")
if str(PEREGRINE_ROOT) not in sys.path: if str(PEREGRINE_ROOT) not in sys.path:
sys.path.insert(0, str(PEREGRINE_ROOT)) sys.path.insert(0, str(PEREGRINE_ROOT))
from circuitforge_core.config.settings import load_env as _load_env # noqa: E402
from scripts.credential_store import get_credential, set_credential, delete_credential # noqa: E402 from scripts.credential_store import get_credential, set_credential, delete_credential # noqa: E402
DB_PATH = os.environ.get("STAGING_DB", "/devl/job-seeker/staging.db") DB_PATH = os.environ.get("STAGING_DB", "/devl/job-seeker/staging.db")
_CLOUD_MODE = os.environ.get("CLOUD_MODE", "").lower() in ("1", "true")
_CLOUD_DATA_ROOT = Path(os.environ.get("CLOUD_DATA_ROOT", "/devl/menagerie-data"))
_DIRECTUS_SECRET = os.environ.get("DIRECTUS_JWT_SECRET", "")
# Per-request DB path — set by cloud_session_middleware; falls back to DB_PATH
_request_db: ContextVar[str | None] = ContextVar("_request_db", default=None)
app = FastAPI(title="Peregrine Dev API") app = FastAPI(title="Peregrine Dev API")
app.add_middleware( app.add_middleware(
@ -46,8 +55,65 @@ app.add_middleware(
) )
_log = logging.getLogger("peregrine.session")
def _resolve_cf_user_id(cookie_str: str) -> str | None:
"""Extract cf_session JWT from Cookie string and return Directus user_id.
Directus signs with the raw bytes of its JWT_SECRET (which is base64-encoded
in env). Try the raw string first, then fall back to base64-decoded bytes.
"""
if not cookie_str:
_log.debug("_resolve_cf_user_id: empty cookie string")
return None
m = re.search(r'(?:^|;)\s*cf_session=([^;]+)', cookie_str)
if not m:
_log.debug("_resolve_cf_user_id: no cf_session in cookie: %s", cookie_str[:80])
return None
token = m.group(1).strip()
import base64
import jwt # PyJWT
secrets_to_try: list[str | bytes] = [_DIRECTUS_SECRET]
try:
secrets_to_try.append(base64.b64decode(_DIRECTUS_SECRET))
except Exception:
pass
# Skip exp verification — we use the token for routing only, not auth.
# Directus manages actual auth; Caddy gates on cookie presence.
decode_opts = {"verify_exp": False}
for secret in secrets_to_try:
try:
payload = jwt.decode(token, secret, algorithms=["HS256"], options=decode_opts)
user_id = payload.get("id") or payload.get("sub")
if user_id:
_log.debug("_resolve_cf_user_id: resolved user_id=%s", user_id)
return user_id
except Exception as exc:
_log.debug("_resolve_cf_user_id: decode failed (%s): %s", type(exc).__name__, exc)
continue
_log.warning("_resolve_cf_user_id: all secrets failed for token prefix %s", token[:20])
return None
@app.middleware("http")
async def cloud_session_middleware(request: Request, call_next):
"""In cloud mode, resolve per-user staging.db from the X-CF-Session header."""
if _CLOUD_MODE and _DIRECTUS_SECRET:
cookie_header = request.headers.get("X-CF-Session", "")
user_id = _resolve_cf_user_id(cookie_header)
if user_id:
user_db = str(_CLOUD_DATA_ROOT / user_id / "peregrine" / "staging.db")
token = _request_db.set(user_db)
try:
return await call_next(request)
finally:
_request_db.reset(token)
return await call_next(request)
def _get_db(): def _get_db():
db = sqlite3.connect(DB_PATH) path = _request_db.get() or DB_PATH
db = sqlite3.connect(path)
db.row_factory = sqlite3.Row db.row_factory = sqlite3.Row
return db return db
@ -66,7 +132,10 @@ def _strip_html(text: str | None) -> str | None:
@app.on_event("startup") @app.on_event("startup")
def _startup(): def _startup():
"""Ensure digest_queue table exists (dev-api may run against an existing DB).""" """Load .env then ensure digest_queue table exists."""
# Load .env before any runtime env reads — safe because startup doesn't run
# when dev_api is imported by tests (only when uvicorn actually starts).
_load_env(PEREGRINE_ROOT / ".env")
db = _get_db() db = _get_db()
try: try:
db.execute(""" db.execute("""
@ -620,6 +689,117 @@ def download_pdf(job_id: int):
raise HTTPException(501, "reportlab not installed — install it to generate PDFs") raise HTTPException(501, "reportlab not installed — install it to generate PDFs")
# ── Application Q&A endpoints ─────────────────────────────────────────────────
def _ensure_qa_column(db) -> None:
"""Add application_qa TEXT column to jobs if not present (idempotent)."""
try:
db.execute("ALTER TABLE jobs ADD COLUMN application_qa TEXT")
db.commit()
except Exception:
pass # Column already exists
class QAItem(BaseModel):
id: str
question: str
answer: str
class QAPayload(BaseModel):
items: List[QAItem]
class QASuggestPayload(BaseModel):
question: str
@app.get("/api/jobs/{job_id}/qa")
def get_qa(job_id: int):
db = _get_db()
_ensure_qa_column(db)
row = db.execute("SELECT application_qa FROM jobs WHERE id = ?", (job_id,)).fetchone()
db.close()
if not row:
raise HTTPException(404, "Job not found")
try:
items = json.loads(row["application_qa"] or "[]")
except Exception:
items = []
return {"items": items}
@app.patch("/api/jobs/{job_id}/qa")
def save_qa(job_id: int, payload: QAPayload):
db = _get_db()
_ensure_qa_column(db)
row = db.execute("SELECT id FROM jobs WHERE id = ?", (job_id,)).fetchone()
if not row:
db.close()
raise HTTPException(404, "Job not found")
db.execute(
"UPDATE jobs SET application_qa = ? WHERE id = ?",
(json.dumps([item.model_dump() for item in payload.items]), job_id),
)
db.commit()
db.close()
return {"ok": True}
@app.post("/api/jobs/{job_id}/qa/suggest")
def suggest_qa_answer(job_id: int, payload: QASuggestPayload):
"""Synchronously generate an LLM answer for an application Q&A question."""
db = _get_db()
job_row = db.execute(
"SELECT title, company, description FROM jobs WHERE id = ?", (job_id,)
).fetchone()
db.close()
if not job_row:
raise HTTPException(404, "Job not found")
# Load resume summary for context
resume_context = ""
try:
resume_path = _resume_path()
if resume_path.exists():
with open(resume_path) as f:
resume_data = yaml.safe_load(f) or {}
parts = []
if resume_data.get("name"):
parts.append(f"Candidate: {resume_data['name']}")
if resume_data.get("skills"):
parts.append(f"Skills: {', '.join(resume_data['skills'][:20])}")
if resume_data.get("experience"):
exp = resume_data["experience"]
if isinstance(exp, list) and exp:
titles = [e.get("title", "") for e in exp[:3] if e.get("title")]
if titles:
parts.append(f"Recent roles: {', '.join(titles)}")
if resume_data.get("career_summary"):
parts.append(f"Summary: {resume_data['career_summary'][:400]}")
resume_context = "\n".join(parts)
except Exception:
pass
prompt = (
f"You are helping a job applicant answer an application question.\n\n"
f"Job: {job_row['title']} at {job_row['company']}\n"
f"Job description excerpt:\n{(job_row['description'] or '')[:800]}\n\n"
f"Candidate background:\n{resume_context or 'Not provided'}\n\n"
f"Application question: {payload.question}\n\n"
"Write a concise, professional answer (24 sentences) in first person. "
"Be specific and genuine. Do not use hollow filler phrases."
)
try:
from scripts.llm_router import LLMRouter
router = LLMRouter()
answer = router.complete(prompt)
return {"answer": answer.strip()}
except Exception as e:
raise HTTPException(500, f"LLM generation failed: {e}")
# ── GET /api/interviews ──────────────────────────────────────────────────────── # ── GET /api/interviews ────────────────────────────────────────────────────────
PIPELINE_STATUSES = { PIPELINE_STATUSES = {
@ -715,6 +895,230 @@ def email_sync_status():
} }
# ── Task management routes ─────────────────────────────────────────────────────
def _db_path() -> Path:
"""Return the effective staging.db path (cloud-aware)."""
return Path(_request_db.get() or DB_PATH)
@app.get("/api/tasks")
def list_active_tasks():
from scripts.db import get_active_tasks
return get_active_tasks(_db_path())
@app.delete("/api/tasks/{task_id}")
def cancel_task_by_id(task_id: int):
from scripts.db import cancel_task
ok = cancel_task(_db_path(), task_id)
return {"ok": ok}
@app.post("/api/tasks/kill")
def kill_stuck():
from scripts.db import kill_stuck_tasks
killed = kill_stuck_tasks(_db_path())
return {"killed": killed}
@app.post("/api/tasks/discovery", status_code=202)
def trigger_discovery():
from scripts.task_runner import submit_task
task_id, is_new = submit_task(_db_path(), "discovery", 0)
return {"task_id": task_id, "is_new": is_new}
@app.post("/api/tasks/email-sync", status_code=202)
def trigger_email_sync_task():
from scripts.task_runner import submit_task
task_id, is_new = submit_task(_db_path(), "email_sync", 0)
return {"task_id": task_id, "is_new": is_new}
@app.post("/api/tasks/enrich", status_code=202)
def trigger_enrich_task():
from scripts.task_runner import submit_task
task_id, is_new = submit_task(_db_path(), "enrich_descriptions", 0)
return {"task_id": task_id, "is_new": is_new}
@app.post("/api/tasks/score")
def trigger_score():
try:
result = subprocess.run(
[sys.executable, "scripts/match.py"],
capture_output=True, text=True, cwd=str(PEREGRINE_ROOT),
)
if result.returncode == 0:
return {"ok": True, "output": result.stdout}
raise HTTPException(status_code=500, detail=result.stderr)
except HTTPException:
raise
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@app.post("/api/tasks/sync")
def trigger_notion_sync():
try:
from scripts.sync import sync_to_notion
count = sync_to_notion(_db_path())
return {"ok": True, "count": count}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
# ── Bulk job actions ───────────────────────────────────────────────────────────
class BulkArchiveBody(BaseModel):
statuses: List[str]
@app.post("/api/jobs/archive")
def bulk_archive_jobs(body: BulkArchiveBody):
from scripts.db import archive_jobs
n = archive_jobs(_db_path(), statuses=body.statuses)
return {"archived": n}
class BulkPurgeBody(BaseModel):
statuses: Optional[List[str]] = None
target: Optional[str] = None # "email", "non_remote", "rescrape"
@app.post("/api/jobs/purge")
def bulk_purge_jobs(body: BulkPurgeBody):
from scripts.db import purge_jobs, purge_email_data, purge_non_remote
if body.target == "email":
contacts, jobs = purge_email_data(_db_path())
return {"ok": True, "contacts": contacts, "jobs": jobs}
if body.target == "non_remote":
n = purge_non_remote(_db_path())
return {"ok": True, "deleted": n}
if body.target == "rescrape":
purge_jobs(_db_path(), statuses=["pending", "approved", "rejected"])
from scripts.task_runner import submit_task
submit_task(_db_path(), "discovery", 0)
return {"ok": True}
statuses = body.statuses or ["pending", "rejected"]
n = purge_jobs(_db_path(), statuses=statuses)
return {"ok": True, "deleted": n}
class AddJobsBody(BaseModel):
urls: List[str]
@app.post("/api/jobs/add", status_code=202)
def add_jobs_by_url(body: AddJobsBody):
try:
from datetime import datetime as _dt
from scripts.scrape_url import canonicalize_url
from scripts.db import get_existing_urls, insert_job
from scripts.task_runner import submit_task
db_path = _db_path()
existing = get_existing_urls(db_path)
queued = 0
for raw_url in body.urls:
url = canonicalize_url(raw_url.strip())
if not url.startswith("http") or url in existing:
continue
job_id = insert_job(db_path, {
"title": "Importing...", "company": "", "url": url,
"source": "manual", "location": "", "description": "",
"date_found": _dt.now().isoformat()[:10],
})
if job_id:
submit_task(db_path, "scrape_url", job_id)
queued += 1
return {"queued": queued}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@app.post("/api/jobs/upload-csv", status_code=202)
async def upload_jobs_csv(file: UploadFile):
try:
import csv as _csv
import io as _io
from datetime import datetime as _dt
from scripts.scrape_url import canonicalize_url
from scripts.db import get_existing_urls, insert_job
from scripts.task_runner import submit_task
content = await file.read()
reader = _csv.DictReader(_io.StringIO(content.decode("utf-8", errors="replace")))
urls: list[str] = []
for row in reader:
for val in row.values():
if val and val.strip().startswith("http"):
urls.append(val.strip())
break
db_path = _db_path()
existing = get_existing_urls(db_path)
queued = 0
for raw_url in urls:
url = canonicalize_url(raw_url)
if not url.startswith("http") or url in existing:
continue
job_id = insert_job(db_path, {
"title": "Importing...", "company": "", "url": url,
"source": "manual", "location": "", "description": "",
"date_found": _dt.now().isoformat()[:10],
})
if job_id:
submit_task(db_path, "scrape_url", job_id)
queued += 1
return {"queued": queued, "total": len(urls)}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
# ── Setup banners ──────────────────────────────────────────────────────────────
_SETUP_BANNERS = [
{"key": "connect_cloud", "text": "Connect a cloud service for resume/cover letter storage", "link": "/settings?tab=integrations"},
{"key": "setup_email", "text": "Set up email sync to catch recruiter outreach", "link": "/settings?tab=email"},
{"key": "setup_email_labels", "text": "Set up email label filters for auto-classification", "link": "/settings?tab=email"},
{"key": "tune_mission", "text": "Tune your mission preferences for better cover letters", "link": "/settings?tab=profile"},
{"key": "configure_keywords", "text": "Configure keywords and blocklist for smarter search", "link": "/settings?tab=search"},
{"key": "upload_corpus", "text": "Upload your cover letter corpus for voice fine-tuning", "link": "/settings?tab=fine-tune"},
{"key": "configure_linkedin", "text": "Configure LinkedIn Easy Apply automation", "link": "/settings?tab=integrations"},
{"key": "setup_searxng", "text": "Set up company research with SearXNG", "link": "/settings?tab=system"},
{"key": "target_companies", "text": "Build a target company list for focused outreach", "link": "/settings?tab=search"},
{"key": "setup_notifications", "text": "Set up notifications for stage changes", "link": "/settings?tab=integrations"},
{"key": "tune_model", "text": "Tune a custom cover letter model on your writing", "link": "/settings?tab=fine-tune"},
{"key": "review_training", "text": "Review and curate training data for model tuning", "link": "/settings?tab=fine-tune"},
{"key": "setup_calendar", "text": "Set up calendar sync to track interview dates", "link": "/settings?tab=integrations"},
]
@app.get("/api/config/setup-banners")
def get_setup_banners():
try:
cfg = _load_user_config()
if not cfg.get("wizard_complete"):
return []
dismissed = set(cfg.get("dismissed_banners", []))
return [b for b in _SETUP_BANNERS if b["key"] not in dismissed]
except Exception:
return []
@app.post("/api/config/setup-banners/{key}/dismiss")
def dismiss_setup_banner(key: str):
try:
cfg = _load_user_config()
dismissed = cfg.get("dismissed_banners", [])
if key not in dismissed:
dismissed.append(key)
cfg["dismissed_banners"] = dismissed
_save_user_config(cfg)
return {"ok": True}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
# ── POST /api/stage-signals/{id}/dismiss ───────────────────────────────── # ── POST /api/stage-signals/{id}/dismiss ─────────────────────────────────
@app.post("/api/stage-signals/{signal_id}/dismiss") @app.post("/api/stage-signals/{signal_id}/dismiss")
@ -948,13 +1352,16 @@ def get_app_config():
valid_tiers = {"free", "paid", "premium", "ultra"} valid_tiers = {"free", "paid", "premium", "ultra"}
raw_tier = os.environ.get("APP_TIER", "free") raw_tier = os.environ.get("APP_TIER", "free")
# wizard_complete: read from user.yaml so the guard reflects live state # Cloud users always bypass the wizard — they configure through Settings
wizard_complete = True is_cloud = os.environ.get("CLOUD_MODE", "").lower() in ("1", "true")
try: if is_cloud:
cfg = load_user_profile(_user_yaml_path()) wizard_complete = True
wizard_complete = bool(cfg.get("wizard_complete", False)) else:
except Exception: try:
wizard_complete = False cfg = load_user_profile(_user_yaml_path())
wizard_complete = bool(cfg.get("wizard_complete", False))
except Exception:
wizard_complete = False
return { return {
"isCloud": os.environ.get("CLOUD_MODE", "").lower() in ("1", "true"), "isCloud": os.environ.get("CLOUD_MODE", "").lower() in ("1", "true"),
@ -988,12 +1395,12 @@ from scripts.user_profile import load_user_profile, save_user_profile
def _user_yaml_path() -> str: def _user_yaml_path() -> str:
"""Resolve user.yaml path relative to the current STAGING_DB location. """Resolve user.yaml path relative to the active staging.db.
Never falls back to another user's config directory — callers must handle In cloud mode the ContextVar holds the per-user db path; elsewhere
a missing file gracefully (return defaults / empty wizard state). falls back to STAGING_DB env var. Never crosses user boundaries.
""" """
db = os.environ.get("STAGING_DB", "/devl/peregrine/staging.db") db = _request_db.get() or os.environ.get("STAGING_DB", "/devl/peregrine/staging.db")
return os.path.join(os.path.dirname(db), "config", "user.yaml") return os.path.join(os.path.dirname(db), "config", "user.yaml")
@ -1061,6 +1468,23 @@ class IdentitySyncPayload(BaseModel):
phone: str = "" phone: str = ""
linkedin_url: str = "" linkedin_url: str = ""
class UIPrefPayload(BaseModel):
preference: str # "streamlit" | "vue"
@app.post("/api/settings/ui-preference")
def set_ui_preference(payload: UIPrefPayload):
"""Persist UI preference to user.yaml so Streamlit doesn't re-set the cookie."""
if payload.preference not in ("streamlit", "vue"):
raise HTTPException(status_code=400, detail="preference must be 'streamlit' or 'vue'")
try:
data = load_user_profile(_user_yaml_path())
data["ui_preference"] = payload.preference
save_user_profile(_user_yaml_path(), data)
return {"ok": True}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@app.post("/api/settings/resume/sync-identity") @app.post("/api/settings/resume/sync-identity")
def sync_identity(payload: IdentitySyncPayload): def sync_identity(payload: IdentitySyncPayload):
"""Sync identity fields from profile store back to user.yaml.""" """Sync identity fields from profile store back to user.yaml."""
@ -1117,9 +1541,54 @@ class ResumePayload(BaseModel):
veteran_status: str = ""; disability: str = "" veteran_status: str = ""; disability: str = ""
skills: List[str] = []; domains: List[str] = []; keywords: List[str] = [] skills: List[str] = []; domains: List[str] = []; keywords: List[str] = []
def _config_dir() -> Path:
"""Resolve per-user config directory. Always co-located with user.yaml."""
return Path(_user_yaml_path()).parent
def _resume_path() -> Path: def _resume_path() -> Path:
"""Resolve plain_text_resume.yaml co-located with user.yaml (user-isolated).""" """Resolve plain_text_resume.yaml co-located with user.yaml (user-isolated)."""
return Path(_user_yaml_path()).parent / "plain_text_resume.yaml" return _config_dir() / "plain_text_resume.yaml"
def _search_prefs_path() -> Path:
return _config_dir() / "search_profiles.yaml"
def _license_path() -> Path:
return _config_dir() / "license.yaml"
def _tokens_path() -> Path:
return _config_dir() / "tokens.yaml"
def _normalize_experience(raw: list) -> list:
"""Normalize AIHawk-style experience entries to the Vue WorkEntry schema.
Parser / AIHawk stores: bullets (list[str]), start_date, end_date
Vue WorkEntry expects: responsibilities (str), period (str)
"""
out = []
for e in raw:
if not isinstance(e, dict):
continue
entry = dict(e)
# bullets → responsibilities
if "responsibilities" not in entry or not entry["responsibilities"]:
bullets = entry.pop("bullets", None) or []
if isinstance(bullets, list):
entry["responsibilities"] = "\n".join(b for b in bullets if b)
elif isinstance(bullets, str):
entry["responsibilities"] = bullets
else:
entry.pop("bullets", None)
# start_date + end_date → period
if "period" not in entry or not entry["period"]:
start = entry.pop("start_date", "") or ""
end = entry.pop("end_date", "") or ""
entry["period"] = f"{start} {end}".strip(" ") if (start or end) else ""
else:
entry.pop("start_date", None)
entry.pop("end_date", None)
out.append(entry)
return out
@app.get("/api/settings/resume") @app.get("/api/settings/resume")
def get_resume(): def get_resume():
@ -1130,6 +1599,8 @@ def get_resume():
with open(resume_path) as f: with open(resume_path) as f:
data = yaml.safe_load(f) or {} data = yaml.safe_load(f) or {}
data["exists"] = True data["exists"] = True
if "experience" in data and isinstance(data["experience"], list):
data["experience"] = _normalize_experience(data["experience"])
return data return data
except Exception as e: except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))
@ -1177,8 +1648,13 @@ async def upload_resume(file: UploadFile):
raw_text = extract_text_from_docx(file_bytes) raw_text = extract_text_from_docx(file_bytes)
result, err = structure_resume(raw_text) result, err = structure_resume(raw_text)
if err: if err and not result:
return {"ok": False, "error": err, "data": result} return {"ok": False, "error": err}
# Persist parsed data so store.load() reads the updated file
resume_path = _resume_path()
resume_path.parent.mkdir(parents=True, exist_ok=True)
with open(resume_path, "w") as f:
yaml.dump(result, f, allow_unicode=True, default_flow_style=False)
result["exists"] = True result["exists"] = True
return {"ok": True, "data": result} return {"ok": True, "data": result}
except Exception as e: except Exception as e:
@ -1198,14 +1674,13 @@ class SearchPrefsPayload(BaseModel):
blocklist_industries: List[str] = [] blocklist_industries: List[str] = []
blocklist_locations: List[str] = [] blocklist_locations: List[str] = []
SEARCH_PREFS_PATH = Path("config/search_profiles.yaml")
@app.get("/api/settings/search") @app.get("/api/settings/search")
def get_search_prefs(): def get_search_prefs():
try: try:
if not SEARCH_PREFS_PATH.exists(): p = _search_prefs_path()
if not p.exists():
return {} return {}
with open(SEARCH_PREFS_PATH) as f: with open(p) as f:
data = yaml.safe_load(f) or {} data = yaml.safe_load(f) or {}
return data.get("default", {}) return data.get("default", {})
except Exception as e: except Exception as e:
@ -1214,12 +1689,14 @@ def get_search_prefs():
@app.put("/api/settings/search") @app.put("/api/settings/search")
def save_search_prefs(payload: SearchPrefsPayload): def save_search_prefs(payload: SearchPrefsPayload):
try: try:
p = _search_prefs_path()
data = {} data = {}
if SEARCH_PREFS_PATH.exists(): if p.exists():
with open(SEARCH_PREFS_PATH) as f: with open(p) as f:
data = yaml.safe_load(f) or {} data = yaml.safe_load(f) or {}
data["default"] = payload.model_dump() data["default"] = payload.model_dump()
with open(SEARCH_PREFS_PATH, "w") as f: p.parent.mkdir(parents=True, exist_ok=True)
with open(p, "w") as f:
yaml.dump(data, f, allow_unicode=True, default_flow_style=False) yaml.dump(data, f, allow_unicode=True, default_flow_style=False)
return {"ok": True} return {"ok": True}
except Exception as e: except Exception as e:
@ -1347,7 +1824,7 @@ def stop_service(name: str):
# ── Settings: System — Email ────────────────────────────────────────────────── # ── Settings: System — Email ──────────────────────────────────────────────────
EMAIL_PATH = Path("config/email.yaml") # EMAIL_PATH is resolved per-request via _config_dir()
EMAIL_CRED_SERVICE = "peregrine" EMAIL_CRED_SERVICE = "peregrine"
EMAIL_CRED_KEY = "imap_password" EMAIL_CRED_KEY = "imap_password"
@ -1359,8 +1836,9 @@ EMAIL_YAML_FIELDS = ("host", "port", "ssl", "username", "sent_folder", "lookback
def get_email_config(): def get_email_config():
try: try:
config = {} config = {}
if EMAIL_PATH.exists(): ep = _config_dir() / "email.yaml"
with open(EMAIL_PATH) as f: if ep.exists():
with open(ep) as f:
config = yaml.safe_load(f) or {} config = yaml.safe_load(f) or {}
# Never return the password — only indicate whether it's set # Never return the password — only indicate whether it's set
password = get_credential(EMAIL_CRED_SERVICE, EMAIL_CRED_KEY) password = get_credential(EMAIL_CRED_SERVICE, EMAIL_CRED_KEY)
@ -1374,7 +1852,8 @@ def get_email_config():
@app.put("/api/settings/system/email") @app.put("/api/settings/system/email")
def save_email_config(payload: dict): def save_email_config(payload: dict):
try: try:
EMAIL_PATH.parent.mkdir(parents=True, exist_ok=True) ep = _config_dir() / "email.yaml"
ep.parent.mkdir(parents=True, exist_ok=True)
# Extract password before writing yaml; discard the sentinel boolean regardless # Extract password before writing yaml; discard the sentinel boolean regardless
password = payload.pop("password", None) password = payload.pop("password", None)
payload.pop("password_set", None) # always discard — boolean sentinel, not a secret payload.pop("password_set", None) # always discard — boolean sentinel, not a secret
@ -1382,7 +1861,7 @@ def save_email_config(payload: dict):
set_credential(EMAIL_CRED_SERVICE, EMAIL_CRED_KEY, password) set_credential(EMAIL_CRED_SERVICE, EMAIL_CRED_KEY, password)
# Write non-secret fields to yaml (chmod 600 still, contains username) # Write non-secret fields to yaml (chmod 600 still, contains username)
safe_config = {k: v for k, v in payload.items() if k in EMAIL_YAML_FIELDS} safe_config = {k: v for k, v in payload.items() if k in EMAIL_YAML_FIELDS}
fd = os.open(str(EMAIL_PATH), os.O_WRONLY | os.O_CREAT | os.O_TRUNC, 0o600) fd = os.open(str(ep), os.O_WRONLY | os.O_CREAT | os.O_TRUNC, 0o600)
with os.fdopen(fd, "w") as f: with os.fdopen(fd, "w") as f:
yaml.dump(safe_config, f, allow_unicode=True, default_flow_style=False) yaml.dump(safe_config, f, allow_unicode=True, default_flow_style=False)
return {"ok": True} return {"ok": True}
@ -1601,12 +2080,7 @@ def finetune_local_status():
# ── Settings: License ───────────────────────────────────────────────────────── # ── Settings: License ─────────────────────────────────────────────────────────
# CONFIG_DIR resolves relative to staging.db location (same convention as _user_yaml_path) # _config_dir() / _license_path() / _tokens_path() are per-request (see helpers above)
CONFIG_DIR = Path(os.path.dirname(DB_PATH)) / "config"
if not CONFIG_DIR.exists():
CONFIG_DIR = Path("/devl/job-seeker/config")
LICENSE_PATH = CONFIG_DIR / "license.yaml"
def _load_user_config() -> dict: def _load_user_config() -> dict:
@ -1622,8 +2096,9 @@ def _save_user_config(cfg: dict) -> None:
@app.get("/api/settings/license") @app.get("/api/settings/license")
def get_license(): def get_license():
try: try:
if LICENSE_PATH.exists(): lp = _license_path()
with open(LICENSE_PATH) as f: if lp.exists():
with open(lp) as f:
data = yaml.safe_load(f) or {} data = yaml.safe_load(f) or {}
else: else:
data = {} data = {}
@ -1647,9 +2122,10 @@ def activate_license(payload: LicenseActivatePayload):
key = payload.key.strip() key = payload.key.strip()
if not re.match(r'^CFG-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}$', key): if not re.match(r'^CFG-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}$', key):
return {"ok": False, "error": "Invalid key format"} return {"ok": False, "error": "Invalid key format"}
lp = _license_path()
data = {"tier": "paid", "key": key, "active": True} data = {"tier": "paid", "key": key, "active": True}
CONFIG_DIR.mkdir(parents=True, exist_ok=True) lp.parent.mkdir(parents=True, exist_ok=True)
fd = os.open(str(LICENSE_PATH), os.O_WRONLY | os.O_CREAT | os.O_TRUNC, 0o600) fd = os.open(str(lp), os.O_WRONLY | os.O_CREAT | os.O_TRUNC, 0o600)
with os.fdopen(fd, "w") as f: with os.fdopen(fd, "w") as f:
yaml.dump(data, f, allow_unicode=True, default_flow_style=False) yaml.dump(data, f, allow_unicode=True, default_flow_style=False)
return {"ok": True, "tier": "paid"} return {"ok": True, "tier": "paid"}
@ -1660,11 +2136,12 @@ def activate_license(payload: LicenseActivatePayload):
@app.post("/api/settings/license/deactivate") @app.post("/api/settings/license/deactivate")
def deactivate_license(): def deactivate_license():
try: try:
if LICENSE_PATH.exists(): lp = _license_path()
with open(LICENSE_PATH) as f: if lp.exists():
with open(lp) as f:
data = yaml.safe_load(f) or {} data = yaml.safe_load(f) or {}
data["active"] = False data["active"] = False
fd = os.open(str(LICENSE_PATH), os.O_WRONLY | os.O_CREAT | os.O_TRUNC, 0o600) fd = os.open(str(lp), os.O_WRONLY | os.O_CREAT | os.O_TRUNC, 0o600)
with os.fdopen(fd, "w") as f: with os.fdopen(fd, "w") as f:
yaml.dump(data, f, allow_unicode=True, default_flow_style=False) yaml.dump(data, f, allow_unicode=True, default_flow_style=False)
return {"ok": True} return {"ok": True}
@ -1682,18 +2159,19 @@ def create_backup(payload: BackupCreatePayload):
try: try:
import zipfile import zipfile
import datetime import datetime
backup_dir = Path("data/backups") cfg_dir = _config_dir()
backup_dir = cfg_dir.parent / "backups"
backup_dir.mkdir(parents=True, exist_ok=True) backup_dir.mkdir(parents=True, exist_ok=True)
ts = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") ts = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
dest = backup_dir / f"peregrine_backup_{ts}.zip" dest = backup_dir / f"peregrine_backup_{ts}.zip"
file_count = 0 file_count = 0
with zipfile.ZipFile(dest, "w", zipfile.ZIP_DEFLATED) as zf: with zipfile.ZipFile(dest, "w", zipfile.ZIP_DEFLATED) as zf:
for cfg_file in CONFIG_DIR.glob("*.yaml"): for cfg_file in cfg_dir.glob("*.yaml"):
if cfg_file.name not in ("tokens.yaml",): if cfg_file.name not in ("tokens.yaml",):
zf.write(cfg_file, f"config/{cfg_file.name}") zf.write(cfg_file, f"config/{cfg_file.name}")
file_count += 1 file_count += 1
if payload.include_db: if payload.include_db:
db_path = Path(DB_PATH) db_path = Path(_request_db.get() or DB_PATH)
if db_path.exists(): if db_path.exists():
zf.write(db_path, "data/staging.db") zf.write(db_path, "data/staging.db")
file_count += 1 file_count += 1
@ -1737,15 +2215,14 @@ def save_privacy(payload: dict):
# ── Settings: Developer ─────────────────────────────────────────────────────── # ── Settings: Developer ───────────────────────────────────────────────────────
TOKENS_PATH = CONFIG_DIR / "tokens.yaml"
@app.get("/api/settings/developer") @app.get("/api/settings/developer")
def get_developer(): def get_developer():
try: try:
cfg = _load_user_config() cfg = _load_user_config()
tokens = {} tokens = {}
if TOKENS_PATH.exists(): tp = _tokens_path()
with open(TOKENS_PATH) as f: if tp.exists():
with open(tp) as f:
tokens = yaml.safe_load(f) or {} tokens = yaml.safe_load(f) or {}
return { return {
"dev_tier_override": cfg.get("dev_tier_override"), "dev_tier_override": cfg.get("dev_tier_override"),
@ -1980,7 +2457,7 @@ def wizard_save_step(payload: WizardStepPayload):
# Persist search preferences to search_profiles.yaml # Persist search preferences to search_profiles.yaml
titles = data.get("titles", []) titles = data.get("titles", [])
locations = data.get("locations", []) locations = data.get("locations", [])
search_path = SEARCH_PREFS_PATH search_path = _search_prefs_path()
existing_search: dict = {} existing_search: dict = {}
if search_path.exists(): if search_path.exists():
with open(search_path) as f: with open(search_path) as f:

View file

@ -2,6 +2,8 @@ server {
listen 80; listen 80;
server_name _; server_name _;
client_max_body_size 20m;
root /usr/share/nginx/html; root /usr/share/nginx/html;
index index.html; index index.html;

View file

@ -383,6 +383,19 @@ def mark_applied(db_path: Path = DEFAULT_DB, ids: list[int] = None) -> None:
conn.close() conn.close()
def cancel_task(db_path: Path = DEFAULT_DB, task_id: int = 0) -> bool:
"""Cancel a single queued/running task by id. Returns True if a row was updated."""
conn = sqlite3.connect(db_path)
count = conn.execute(
"UPDATE background_tasks SET status='failed', error='Cancelled by user',"
" finished_at=datetime('now') WHERE id=? AND status IN ('queued','running')",
(task_id,),
).rowcount
conn.commit()
conn.close()
return count > 0
def kill_stuck_tasks(db_path: Path = DEFAULT_DB) -> int: def kill_stuck_tasks(db_path: Path = DEFAULT_DB) -> int:
"""Mark all queued/running background tasks as failed. Returns count killed.""" """Mark all queued/running background tasks as failed. Returns count killed."""
conn = sqlite3.connect(db_path) conn = sqlite3.connect(db_path)

View file

@ -40,6 +40,9 @@
<Cog6ToothIcon class="sidebar__icon" aria-hidden="true" /> <Cog6ToothIcon class="sidebar__icon" aria-hidden="true" />
<span class="sidebar__label">Settings</span> <span class="sidebar__label">Settings</span>
</RouterLink> </RouterLink>
<button class="sidebar__classic-btn" @click="switchToClassic" title="Switch to Classic (Streamlit) UI">
Classic
</button>
</div> </div>
</nav> </nav>
@ -105,6 +108,23 @@ function exitHackerMode() {
localStorage.removeItem('cf-hacker-mode') localStorage.removeItem('cf-hacker-mode')
} }
const _apiBase = import.meta.env.BASE_URL.replace(/\/$/, '')
async function switchToClassic() {
// Persist preference via API so Streamlit reads streamlit from user.yaml
// and won't re-set the cookie back to vue (avoids the ?prgn_switch rerun cycle)
try {
await fetch(_apiBase + '/api/settings/ui-preference', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ preference: 'streamlit' }),
})
} catch { /* non-fatal — cookie below is enough for immediate redirect */ }
document.cookie = 'prgn_ui=streamlit; path=/; SameSite=Lax'
// Navigate to root (no query params) Caddy routes to Streamlit based on cookie
window.location.href = window.location.origin + '/'
}
const navLinks = computed(() => [ const navLinks = computed(() => [
{ to: '/', icon: HomeIcon, label: 'Home' }, { to: '/', icon: HomeIcon, label: 'Home' },
{ to: '/review', icon: ClipboardDocumentListIcon, label: 'Job Review' }, { to: '/review', icon: ClipboardDocumentListIcon, label: 'Job Review' },
@ -272,6 +292,29 @@ const mobileLinks = [
margin: 0; margin: 0;
} }
.sidebar__classic-btn {
display: flex;
align-items: center;
width: 100%;
padding: var(--space-2) var(--space-3);
margin-top: var(--space-1);
background: none;
border: none;
border-radius: var(--radius-md);
color: var(--color-text-muted);
font-size: var(--text-xs);
font-weight: 500;
cursor: pointer;
opacity: 0.6;
transition: opacity 150ms, background 150ms;
white-space: nowrap;
}
.sidebar__classic-btn:hover {
opacity: 1;
background: var(--color-surface-alt);
}
/* ── Mobile tab bar (<1024px) ───────────────────────── */ /* ── Mobile tab bar (<1024px) ───────────────────────── */
.app-tabbar { .app-tabbar {
display: none; /* hidden on desktop */ display: none; /* hidden on desktop */

View file

@ -56,6 +56,49 @@
<span v-if="gaps.length > 6" class="gaps-more">+{{ gaps.length - 6 }}</span> <span v-if="gaps.length > 6" class="gaps-more">+{{ gaps.length - 6 }}</span>
</div> </div>
<!-- Resume Highlights -->
<div
v-if="resumeSkills.length || resumeDomains.length || resumeKeywords.length"
class="resume-highlights"
>
<button class="section-toggle" @click="highlightsExpanded = !highlightsExpanded">
<span class="section-toggle__label">My Resume Highlights</span>
<span class="section-toggle__icon" aria-hidden="true">{{ highlightsExpanded ? '▲' : '▼' }}</span>
</button>
<div v-if="highlightsExpanded" class="highlights-body">
<div v-if="resumeSkills.length" class="chips-group">
<span class="chips-group__label">Skills</span>
<div class="chips-wrap">
<span
v-for="s in resumeSkills" :key="s"
class="hl-chip"
:class="{ 'hl-chip--match': jobMatchSet.has(s.toLowerCase()) }"
>{{ s }}</span>
</div>
</div>
<div v-if="resumeDomains.length" class="chips-group">
<span class="chips-group__label">Domains</span>
<div class="chips-wrap">
<span
v-for="d in resumeDomains" :key="d"
class="hl-chip"
:class="{ 'hl-chip--match': jobMatchSet.has(d.toLowerCase()) }"
>{{ d }}</span>
</div>
</div>
<div v-if="resumeKeywords.length" class="chips-group">
<span class="chips-group__label">Keywords</span>
<div class="chips-wrap">
<span
v-for="k in resumeKeywords" :key="k"
class="hl-chip"
:class="{ 'hl-chip--match': jobMatchSet.has(k.toLowerCase()) }"
>{{ k }}</span>
</div>
</div>
</div>
</div>
<a v-if="job.url" :href="job.url" target="_blank" rel="noopener noreferrer" class="job-details__link"> <a v-if="job.url" :href="job.url" target="_blank" rel="noopener noreferrer" class="job-details__link">
View listing View listing
</a> </a>
@ -151,6 +194,61 @@
<!-- ATS Resume Optimizer --> <!-- ATS Resume Optimizer -->
<ResumeOptimizerPanel :job-id="props.jobId" /> <ResumeOptimizerPanel :job-id="props.jobId" />
<!-- Application Q&A -->
<div class="qa-section">
<button class="section-toggle" @click="qaExpanded = !qaExpanded">
<span class="section-toggle__label">Application Q&amp;A</span>
<span v-if="qaItems.length" class="qa-count">{{ qaItems.length }}</span>
<span class="section-toggle__icon" aria-hidden="true">{{ qaExpanded ? '▲' : '▼' }}</span>
</button>
<div v-if="qaExpanded" class="qa-body">
<p v-if="!qaItems.length" class="qa-empty">
No questions yet add one below to get LLM-suggested answers.
</p>
<div v-for="(item, i) in qaItems" :key="item.id" class="qa-item">
<div class="qa-item__header">
<span class="qa-item__q">{{ item.question }}</span>
<button class="qa-item__del" aria-label="Remove question" @click="removeQA(i)"></button>
</div>
<textarea
class="qa-item__answer"
:value="item.answer"
placeholder="Your answer…"
rows="3"
@input="updateAnswer(item.id, ($event.target as HTMLTextAreaElement).value)"
/>
<button
class="btn-ghost btn-ghost--sm qa-suggest-btn"
:disabled="suggesting === item.id"
@click="suggestAnswer(item)"
>
{{ suggesting === item.id ? '✨ Thinking…' : '✨ Suggest' }}
</button>
</div>
<div class="qa-add">
<input
v-model="newQuestion"
class="qa-add__input"
placeholder="Add a question from the application…"
@keydown.enter.prevent="addQA"
/>
<button class="btn-ghost btn-ghost--sm" :disabled="!newQuestion.trim()" @click="addQA">Add</button>
</div>
<button
v-if="qaItems.length"
class="btn-ghost qa-save-btn"
:disabled="qaSaved || qaSaving"
@click="saveQA"
>
{{ qaSaving ? 'Saving…' : (qaSaved ? '✓ Saved' : 'Save All') }}
</button>
</div>
</div>
<!-- Bottom action bar --> <!-- Bottom action bar -->
<div class="workspace__actions"> <div class="workspace__actions">
<button <button
@ -359,6 +457,96 @@ async function rejectListing() {
setTimeout(() => emit('job-removed'), 1000) setTimeout(() => emit('job-removed'), 1000)
} }
// Resume highlights
const resumeSkills = ref<string[]>([])
const resumeDomains = ref<string[]>([])
const resumeKeywords = ref<string[]>([])
const highlightsExpanded = ref(false)
// Words from the resume that also appear in the job description text
const jobMatchSet = computed<Set<string>>(() => {
const desc = (job.value?.description ?? '').toLowerCase()
const all = [...resumeSkills.value, ...resumeDomains.value, ...resumeKeywords.value]
return new Set(all.filter(t => desc.includes(t.toLowerCase())))
})
async function fetchResume() {
const { data } = await useApiFetch<{ skills?: string[]; domains?: string[]; keywords?: string[] }>(
'/api/settings/resume',
)
if (!data) return
resumeSkills.value = data.skills ?? []
resumeDomains.value = data.domains ?? []
resumeKeywords.value = data.keywords ?? []
if (resumeSkills.value.length || resumeDomains.value.length || resumeKeywords.value.length) {
highlightsExpanded.value = true
}
}
// Application Q&A
interface QAItem { id: string; question: string; answer: string }
const qaItems = ref<QAItem[]>([])
const qaExpanded = ref(false)
const qaSaved = ref(true)
const qaSaving = ref(false)
const newQuestion = ref('')
const suggesting = ref<string | null>(null)
function addQA() {
const q = newQuestion.value.trim()
if (!q) return
qaItems.value = [...qaItems.value, { id: crypto.randomUUID(), question: q, answer: '' }]
newQuestion.value = ''
qaSaved.value = false
qaExpanded.value = true
}
function removeQA(index: number) {
qaItems.value = qaItems.value.filter((_, i) => i !== index)
qaSaved.value = false
}
function updateAnswer(id: string, value: string) {
qaItems.value = qaItems.value.map(q => q.id === id ? { ...q, answer: value } : q)
qaSaved.value = false
}
async function saveQA() {
qaSaving.value = true
const { error } = await useApiFetch(`/api/jobs/${props.jobId}/qa`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ items: qaItems.value }),
})
qaSaving.value = false
if (error) { showToast('Save failed — please try again'); return }
qaSaved.value = true
}
async function suggestAnswer(item: QAItem) {
suggesting.value = item.id
const { data, error } = await useApiFetch<{ answer: string }>(`/api/jobs/${props.jobId}/qa/suggest`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ question: item.question }),
})
suggesting.value = null
if (error || !data?.answer) { showToast('Suggestion failed — check your LLM backend'); return }
qaItems.value = qaItems.value.map(q => q.id === item.id ? { ...q, answer: data.answer } : q)
qaSaved.value = false
}
async function fetchQA() {
const { data } = await useApiFetch<{ items: QAItem[] }>(`/api/jobs/${props.jobId}/qa`)
if (data?.items?.length) {
qaItems.value = data.items
qaExpanded.value = true
}
}
// Toast // Toast
const toast = ref<string | null>(null) const toast = ref<string | null>(null)
@ -406,6 +594,10 @@ onMounted(async () => {
await fetchJob() await fetchJob()
loadingJob.value = false loadingJob.value = false
// Load resume highlights and saved Q&A in parallel
fetchResume()
fetchQA()
// Check if a generation task is already in flight // Check if a generation task is already in flight
if (clState.value === 'none') { if (clState.value === 'none') {
const { data } = await useApiFetch<{ status: string; stage: string | null }>(`/api/jobs/${props.jobId}/cover_letter/task`) const { data } = await useApiFetch<{ status: string; stage: string | null }>(`/api/jobs/${props.jobId}/cover_letter/task`)
@ -843,6 +1035,205 @@ declare module '../stores/review' {
.toast-enter-active, .toast-leave-active { transition: opacity 250ms ease, transform 250ms ease; } .toast-enter-active, .toast-leave-active { transition: opacity 250ms ease, transform 250ms ease; }
.toast-enter-from, .toast-leave-to { opacity: 0; transform: translateX(-50%) translateY(8px); } .toast-enter-from, .toast-leave-to { opacity: 0; transform: translateX(-50%) translateY(8px); }
/* ── Resume Highlights ───────────────────────────────────────────────── */
.resume-highlights {
border-top: 1px solid var(--color-border-light);
padding-top: var(--space-3);
}
.section-toggle {
display: flex;
align-items: center;
gap: var(--space-2);
width: 100%;
background: none;
border: none;
cursor: pointer;
padding: 0;
text-align: left;
color: var(--color-text-muted);
}
.section-toggle__label {
font-size: var(--text-xs);
font-weight: 700;
text-transform: uppercase;
letter-spacing: 0.04em;
flex: 1;
}
.section-toggle__icon {
font-size: var(--text-xs);
}
.highlights-body {
display: flex;
flex-direction: column;
gap: var(--space-2);
margin-top: var(--space-2);
}
.chips-group { display: flex; flex-direction: column; gap: 4px; }
.chips-group__label {
font-size: 10px;
font-weight: 700;
text-transform: uppercase;
letter-spacing: 0.06em;
color: var(--color-text-muted);
opacity: 0.7;
}
.chips-wrap { display: flex; flex-wrap: wrap; gap: 4px; }
.hl-chip {
padding: 2px var(--space-2);
border-radius: 999px;
font-size: 11px;
background: var(--color-surface-alt);
border: 1px solid var(--color-border-light);
color: var(--color-text-muted);
}
.hl-chip--match {
background: rgba(39, 174, 96, 0.10);
border-color: rgba(39, 174, 96, 0.35);
color: var(--color-success);
font-weight: 600;
}
/* ── Application Q&A ─────────────────────────────────────────────────── */
.qa-section {
background: var(--color-surface-raised);
border: 1px solid var(--color-border-light);
border-radius: var(--radius-lg);
overflow: hidden;
}
.qa-section > .section-toggle {
padding: var(--space-3) var(--space-4);
color: var(--color-text);
}
.qa-section > .section-toggle:hover { background: var(--color-surface-alt); }
.qa-count {
display: inline-flex;
align-items: center;
justify-content: center;
width: 18px;
height: 18px;
border-radius: 50%;
background: var(--app-primary-light);
color: var(--app-primary);
font-size: 10px;
font-weight: 700;
}
.qa-body {
display: flex;
flex-direction: column;
gap: var(--space-3);
padding: var(--space-4);
border-top: 1px solid var(--color-border-light);
}
.qa-empty {
font-size: var(--text-xs);
color: var(--color-text-muted);
text-align: center;
padding: var(--space-2) 0;
}
.qa-item {
display: flex;
flex-direction: column;
gap: var(--space-1);
padding-bottom: var(--space-3);
border-bottom: 1px solid var(--color-border-light);
}
.qa-item:last-of-type { border-bottom: none; }
.qa-item__header {
display: flex;
align-items: flex-start;
justify-content: space-between;
gap: var(--space-2);
}
.qa-item__q {
font-size: var(--text-sm);
font-weight: 600;
color: var(--color-text);
line-height: 1.4;
flex: 1;
}
.qa-item__del {
background: none;
border: none;
cursor: pointer;
font-size: var(--text-xs);
color: var(--color-text-muted);
padding: 2px 4px;
flex-shrink: 0;
opacity: 0.5;
transition: opacity 150ms;
}
.qa-item__del:hover { opacity: 1; color: var(--color-error); }
.qa-item__answer {
width: 100%;
padding: var(--space-2) var(--space-3);
border: 1px solid var(--color-border-light);
border-radius: var(--radius-md);
background: var(--color-surface-alt);
color: var(--color-text);
font-family: var(--font-body);
font-size: var(--text-sm);
line-height: 1.5;
resize: vertical;
min-height: 72px;
}
.qa-item__answer:focus {
outline: none;
border-color: var(--app-primary);
}
.qa-suggest-btn { align-self: flex-end; }
.qa-add {
display: flex;
gap: var(--space-2);
align-items: center;
}
.qa-add__input {
flex: 1;
padding: var(--space-2) var(--space-3);
border: 1px solid var(--color-border-light);
border-radius: var(--radius-md);
background: var(--color-surface-alt);
color: var(--color-text);
font-family: var(--font-body);
font-size: var(--text-sm);
min-height: 36px;
}
.qa-add__input:focus {
outline: none;
border-color: var(--app-primary);
}
.qa-add__input::placeholder { color: var(--color-text-muted); }
.qa-save-btn { align-self: flex-end; }
/* ── Responsive ──────────────────────────────────────────────────────── */ /* ── Responsive ──────────────────────────────────────────────────────── */
@media (max-width: 900px) { @media (max-width: 900px) {

View file

@ -2,12 +2,15 @@ export type ApiError =
| { kind: 'network'; message: string } | { kind: 'network'; message: string }
| { kind: 'http'; status: number; detail: string } | { kind: 'http'; status: number; detail: string }
// Strip trailing slash so '/peregrine/' + '/api/...' → '/peregrine/api/...'
const _apiBase = import.meta.env.BASE_URL.replace(/\/$/, '')
export async function useApiFetch<T>( export async function useApiFetch<T>(
url: string, url: string,
opts?: RequestInit, opts?: RequestInit,
): Promise<{ data: T | null; error: ApiError | null }> { ): Promise<{ data: T | null; error: ApiError | null }> {
try { try {
const res = await fetch(url, opts) const res = await fetch(_apiBase + url, opts)
if (!res.ok) { if (!res.ok) {
const detail = await res.text().catch(() => '') const detail = await res.text().catch(() => '')
return { data: null, error: { kind: 'http', status: res.status, detail } } return { data: null, error: { kind: 'http', status: res.status, detail } }
@ -31,7 +34,7 @@ export function useApiSSE(
onComplete?: () => void, onComplete?: () => void,
onError?: (e: Event) => void, onError?: (e: Event) => void,
): () => void { ): () => void {
const es = new EventSource(url) const es = new EventSource(_apiBase + url)
es.onmessage = (e) => { es.onmessage = (e) => {
try { try {
const data = JSON.parse(e.data) as Record<string, unknown> const data = JSON.parse(e.data) as Record<string, unknown>

View file

@ -53,6 +53,13 @@
:loading="taskRunning === 'score'" :loading="taskRunning === 'score'"
@click="scoreUnscored" @click="scoreUnscored"
/> />
<WorkflowButton
emoji="🔍"
label="Fill Missing Descriptions"
description="Re-fetch truncated job descriptions"
:loading="taskRunning === 'enrich'"
@click="runEnrich"
/>
</div> </div>
<button <button
@ -80,7 +87,6 @@
? `Last enriched ${formatRelative(store.status.enrichment_last_run)}` ? `Last enriched ${formatRelative(store.status.enrichment_last_run)}`
: 'Auto-enrichment active' }} : 'Auto-enrichment active' }}
</span> </span>
<button class="btn-ghost btn-ghost--sm" @click="runEnrich">Run Now</button>
</div> </div>
</section> </section>
@ -162,24 +168,194 @@
</div> </div>
</section> </section>
<!-- Advanced --> <!-- Danger Zone -->
<section class="home__section"> <section class="home__section">
<details class="advanced"> <details class="danger-zone">
<summary class="advanced__summary">Advanced</summary> <summary class="danger-zone__summary"> Danger Zone</summary>
<div class="advanced__body"> <div class="danger-zone__body">
<p class="advanced__warning"> These actions are destructive and cannot be undone.</p>
<div class="home__actions home__actions--danger"> <!-- Queue reset -->
<button class="action-btn action-btn--danger" @click="confirmPurge"> <div class="dz-block">
🗑 Purge Pending + Rejected <p class="dz-block__title">Queue reset</p>
</button> <p class="dz-block__desc">
<button class="action-btn action-btn--danger" @click="killTasks"> Archive clears your review queue while keeping job URLs for dedup same listings
🛑 Kill Stuck Tasks won't resurface on the next discovery run. Use hard purge only for a full clean slate
including dedup history.
</p>
<fieldset class="dz-scope" aria-label="Clear scope">
<legend class="dz-scope__legend">Clear scope</legend>
<label class="dz-scope__option">
<input type="radio" v-model="dangerScope" value="pending" />
Pending only
</label>
<label class="dz-scope__option">
<input type="radio" v-model="dangerScope" value="pending_approved" />
Pending + approved (stale search)
</label>
</fieldset>
<div class="dz-actions">
<button
class="action-btn action-btn--primary"
:disabled="!!confirmAction"
@click="beginConfirm('archive')"
>
📦 Archive &amp; reset
</button>
<button
class="action-btn action-btn--secondary"
:disabled="!!confirmAction"
@click="beginConfirm('purge')"
>
🗑 Hard purge (delete)
</button>
</div>
<!-- Inline confirm -->
<div v-if="confirmAction" class="dz-confirm" role="alertdialog" aria-live="assertive">
<p v-if="confirmAction.type === 'archive'" class="dz-confirm__msg dz-confirm__msg--info">
Archive <strong>{{ confirmAction.statuses.join(' + ') }}</strong> jobs?
URLs are kept for dedup nothing is permanently deleted.
</p>
<p v-else class="dz-confirm__msg dz-confirm__msg--warn">
Permanently delete <strong>{{ confirmAction.statuses.join(' + ') }}</strong> jobs?
This removes URLs from dedup history too. Cannot be undone.
</p>
<div class="dz-confirm__actions">
<button class="action-btn action-btn--primary" @click="executeConfirm">
{{ confirmAction.type === 'archive' ? 'Yes, archive' : 'Yes, delete' }}
</button>
<button class="action-btn action-btn--secondary" @click="confirmAction = null">
Cancel
</button>
</div>
</div>
</div>
<hr class="dz-divider" />
<!-- Background tasks -->
<div class="dz-block">
<p class="dz-block__title">Background tasks {{ activeTasks.length }} active</p>
<template v-if="activeTasks.length > 0">
<div
v-for="task in activeTasks"
:key="task.id"
class="dz-task"
>
<span class="dz-task__icon">{{ taskIcon(task.task_type) }}</span>
<span class="dz-task__type">{{ task.task_type.replace(/_/g, ' ') }}</span>
<span class="dz-task__label">
{{ task.title ? `${task.title}${task.company ? ' @ ' + task.company : ''}` : `job #${task.job_id}` }}
</span>
<span class="dz-task__status">{{ task.status }}</span>
<button
class="btn-ghost btn-ghost--sm dz-task__cancel"
@click="cancelTaskById(task.id)"
:aria-label="`Cancel ${task.task_type} task`"
>
</button>
</div>
</template>
<button
class="action-btn action-btn--secondary dz-kill"
:disabled="activeTasks.length === 0"
@click="killAll"
>
Kill all stuck
</button> </button>
</div> </div>
<hr class="dz-divider" />
<!-- More options -->
<details class="dz-more">
<summary class="dz-more__summary">More options</summary>
<div class="dz-more__body">
<!-- Email purge -->
<div class="dz-more__item">
<p class="dz-block__title">Purge email data</p>
<p class="dz-block__desc">Clears all email thread logs and email-sourced pending jobs.</p>
<template v-if="moreConfirm === 'email'">
<p class="dz-confirm__msg dz-confirm__msg--warn">
Deletes all email contacts and email-sourced jobs. Cannot be undone.
</p>
<div class="dz-confirm__actions">
<button class="action-btn action-btn--primary" @click="executePurgeTarget('email')">Yes, purge emails</button>
<button class="action-btn action-btn--secondary" @click="moreConfirm = null">Cancel</button>
</div>
</template>
<button v-else class="action-btn action-btn--secondary" @click="moreConfirm = 'email'">
📧 Purge Email Data
</button>
</div>
<!-- Non-remote purge -->
<div class="dz-more__item">
<p class="dz-block__title">Purge non-remote</p>
<p class="dz-block__desc">Removes pending/approved/rejected on-site listings from the DB.</p>
<template v-if="moreConfirm === 'non_remote'">
<p class="dz-confirm__msg dz-confirm__msg--warn">
Deletes all non-remote jobs not yet applied to. Cannot be undone.
</p>
<div class="dz-confirm__actions">
<button class="action-btn action-btn--primary" @click="executePurgeTarget('non_remote')">Yes, purge on-site</button>
<button class="action-btn action-btn--secondary" @click="moreConfirm = null">Cancel</button>
</div>
</template>
<button v-else class="action-btn action-btn--secondary" @click="moreConfirm = 'non_remote'">
🏢 Purge On-site Jobs
</button>
</div>
<!-- Wipe + re-scrape -->
<div class="dz-more__item">
<p class="dz-block__title">Wipe all + re-scrape</p>
<p class="dz-block__desc">Deletes all non-applied jobs then immediately runs a fresh discovery.</p>
<template v-if="moreConfirm === 'rescrape'">
<p class="dz-confirm__msg dz-confirm__msg--warn">
Wipes ALL pending, approved, and rejected jobs, then re-scrapes.
Applied and synced records are kept.
</p>
<div class="dz-confirm__actions">
<button class="action-btn action-btn--primary" @click="executePurgeTarget('rescrape')">Yes, wipe + scrape</button>
<button class="action-btn action-btn--secondary" @click="moreConfirm = null">Cancel</button>
</div>
</template>
<button v-else class="action-btn action-btn--secondary" @click="moreConfirm = 'rescrape'">
🔄 Wipe + Re-scrape
</button>
</div>
</div>
</details>
</div> </div>
</details> </details>
</section> </section>
<!-- Setup banners -->
<section v-if="banners.length > 0" class="home__section" aria-labelledby="setup-heading">
<h2 id="setup-heading" class="home__section-title">Finish setting up Peregrine</h2>
<div class="banners">
<div v-for="banner in banners" :key="banner.key" class="banner">
<span class="banner__icon" aria-hidden="true">💡</span>
<span class="banner__text">{{ banner.text }}</span>
<RouterLink :to="banner.link" class="banner__link">Go to settings </RouterLink>
<button
class="btn-ghost btn-ghost--sm banner__dismiss"
@click="dismissBanner(banner.key)"
:aria-label="`Dismiss: ${banner.text}`"
>
</button>
</div>
</div>
</section>
<!-- Stoop speed toast easter egg 9.2 --> <!-- Stoop speed toast easter egg 9.2 -->
<Transition name="toast"> <Transition name="toast">
<div v-if="stoopToast" class="stoop-toast" role="status" aria-live="polite"> <div v-if="stoopToast" class="stoop-toast" role="status" aria-live="polite">
@ -190,7 +366,7 @@
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { ref, computed, onMounted } from 'vue' import { ref, computed, onMounted, onUnmounted } from 'vue'
import { RouterLink } from 'vue-router' import { RouterLink } from 'vue-router'
import { useJobsStore } from '../stores/jobs' import { useJobsStore } from '../stores/jobs'
import { useApiFetch } from '../composables/useApi' import { useApiFetch } from '../composables/useApi'
@ -231,6 +407,8 @@ function formatRelative(isoStr: string) {
return hrs === 1 ? '1 hour ago' : `${hrs} hours ago` return hrs === 1 ? '1 hour ago' : `${hrs} hours ago`
} }
// Task execution
const taskRunning = ref<string | null>(null) const taskRunning = ref<string | null>(null)
const stoopToast = ref(false) const stoopToast = ref(false)
@ -239,13 +417,16 @@ async function runTask(key: string, endpoint: string) {
await useApiFetch(endpoint, { method: 'POST' }) await useApiFetch(endpoint, { method: 'POST' })
taskRunning.value = null taskRunning.value = null
store.refresh() store.refresh()
fetchActiveTasks()
} }
const runDiscovery = () => runTask('discovery', '/api/tasks/discovery') const runDiscovery = () => runTask('discovery', '/api/tasks/discovery')
const syncEmails = () => runTask('email', '/api/tasks/email-sync') const syncEmails = () => runTask('email', '/api/tasks/email-sync')
const scoreUnscored = () => runTask('score', '/api/tasks/score') const scoreUnscored = () => runTask('score', '/api/tasks/score')
const syncIntegration = () => runTask('sync', '/api/tasks/sync') const syncIntegration = () => runTask('sync', '/api/tasks/sync')
const runEnrich = () => useApiFetch('/api/tasks/enrich', { method: 'POST' }) const runEnrich = () => runTask('enrich', '/api/tasks/enrich')
// Add jobs
const addTab = ref<'url' | 'csv'>('url') const addTab = ref<'url' | 'csv'>('url')
const urlInput = ref('') const urlInput = ref('')
@ -269,6 +450,8 @@ function handleCsvUpload(e: Event) {
useApiFetch('/api/jobs/upload-csv', { method: 'POST', body: form }) useApiFetch('/api/jobs/upload-csv', { method: 'POST', body: form })
} }
// Backlog archive
async function archiveByStatus(statuses: string[]) { async function archiveByStatus(statuses: string[]) {
await useApiFetch('/api/jobs/archive', { await useApiFetch('/api/jobs/archive', {
method: 'POST', method: 'POST',
@ -278,26 +461,100 @@ async function archiveByStatus(statuses: string[]) {
store.refresh() store.refresh()
} }
function confirmPurge() { // Danger Zone
// TODO: replace with ConfirmModal component
if (confirm('Permanently delete all pending and rejected jobs? This cannot be undone.')) { interface TaskRow { id: number; task_type: string; status: string; title?: string; company?: string; job_id: number }
useApiFetch('/api/jobs/purge', { interface Banner { key: string; text: string; link: string }
method: 'POST', interface ConfirmAction { type: 'archive' | 'purge'; statuses: string[] }
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ target: 'pending_rejected' }), const activeTasks = ref<TaskRow[]>([])
}) const dangerScope = ref<'pending' | 'pending_approved'>('pending')
store.refresh() const confirmAction = ref<ConfirmAction | null>(null)
} const moreConfirm = ref<string | null>(null)
const banners = ref<Banner[]>([])
let taskPollInterval: ReturnType<typeof setInterval> | null = null
async function fetchActiveTasks() {
const { data } = await useApiFetch<TaskRow[]>('/api/tasks')
activeTasks.value = data ?? []
} }
async function killTasks() { async function fetchBanners() {
const { data } = await useApiFetch<Banner[]>('/api/config/setup-banners')
banners.value = data ?? []
}
function scopeStatuses(): string[] {
return dangerScope.value === 'pending' ? ['pending'] : ['pending', 'approved']
}
function beginConfirm(type: 'archive' | 'purge') {
moreConfirm.value = null
confirmAction.value = { type, statuses: scopeStatuses() }
}
async function executeConfirm() {
const action = confirmAction.value
confirmAction.value = null
if (!action) return
const endpoint = action.type === 'archive' ? '/api/jobs/archive' : '/api/jobs/purge'
const key = action.type === 'archive' ? 'statuses' : 'statuses'
await useApiFetch(endpoint, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ [key]: action.statuses }),
})
store.refresh()
fetchActiveTasks()
}
async function cancelTaskById(id: number) {
await useApiFetch(`/api/tasks/${id}`, { method: 'DELETE' })
fetchActiveTasks()
}
async function killAll() {
await useApiFetch('/api/tasks/kill', { method: 'POST' }) await useApiFetch('/api/tasks/kill', { method: 'POST' })
fetchActiveTasks()
}
async function executePurgeTarget(target: string) {
moreConfirm.value = null
await useApiFetch('/api/jobs/purge', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ target }),
})
store.refresh()
fetchActiveTasks()
}
async function dismissBanner(key: string) {
await useApiFetch(`/api/config/setup-banners/${key}/dismiss`, { method: 'POST' })
banners.value = banners.value.filter(b => b.key !== key)
}
function taskIcon(taskType: string): string {
const icons: Record<string, string> = {
cover_letter: '✉️', company_research: '🔍', discovery: '🌐',
enrich_descriptions: '📝', email_sync: '📧', score: '📊',
scrape_url: '🔗',
}
return icons[taskType] ?? '⚙️'
} }
onMounted(async () => { onMounted(async () => {
store.refresh() store.refresh()
const { data } = await useApiFetch<{ name: string }>('/api/config/user') const { data } = await useApiFetch<{ name: string }>('/api/config/user')
if (data?.name) userName.value = data.name if (data?.name) userName.value = data.name
fetchActiveTasks()
fetchBanners()
taskPollInterval = setInterval(fetchActiveTasks, 5000)
})
onUnmounted(() => {
if (taskPollInterval) clearInterval(taskPollInterval)
}) })
</script> </script>
@ -392,12 +649,11 @@ onMounted(async () => {
.home__actions { .home__actions {
display: grid; display: grid;
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr)); grid-template-columns: repeat(auto-fit, minmax(180px, 1fr));
gap: var(--space-3); gap: var(--space-3);
} }
.home__actions--secondary { grid-template-columns: repeat(auto-fit, minmax(240px, 1fr)); } .home__actions--secondary { grid-template-columns: repeat(auto-fit, minmax(240px, 1fr)); }
.home__actions--danger { grid-template-columns: repeat(auto-fit, minmax(220px, 1fr)); }
.sync-banner { .sync-banner {
display: flex; display: flex;
@ -451,9 +707,7 @@ onMounted(async () => {
.action-btn--secondary { background: var(--color-surface-alt); color: var(--color-text); border: 1px solid var(--color-border); } .action-btn--secondary { background: var(--color-surface-alt); color: var(--color-text); border: 1px solid var(--color-border); }
.action-btn--secondary:hover { background: var(--color-border-light); } .action-btn--secondary:hover { background: var(--color-border-light); }
.action-btn--secondary:disabled { opacity: 0.4; cursor: not-allowed; }
.action-btn--danger { background: transparent; color: var(--color-error); border: 1px solid var(--color-error); }
.action-btn--danger:hover { background: rgba(192, 57, 43, 0.08); }
.enrichment-row { .enrichment-row {
display: flex; display: flex;
@ -528,13 +782,15 @@ onMounted(async () => {
.add-jobs__textarea:focus { outline: 2px solid var(--app-primary); outline-offset: 1px; } .add-jobs__textarea:focus { outline: 2px solid var(--app-primary); outline-offset: 1px; }
.advanced { /* ── Danger Zone ──────────────────────────────────────── */
.danger-zone {
background: var(--color-surface-raised); background: var(--color-surface-raised);
border: 1px solid var(--color-border-light); border: 1px solid var(--color-border-light);
border-radius: var(--radius-md); border-radius: var(--radius-md);
} }
.advanced__summary { .danger-zone__summary {
padding: var(--space-3) var(--space-4); padding: var(--space-3) var(--space-4);
cursor: pointer; cursor: pointer;
font-size: var(--text-sm); font-size: var(--text-sm);
@ -544,21 +800,172 @@ onMounted(async () => {
user-select: none; user-select: none;
} }
.advanced__summary::-webkit-details-marker { display: none; } .danger-zone__summary::-webkit-details-marker { display: none; }
.advanced__summary::before { content: '▶ '; font-size: 0.7em; } .danger-zone__summary::before { content: '▶ '; font-size: 0.7em; }
details[open] > .advanced__summary::before { content: '▼ '; } details[open] > .danger-zone__summary::before { content: '▼ '; }
.advanced__body { padding: 0 var(--space-4) var(--space-4); display: flex; flex-direction: column; gap: var(--space-4); } .danger-zone__body {
padding: 0 var(--space-4) var(--space-4);
display: flex;
flex-direction: column;
gap: var(--space-5);
}
.advanced__warning { .dz-block { display: flex; flex-direction: column; gap: var(--space-3); }
.dz-block__title {
font-size: var(--text-sm); font-size: var(--text-sm);
color: var(--color-warning); font-weight: 600;
background: rgba(212, 137, 26, 0.08); color: var(--color-text);
}
.dz-block__desc {
font-size: var(--text-sm);
color: var(--color-text-muted);
}
.dz-scope {
border: none;
padding: 0;
margin: 0;
display: flex;
gap: var(--space-5);
flex-wrap: wrap;
}
.dz-scope__legend {
font-size: var(--text-xs);
color: var(--color-text-muted);
margin-bottom: var(--space-2);
float: left;
width: 100%;
}
.dz-scope__option {
display: flex;
align-items: center;
gap: var(--space-2);
font-size: var(--text-sm);
cursor: pointer;
}
.dz-actions {
display: flex;
gap: var(--space-3);
flex-wrap: wrap;
}
.dz-confirm {
padding: var(--space-3) var(--space-4); padding: var(--space-3) var(--space-4);
border-radius: var(--radius-md); border-radius: var(--radius-md);
border-left: 3px solid var(--color-warning); display: flex;
flex-direction: column;
gap: var(--space-3);
} }
.dz-confirm__msg {
font-size: var(--text-sm);
padding: var(--space-3) var(--space-4);
border-radius: var(--radius-md);
border-left: 3px solid;
}
.dz-confirm__msg--info {
background: rgba(52, 152, 219, 0.1);
border-color: var(--app-primary);
color: var(--color-text);
}
.dz-confirm__msg--warn {
background: rgba(192, 57, 43, 0.08);
border-color: var(--color-error);
color: var(--color-text);
}
.dz-confirm__actions {
display: flex;
gap: var(--space-3);
}
.dz-divider {
border: none;
border-top: 1px solid var(--color-border-light);
margin: 0;
}
.dz-task {
display: flex;
align-items: center;
gap: var(--space-2);
padding: var(--space-2) var(--space-3);
background: var(--color-surface-alt);
border-radius: var(--radius-md);
font-size: var(--text-xs);
}
.dz-task__icon { flex-shrink: 0; }
.dz-task__type { font-family: var(--font-mono); color: var(--color-text-muted); min-width: 120px; }
.dz-task__label { flex: 1; color: var(--color-text); overflow: hidden; text-overflow: ellipsis; white-space: nowrap; }
.dz-task__status { color: var(--color-text-muted); font-style: italic; }
.dz-task__cancel { margin-left: var(--space-2); }
.dz-kill { align-self: flex-start; }
.dz-more {
background: transparent;
border: none;
}
.dz-more__summary {
cursor: pointer;
font-size: var(--text-sm);
font-weight: 600;
color: var(--color-text-muted);
list-style: none;
user-select: none;
padding: var(--space-1) 0;
}
.dz-more__summary::-webkit-details-marker { display: none; }
.dz-more__summary::before { content: '▶ '; font-size: 0.7em; }
details[open] > .dz-more__summary::before { content: '▼ '; }
.dz-more__body {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
gap: var(--space-5);
margin-top: var(--space-4);
}
.dz-more__item { display: flex; flex-direction: column; gap: var(--space-2); }
/* ── Setup banners ────────────────────────────────────── */
.banners {
display: flex;
flex-direction: column;
gap: var(--space-2);
}
.banner {
display: flex;
align-items: center;
gap: var(--space-3);
padding: var(--space-3) var(--space-4);
background: var(--color-surface-raised);
border: 1px solid var(--color-border-light);
border-radius: var(--radius-md);
font-size: var(--text-sm);
}
.banner__icon { flex-shrink: 0; }
.banner__text { flex: 1; color: var(--color-text); }
.banner__link { color: var(--app-primary); text-decoration: none; white-space: nowrap; font-weight: 500; }
.banner__link:hover { text-decoration: underline; }
.banner__dismiss { margin-left: var(--space-1); }
/* ── Toast ────────────────────────────────────────────── */
.stoop-toast { .stoop-toast {
position: fixed; position: fixed;
bottom: var(--space-6); bottom: var(--space-6);
@ -588,6 +995,7 @@ details[open] > .advanced__summary::before { content: '▼ '; }
.home { padding: var(--space-4); gap: var(--space-6); } .home { padding: var(--space-4); gap: var(--space-6); }
.home__greeting { font-size: var(--text-2xl); } .home__greeting { font-size: var(--text-2xl); }
.home__metrics { grid-template-columns: repeat(3, 1fr); } .home__metrics { grid-template-columns: repeat(3, 1fr); }
.dz-more__body { grid-template-columns: 1fr; }
} }
@media (max-width: 480px) { @media (max-width: 480px) {

View file

@ -64,7 +64,7 @@
/> />
</div> </div>
<div class="field-row"> <div v-if="!config.isCloud" class="field-row">
<label class="field-label" for="profile-inference">Inference profile</label> <label class="field-label" for="profile-inference">Inference profile</label>
<select id="profile-inference" v-model="store.inference_profile" class="select-input"> <select id="profile-inference" v-model="store.inference_profile" class="select-input">
<option value="remote">Remote</option> <option value="remote">Remote</option>

View file

@ -15,7 +15,13 @@
<div class="empty-card"> <div class="empty-card">
<h3>Upload & Parse</h3> <h3>Upload & Parse</h3>
<p>Upload a PDF, DOCX, or ODT and we'll extract your info automatically.</p> <p>Upload a PDF, DOCX, or ODT and we'll extract your info automatically.</p>
<input type="file" accept=".pdf,.docx,.odt" @change="handleUpload" ref="fileInput" /> <input type="file" accept=".pdf,.docx,.odt" @change="handleFileSelect" ref="fileInput" />
<button
v-if="pendingFile"
@click="handleUpload"
:disabled="uploading"
style="margin-top:10px"
>{{ uploading ? 'Parsing…' : `Parse "${pendingFile.name}"` }}</button>
<p v-if="uploadError" class="error">{{ uploadError }}</p> <p v-if="uploadError" class="error">{{ uploadError }}</p>
</div> </div>
<!-- Blank --> <!-- Blank -->
@ -24,8 +30,8 @@
<p>Start with a blank form and fill in your details.</p> <p>Start with a blank form and fill in your details.</p>
<button @click="store.createBlank()" :disabled="store.loading">Start from Scratch</button> <button @click="store.createBlank()" :disabled="store.loading">Start from Scratch</button>
</div> </div>
<!-- Wizard --> <!-- Wizard self-hosted only -->
<div class="empty-card"> <div v-if="!config.isCloud" class="empty-card">
<h3>Run Setup Wizard</h3> <h3>Run Setup Wizard</h3>
<p>Walk through the onboarding wizard to set up your profile step by step.</p> <p>Walk through the onboarding wizard to set up your profile step by step.</p>
<RouterLink to="/setup">Open Setup Wizard </RouterLink> <RouterLink to="/setup">Open Setup Wizard </RouterLink>
@ -35,6 +41,21 @@
<!-- Full form (when resume exists) --> <!-- Full form (when resume exists) -->
<template v-else-if="store.hasResume"> <template v-else-if="store.hasResume">
<!-- Replace resume via upload -->
<section class="form-section replace-section">
<h3>Replace Resume</h3>
<p class="section-note">Upload a new PDF, DOCX, or ODT to re-parse and overwrite the current data.</p>
<input type="file" accept=".pdf,.docx,.odt" @change="handleFileSelect" ref="replaceFileInput" />
<button
v-if="pendingFile"
@click="handleUpload"
:disabled="uploading"
class="btn-primary"
style="margin-top:10px"
>{{ uploading ? 'Parsing…' : `Parse "${pendingFile.name}"` }}</button>
<p v-if="uploadError" class="error">{{ uploadError }}</p>
</section>
<!-- Personal Information --> <!-- Personal Information -->
<section class="form-section"> <section class="form-section">
<h3>Personal Information</h3> <h3>Personal Information</h3>
@ -221,17 +242,22 @@ import { ref, onMounted } from 'vue'
import { storeToRefs } from 'pinia' import { storeToRefs } from 'pinia'
import { useResumeStore } from '../../stores/settings/resume' import { useResumeStore } from '../../stores/settings/resume'
import { useProfileStore } from '../../stores/settings/profile' import { useProfileStore } from '../../stores/settings/profile'
import { useAppConfigStore } from '../../stores/appConfig'
import { useApiFetch } from '../../composables/useApi' import { useApiFetch } from '../../composables/useApi'
const store = useResumeStore() const store = useResumeStore()
const profileStore = useProfileStore() const profileStore = useProfileStore()
const config = useAppConfigStore()
const { loadError } = storeToRefs(store) const { loadError } = storeToRefs(store)
const showSelfId = ref(false) const showSelfId = ref(false)
const skillInput = ref('') const skillInput = ref('')
const domainInput = ref('') const domainInput = ref('')
const kwInput = ref('') const kwInput = ref('')
const uploadError = ref<string | null>(null) const uploadError = ref<string | null>(null)
const uploading = ref(false)
const pendingFile = ref<File | null>(null)
const fileInput = ref<HTMLInputElement | null>(null) const fileInput = ref<HTMLInputElement | null>(null)
const replaceFileInput = ref<HTMLInputElement | null>(null)
onMounted(async () => { onMounted(async () => {
await store.load() await store.load()
@ -246,9 +272,16 @@ onMounted(async () => {
} }
}) })
async function handleUpload(event: Event) { function handleFileSelect(event: Event) {
const file = (event.target as HTMLInputElement).files?.[0] const file = (event.target as HTMLInputElement).files?.[0]
pendingFile.value = file ?? null
uploadError.value = null
}
async function handleUpload() {
const file = pendingFile.value
if (!file) return if (!file) return
uploading.value = true
uploadError.value = null uploadError.value = null
const formData = new FormData() const formData = new FormData()
formData.append('file', file) formData.append('file', file)
@ -256,10 +289,14 @@ async function handleUpload(event: Event) {
'/api/settings/resume/upload', '/api/settings/resume/upload',
{ method: 'POST', body: formData } { method: 'POST', body: formData }
) )
uploading.value = false
if (error || !data?.ok) { if (error || !data?.ok) {
uploadError.value = data?.error ?? (typeof error === 'string' ? error : (error?.kind === 'network' ? error.message : error?.detail ?? 'Upload failed')) uploadError.value = data?.error ?? (typeof error === 'string' ? error : (error?.kind === 'network' ? error.message : error?.detail ?? 'Upload failed'))
return return
} }
pendingFile.value = null
if (fileInput.value) fileInput.value.value = ''
if (replaceFileInput.value) replaceFileInput.value.value = ''
if (data.data) { if (data.data) {
await store.load() await store.load()
} }
@ -307,4 +344,5 @@ h3 { font-size: 1rem; font-weight: 600; margin-bottom: var(--space-3, 16px); col
.section-note { font-size: 0.8rem; color: var(--color-text-secondary, #94a3b8); margin-bottom: 16px; } .section-note { font-size: 0.8rem; color: var(--color-text-secondary, #94a3b8); margin-bottom: 16px; }
.toggle-btn { margin-left: 10px; padding: 2px 10px; background: transparent; border: 1px solid var(--color-border, rgba(255,255,255,0.15)); border-radius: 4px; color: var(--color-text-secondary, #94a3b8); cursor: pointer; font-size: 0.78rem; } .toggle-btn { margin-left: 10px; padding: 2px 10px; background: transparent; border: 1px solid var(--color-border, rgba(255,255,255,0.15)); border-radius: 4px; color: var(--color-text-secondary, #94a3b8); cursor: pointer; font-size: 0.78rem; }
.loading { text-align: center; padding: var(--space-8, 48px); color: var(--color-text-secondary, #94a3b8); } .loading { text-align: center; padding: var(--space-8, 48px); color: var(--color-text-secondary, #94a3b8); }
.replace-section { background: var(--color-surface-2, rgba(255,255,255,0.03)); border-radius: 8px; padding: var(--space-4, 24px); }
</style> </style>

View file

@ -41,7 +41,8 @@ const config = useAppConfigStore()
const devOverride = computed(() => !!config.devTierOverride) const devOverride = computed(() => !!config.devTierOverride)
const gpuProfiles = ['single-gpu', 'dual-gpu'] const gpuProfiles = ['single-gpu', 'dual-gpu']
const showSystem = computed(() => !config.isCloud) const showSystem = computed(() => !config.isCloud)
const showData = computed(() => !config.isCloud)
const showFineTune = computed(() => { const showFineTune = computed(() => {
if (config.isCloud) return config.tier === 'premium' if (config.isCloud) return config.tier === 'premium'
return gpuProfiles.includes(config.inferenceProfile) return gpuProfiles.includes(config.inferenceProfile)
@ -65,7 +66,7 @@ const allGroups = [
]}, ]},
{ label: 'Account', items: [ { label: 'Account', items: [
{ key: 'license', path: '/settings/license', label: 'License', show: true }, { key: 'license', path: '/settings/license', label: 'License', show: true },
{ key: 'data', path: '/settings/data', label: 'Data', show: true }, { key: 'data', path: '/settings/data', label: 'Data', show: showData },
{ key: 'privacy', path: '/settings/privacy', label: 'Privacy', show: true }, { key: 'privacy', path: '/settings/privacy', label: 'Privacy', show: true },
]}, ]},
{ label: 'Dev', items: [ { label: 'Dev', items: [