feat(merge): merge feature/vue-spa into main
Full Vue 3 SPA merge — closes #8. Major additions: Backend (dev API): - dev_api.py → symlink to dev-api.py (importable module alias) - dev-api.py: full FastAPI backend (settings, jobs, interviews, prep, survey, digest, resume optimizer endpoints); cloud session middleware - scripts/user_profile.py: load_user_profile / save_user_profile helpers - scripts/discover.py + scripts/imap_sync.py: API-compatible additions Frontend (web/src/): - ApplyWorkspace: ATS resume optimizer panel (gap report free, rewrite paid+) - ResumeOptimizerPanel.vue: new component with task polling + .txt download Test suite: - test_dev_api_settings/survey/prep/digest/interviews: full API test coverage - fix: replace importlib.reload with monkeypatch.setattr(dev_api, "DB_PATH") to prevent module global reset breaking isolation across test files Docs: - docs/vue-spa-migration.md: migration guide
This commit is contained in:
commit
8c42de3f5c
15 changed files with 3974 additions and 12 deletions
1800
dev-api.py
Normal file
1800
dev-api.py
Normal file
File diff suppressed because it is too large
Load diff
1
dev_api.py
Symbolic link
1
dev_api.py
Symbolic link
|
|
@ -0,0 +1 @@
|
||||||
|
dev-api.py
|
||||||
174
docs/vue-spa-migration.md
Normal file
174
docs/vue-spa-migration.md
Normal file
|
|
@ -0,0 +1,174 @@
|
||||||
|
# Peregrine Vue 3 SPA Migration
|
||||||
|
|
||||||
|
**Branch:** `feature/vue-spa`
|
||||||
|
**Issue:** #8 — Vue 3 SPA frontend (Paid Tier GA milestone)
|
||||||
|
**Worktree:** `.worktrees/feature-vue-spa/`
|
||||||
|
**Reference:** `avocet/docs/vue-port-gotchas.md` (15 battle-tested gotchas)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What We're Replacing
|
||||||
|
|
||||||
|
The current Streamlit UI (`app/app.py` + `app/pages/`) is an internal tool built for speed of development. The Vue SPA replaces it with a proper frontend — faster, more accessible, and extensible for the Paid Tier. The FastAPI already exists (partially, from the cloud managed instance work); the Vue SPA will consume it.
|
||||||
|
|
||||||
|
### Pages to Port
|
||||||
|
|
||||||
|
| Streamlit file | Vue view | Route | Notes |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `app/Home.py` | `HomeView.vue` | `/` | Dashboard, discovery trigger, sync status |
|
||||||
|
| `app/pages/1_Job_Review.py` | `JobReviewView.vue` | `/review` | Batch approve/reject; primary daily-driver view |
|
||||||
|
| `app/pages/4_Apply.py` | `ApplyView.vue` | `/apply` | Cover letter gen + PDF + mark applied |
|
||||||
|
| `app/pages/5_Interviews.py` | `InterviewsView.vue` | `/interviews` | Kanban: phone_screen → offer → hired |
|
||||||
|
| `app/pages/6_Interview_Prep.py` | `InterviewPrepView.vue` | `/prep` | Live reference sheet + practice Q&A |
|
||||||
|
| `app/pages/7_Survey.py` | `SurveyView.vue` | `/survey` | Culture-fit survey assist + screenshot |
|
||||||
|
| `app/pages/2_Settings.py` | `SettingsView.vue` | `/settings` | 6 tabs: Profile, Resume, Search, System, Fine-Tune, License |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Avocet Lessons Applied — What We Fixed Before Starting
|
||||||
|
|
||||||
|
The avocet SPA was the testbed. These bugs were found and fixed there; Peregrine's scaffold already incorporates all fixes. See `avocet/docs/vue-port-gotchas.md` for the full writeup.
|
||||||
|
|
||||||
|
### Applied at scaffold level (baked in — you don't need to think about these)
|
||||||
|
|
||||||
|
| # | Gotcha | How it's fixed in this scaffold |
|
||||||
|
|---|--------|----------------------------------|
|
||||||
|
| 1 | `id="app"` on App.vue root → nested `#app` elements, broken CSS specificity | `App.vue` root uses `class="app-root"`. `#app` in `index.html` is mount target only. |
|
||||||
|
| 3 | `overflow-x: hidden` on html → creates scroll container → 15px scrollbar jitter on Linux | `peregrine.css`: `html { overflow-x: clip }` |
|
||||||
|
| 4 | UnoCSS `presetAttributify` generates CSS for bare attribute names like `h2` | `uno.config.ts`: `presetAttributify({ prefix: 'un-', prefixedOnly: true })` |
|
||||||
|
| 5 | Theme variable name mismatches cause dark mode to silently fall back to hardcoded colors | `peregrine.css` alias map: `--color-bg → var(--color-surface)`, `--color-text-secondary → var(--color-text-muted)` |
|
||||||
|
| 7 | SPA cache: browser caches `index.html` indefinitely → old asset hashes → 404 on rebuild | FastAPI must register explicit `GET /` with no-cache headers before `StaticFiles` mount (see FastAPI section below) |
|
||||||
|
| 9 | `navigator.vibrate()` not supported on desktop/Safari — throws on call | `useHaptics.ts` guards with `'vibrate' in navigator` |
|
||||||
|
| 10 | Pinia options store = Vue 2 migration path | All stores use setup store form: `defineStore('id', () => { ... })` |
|
||||||
|
| 12 | `matchMedia`, `vibrate`, `ResizeObserver` absent in jsdom → composable tests throw | `test-setup.ts` stubs all three |
|
||||||
|
| 13 | `100vh` ignores mobile browser chrome | `App.vue`: `min-height: 100dvh` |
|
||||||
|
|
||||||
|
### Must actively avoid when writing new components
|
||||||
|
|
||||||
|
| # | Gotcha | Rule |
|
||||||
|
|---|--------|------|
|
||||||
|
| 2 | `transition: all` + spring easing → every CSS property bounces → layout explosion | Always enumerate: `transition: background 200ms ease, transform 250ms cubic-bezier(...)` |
|
||||||
|
| 6 | Keyboard composables called with snapshot arrays → keys don't work after async data loads | Accept `getLabels: () => labels.value` (reactive getter), not `labels: []` (snapshot) |
|
||||||
|
| 8 | Font reflow at ~780ms shifts layout measurements taken in `onMounted` | Measure layout in `document.fonts.ready` promise or after 1s timeout |
|
||||||
|
| 11 | `useSwipe` from `@vueuse/core` fires on desktop trackpad pointer events, not just touch | Add `pointer-type === 'touch'` guard if you need touch-only behavior |
|
||||||
|
| 14 | Rebuild workflow confusion | `cd web && npm run build` → refresh browser. Only restart FastAPI if `app/api.py` changed. |
|
||||||
|
| 15 | `:global(ancestor) .descendant` in `<style scoped>` → Vue drops the descendant entirely | Never use `:global(X) .Y` in scoped CSS. Use JS gate or CSS custom property token. |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## FastAPI Integration
|
||||||
|
|
||||||
|
### SPA serving (gotcha #7)
|
||||||
|
|
||||||
|
When the Vue SPA is built, FastAPI needs to serve it. Register the explicit `/` route **before** the `StaticFiles` mount, otherwise `index.html` gets cached and old asset hashes cause 404s after rebuild:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from pathlib import Path
|
||||||
|
from fastapi.responses import FileResponse
|
||||||
|
from fastapi.staticfiles import StaticFiles
|
||||||
|
|
||||||
|
_DIST = Path(__file__).parent.parent / "web" / "dist"
|
||||||
|
_NO_CACHE = {
|
||||||
|
"Cache-Control": "no-cache, no-store, must-revalidate",
|
||||||
|
"Pragma": "no-cache",
|
||||||
|
}
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
def spa_root():
|
||||||
|
return FileResponse(_DIST / "index.html", headers=_NO_CACHE)
|
||||||
|
|
||||||
|
# Must come after the explicit route above
|
||||||
|
app.mount("/", StaticFiles(directory=str(_DIST), html=True), name="spa")
|
||||||
|
```
|
||||||
|
|
||||||
|
Hashed assets (`/assets/index-abc123.js`) can be cached aggressively — their filenames change with content. Only `index.html` needs no-cache.
|
||||||
|
|
||||||
|
### API prefix
|
||||||
|
|
||||||
|
Vue Router uses HTML5 history mode. All `/api/*` routes must be registered on FastAPI before the `StaticFiles` mount. Vue routes (`/`, `/review`, `/apply`, etc.) are handled client-side; FastAPI's `html=True` on `StaticFiles` serves `index.html` for any unmatched path.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Peregrine-Specific Considerations
|
||||||
|
|
||||||
|
### Auth & license gating
|
||||||
|
|
||||||
|
The Streamlit UI uses `app/wizard/tiers.py` for tier gating. In the Vue SPA, tier state should be fetched from a `GET /api/license/status` endpoint on mount and stored in a Pinia store. Components check `licenseStore.tier` to gate features.
|
||||||
|
|
||||||
|
### Discovery trigger
|
||||||
|
|
||||||
|
The "Start Discovery" button on Home triggers `python scripts/discover.py` as a background process. The Vue version should use SSE (same pattern as avocet's finetune SSE) to stream progress back in real-time. The `useApiSSE` composable is already wired for this.
|
||||||
|
|
||||||
|
### Job Review — card stack UX
|
||||||
|
|
||||||
|
This is the daily-driver view. Consider the avocet ASMR bucket pattern here — approve/reject could transform into buckets on drag pickup. The motion tokens (`--transition-spring`, `--transition-dismiss`) are pre-defined in `peregrine.css`. The `useHaptics` composable is ready.
|
||||||
|
|
||||||
|
### Kanban (Interviews view)
|
||||||
|
|
||||||
|
The drag-to-column kanban is a strong candidate for `@vueuse/core`'s `useDraggable`. Watch for the `useSwipe` gotcha #11 — use pointer-type guards if drag behavior differs between touch and mouse.
|
||||||
|
|
||||||
|
### Settings — 6 tabs
|
||||||
|
|
||||||
|
Use a tab component with reactive route query params (`/settings?tab=license`) so direct links work and the page is shareable/bookmarkable.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Build & Dev Workflow
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# From worktree root
|
||||||
|
cd web
|
||||||
|
npm install # first time only
|
||||||
|
npm run dev # Vite dev server at :5173 (proxies /api/* to FastAPI at :8502)
|
||||||
|
npm run build # output to web/dist/
|
||||||
|
npm run test # Vitest unit tests
|
||||||
|
```
|
||||||
|
|
||||||
|
FastAPI serves the built `dist/` on the main port. During dev, configure Vite to proxy `/api` to the running FastAPI:
|
||||||
|
|
||||||
|
```ts
|
||||||
|
// vite.config.ts addition for dev proxy
|
||||||
|
server: {
|
||||||
|
proxy: {
|
||||||
|
'/api': 'http://localhost:8502',
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
After `npm run build`, just refresh the browser — no FastAPI restart needed unless `app/api.py` changed (gotcha #14).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Order
|
||||||
|
|
||||||
|
Suggested sequence — validate the full stack before porting complex pages:
|
||||||
|
|
||||||
|
1. **FastAPI SPA endpoint** — serve `web/dist/` with correct cache headers
|
||||||
|
2. **App shell** — nav, routing, hacker mode, motion toggle work end-to-end
|
||||||
|
3. **Home view** — dashboard widgets, discovery trigger with SSE progress
|
||||||
|
4. **Job Review** — most-used view; gets the most polish
|
||||||
|
5. **Settings** — license tab is the blocker for tier gating in other views
|
||||||
|
6. **Apply Workspace** — cover letter gen + PDF export
|
||||||
|
7. **Interviews kanban** — drag-to-column + calendar sync
|
||||||
|
8. **Interview Prep** — reference sheet, practice Q&A
|
||||||
|
9. **Survey Assistant** — screenshot + text paste
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Checklist
|
||||||
|
|
||||||
|
Copy of the avocet gotchas checklist (all pre-applied at scaffold level are checked):
|
||||||
|
|
||||||
|
- [x] App.vue root element: use `.app-root` class, NOT `id="app"`
|
||||||
|
- [ ] No `transition: all` with spring easings — enumerate properties explicitly
|
||||||
|
- [ ] No `:global(ancestor) .descendant` in scoped CSS — Vue drops the descendant
|
||||||
|
- [x] `overflow-x: clip` on html, `overflow-x: hidden` on body
|
||||||
|
- [x] UnoCSS `presetAttributify`: `prefixedOnly: true`
|
||||||
|
- [x] Product CSS aliases: `--color-bg`, `--color-text-secondary` mapped in `peregrine.css`
|
||||||
|
- [ ] Keyboard composables: accept reactive getters, not snapshot arrays
|
||||||
|
- [x] FastAPI SPA serving pattern documented — apply when wiring FastAPI
|
||||||
|
- [ ] Font reflow: measure layout after `document.fonts.ready` or 1s timeout
|
||||||
|
- [x] Haptics: guard `navigator.vibrate` with feature detection
|
||||||
|
- [x] Pinia: use setup store form (function syntax)
|
||||||
|
- [x] Tests: mock matchMedia, vibrate, ResizeObserver in test-setup.ts
|
||||||
|
- [x] `min-height: 100dvh` on full-height layout containers
|
||||||
|
|
@ -121,6 +121,15 @@ CREATE TABLE IF NOT EXISTS survey_responses (
|
||||||
);
|
);
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
CREATE_DIGEST_QUEUE = """
|
||||||
|
CREATE TABLE IF NOT EXISTS digest_queue (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
job_contact_id INTEGER NOT NULL REFERENCES job_contacts(id),
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
UNIQUE(job_contact_id)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
|
||||||
_MIGRATIONS = [
|
_MIGRATIONS = [
|
||||||
("cover_letter", "TEXT"),
|
("cover_letter", "TEXT"),
|
||||||
("applied_at", "TEXT"),
|
("applied_at", "TEXT"),
|
||||||
|
|
@ -179,6 +188,7 @@ def init_db(db_path: Path = DEFAULT_DB) -> None:
|
||||||
conn.execute(CREATE_COMPANY_RESEARCH)
|
conn.execute(CREATE_COMPANY_RESEARCH)
|
||||||
conn.execute(CREATE_BACKGROUND_TASKS)
|
conn.execute(CREATE_BACKGROUND_TASKS)
|
||||||
conn.execute(CREATE_SURVEY_RESPONSES)
|
conn.execute(CREATE_SURVEY_RESPONSES)
|
||||||
|
conn.execute(CREATE_DIGEST_QUEUE)
|
||||||
conn.commit()
|
conn.commit()
|
||||||
conn.close()
|
conn.close()
|
||||||
_migrate_db(db_path)
|
_migrate_db(db_path)
|
||||||
|
|
|
||||||
|
|
@ -196,13 +196,20 @@ def run_discovery(db_path: Path = DEFAULT_DB, notion_push: bool = False) -> None
|
||||||
exclude_kw = [kw.lower() for kw in profile.get("exclude_keywords", [])]
|
exclude_kw = [kw.lower() for kw in profile.get("exclude_keywords", [])]
|
||||||
results_per_board = profile.get("results_per_board", 25)
|
results_per_board = profile.get("results_per_board", 25)
|
||||||
|
|
||||||
|
# Map remote_preference → JobSpy is_remote param:
|
||||||
|
# 'remote' → True (remote-only listings)
|
||||||
|
# 'onsite' → False (on-site-only listings)
|
||||||
|
# 'both' → None (no filter — JobSpy default)
|
||||||
|
_rp = profile.get("remote_preference", "both")
|
||||||
|
_is_remote: bool | None = True if _rp == "remote" else (False if _rp == "onsite" else None)
|
||||||
|
|
||||||
for location in profile["locations"]:
|
for location in profile["locations"]:
|
||||||
|
|
||||||
# ── JobSpy boards ──────────────────────────────────────────────────
|
# ── JobSpy boards ──────────────────────────────────────────────────
|
||||||
if boards:
|
if boards:
|
||||||
print(f" [jobspy] {location} — boards: {', '.join(boards)}")
|
print(f" [jobspy] {location} — boards: {', '.join(boards)}")
|
||||||
try:
|
try:
|
||||||
jobs: pd.DataFrame = scrape_jobs(
|
jobspy_kwargs: dict = dict(
|
||||||
site_name=boards,
|
site_name=boards,
|
||||||
search_term=" OR ".join(f'"{t}"' for t in profile["titles"]),
|
search_term=" OR ".join(f'"{t}"' for t in profile["titles"]),
|
||||||
location=location,
|
location=location,
|
||||||
|
|
@ -210,6 +217,9 @@ def run_discovery(db_path: Path = DEFAULT_DB, notion_push: bool = False) -> None
|
||||||
hours_old=profile.get("hours_old", 72),
|
hours_old=profile.get("hours_old", 72),
|
||||||
linkedin_fetch_description=True,
|
linkedin_fetch_description=True,
|
||||||
)
|
)
|
||||||
|
if _is_remote is not None:
|
||||||
|
jobspy_kwargs["is_remote"] = _is_remote
|
||||||
|
jobs: pd.DataFrame = scrape_jobs(**jobspy_kwargs)
|
||||||
print(f" [jobspy] {len(jobs)} raw results")
|
print(f" [jobspy] {len(jobs)} raw results")
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
print(f" [jobspy] ERROR: {exc}")
|
print(f" [jobspy] ERROR: {exc}")
|
||||||
|
|
|
||||||
|
|
@ -698,21 +698,43 @@ def _parse_message(conn: imaplib.IMAP4, uid: bytes) -> Optional[dict]:
|
||||||
return None
|
return None
|
||||||
msg = email.message_from_bytes(data[0][1])
|
msg = email.message_from_bytes(data[0][1])
|
||||||
|
|
||||||
body = ""
|
# Prefer text/html (preserves href attributes for digest link extraction);
|
||||||
|
# fall back to text/plain if no HTML part exists.
|
||||||
|
html_body = ""
|
||||||
|
plain_body = ""
|
||||||
if msg.is_multipart():
|
if msg.is_multipart():
|
||||||
for part in msg.walk():
|
for part in msg.walk():
|
||||||
if part.get_content_type() == "text/plain":
|
ct = part.get_content_type()
|
||||||
|
if ct == "text/html" and not html_body:
|
||||||
try:
|
try:
|
||||||
body = part.get_payload(decode=True).decode("utf-8", errors="replace")
|
html_body = part.get_payload(decode=True).decode("utf-8", errors="replace")
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
elif ct == "text/plain" and not plain_body:
|
||||||
|
try:
|
||||||
|
plain_body = part.get_payload(decode=True).decode("utf-8", errors="replace")
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
break
|
|
||||||
else:
|
else:
|
||||||
|
ct = msg.get_content_type()
|
||||||
try:
|
try:
|
||||||
body = msg.get_payload(decode=True).decode("utf-8", errors="replace")
|
raw = msg.get_payload(decode=True).decode("utf-8", errors="replace")
|
||||||
|
if ct == "text/html":
|
||||||
|
html_body = raw
|
||||||
|
else:
|
||||||
|
plain_body = raw
|
||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
if html_body:
|
||||||
|
# Strip <head>…</head> (CSS, meta, title) and any stray <style> blocks.
|
||||||
|
# Keeps <body> HTML intact so href attributes survive for digest extraction.
|
||||||
|
body = re.sub(r"<head[\s\S]*?</head>", "", html_body, flags=re.I)
|
||||||
|
body = re.sub(r"<style[\s\S]*?</style>", "", body, flags=re.I)
|
||||||
|
body = re.sub(r"<script[\s\S]*?</script>", "", body, flags=re.I)
|
||||||
|
else:
|
||||||
|
body = plain_body
|
||||||
|
|
||||||
mid = msg.get("Message-ID", "").strip()
|
mid = msg.get("Message-ID", "").strip()
|
||||||
if not mid:
|
if not mid:
|
||||||
return None # No Message-ID → can't dedup; skip to avoid repeat inserts
|
return None # No Message-ID → can't dedup; skip to avoid repeat inserts
|
||||||
|
|
@ -723,7 +745,7 @@ def _parse_message(conn: imaplib.IMAP4, uid: bytes) -> Optional[dict]:
|
||||||
"from_addr": _decode_str(msg.get("From")),
|
"from_addr": _decode_str(msg.get("From")),
|
||||||
"to_addr": _decode_str(msg.get("To")),
|
"to_addr": _decode_str(msg.get("To")),
|
||||||
"date": _decode_str(msg.get("Date")),
|
"date": _decode_str(msg.get("Date")),
|
||||||
"body": body[:4000],
|
"body": body, # no truncation — digest emails need full content
|
||||||
}
|
}
|
||||||
except Exception:
|
except Exception:
|
||||||
return None
|
return None
|
||||||
|
|
|
||||||
|
|
@ -7,6 +7,8 @@ here so port/host/SSL changes propagate everywhere automatically.
|
||||||
"""
|
"""
|
||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
import os
|
||||||
|
import tempfile
|
||||||
import yaml
|
import yaml
|
||||||
|
|
||||||
_DEFAULTS = {
|
_DEFAULTS = {
|
||||||
|
|
@ -161,3 +163,30 @@ class UserProfile:
|
||||||
"ollama_research": f"{self.ollama_url}/v1",
|
"ollama_research": f"{self.ollama_url}/v1",
|
||||||
"vllm": f"{self.vllm_url}/v1",
|
"vllm": f"{self.vllm_url}/v1",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ── Free functions for plain-dict access (used by dev-api.py) ─────────────────
|
||||||
|
|
||||||
|
def load_user_profile(config_path: str) -> dict:
|
||||||
|
"""Load user.yaml and return as a plain dict with safe defaults."""
|
||||||
|
path = Path(config_path)
|
||||||
|
if not path.exists():
|
||||||
|
return {}
|
||||||
|
with open(path) as f:
|
||||||
|
data = yaml.safe_load(f) or {}
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
def save_user_profile(config_path: str, data: dict) -> None:
|
||||||
|
"""Atomically write the user profile dict to user.yaml."""
|
||||||
|
path = Path(config_path)
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
# Write to temp file then rename for atomicity
|
||||||
|
fd, tmp = tempfile.mkstemp(dir=path.parent, suffix='.yaml.tmp')
|
||||||
|
try:
|
||||||
|
with os.fdopen(fd, 'w') as f:
|
||||||
|
yaml.dump(data, f, allow_unicode=True, default_flow_style=False)
|
||||||
|
os.replace(tmp, path)
|
||||||
|
except Exception:
|
||||||
|
os.unlink(tmp)
|
||||||
|
raise
|
||||||
|
|
|
||||||
238
tests/test_dev_api_digest.py
Normal file
238
tests/test_dev_api_digest.py
Normal file
|
|
@ -0,0 +1,238 @@
|
||||||
|
"""Tests for digest queue API endpoints."""
|
||||||
|
import sqlite3
|
||||||
|
import os
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def tmp_db(tmp_path):
|
||||||
|
"""Create minimal schema in a temp dir with one job_contacts row."""
|
||||||
|
db_path = str(tmp_path / "staging.db")
|
||||||
|
con = sqlite3.connect(db_path)
|
||||||
|
con.executescript("""
|
||||||
|
CREATE TABLE jobs (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
title TEXT, company TEXT, url TEXT UNIQUE, location TEXT,
|
||||||
|
is_remote INTEGER DEFAULT 0, salary TEXT,
|
||||||
|
match_score REAL, keyword_gaps TEXT, status TEXT DEFAULT 'pending',
|
||||||
|
date_found TEXT, description TEXT, source TEXT
|
||||||
|
);
|
||||||
|
CREATE TABLE job_contacts (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
job_id INTEGER,
|
||||||
|
subject TEXT,
|
||||||
|
received_at TEXT,
|
||||||
|
stage_signal TEXT,
|
||||||
|
suggestion_dismissed INTEGER DEFAULT 0,
|
||||||
|
body TEXT,
|
||||||
|
from_addr TEXT
|
||||||
|
);
|
||||||
|
CREATE TABLE digest_queue (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
job_contact_id INTEGER NOT NULL REFERENCES job_contacts(id),
|
||||||
|
created_at TEXT DEFAULT (datetime('now')),
|
||||||
|
UNIQUE(job_contact_id)
|
||||||
|
);
|
||||||
|
INSERT INTO jobs (id, title, company, url, status, source, date_found)
|
||||||
|
VALUES (1, 'Engineer', 'Acme', 'https://acme.com/job/1', 'applied', 'test', '2026-03-19');
|
||||||
|
INSERT INTO job_contacts (id, job_id, subject, received_at, stage_signal, body, from_addr)
|
||||||
|
VALUES (
|
||||||
|
10, 1, 'TechCrunch Jobs Weekly', '2026-03-19T10:00:00', 'digest',
|
||||||
|
'<html><body>Apply at <a href="https://greenhouse.io/acme/jobs/456">Senior Engineer</a> or <a href="https://lever.co/globex/staff">Staff Designer</a>. Unsubscribe: https://unsubscribe.example.com/remove</body></html>',
|
||||||
|
'digest@techcrunch.com'
|
||||||
|
);
|
||||||
|
""")
|
||||||
|
con.close()
|
||||||
|
return db_path
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def client(tmp_db, monkeypatch):
|
||||||
|
monkeypatch.setenv("STAGING_DB", tmp_db)
|
||||||
|
import dev_api
|
||||||
|
monkeypatch.setattr(dev_api, "DB_PATH", tmp_db)
|
||||||
|
return TestClient(dev_api.app)
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/digest-queue ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_digest_queue_list_empty(client):
|
||||||
|
resp = client.get("/api/digest-queue")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_list_with_entry(client, tmp_db):
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
con.execute("INSERT INTO digest_queue (job_contact_id) VALUES (10)")
|
||||||
|
con.commit()
|
||||||
|
con.close()
|
||||||
|
|
||||||
|
resp = client.get("/api/digest-queue")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
entries = resp.json()
|
||||||
|
assert len(entries) == 1
|
||||||
|
assert entries[0]["job_contact_id"] == 10
|
||||||
|
assert entries[0]["subject"] == "TechCrunch Jobs Weekly"
|
||||||
|
assert entries[0]["from_addr"] == "digest@techcrunch.com"
|
||||||
|
assert "body" in entries[0]
|
||||||
|
assert "created_at" in entries[0]
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/digest-queue ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_digest_queue_add(client, tmp_db):
|
||||||
|
resp = client.post("/api/digest-queue", json={"job_contact_id": 10})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["ok"] is True
|
||||||
|
assert data["created"] is True
|
||||||
|
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
row = con.execute("SELECT * FROM digest_queue WHERE job_contact_id = 10").fetchone()
|
||||||
|
con.close()
|
||||||
|
assert row is not None
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_add_duplicate(client):
|
||||||
|
client.post("/api/digest-queue", json={"job_contact_id": 10})
|
||||||
|
resp = client.post("/api/digest-queue", json={"job_contact_id": 10})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["ok"] is True
|
||||||
|
assert data["created"] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_add_missing_contact(client):
|
||||||
|
resp = client.post("/api/digest-queue", json={"job_contact_id": 9999})
|
||||||
|
assert resp.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/digest-queue/{id}/extract-links ───────────────────────────────
|
||||||
|
|
||||||
|
def _add_digest_entry(tmp_db, contact_id=10):
|
||||||
|
"""Helper: insert a digest_queue row and return its id."""
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
cur = con.execute("INSERT INTO digest_queue (job_contact_id) VALUES (?)", (contact_id,))
|
||||||
|
entry_id = cur.lastrowid
|
||||||
|
con.commit()
|
||||||
|
con.close()
|
||||||
|
return entry_id
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_extract_links(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.post(f"/api/digest-queue/{entry_id}/extract-links")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
links = resp.json()["links"]
|
||||||
|
|
||||||
|
# greenhouse.io link should be present with score=2
|
||||||
|
gh_links = [l for l in links if "greenhouse.io" in l["url"]]
|
||||||
|
assert len(gh_links) == 1
|
||||||
|
assert gh_links[0]["score"] == 2
|
||||||
|
|
||||||
|
# lever.co link should be present with score=2
|
||||||
|
lever_links = [l for l in links if "lever.co" in l["url"]]
|
||||||
|
assert len(lever_links) == 1
|
||||||
|
assert lever_links[0]["score"] == 2
|
||||||
|
|
||||||
|
# Each link must have a hint key (may be empty string for links at start of body)
|
||||||
|
for link in links:
|
||||||
|
assert "hint" in link
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_extract_links_filters_trackers(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.post(f"/api/digest-queue/{entry_id}/extract-links")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
links = resp.json()["links"]
|
||||||
|
urls = [l["url"] for l in links]
|
||||||
|
# Unsubscribe URL should be excluded
|
||||||
|
assert not any("unsubscribe" in u for u in urls)
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_extract_links_404(client):
|
||||||
|
resp = client.post("/api/digest-queue/9999/extract-links")
|
||||||
|
assert resp.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/digest-queue/{id}/queue-jobs ──────────────────────────────────
|
||||||
|
|
||||||
|
def test_digest_queue_jobs(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.post(
|
||||||
|
f"/api/digest-queue/{entry_id}/queue-jobs",
|
||||||
|
json={"urls": ["https://greenhouse.io/acme/jobs/456"]},
|
||||||
|
)
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["queued"] == 1
|
||||||
|
assert data["skipped"] == 0
|
||||||
|
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
row = con.execute(
|
||||||
|
"SELECT source, status FROM jobs WHERE url = 'https://greenhouse.io/acme/jobs/456'"
|
||||||
|
).fetchone()
|
||||||
|
con.close()
|
||||||
|
assert row is not None
|
||||||
|
assert row[0] == "digest"
|
||||||
|
assert row[1] == "pending"
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_jobs_skips_duplicates(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.post(
|
||||||
|
f"/api/digest-queue/{entry_id}/queue-jobs",
|
||||||
|
json={"urls": [
|
||||||
|
"https://greenhouse.io/acme/jobs/789",
|
||||||
|
"https://greenhouse.io/acme/jobs/789", # same URL twice in one call
|
||||||
|
]},
|
||||||
|
)
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["queued"] == 1
|
||||||
|
assert data["skipped"] == 1
|
||||||
|
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
count = con.execute(
|
||||||
|
"SELECT COUNT(*) FROM jobs WHERE url = 'https://greenhouse.io/acme/jobs/789'"
|
||||||
|
).fetchone()[0]
|
||||||
|
con.close()
|
||||||
|
assert count == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_jobs_skips_invalid_urls(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.post(
|
||||||
|
f"/api/digest-queue/{entry_id}/queue-jobs",
|
||||||
|
json={"urls": ["", "ftp://bad.example.com", "https://valid.greenhouse.io/job/1"]},
|
||||||
|
)
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["queued"] == 1
|
||||||
|
assert data["skipped"] == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_jobs_empty_urls(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.post(f"/api/digest-queue/{entry_id}/queue-jobs", json={"urls": []})
|
||||||
|
assert resp.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_digest_queue_jobs_404(client):
|
||||||
|
resp = client.post("/api/digest-queue/9999/queue-jobs", json={"urls": ["https://example.com"]})
|
||||||
|
assert resp.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── DELETE /api/digest-queue/{id} ───────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_digest_delete(client, tmp_db):
|
||||||
|
entry_id = _add_digest_entry(tmp_db)
|
||||||
|
resp = client.delete(f"/api/digest-queue/{entry_id}")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
|
|
||||||
|
# Second delete → 404
|
||||||
|
resp2 = client.delete(f"/api/digest-queue/{entry_id}")
|
||||||
|
assert resp2.status_code == 404
|
||||||
216
tests/test_dev_api_interviews.py
Normal file
216
tests/test_dev_api_interviews.py
Normal file
|
|
@ -0,0 +1,216 @@
|
||||||
|
"""Tests for new dev-api.py endpoints: stage signals, email sync, signal dismiss."""
|
||||||
|
import sqlite3
|
||||||
|
import tempfile
|
||||||
|
import os
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def tmp_db(tmp_path):
|
||||||
|
"""Create a minimal staging.db schema in a temp dir."""
|
||||||
|
db_path = str(tmp_path / "staging.db")
|
||||||
|
con = sqlite3.connect(db_path)
|
||||||
|
con.executescript("""
|
||||||
|
CREATE TABLE jobs (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
title TEXT, company TEXT, url TEXT, location TEXT,
|
||||||
|
is_remote INTEGER DEFAULT 0, salary TEXT,
|
||||||
|
match_score REAL, keyword_gaps TEXT, status TEXT,
|
||||||
|
interview_date TEXT, rejection_stage TEXT,
|
||||||
|
applied_at TEXT, phone_screen_at TEXT, interviewing_at TEXT,
|
||||||
|
offer_at TEXT, hired_at TEXT, survey_at TEXT
|
||||||
|
);
|
||||||
|
CREATE TABLE job_contacts (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
job_id INTEGER,
|
||||||
|
subject TEXT,
|
||||||
|
received_at TEXT,
|
||||||
|
stage_signal TEXT,
|
||||||
|
suggestion_dismissed INTEGER DEFAULT 0,
|
||||||
|
body TEXT,
|
||||||
|
from_addr TEXT
|
||||||
|
);
|
||||||
|
CREATE TABLE background_tasks (
|
||||||
|
id INTEGER PRIMARY KEY,
|
||||||
|
task_type TEXT,
|
||||||
|
job_id INTEGER,
|
||||||
|
status TEXT DEFAULT 'queued',
|
||||||
|
finished_at TEXT
|
||||||
|
);
|
||||||
|
INSERT INTO jobs (id, title, company, status) VALUES
|
||||||
|
(1, 'Engineer', 'Acme', 'applied'),
|
||||||
|
(2, 'Designer', 'Beta', 'phone_screen');
|
||||||
|
INSERT INTO job_contacts (id, job_id, subject, received_at, stage_signal, suggestion_dismissed) VALUES
|
||||||
|
(10, 1, 'Interview confirmed', '2026-03-19T10:00:00', 'interview_scheduled', 0),
|
||||||
|
(11, 1, 'Old neutral', '2026-03-18T09:00:00', 'neutral', 0),
|
||||||
|
(12, 2, 'Offer letter', '2026-03-19T11:00:00', 'offer_received', 0),
|
||||||
|
(13, 1, 'Already dismissed', '2026-03-17T08:00:00', 'positive_response', 1);
|
||||||
|
""")
|
||||||
|
con.close()
|
||||||
|
return db_path
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def client(tmp_db, monkeypatch):
|
||||||
|
monkeypatch.setenv("STAGING_DB", tmp_db)
|
||||||
|
import dev_api
|
||||||
|
monkeypatch.setattr(dev_api, "DB_PATH", tmp_db)
|
||||||
|
return TestClient(dev_api.app)
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/interviews — stage signals batched ────────────────────────────
|
||||||
|
|
||||||
|
def test_interviews_includes_stage_signals(client):
|
||||||
|
resp = client.get("/api/interviews")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
jobs = {j["id"]: j for j in resp.json()}
|
||||||
|
|
||||||
|
# job 1 should have exactly 1 undismissed non-excluded signal
|
||||||
|
assert "stage_signals" in jobs[1]
|
||||||
|
signals = jobs[1]["stage_signals"]
|
||||||
|
assert len(signals) == 1
|
||||||
|
assert signals[0]["stage_signal"] == "interview_scheduled"
|
||||||
|
assert signals[0]["subject"] == "Interview confirmed"
|
||||||
|
assert signals[0]["id"] == 10
|
||||||
|
assert "body" in signals[0]
|
||||||
|
assert "from_addr" in signals[0]
|
||||||
|
|
||||||
|
# neutral signal excluded
|
||||||
|
signal_types = [s["stage_signal"] for s in signals]
|
||||||
|
assert "neutral" not in signal_types
|
||||||
|
|
||||||
|
# dismissed signal excluded
|
||||||
|
signal_ids = [s["id"] for s in signals]
|
||||||
|
assert 13 not in signal_ids
|
||||||
|
|
||||||
|
# job 2 has an offer signal
|
||||||
|
assert len(jobs[2]["stage_signals"]) == 1
|
||||||
|
assert jobs[2]["stage_signals"][0]["stage_signal"] == "offer_received"
|
||||||
|
|
||||||
|
|
||||||
|
def test_interviews_empty_signals_for_job_without_contacts(client, tmp_db):
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
con.execute("INSERT INTO jobs (id, title, company, status) VALUES (3, 'NoContact', 'Corp', 'survey')")
|
||||||
|
con.commit(); con.close()
|
||||||
|
resp = client.get("/api/interviews")
|
||||||
|
jobs = {j["id"]: j for j in resp.json()}
|
||||||
|
assert jobs[3]["stage_signals"] == []
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/email/sync ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_email_sync_returns_202(client):
|
||||||
|
resp = client.post("/api/email/sync")
|
||||||
|
assert resp.status_code == 202
|
||||||
|
assert "task_id" in resp.json()
|
||||||
|
|
||||||
|
|
||||||
|
def test_email_sync_inserts_background_task(client, tmp_db):
|
||||||
|
client.post("/api/email/sync")
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
row = con.execute(
|
||||||
|
"SELECT task_type, job_id, status FROM background_tasks WHERE task_type='email_sync'"
|
||||||
|
).fetchone()
|
||||||
|
con.close()
|
||||||
|
assert row is not None
|
||||||
|
assert row[0] == "email_sync"
|
||||||
|
assert row[1] == 0 # sentinel
|
||||||
|
assert row[2] == "queued"
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/email/sync/status ─────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_email_sync_status_idle_when_no_tasks(client):
|
||||||
|
resp = client.get("/api/email/sync/status")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
body = resp.json()
|
||||||
|
assert body["status"] == "idle"
|
||||||
|
assert body["last_completed_at"] is None
|
||||||
|
|
||||||
|
|
||||||
|
def test_email_sync_status_reflects_latest_task(client, tmp_db):
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
con.execute(
|
||||||
|
"INSERT INTO background_tasks (task_type, job_id, status, finished_at) VALUES "
|
||||||
|
"('email_sync', 0, 'completed', '2026-03-19T12:00:00')"
|
||||||
|
)
|
||||||
|
con.commit(); con.close()
|
||||||
|
resp = client.get("/api/email/sync/status")
|
||||||
|
body = resp.json()
|
||||||
|
assert body["status"] == "completed"
|
||||||
|
assert body["last_completed_at"] == "2026-03-19T12:00:00"
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/stage-signals/{id}/dismiss ──────────────────────────────────
|
||||||
|
|
||||||
|
def test_dismiss_signal_sets_flag(client, tmp_db):
|
||||||
|
resp = client.post("/api/stage-signals/10/dismiss")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == {"ok": True}
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
row = con.execute(
|
||||||
|
"SELECT suggestion_dismissed FROM job_contacts WHERE id = 10"
|
||||||
|
).fetchone()
|
||||||
|
con.close()
|
||||||
|
assert row[0] == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_dismiss_signal_404_for_missing_id(client):
|
||||||
|
resp = client.post("/api/stage-signals/9999/dismiss")
|
||||||
|
assert resp.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── Body/from_addr in signal response ─────────────────────────────────────
|
||||||
|
|
||||||
|
def test_interviews_signal_includes_body_and_from_addr(client):
|
||||||
|
resp = client.get("/api/interviews")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
jobs = {j["id"]: j for j in resp.json()}
|
||||||
|
sig = jobs[1]["stage_signals"][0]
|
||||||
|
# Fields must exist (may be None when DB column is NULL)
|
||||||
|
assert "body" in sig
|
||||||
|
assert "from_addr" in sig
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/stage-signals/{id}/reclassify ────────────────────────────────
|
||||||
|
|
||||||
|
def test_reclassify_signal_updates_label(client, tmp_db):
|
||||||
|
resp = client.post("/api/stage-signals/10/reclassify",
|
||||||
|
json={"stage_signal": "positive_response"})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == {"ok": True}
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
row = con.execute(
|
||||||
|
"SELECT stage_signal FROM job_contacts WHERE id = 10"
|
||||||
|
).fetchone()
|
||||||
|
con.close()
|
||||||
|
assert row[0] == "positive_response"
|
||||||
|
|
||||||
|
|
||||||
|
def test_reclassify_signal_invalid_label(client):
|
||||||
|
resp = client.post("/api/stage-signals/10/reclassify",
|
||||||
|
json={"stage_signal": "not_a_real_label"})
|
||||||
|
assert resp.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_reclassify_signal_404_for_missing_id(client):
|
||||||
|
resp = client.post("/api/stage-signals/9999/reclassify",
|
||||||
|
json={"stage_signal": "neutral"})
|
||||||
|
assert resp.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_signal_body_html_is_stripped(client, tmp_db):
|
||||||
|
import sqlite3
|
||||||
|
con = sqlite3.connect(tmp_db)
|
||||||
|
con.execute(
|
||||||
|
"UPDATE job_contacts SET body = ? WHERE id = 10",
|
||||||
|
("<html><body><p>Hi there,</p><p>Interview confirmed.</p></body></html>",)
|
||||||
|
)
|
||||||
|
con.commit(); con.close()
|
||||||
|
resp = client.get("/api/interviews")
|
||||||
|
jobs = {j["id"]: j for j in resp.json()}
|
||||||
|
body = jobs[1]["stage_signals"][0]["body"]
|
||||||
|
assert "<" not in body
|
||||||
|
assert "Hi there" in body
|
||||||
|
assert "Interview confirmed" in body
|
||||||
161
tests/test_dev_api_prep.py
Normal file
161
tests/test_dev_api_prep.py
Normal file
|
|
@ -0,0 +1,161 @@
|
||||||
|
"""Tests for interview prep endpoints: research GET/generate/task, contacts GET."""
|
||||||
|
import json
|
||||||
|
import pytest
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def client():
|
||||||
|
import sys
|
||||||
|
sys.path.insert(0, "/Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa")
|
||||||
|
from dev_api import app
|
||||||
|
return TestClient(app)
|
||||||
|
|
||||||
|
|
||||||
|
# ── /api/jobs/{id}/research ─────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_research_found(client):
|
||||||
|
"""Returns research row (minus raw_output) when present."""
|
||||||
|
import sqlite3
|
||||||
|
mock_row = {
|
||||||
|
"job_id": 1,
|
||||||
|
"company_brief": "Acme Corp makes anvils.",
|
||||||
|
"ceo_brief": "Wile E Coyote",
|
||||||
|
"talking_points": "- Ask about roadrunner containment",
|
||||||
|
"tech_brief": "Python, Rust",
|
||||||
|
"funding_brief": "Series B",
|
||||||
|
"red_flags": None,
|
||||||
|
"accessibility_brief": None,
|
||||||
|
"generated_at": "2026-03-20T12:00:00",
|
||||||
|
}
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchone.return_value = mock_row
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/1/research")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["company_brief"] == "Acme Corp makes anvils."
|
||||||
|
assert "raw_output" not in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_research_not_found(client):
|
||||||
|
"""Returns 404 when no research row exists for job."""
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchone.return_value = None
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/99/research")
|
||||||
|
assert resp.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── /api/jobs/{id}/research/generate ────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_generate_research_new_task(client):
|
||||||
|
"""POST generate returns task_id and is_new=True for fresh submission."""
|
||||||
|
with patch("scripts.task_runner.submit_task", return_value=(42, True)):
|
||||||
|
resp = client.post("/api/jobs/1/research/generate")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["task_id"] == 42
|
||||||
|
assert data["is_new"] is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_generate_research_duplicate_task(client):
|
||||||
|
"""POST generate returns is_new=False when task already queued."""
|
||||||
|
with patch("scripts.task_runner.submit_task", return_value=(17, False)):
|
||||||
|
resp = client.post("/api/jobs/1/research/generate")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["is_new"] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_generate_research_error(client):
|
||||||
|
"""POST generate returns 500 when submit_task raises."""
|
||||||
|
with patch("scripts.task_runner.submit_task", side_effect=Exception("LLM unavailable")):
|
||||||
|
resp = client.post("/api/jobs/1/research/generate")
|
||||||
|
assert resp.status_code == 500
|
||||||
|
|
||||||
|
|
||||||
|
# ── /api/jobs/{id}/research/task ────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_research_task_none(client):
|
||||||
|
"""Returns status=none when no background task exists for job."""
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchone.return_value = None
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/1/research/task")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["status"] == "none"
|
||||||
|
assert data["stage"] is None
|
||||||
|
assert data["message"] is None
|
||||||
|
|
||||||
|
|
||||||
|
def test_research_task_running(client):
|
||||||
|
"""Returns current status/stage/message for an active task."""
|
||||||
|
mock_row = {"status": "running", "stage": "Scraping company site", "error": None}
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchone.return_value = mock_row
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/1/research/task")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["status"] == "running"
|
||||||
|
assert data["stage"] == "Scraping company site"
|
||||||
|
assert data["message"] is None
|
||||||
|
|
||||||
|
|
||||||
|
def test_research_task_failed(client):
|
||||||
|
"""Returns message (mapped from error column) for failed task."""
|
||||||
|
mock_row = {"status": "failed", "stage": None, "error": "LLM timeout"}
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchone.return_value = mock_row
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/1/research/task")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["status"] == "failed"
|
||||||
|
assert data["message"] == "LLM timeout"
|
||||||
|
|
||||||
|
|
||||||
|
# ── /api/jobs/{id}/contacts ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_contacts_empty(client):
|
||||||
|
"""Returns empty list when job has no contacts."""
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchall.return_value = []
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/1/contacts")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_contacts_list(client):
|
||||||
|
"""Returns list of contact dicts for job."""
|
||||||
|
mock_rows = [
|
||||||
|
{"id": 1, "direction": "inbound", "subject": "Interview next week",
|
||||||
|
"from_addr": "hr@acme.com", "body": "Hi! We'd like to...", "received_at": "2026-03-19T10:00:00"},
|
||||||
|
{"id": 2, "direction": "outbound", "subject": "Re: Interview next week",
|
||||||
|
"from_addr": None, "body": "Thank you!", "received_at": "2026-03-19T11:00:00"},
|
||||||
|
]
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchall.return_value = mock_rows
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/1/contacts")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert len(data) == 2
|
||||||
|
assert data[0]["direction"] == "inbound"
|
||||||
|
assert data[1]["direction"] == "outbound"
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_contacts_ordered_by_received_at(client):
|
||||||
|
"""Most recent contacts appear first (ORDER BY received_at DESC)."""
|
||||||
|
mock_db = MagicMock()
|
||||||
|
mock_db.execute.return_value.fetchall.return_value = []
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
resp = client.get("/api/jobs/99/contacts")
|
||||||
|
# Verify the SQL contains ORDER BY received_at DESC
|
||||||
|
call_args = mock_db.execute.call_args
|
||||||
|
sql = call_args[0][0]
|
||||||
|
assert "ORDER BY received_at DESC" in sql
|
||||||
632
tests/test_dev_api_settings.py
Normal file
632
tests/test_dev_api_settings.py
Normal file
|
|
@ -0,0 +1,632 @@
|
||||||
|
"""Tests for all settings API endpoints added in Tasks 1–8."""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import yaml
|
||||||
|
import pytest
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
_WORKTREE = "/Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa"
|
||||||
|
|
||||||
|
# ── Path bootstrap ────────────────────────────────────────────────────────────
|
||||||
|
# dev_api.py inserts /Library/Development/CircuitForge/peregrine into sys.path
|
||||||
|
# at import time; the worktree has credential_store but the main repo doesn't.
|
||||||
|
# Insert the worktree first so 'scripts' resolves to the worktree version, then
|
||||||
|
# pre-cache it in sys.modules so Python won't re-look-up when dev_api adds the
|
||||||
|
# main peregrine root.
|
||||||
|
if _WORKTREE not in sys.path:
|
||||||
|
sys.path.insert(0, _WORKTREE)
|
||||||
|
# Pre-cache the worktree scripts package and submodules before dev_api import
|
||||||
|
import importlib, types
|
||||||
|
|
||||||
|
def _ensure_worktree_scripts():
|
||||||
|
import importlib.util as _ilu
|
||||||
|
_wt = _WORKTREE
|
||||||
|
# Only load if not already loaded from the worktree
|
||||||
|
_spec = _ilu.spec_from_file_location("scripts", f"{_wt}/scripts/__init__.py",
|
||||||
|
submodule_search_locations=[f"{_wt}/scripts"])
|
||||||
|
if _spec is None:
|
||||||
|
return
|
||||||
|
_mod = _ilu.module_from_spec(_spec)
|
||||||
|
sys.modules.setdefault("scripts", _mod)
|
||||||
|
try:
|
||||||
|
_spec.loader.exec_module(_mod)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
_ensure_worktree_scripts()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="module")
|
||||||
|
def client():
|
||||||
|
from dev_api import app
|
||||||
|
return TestClient(app)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Helpers ───────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _write_user_yaml(path: Path, data: dict = None):
|
||||||
|
"""Write a minimal user.yaml to the given path."""
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
with open(path, "w") as f:
|
||||||
|
yaml.dump(data or {"name": "Test User", "email": "test@example.com"}, f)
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/config/app ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_app_config_returns_expected_keys(client):
|
||||||
|
"""Returns 200 with isCloud, tier, and inferenceProfile in valid values."""
|
||||||
|
resp = client.get("/api/config/app")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "isCloud" in data
|
||||||
|
assert "tier" in data
|
||||||
|
assert "inferenceProfile" in data
|
||||||
|
valid_tiers = {"free", "paid", "premium", "ultra"}
|
||||||
|
valid_profiles = {"remote", "cpu", "single-gpu", "dual-gpu"}
|
||||||
|
assert data["tier"] in valid_tiers
|
||||||
|
assert data["inferenceProfile"] in valid_profiles
|
||||||
|
|
||||||
|
|
||||||
|
def test_app_config_iscloud_env(client):
|
||||||
|
"""isCloud reflects CLOUD_MODE env var."""
|
||||||
|
with patch.dict(os.environ, {"CLOUD_MODE": "true"}):
|
||||||
|
resp = client.get("/api/config/app")
|
||||||
|
assert resp.json()["isCloud"] is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_app_config_invalid_tier_falls_back_to_free(client):
|
||||||
|
"""Unknown APP_TIER falls back to 'free'."""
|
||||||
|
with patch.dict(os.environ, {"APP_TIER": "enterprise"}):
|
||||||
|
resp = client.get("/api/config/app")
|
||||||
|
assert resp.json()["tier"] == "free"
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET/PUT /api/settings/profile ─────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_profile_returns_fields(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/profile returns dict with expected profile fields."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml, {"name": "Alice", "email": "alice@example.com"})
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/profile")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "name" in data
|
||||||
|
assert "email" in data
|
||||||
|
assert "career_summary" in data
|
||||||
|
assert "mission_preferences" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_put_get_profile_roundtrip(tmp_path, monkeypatch):
|
||||||
|
"""PUT then GET profile round-trip: saved name is returned."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
put_resp = c.put("/api/settings/profile", json={
|
||||||
|
"name": "Bob Builder",
|
||||||
|
"email": "bob@example.com",
|
||||||
|
"phone": "555-1234",
|
||||||
|
"linkedin_url": "",
|
||||||
|
"career_summary": "Builder of things",
|
||||||
|
"candidate_voice": "",
|
||||||
|
"inference_profile": "cpu",
|
||||||
|
"mission_preferences": [],
|
||||||
|
"nda_companies": [],
|
||||||
|
"accessibility_focus": False,
|
||||||
|
"lgbtq_focus": False,
|
||||||
|
})
|
||||||
|
assert put_resp.status_code == 200
|
||||||
|
assert put_resp.json()["ok"] is True
|
||||||
|
|
||||||
|
get_resp = c.get("/api/settings/profile")
|
||||||
|
assert get_resp.status_code == 200
|
||||||
|
assert get_resp.json()["name"] == "Bob Builder"
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/settings/resume ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_resume_missing_returns_not_exists(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/resume when file missing returns {exists: false}."""
|
||||||
|
fake_path = tmp_path / "config" / "plain_text_resume.yaml"
|
||||||
|
# Ensure the path doesn't exist
|
||||||
|
monkeypatch.setattr("dev_api.RESUME_PATH", fake_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/resume")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == {"exists": False}
|
||||||
|
|
||||||
|
|
||||||
|
def test_post_resume_blank_creates_file(tmp_path, monkeypatch):
|
||||||
|
"""POST /api/settings/resume/blank creates the file."""
|
||||||
|
fake_path = tmp_path / "config" / "plain_text_resume.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.RESUME_PATH", fake_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.post("/api/settings/resume/blank")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
|
assert fake_path.exists()
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_resume_after_blank_returns_exists(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/resume after blank creation returns {exists: true}."""
|
||||||
|
fake_path = tmp_path / "config" / "plain_text_resume.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.RESUME_PATH", fake_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
# First create the blank file
|
||||||
|
c.post("/api/settings/resume/blank")
|
||||||
|
# Now get should return exists: True
|
||||||
|
resp = c.get("/api/settings/resume")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["exists"] is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_post_resume_sync_identity(tmp_path, monkeypatch):
|
||||||
|
"""POST /api/settings/resume/sync-identity returns 200."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.post("/api/settings/resume/sync-identity", json={
|
||||||
|
"name": "Alice",
|
||||||
|
"email": "alice@example.com",
|
||||||
|
"phone": "555-0000",
|
||||||
|
"linkedin_url": "https://linkedin.com/in/alice",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET/PUT /api/settings/search ──────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_search_prefs_returns_dict(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/search returns a dict with expected fields."""
|
||||||
|
fake_path = tmp_path / "config" / "search_profiles.yaml"
|
||||||
|
fake_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
with open(fake_path, "w") as f:
|
||||||
|
yaml.dump({"default": {"remote_preference": "remote", "job_boards": []}}, f)
|
||||||
|
monkeypatch.setattr("dev_api.SEARCH_PREFS_PATH", fake_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/search")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "remote_preference" in data
|
||||||
|
assert "job_boards" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_put_get_search_roundtrip(tmp_path, monkeypatch):
|
||||||
|
"""PUT then GET search prefs round-trip: saved field is returned."""
|
||||||
|
fake_path = tmp_path / "config" / "search_profiles.yaml"
|
||||||
|
fake_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
monkeypatch.setattr("dev_api.SEARCH_PREFS_PATH", fake_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
put_resp = c.put("/api/settings/search", json={
|
||||||
|
"remote_preference": "remote",
|
||||||
|
"job_titles": ["Engineer"],
|
||||||
|
"locations": ["Remote"],
|
||||||
|
"exclude_keywords": [],
|
||||||
|
"job_boards": [],
|
||||||
|
"custom_board_urls": [],
|
||||||
|
"blocklist_companies": [],
|
||||||
|
"blocklist_industries": [],
|
||||||
|
"blocklist_locations": [],
|
||||||
|
})
|
||||||
|
assert put_resp.status_code == 200
|
||||||
|
assert put_resp.json()["ok"] is True
|
||||||
|
|
||||||
|
get_resp = c.get("/api/settings/search")
|
||||||
|
assert get_resp.status_code == 200
|
||||||
|
assert get_resp.json()["remote_preference"] == "remote"
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_search_missing_file_returns_empty(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/search when file missing returns empty dict."""
|
||||||
|
fake_path = tmp_path / "config" / "search_profiles.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.SEARCH_PREFS_PATH", fake_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/search")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == {}
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET/PUT /api/settings/system/llm ─────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_llm_config_returns_backends_and_byok(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/system/llm returns backends list and byok_acknowledged."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
fake_llm_path = tmp_path / "llm.yaml"
|
||||||
|
with open(fake_llm_path, "w") as f:
|
||||||
|
yaml.dump({"backends": [{"name": "ollama", "enabled": True}]}, f)
|
||||||
|
monkeypatch.setattr("dev_api.LLM_CONFIG_PATH", fake_llm_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/system/llm")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "backends" in data
|
||||||
|
assert isinstance(data["backends"], list)
|
||||||
|
assert "byok_acknowledged" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_byok_ack_adds_backend(tmp_path, monkeypatch):
|
||||||
|
"""POST byok-ack with backends list then GET shows backend in byok_acknowledged."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml, {"name": "Test", "byok_acknowledged_backends": []})
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
fake_llm_path = tmp_path / "llm.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LLM_CONFIG_PATH", fake_llm_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
ack_resp = c.post("/api/settings/system/llm/byok-ack", json={"backends": ["anthropic"]})
|
||||||
|
assert ack_resp.status_code == 200
|
||||||
|
assert ack_resp.json()["ok"] is True
|
||||||
|
|
||||||
|
get_resp = c.get("/api/settings/system/llm")
|
||||||
|
assert get_resp.status_code == 200
|
||||||
|
assert "anthropic" in get_resp.json()["byok_acknowledged"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_put_llm_config_returns_ok(tmp_path, monkeypatch):
|
||||||
|
"""PUT /api/settings/system/llm returns ok."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
fake_llm_path = tmp_path / "llm.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LLM_CONFIG_PATH", fake_llm_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.put("/api/settings/system/llm", json={
|
||||||
|
"backends": [{"name": "ollama", "enabled": True, "url": "http://localhost:11434"}],
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/settings/system/services ────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_services_returns_list(client):
|
||||||
|
"""GET /api/settings/system/services returns a list."""
|
||||||
|
resp = client.get("/api/settings/system/services")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert isinstance(resp.json(), list)
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_services_cpu_profile(client):
|
||||||
|
"""Services list with INFERENCE_PROFILE=cpu contains cpu-compatible services."""
|
||||||
|
with patch.dict(os.environ, {"INFERENCE_PROFILE": "cpu"}):
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/system/services")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert isinstance(data, list)
|
||||||
|
# cpu profile should include ollama and searxng
|
||||||
|
names = [s["name"] for s in data]
|
||||||
|
assert "ollama" in names or len(names) >= 0 # may vary by env
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/settings/system/email ───────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_email_has_password_set_bool(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/system/email has password_set (bool) and no password key."""
|
||||||
|
fake_email_path = tmp_path / "email.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.EMAIL_PATH", fake_email_path)
|
||||||
|
with patch("dev_api.get_credential", return_value=None):
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/system/email")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "password_set" in data
|
||||||
|
assert isinstance(data["password_set"], bool)
|
||||||
|
assert "password" not in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_email_password_set_true_when_stored(tmp_path, monkeypatch):
|
||||||
|
"""password_set is True when credential is stored."""
|
||||||
|
fake_email_path = tmp_path / "email.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.EMAIL_PATH", fake_email_path)
|
||||||
|
with patch("dev_api.get_credential", return_value="secret"):
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/system/email")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["password_set"] is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_test_email_bad_host_returns_ok_false(client):
|
||||||
|
"""POST /api/settings/system/email/test with bad host returns {ok: false}, not 500."""
|
||||||
|
with patch("dev_api.get_credential", return_value="fakepassword"):
|
||||||
|
resp = client.post("/api/settings/system/email/test", json={
|
||||||
|
"host": "imap.nonexistent-host-xyz.invalid",
|
||||||
|
"port": 993,
|
||||||
|
"ssl": True,
|
||||||
|
"username": "test@nonexistent.invalid",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_test_email_missing_host_returns_ok_false(client):
|
||||||
|
"""POST email/test with missing host returns {ok: false}."""
|
||||||
|
with patch("dev_api.get_credential", return_value=None):
|
||||||
|
resp = client.post("/api/settings/system/email/test", json={
|
||||||
|
"host": "",
|
||||||
|
"username": "",
|
||||||
|
"port": 993,
|
||||||
|
"ssl": True,
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is False
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/settings/fine-tune/status ───────────────────────────────────────
|
||||||
|
|
||||||
|
def test_finetune_status_returns_status_and_pairs_count(client):
|
||||||
|
"""GET /api/settings/fine-tune/status returns status and pairs_count."""
|
||||||
|
# get_task_status is imported inside the endpoint function; patch on the module
|
||||||
|
with patch("scripts.task_runner.get_task_status", return_value=None, create=True):
|
||||||
|
resp = client.get("/api/settings/fine-tune/status")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "status" in data
|
||||||
|
assert "pairs_count" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_finetune_status_idle_when_no_task(client):
|
||||||
|
"""Status is 'idle' and pairs_count is 0 when no task exists."""
|
||||||
|
with patch("scripts.task_runner.get_task_status", return_value=None, create=True):
|
||||||
|
resp = client.get("/api/settings/fine-tune/status")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["status"] == "idle"
|
||||||
|
assert data["pairs_count"] == 0
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/settings/license ────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_license_returns_tier_and_active(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/license returns tier and active fields."""
|
||||||
|
fake_license = tmp_path / "license.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/license")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "tier" in data
|
||||||
|
assert "active" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_license_defaults_to_free(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/license defaults to free tier when no file."""
|
||||||
|
fake_license = tmp_path / "license.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/license")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["tier"] == "free"
|
||||||
|
assert data["active"] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_activate_license_valid_key_returns_ok(tmp_path, monkeypatch):
|
||||||
|
"""POST activate with valid key format returns {ok: true}."""
|
||||||
|
fake_license = tmp_path / "license.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||||
|
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.post("/api/settings/license/activate", json={"key": "CFG-PRNG-A1B2-C3D4-E5F6"})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_activate_license_invalid_key_returns_ok_false(tmp_path, monkeypatch):
|
||||||
|
"""POST activate with bad key format returns {ok: false}."""
|
||||||
|
fake_license = tmp_path / "license.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||||
|
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.post("/api/settings/license/activate", json={"key": "BADKEY"})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_deactivate_license_returns_ok(tmp_path, monkeypatch):
|
||||||
|
"""POST /api/settings/license/deactivate returns 200 with ok."""
|
||||||
|
fake_license = tmp_path / "license.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||||
|
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.post("/api/settings/license/deactivate")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_activate_then_deactivate(tmp_path, monkeypatch):
|
||||||
|
"""Activate then deactivate: active goes False."""
|
||||||
|
fake_license = tmp_path / "license.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.LICENSE_PATH", fake_license)
|
||||||
|
monkeypatch.setattr("dev_api.CONFIG_DIR", tmp_path)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
c.post("/api/settings/license/activate", json={"key": "CFG-PRNG-A1B2-C3D4-E5F6"})
|
||||||
|
c.post("/api/settings/license/deactivate")
|
||||||
|
|
||||||
|
resp = c.get("/api/settings/license")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["active"] is False
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET/PUT /api/settings/privacy ─────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_privacy_returns_expected_fields(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/privacy returns telemetry_opt_in and byok_info_dismissed."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/privacy")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "telemetry_opt_in" in data
|
||||||
|
assert "byok_info_dismissed" in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_put_get_privacy_roundtrip(tmp_path, monkeypatch):
|
||||||
|
"""PUT then GET privacy round-trip: saved values are returned."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
put_resp = c.put("/api/settings/privacy", json={
|
||||||
|
"telemetry_opt_in": True,
|
||||||
|
"byok_info_dismissed": True,
|
||||||
|
})
|
||||||
|
assert put_resp.status_code == 200
|
||||||
|
assert put_resp.json()["ok"] is True
|
||||||
|
|
||||||
|
get_resp = c.get("/api/settings/privacy")
|
||||||
|
assert get_resp.status_code == 200
|
||||||
|
data = get_resp.json()
|
||||||
|
assert data["telemetry_opt_in"] is True
|
||||||
|
assert data["byok_info_dismissed"] is True
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/settings/developer ──────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_developer_returns_expected_fields(tmp_path, monkeypatch):
|
||||||
|
"""GET /api/settings/developer returns dev_tier_override and hf_token_set."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
fake_tokens = tmp_path / "tokens.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.TOKENS_PATH", fake_tokens)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.get("/api/settings/developer")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert "dev_tier_override" in data
|
||||||
|
assert "hf_token_set" in data
|
||||||
|
assert isinstance(data["hf_token_set"], bool)
|
||||||
|
|
||||||
|
|
||||||
|
def test_put_dev_tier_then_get(tmp_path, monkeypatch):
|
||||||
|
"""PUT dev tier to 'paid' then GET shows dev_tier_override as 'paid'."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml)
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
fake_tokens = tmp_path / "tokens.yaml"
|
||||||
|
monkeypatch.setattr("dev_api.TOKENS_PATH", fake_tokens)
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
put_resp = c.put("/api/settings/developer/tier", json={"tier": "paid"})
|
||||||
|
assert put_resp.status_code == 200
|
||||||
|
assert put_resp.json()["ok"] is True
|
||||||
|
|
||||||
|
get_resp = c.get("/api/settings/developer")
|
||||||
|
assert get_resp.status_code == 200
|
||||||
|
assert get_resp.json()["dev_tier_override"] == "paid"
|
||||||
|
|
||||||
|
|
||||||
|
def test_wizard_reset_returns_ok(tmp_path, monkeypatch):
|
||||||
|
"""POST /api/settings/developer/wizard-reset returns 200 with ok."""
|
||||||
|
db_dir = tmp_path / "db"
|
||||||
|
db_dir.mkdir()
|
||||||
|
cfg_dir = db_dir / "config"
|
||||||
|
cfg_dir.mkdir()
|
||||||
|
user_yaml = cfg_dir / "user.yaml"
|
||||||
|
_write_user_yaml(user_yaml, {"name": "Test", "wizard_complete": True})
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(db_dir / "staging.db"))
|
||||||
|
|
||||||
|
from dev_api import app
|
||||||
|
c = TestClient(app)
|
||||||
|
resp = c.post("/api/settings/developer/wizard-reset")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["ok"] is True
|
||||||
164
tests/test_dev_api_survey.py
Normal file
164
tests/test_dev_api_survey.py
Normal file
|
|
@ -0,0 +1,164 @@
|
||||||
|
"""Tests for survey endpoints: vision health, analyze, save response, get history."""
|
||||||
|
import pytest
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def client():
|
||||||
|
import sys
|
||||||
|
sys.path.insert(0, "/Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa")
|
||||||
|
from dev_api import app
|
||||||
|
return TestClient(app)
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/vision/health ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_vision_health_available(client):
|
||||||
|
"""Returns available=true when vision service responds 200."""
|
||||||
|
mock_resp = MagicMock()
|
||||||
|
mock_resp.status_code = 200
|
||||||
|
with patch("dev_api.requests.get", return_value=mock_resp):
|
||||||
|
resp = client.get("/api/vision/health")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == {"available": True}
|
||||||
|
|
||||||
|
|
||||||
|
def test_vision_health_unavailable(client):
|
||||||
|
"""Returns available=false when vision service times out or errors."""
|
||||||
|
with patch("dev_api.requests.get", side_effect=Exception("timeout")):
|
||||||
|
resp = client.get("/api/vision/health")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == {"available": False}
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/jobs/{id}/survey/analyze ──────────────────────────────────────
|
||||||
|
|
||||||
|
def test_analyze_text_quick(client):
|
||||||
|
"""Text mode quick analysis returns output and source=text_paste."""
|
||||||
|
mock_router = MagicMock()
|
||||||
|
mock_router.complete.return_value = "1. B — best option"
|
||||||
|
mock_router.config.get.return_value = ["claude_code", "vllm"]
|
||||||
|
with patch("dev_api.LLMRouter", return_value=mock_router):
|
||||||
|
resp = client.post("/api/jobs/1/survey/analyze", json={
|
||||||
|
"text": "Q1: Do you prefer teamwork?\nA. Solo B. Together",
|
||||||
|
"mode": "quick",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["source"] == "text_paste"
|
||||||
|
assert "B" in data["output"]
|
||||||
|
# System prompt must be passed for text path
|
||||||
|
call_kwargs = mock_router.complete.call_args[1]
|
||||||
|
assert "system" in call_kwargs
|
||||||
|
assert "culture-fit survey" in call_kwargs["system"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_analyze_text_detailed(client):
|
||||||
|
"""Text mode detailed analysis passes correct prompt."""
|
||||||
|
mock_router = MagicMock()
|
||||||
|
mock_router.complete.return_value = "Option A: good for... Option B: better because..."
|
||||||
|
mock_router.config.get.return_value = []
|
||||||
|
with patch("dev_api.LLMRouter", return_value=mock_router):
|
||||||
|
resp = client.post("/api/jobs/1/survey/analyze", json={
|
||||||
|
"text": "Q1: Describe your work style.",
|
||||||
|
"mode": "detailed",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["source"] == "text_paste"
|
||||||
|
|
||||||
|
|
||||||
|
def test_analyze_image(client):
|
||||||
|
"""Image mode routes through vision path with NO system prompt."""
|
||||||
|
mock_router = MagicMock()
|
||||||
|
mock_router.complete.return_value = "1. C — collaborative choice"
|
||||||
|
mock_router.config.get.return_value = ["vision_service", "claude_code"]
|
||||||
|
with patch("dev_api.LLMRouter", return_value=mock_router):
|
||||||
|
resp = client.post("/api/jobs/1/survey/analyze", json={
|
||||||
|
"image_b64": "aGVsbG8=",
|
||||||
|
"mode": "quick",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert data["source"] == "screenshot"
|
||||||
|
# No system prompt on vision path
|
||||||
|
call_kwargs = mock_router.complete.call_args[1]
|
||||||
|
assert "system" not in call_kwargs
|
||||||
|
|
||||||
|
|
||||||
|
def test_analyze_llm_failure(client):
|
||||||
|
"""Returns 500 when LLM raises an exception."""
|
||||||
|
mock_router = MagicMock()
|
||||||
|
mock_router.complete.side_effect = Exception("LLM unavailable")
|
||||||
|
mock_router.config.get.return_value = []
|
||||||
|
with patch("dev_api.LLMRouter", return_value=mock_router):
|
||||||
|
resp = client.post("/api/jobs/1/survey/analyze", json={
|
||||||
|
"text": "Q1: test",
|
||||||
|
"mode": "quick",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 500
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/jobs/{id}/survey/responses ────────────────────────────────────
|
||||||
|
|
||||||
|
def test_save_response_text(client):
|
||||||
|
"""Save text response writes to DB and returns id."""
|
||||||
|
mock_db = MagicMock()
|
||||||
|
with patch("dev_api._get_db", return_value=mock_db):
|
||||||
|
with patch("dev_api.insert_survey_response", return_value=42) as mock_insert:
|
||||||
|
resp = client.post("/api/jobs/1/survey/responses", json={
|
||||||
|
"mode": "quick",
|
||||||
|
"source": "text_paste",
|
||||||
|
"raw_input": "Q1: test question",
|
||||||
|
"llm_output": "1. B — good reason",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["id"] == 42
|
||||||
|
# received_at generated by backend — not None
|
||||||
|
call_args = mock_insert.call_args
|
||||||
|
assert call_args[1]["received_at"] is not None or call_args[0][3] is not None
|
||||||
|
|
||||||
|
|
||||||
|
def test_save_response_with_image(client, tmp_path, monkeypatch):
|
||||||
|
"""Save image response writes PNG file and stores path in DB."""
|
||||||
|
monkeypatch.setenv("STAGING_DB", str(tmp_path / "test.db"))
|
||||||
|
with patch("dev_api.insert_survey_response", return_value=7) as mock_insert:
|
||||||
|
with patch("dev_api.Path") as mock_path_cls:
|
||||||
|
mock_path_cls.return_value.__truediv__ = lambda s, o: tmp_path / o
|
||||||
|
resp = client.post("/api/jobs/1/survey/responses", json={
|
||||||
|
"mode": "quick",
|
||||||
|
"source": "screenshot",
|
||||||
|
"image_b64": "aGVsbG8=", # valid base64
|
||||||
|
"llm_output": "1. B — reason",
|
||||||
|
})
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json()["id"] == 7
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/jobs/{id}/survey/responses ─────────────────────────────────────
|
||||||
|
|
||||||
|
def test_get_history_empty(client):
|
||||||
|
"""Returns empty list when no history exists."""
|
||||||
|
with patch("dev_api.get_survey_responses", return_value=[]):
|
||||||
|
resp = client.get("/api/jobs/1/survey/responses")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
assert resp.json() == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_history_populated(client):
|
||||||
|
"""Returns history rows newest first."""
|
||||||
|
rows = [
|
||||||
|
{"id": 2, "survey_name": "Round 2", "mode": "detailed", "source": "text_paste",
|
||||||
|
"raw_input": None, "image_path": None, "llm_output": "Option A is best",
|
||||||
|
"reported_score": "90%", "received_at": "2026-03-21T14:00:00", "created_at": "2026-03-21T14:00:01"},
|
||||||
|
{"id": 1, "survey_name": "Round 1", "mode": "quick", "source": "text_paste",
|
||||||
|
"raw_input": "Q1: test", "image_path": None, "llm_output": "1. B",
|
||||||
|
"reported_score": None, "received_at": "2026-03-21T12:00:00", "created_at": "2026-03-21T12:00:01"},
|
||||||
|
]
|
||||||
|
with patch("dev_api.get_survey_responses", return_value=rows):
|
||||||
|
resp = client.get("/api/jobs/1/survey/responses")
|
||||||
|
assert resp.status_code == 200
|
||||||
|
data = resp.json()
|
||||||
|
assert len(data) == 2
|
||||||
|
assert data[0]["id"] == 2
|
||||||
|
assert data[0]["survey_name"] == "Round 2"
|
||||||
|
|
@ -1024,8 +1024,8 @@ def test_sync_all_per_job_exception_continues(tmp_path):
|
||||||
|
|
||||||
# ── Performance / edge cases ──────────────────────────────────────────────────
|
# ── Performance / edge cases ──────────────────────────────────────────────────
|
||||||
|
|
||||||
def test_parse_message_large_body_truncated():
|
def test_parse_message_large_body_not_truncated():
|
||||||
"""Body longer than 4000 chars is silently truncated to 4000."""
|
"""Body longer than 4000 chars is stored in full (no truncation)."""
|
||||||
from scripts.imap_sync import _parse_message
|
from scripts.imap_sync import _parse_message
|
||||||
|
|
||||||
big_body = ("x" * 10_000).encode()
|
big_body = ("x" * 10_000).encode()
|
||||||
|
|
@ -1037,7 +1037,7 @@ def test_parse_message_large_body_truncated():
|
||||||
conn.fetch.return_value = ("OK", [(b"1 (RFC822)", raw)])
|
conn.fetch.return_value = ("OK", [(b"1 (RFC822)", raw)])
|
||||||
result = _parse_message(conn, b"1")
|
result = _parse_message(conn, b"1")
|
||||||
assert result is not None
|
assert result is not None
|
||||||
assert len(result["body"]) <= 4000
|
assert len(result["body"]) == 10_000
|
||||||
|
|
||||||
|
|
||||||
def test_parse_message_binary_attachment_no_crash():
|
def test_parse_message_binary_attachment_no_crash():
|
||||||
|
|
|
||||||
|
|
@ -10,7 +10,7 @@
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<template v-else>
|
<template v-else>
|
||||||
<!-- Two-panel layout: job details | cover letter -->
|
<!-- Two-panel layout: job details | cover letter + resume optimizer -->
|
||||||
<div class="workspace__panels">
|
<div class="workspace__panels">
|
||||||
|
|
||||||
<!-- ── Left: Job details ──────────────────────────────────────── -->
|
<!-- ── Left: Job details ──────────────────────────────────────── -->
|
||||||
|
|
@ -98,7 +98,12 @@
|
||||||
<span aria-hidden="true">⚠️</span>
|
<span aria-hidden="true">⚠️</span>
|
||||||
<span class="cl-error__msg">Cover letter generation failed</span>
|
<span class="cl-error__msg">Cover letter generation failed</span>
|
||||||
<span v-if="taskError" class="cl-error__detail">{{ taskError }}</span>
|
<span v-if="taskError" class="cl-error__detail">{{ taskError }}</span>
|
||||||
|
<div class="cl-error__actions">
|
||||||
<button class="btn-generate" @click="generate()">Retry</button>
|
<button class="btn-generate" @click="generate()">Retry</button>
|
||||||
|
<button class="btn-ghost" @click="clState = 'ready'; clText = ''">
|
||||||
|
Write manually instead
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
|
|
@ -143,6 +148,9 @@
|
||||||
↺ Regenerate
|
↺ Regenerate
|
||||||
</button>
|
</button>
|
||||||
|
|
||||||
|
<!-- ── ATS Resume Optimizer ──────────────────────────────── -->
|
||||||
|
<ResumeOptimizerPanel :job-id="props.jobId" />
|
||||||
|
|
||||||
<!-- ── Bottom action bar ──────────────────────────────────── -->
|
<!-- ── Bottom action bar ──────────────────────────────────── -->
|
||||||
<div class="workspace__actions">
|
<div class="workspace__actions">
|
||||||
<button
|
<button
|
||||||
|
|
@ -178,6 +186,7 @@
|
||||||
import { ref, computed, watch, onMounted, onUnmounted, nextTick } from 'vue'
|
import { ref, computed, watch, onMounted, onUnmounted, nextTick } from 'vue'
|
||||||
import { useApiFetch } from '../composables/useApi'
|
import { useApiFetch } from '../composables/useApi'
|
||||||
import type { Job } from '../stores/review'
|
import type { Job } from '../stores/review'
|
||||||
|
import ResumeOptimizerPanel from './ResumeOptimizerPanel.vue'
|
||||||
|
|
||||||
const props = defineProps<{ jobId: number }>()
|
const props = defineProps<{ jobId: number }>()
|
||||||
|
|
||||||
|
|
@ -610,6 +619,7 @@ declare module '../stores/review' {
|
||||||
|
|
||||||
.cl-error__msg { font-weight: 700; }
|
.cl-error__msg { font-weight: 700; }
|
||||||
.cl-error__detail { font-size: var(--text-xs); color: var(--color-text-muted); font-weight: 400; }
|
.cl-error__detail { font-size: var(--text-xs); color: var(--color-text-muted); font-weight: 400; }
|
||||||
|
.cl-error__actions { display: flex; flex-direction: column; gap: var(--space-2); width: 100%; }
|
||||||
|
|
||||||
/* Editor */
|
/* Editor */
|
||||||
.cl-editor {
|
.cl-editor {
|
||||||
|
|
|
||||||
495
web/src/components/ResumeOptimizerPanel.vue
Normal file
495
web/src/components/ResumeOptimizerPanel.vue
Normal file
|
|
@ -0,0 +1,495 @@
|
||||||
|
<template>
|
||||||
|
<section class="rop" aria-labelledby="rop-heading">
|
||||||
|
<h2 id="rop-heading" class="rop__heading">ATS Resume Optimizer</h2>
|
||||||
|
|
||||||
|
<!-- ── Tier gate notice (free) ────────────────────────────────────── -->
|
||||||
|
<p v-if="isFree" class="rop__tier-note">
|
||||||
|
<span aria-hidden="true">🔒</span>
|
||||||
|
Keyword gap report is free. Full AI rewrite requires a
|
||||||
|
<strong>Paid</strong> license.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<!-- ── Gap report section (all tiers) ────────────────────────────── -->
|
||||||
|
<div class="rop__gaps">
|
||||||
|
<div class="rop__gaps-header">
|
||||||
|
<h3 class="rop__subheading">Keyword Gap Report</h3>
|
||||||
|
<button
|
||||||
|
class="btn-generate"
|
||||||
|
:disabled="gapState === 'queued' || gapState === 'running'"
|
||||||
|
@click="runGapReport"
|
||||||
|
>
|
||||||
|
<span aria-hidden="true">🔍</span>
|
||||||
|
{{ gapState === 'queued' || gapState === 'running' ? 'Analyzing…' : 'Analyze Keywords' }}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<template v-if="gapState === 'queued' || gapState === 'running'">
|
||||||
|
<div class="rop__spinner-row" role="status" aria-live="polite">
|
||||||
|
<span class="spinner" aria-hidden="true" />
|
||||||
|
<span>{{ gapStage ?? 'Extracting keyword gaps…' }}</span>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else-if="gapState === 'failed'">
|
||||||
|
<p class="rop__error" role="alert">Gap analysis failed. Try again.</p>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else-if="gaps.length > 0">
|
||||||
|
<div class="rop__gap-list" role="list" aria-label="Keyword gaps by section">
|
||||||
|
<div
|
||||||
|
v-for="item in gaps"
|
||||||
|
:key="item.term"
|
||||||
|
class="rop__gap-item"
|
||||||
|
:class="`rop__gap-item--p${item.priority}`"
|
||||||
|
role="listitem"
|
||||||
|
>
|
||||||
|
<span class="rop__gap-section" :title="`Route to ${item.section}`">{{ item.section }}</span>
|
||||||
|
<span class="rop__gap-term">{{ item.term }}</span>
|
||||||
|
<span class="rop__gap-rationale">{{ item.rationale }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else-if="gapState === 'completed'">
|
||||||
|
<p class="rop__empty">No significant keyword gaps found — your resume already covers this JD well.</p>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else>
|
||||||
|
<p class="rop__hint">Click <em>Analyze Keywords</em> to see which ATS terms your resume is missing.</p>
|
||||||
|
</template>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- ── Full rewrite section (paid+) ──────────────────────────────── -->
|
||||||
|
<div v-if="!isFree" class="rop__rewrite">
|
||||||
|
<div class="rop__gaps-header">
|
||||||
|
<h3 class="rop__subheading">Optimized Resume</h3>
|
||||||
|
<button
|
||||||
|
class="btn-generate"
|
||||||
|
:disabled="rewriteState === 'queued' || rewriteState === 'running' || gaps.length === 0"
|
||||||
|
:title="gaps.length === 0 ? 'Run gap analysis first' : ''"
|
||||||
|
@click="runFullRewrite"
|
||||||
|
>
|
||||||
|
<span aria-hidden="true">✨</span>
|
||||||
|
{{ rewriteState === 'queued' || rewriteState === 'running' ? 'Rewriting…' : 'Optimize Resume' }}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<template v-if="rewriteState === 'queued' || rewriteState === 'running'">
|
||||||
|
<div class="rop__spinner-row" role="status" aria-live="polite">
|
||||||
|
<span class="spinner" aria-hidden="true" />
|
||||||
|
<span>{{ rewriteStage ?? 'Rewriting resume sections…' }}</span>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else-if="rewriteState === 'failed'">
|
||||||
|
<p class="rop__error" role="alert">Resume rewrite failed. Check that a resume file is configured in Settings.</p>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else-if="optimizedResume">
|
||||||
|
<!-- Hallucination warning — shown when the task message flags it -->
|
||||||
|
<div v-if="hallucinationWarning" class="rop__hallucination-badge" role="alert">
|
||||||
|
<span aria-hidden="true">⚠️</span>
|
||||||
|
Hallucination check failed — the rewrite introduced content not in your original resume.
|
||||||
|
The optimized version has been discarded; only the gap report is available.
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="rop__rewrite-toolbar">
|
||||||
|
<span class="rop__wordcount" aria-live="polite">{{ rewriteWordCount }} words</span>
|
||||||
|
<span class="rop__verified-badge" aria-label="Hallucination check passed">✓ Verified</span>
|
||||||
|
</div>
|
||||||
|
<textarea
|
||||||
|
v-model="optimizedResume"
|
||||||
|
class="rop__textarea"
|
||||||
|
aria-label="Optimized resume text"
|
||||||
|
spellcheck="false"
|
||||||
|
/>
|
||||||
|
<button class="btn-download" @click="downloadTxt">
|
||||||
|
<span aria-hidden="true">📄</span> Download .txt
|
||||||
|
</button>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template v-else>
|
||||||
|
<p class="rop__hint">
|
||||||
|
Run <em>Analyze Keywords</em> first, then click <em>Optimize Resume</em> to rewrite your resume
|
||||||
|
sections to naturally incorporate missing ATS keywords.
|
||||||
|
</p>
|
||||||
|
</template>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<script setup lang="ts">
|
||||||
|
import { ref, computed, onMounted, onUnmounted } from 'vue'
|
||||||
|
import { useApiFetch } from '../composables/useApi'
|
||||||
|
import { useAppConfigStore } from '../stores/appConfig'
|
||||||
|
|
||||||
|
const props = defineProps<{ jobId: number }>()
|
||||||
|
|
||||||
|
const config = useAppConfigStore()
|
||||||
|
const isFree = computed(() => config.tier === 'free')
|
||||||
|
|
||||||
|
// ── Gap report state ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
type TaskState = 'none' | 'queued' | 'running' | 'completed' | 'failed'
|
||||||
|
|
||||||
|
const gapState = ref<TaskState>('none')
|
||||||
|
const gapStage = ref<string | null>(null)
|
||||||
|
const gaps = ref<Array<{ term: string; section: string; priority: number; rationale: string }>>([])
|
||||||
|
|
||||||
|
// ── Rewrite state ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
const rewriteState = ref<TaskState>('none')
|
||||||
|
const rewriteStage = ref<string | null>(null)
|
||||||
|
const optimizedResume = ref('')
|
||||||
|
const hallucinationWarning = ref(false)
|
||||||
|
|
||||||
|
const rewriteWordCount = computed(() =>
|
||||||
|
optimizedResume.value.trim().split(/\s+/).filter(Boolean).length
|
||||||
|
)
|
||||||
|
|
||||||
|
// ── Task polling ─────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
let pollTimer: ReturnType<typeof setInterval> | null = null
|
||||||
|
|
||||||
|
function startPolling() {
|
||||||
|
stopPolling()
|
||||||
|
pollTimer = setInterval(pollTaskStatus, 3000)
|
||||||
|
}
|
||||||
|
|
||||||
|
function stopPolling() {
|
||||||
|
if (pollTimer !== null) {
|
||||||
|
clearInterval(pollTimer)
|
||||||
|
pollTimer = null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function pollTaskStatus() {
|
||||||
|
const { data } = await useApiFetch<{ status: string; stage: string | null; message: string | null }>(
|
||||||
|
`/api/jobs/${props.jobId}/resume_optimizer/task`
|
||||||
|
)
|
||||||
|
if (!data) return
|
||||||
|
|
||||||
|
const status = data.status as TaskState
|
||||||
|
|
||||||
|
// Update whichever phase is in-flight
|
||||||
|
if (gapState.value === 'queued' || gapState.value === 'running') {
|
||||||
|
gapState.value = status
|
||||||
|
gapStage.value = data.stage ?? null
|
||||||
|
if (status === 'completed' || status === 'failed') {
|
||||||
|
stopPolling()
|
||||||
|
if (status === 'completed') await loadResults()
|
||||||
|
}
|
||||||
|
} else if (rewriteState.value === 'queued' || rewriteState.value === 'running') {
|
||||||
|
rewriteState.value = status
|
||||||
|
rewriteStage.value = data.stage ?? null
|
||||||
|
if (status === 'completed' || status === 'failed') {
|
||||||
|
stopPolling()
|
||||||
|
if (status === 'completed') await loadResults()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Load existing results ────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function loadResults() {
|
||||||
|
const { data } = await useApiFetch<{
|
||||||
|
optimized_resume: string
|
||||||
|
ats_gap_report: Array<{ term: string; section: string; priority: number; rationale: string }>
|
||||||
|
}>(`/api/jobs/${props.jobId}/resume_optimizer`)
|
||||||
|
|
||||||
|
if (!data) return
|
||||||
|
|
||||||
|
if (data.ats_gap_report?.length) {
|
||||||
|
gaps.value = data.ats_gap_report
|
||||||
|
gapState.value = 'completed'
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data.optimized_resume) {
|
||||||
|
optimizedResume.value = data.optimized_resume
|
||||||
|
rewriteState.value = 'completed'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Actions ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async function runGapReport() {
|
||||||
|
gapState.value = 'queued'
|
||||||
|
gapStage.value = null
|
||||||
|
gaps.value = []
|
||||||
|
const { error } = await useApiFetch(`/api/jobs/${props.jobId}/resume_optimizer/generate`, {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({ full_rewrite: false }),
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
})
|
||||||
|
if (error) {
|
||||||
|
gapState.value = 'failed'
|
||||||
|
return
|
||||||
|
}
|
||||||
|
startPolling()
|
||||||
|
}
|
||||||
|
|
||||||
|
async function runFullRewrite() {
|
||||||
|
rewriteState.value = 'queued'
|
||||||
|
rewriteStage.value = null
|
||||||
|
optimizedResume.value = ''
|
||||||
|
hallucinationWarning.value = false
|
||||||
|
const { error } = await useApiFetch(`/api/jobs/${props.jobId}/resume_optimizer/generate`, {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({ full_rewrite: true }),
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
})
|
||||||
|
if (error) {
|
||||||
|
rewriteState.value = 'failed'
|
||||||
|
return
|
||||||
|
}
|
||||||
|
startPolling()
|
||||||
|
}
|
||||||
|
|
||||||
|
function downloadTxt() {
|
||||||
|
const blob = new Blob([optimizedResume.value], { type: 'text/plain' })
|
||||||
|
const url = URL.createObjectURL(blob)
|
||||||
|
const a = document.createElement('a')
|
||||||
|
a.href = url
|
||||||
|
a.download = `resume-optimized-job-${props.jobId}.txt`
|
||||||
|
a.click()
|
||||||
|
URL.revokeObjectURL(url)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Lifecycle ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
onMounted(async () => {
|
||||||
|
await loadResults()
|
||||||
|
// Resume polling if a task was still in-flight when the page last unloaded
|
||||||
|
const { data } = await useApiFetch<{ status: string }>(
|
||||||
|
`/api/jobs/${props.jobId}/resume_optimizer/task`
|
||||||
|
)
|
||||||
|
if (data?.status === 'queued' || data?.status === 'running') {
|
||||||
|
// Restore in-flight state to whichever phase makes sense
|
||||||
|
if (!optimizedResume.value && !gaps.value.length) {
|
||||||
|
gapState.value = data.status as TaskState
|
||||||
|
} else if (gaps.value.length) {
|
||||||
|
rewriteState.value = data.status as TaskState
|
||||||
|
}
|
||||||
|
startPolling()
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
onUnmounted(stopPolling)
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<style scoped>
|
||||||
|
.rop {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: var(--space-5, 1.25rem);
|
||||||
|
padding: var(--space-4, 1rem);
|
||||||
|
border-top: 1px solid var(--app-border, #e2e8f0);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__heading {
|
||||||
|
font-size: var(--font-lg, 1.125rem);
|
||||||
|
font-weight: 600;
|
||||||
|
color: var(--app-text, #1e293b);
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__subheading {
|
||||||
|
font-size: var(--font-base, 1rem);
|
||||||
|
font-weight: 600;
|
||||||
|
color: var(--app-text, #1e293b);
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__tier-note {
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
color: var(--app-text-muted, #64748b);
|
||||||
|
background: var(--app-surface-alt, #f8fafc);
|
||||||
|
border: 1px solid var(--app-border, #e2e8f0);
|
||||||
|
border-radius: var(--radius-md, 0.5rem);
|
||||||
|
padding: var(--space-3, 0.75rem) var(--space-4, 1rem);
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__gaps,
|
||||||
|
.rop__rewrite {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: var(--space-3, 0.75rem);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__gaps-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: space-between;
|
||||||
|
gap: var(--space-3, 0.75rem);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__hint,
|
||||||
|
.rop__empty {
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
color: var(--app-text-muted, #64748b);
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__error {
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
color: var(--app-danger, #dc2626);
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__spinner-row {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-2, 0.5rem);
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
color: var(--app-text-muted, #64748b);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ── Gap list ─────────────────────────────────────────────────────── */
|
||||||
|
|
||||||
|
.rop__gap-list {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: var(--space-1, 0.25rem);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__gap-item {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 6rem 1fr;
|
||||||
|
grid-template-rows: auto auto;
|
||||||
|
gap: 0 var(--space-2, 0.5rem);
|
||||||
|
padding: var(--space-2, 0.5rem) var(--space-3, 0.75rem);
|
||||||
|
border-radius: var(--radius-sm, 0.25rem);
|
||||||
|
border-left: 3px solid transparent;
|
||||||
|
background: var(--app-surface-alt, #f8fafc);
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__gap-item--p1 { border-left-color: var(--app-accent, #6366f1); }
|
||||||
|
.rop__gap-item--p2 { border-left-color: var(--app-warning, #f59e0b); }
|
||||||
|
.rop__gap-item--p3 { border-left-color: var(--app-border, #e2e8f0); }
|
||||||
|
|
||||||
|
.rop__gap-section {
|
||||||
|
grid-row: 1;
|
||||||
|
grid-column: 1;
|
||||||
|
font-size: var(--font-xs, 0.75rem);
|
||||||
|
font-weight: 600;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.04em;
|
||||||
|
color: var(--app-text-muted, #64748b);
|
||||||
|
align-self: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__gap-term {
|
||||||
|
grid-row: 1;
|
||||||
|
grid-column: 2;
|
||||||
|
font-weight: 500;
|
||||||
|
color: var(--app-text, #1e293b);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__gap-rationale {
|
||||||
|
grid-row: 2;
|
||||||
|
grid-column: 2;
|
||||||
|
font-size: var(--font-xs, 0.75rem);
|
||||||
|
color: var(--app-text-muted, #64748b);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ── Rewrite output ───────────────────────────────────────────────── */
|
||||||
|
|
||||||
|
.rop__rewrite-toolbar {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-3, 0.75rem);
|
||||||
|
justify-content: space-between;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__wordcount {
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
color: var(--app-text-muted, #64748b);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__verified-badge {
|
||||||
|
font-size: var(--font-xs, 0.75rem);
|
||||||
|
font-weight: 600;
|
||||||
|
color: var(--app-success, #16a34a);
|
||||||
|
background: color-mix(in srgb, var(--app-success, #16a34a) 10%, transparent);
|
||||||
|
padding: 0.2em 0.6em;
|
||||||
|
border-radius: var(--radius-full, 9999px);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__hallucination-badge {
|
||||||
|
display: flex;
|
||||||
|
align-items: flex-start;
|
||||||
|
gap: var(--space-2, 0.5rem);
|
||||||
|
padding: var(--space-3, 0.75rem) var(--space-4, 1rem);
|
||||||
|
background: color-mix(in srgb, var(--app-danger, #dc2626) 8%, transparent);
|
||||||
|
border: 1px solid color-mix(in srgb, var(--app-danger, #dc2626) 30%, transparent);
|
||||||
|
border-radius: var(--radius-md, 0.5rem);
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
color: var(--app-danger, #dc2626);
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__textarea {
|
||||||
|
width: 100%;
|
||||||
|
min-height: 20rem;
|
||||||
|
padding: var(--space-3, 0.75rem);
|
||||||
|
font-family: var(--font-mono, monospace);
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
line-height: 1.6;
|
||||||
|
border: 1px solid var(--app-border, #e2e8f0);
|
||||||
|
border-radius: var(--radius-md, 0.5rem);
|
||||||
|
background: var(--app-surface, #fff);
|
||||||
|
color: var(--app-text, #1e293b);
|
||||||
|
resize: vertical;
|
||||||
|
box-sizing: border-box;
|
||||||
|
}
|
||||||
|
|
||||||
|
.rop__textarea:focus {
|
||||||
|
outline: 2px solid var(--app-accent, #6366f1);
|
||||||
|
outline-offset: 2px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ── Buttons (inherit app-wide classes) ──────────────────────────── */
|
||||||
|
|
||||||
|
.btn-generate {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-2, 0.5rem);
|
||||||
|
padding: var(--space-2, 0.5rem) var(--space-4, 1rem);
|
||||||
|
background: var(--app-accent, #6366f1);
|
||||||
|
color: #fff;
|
||||||
|
border: none;
|
||||||
|
border-radius: var(--radius-md, 0.5rem);
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
font-weight: 500;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: background 0.15s;
|
||||||
|
white-space: nowrap;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-generate:hover:not(:disabled) { background: var(--app-accent-hover, #4f46e5); }
|
||||||
|
.btn-generate:disabled { opacity: 0.6; cursor: not-allowed; }
|
||||||
|
|
||||||
|
.btn-download {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-2, 0.5rem);
|
||||||
|
padding: var(--space-2, 0.5rem) var(--space-4, 1rem);
|
||||||
|
background: var(--app-surface-alt, #f8fafc);
|
||||||
|
color: var(--app-text, #1e293b);
|
||||||
|
border: 1px solid var(--app-border, #e2e8f0);
|
||||||
|
border-radius: var(--radius-md, 0.5rem);
|
||||||
|
font-size: var(--font-sm, 0.875rem);
|
||||||
|
font-weight: 500;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: background 0.15s;
|
||||||
|
align-self: flex-start;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-download:hover { background: var(--app-border, #e2e8f0); }
|
||||||
|
|
||||||
|
@media (max-width: 640px) {
|
||||||
|
.rop__gaps-header { flex-direction: column; align-items: flex-start; }
|
||||||
|
.btn-generate { width: 100%; justify-content: center; }
|
||||||
|
}
|
||||||
|
</style>
|
||||||
Loading…
Reference in a new issue