diff --git a/docs/superpowers/plans/2026-03-19-interviews-improvements.md b/docs/superpowers/plans/2026-03-19-interviews-improvements.md new file mode 100644 index 0000000..d55404f --- /dev/null +++ b/docs/superpowers/plans/2026-03-19-interviews-improvements.md @@ -0,0 +1,1091 @@ +# Interviews Page Improvements — Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Add three improvements to the Interviews page: (1) collapsible Applied/Survey pre-kanban section with localStorage persistence, (2) email sync status pill in the page header, (3) stage signal banners on job cards in both the pre-list and the kanban. + +**Architecture:** Backend adds four new endpoints to `dev-api.py` (stage signals batched into `GET /api/interviews`, email sync trigger/status, signal dismiss). The store gets a new exported `StageSignal` type. `MoveToSheet` gains an optional `preSelectedStage` prop. `InterviewCard` gains the signal banner and an extended `move` emit. `InterviewsView` gets the collapsible section and email sync pill — both wired together. + +**Tech Stack:** Python FastAPI (dev-api.py), Vue 3, TypeScript, Pinia, CSS `max-height` transition + +--- + +## File Map + +| File | Action | +|---|---| +| `dev-api.py` | Stage signals in `/api/interviews`; `POST /api/email/sync`; `GET /api/email/sync/status`; `POST /api/stage-signals/{id}/dismiss` | +| `tests/test_dev_api_interviews.py` | **NEW** — pytest tests for all four new dev-api behaviors | +| `web/src/stores/interviews.ts` | Export `StageSignal` interface; add `stage_signals: StageSignal[]` to `PipelineJob`; update `fetchAll()` identity map | +| `web/src/components/MoveToSheet.vue` | Add optional `preSelectedStage?: PipelineStage` prop; pre-select on open | +| `web/src/components/InterviewCard.vue` | Signal banner at card bottom; extend `move` emit signature | +| `web/src/views/InterviewsView.vue` | Collapsible Applied section (localStorage, `max-height` CSS, signal count in header); email sync pill + polling; wire `preSelectedStage` through `openMove` → `MoveToSheet` | + +--- + +## Task 1: Backend — new dev-api.py endpoints + +**Files:** +- Modify: `dev-api.py` +- Create: `tests/test_dev_api_interviews.py` + +### Context + +`list_interviews()` (line 286) currently runs one query then closes the DB. We'll refactor it to keep the connection open, run a second query for undismissed signals, group results by `job_id` in Python, then close. The three new endpoints follow the existing `_get_db()` + `db.close()` pattern. SQLite column is `finished_at` (NOT `completed_at`) in `background_tasks`. Use `job_id = 0` as sentinel for global email sync tasks. + +Signal types to **exclude** from the query: `'neutral'`, `'unrelated'`, `'digest'`, `'event_rescheduled'`. + +- [ ] **Step 1: Write the failing tests** + +Create `tests/test_dev_api_interviews.py`: + +```python +"""Tests for new dev-api.py endpoints: stage signals, email sync, signal dismiss.""" +import sqlite3 +import tempfile +import os +import pytest +from fastapi.testclient import TestClient + + +@pytest.fixture() +def tmp_db(tmp_path): + """Create a minimal staging.db schema in a temp dir.""" + db_path = str(tmp_path / "staging.db") + con = sqlite3.connect(db_path) + con.executescript(""" + CREATE TABLE jobs ( + id INTEGER PRIMARY KEY, + title TEXT, company TEXT, url TEXT, location TEXT, + is_remote INTEGER DEFAULT 0, salary TEXT, + match_score REAL, keyword_gaps TEXT, status TEXT, + interview_date TEXT, rejection_stage TEXT, + applied_at TEXT, phone_screen_at TEXT, interviewing_at TEXT, + offer_at TEXT, hired_at TEXT, survey_at TEXT + ); + CREATE TABLE job_contacts ( + id INTEGER PRIMARY KEY, + job_id INTEGER, + subject TEXT, + received_at TEXT, + stage_signal TEXT, + suggestion_dismissed INTEGER DEFAULT 0 + ); + CREATE TABLE background_tasks ( + id INTEGER PRIMARY KEY, + task_type TEXT, + job_id INTEGER, + status TEXT DEFAULT 'queued', + finished_at TEXT + ); + INSERT INTO jobs (id, title, company, status) VALUES + (1, 'Engineer', 'Acme', 'applied'), + (2, 'Designer', 'Beta', 'phone_screen'); + INSERT INTO job_contacts (id, job_id, subject, received_at, stage_signal, suggestion_dismissed) VALUES + (10, 1, 'Interview confirmed', '2026-03-19T10:00:00', 'interview_scheduled', 0), + (11, 1, 'Old neutral', '2026-03-18T09:00:00', 'neutral', 0), + (12, 2, 'Offer letter', '2026-03-19T11:00:00', 'offer_received', 0), + (13, 1, 'Already dismissed', '2026-03-17T08:00:00', 'positive_response', 1); + """) + con.close() + return db_path + + +@pytest.fixture() +def client(tmp_db, monkeypatch): + monkeypatch.setenv("STAGING_DB", tmp_db) + # Re-import after env var is set so DB_PATH picks it up + import importlib + import dev_api + importlib.reload(dev_api) + return TestClient(dev_api.app) + + +# ── GET /api/interviews — stage signals batched ──────────────────────────── + +def test_interviews_includes_stage_signals(client): + resp = client.get("/api/interviews") + assert resp.status_code == 200 + jobs = {j["id"]: j for j in resp.json()} + + # job 1 should have exactly 1 undismissed non-excluded signal + assert "stage_signals" in jobs[1] + signals = jobs[1]["stage_signals"] + assert len(signals) == 1 + assert signals[0]["stage_signal"] == "interview_scheduled" + assert signals[0]["subject"] == "Interview confirmed" + assert signals[0]["id"] == 10 + + # neutral signal excluded + signal_types = [s["stage_signal"] for s in signals] + assert "neutral" not in signal_types + + # dismissed signal excluded + signal_ids = [s["id"] for s in signals] + assert 13 not in signal_ids + + # job 2 has an offer signal + assert len(jobs[2]["stage_signals"]) == 1 + assert jobs[2]["stage_signals"][0]["stage_signal"] == "offer_received" + + +def test_interviews_empty_signals_for_job_without_contacts(client, tmp_db): + con = sqlite3.connect(tmp_db) + con.execute("INSERT INTO jobs (id, title, company, status) VALUES (3, 'NoContact', 'Corp', 'survey')") + con.commit(); con.close() + resp = client.get("/api/interviews") + jobs = {j["id"]: j for j in resp.json()} + assert jobs[3]["stage_signals"] == [] + + +# ── POST /api/email/sync ─────────────────────────────────────────────────── + +def test_email_sync_returns_202(client): + resp = client.post("/api/email/sync") + assert resp.status_code == 202 + assert "task_id" in resp.json() + + +def test_email_sync_inserts_background_task(client, tmp_db): + client.post("/api/email/sync") + con = sqlite3.connect(tmp_db) + row = con.execute( + "SELECT task_type, job_id, status FROM background_tasks WHERE task_type='email_sync'" + ).fetchone() + con.close() + assert row is not None + assert row[0] == "email_sync" + assert row[1] == 0 # sentinel + assert row[2] == "queued" + + +# ── GET /api/email/sync/status ───────────────────────────────────────────── + +def test_email_sync_status_idle_when_no_tasks(client): + resp = client.get("/api/email/sync/status") + assert resp.status_code == 200 + body = resp.json() + assert body["status"] == "idle" + assert body["last_completed_at"] is None + + +def test_email_sync_status_reflects_latest_task(client, tmp_db): + con = sqlite3.connect(tmp_db) + con.execute( + "INSERT INTO background_tasks (task_type, job_id, status, finished_at) VALUES " + "('email_sync', 0, 'completed', '2026-03-19T12:00:00')" + ) + con.commit(); con.close() + resp = client.get("/api/email/sync/status") + body = resp.json() + assert body["status"] == "completed" + assert body["last_completed_at"] == "2026-03-19T12:00:00" + + +# ── POST /api/stage-signals/{id}/dismiss ────────────────────────────────── + +def test_dismiss_signal_sets_flag(client, tmp_db): + resp = client.post("/api/stage-signals/10/dismiss") + assert resp.status_code == 200 + assert resp.json() == {"ok": True} + con = sqlite3.connect(tmp_db) + row = con.execute( + "SELECT suggestion_dismissed FROM job_contacts WHERE id = 10" + ).fetchone() + con.close() + assert row[0] == 1 + + +def test_dismiss_signal_404_for_missing_id(client): + resp = client.post("/api/stage-signals/9999/dismiss") + assert resp.status_code == 404 +``` + +- [ ] **Step 2: Run tests to verify they fail** + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa +/devl/miniconda3/envs/job-seeker/bin/pytest tests/test_dev_api_interviews.py -v +``` + +Expected: FAIL — `dev_api` module not found (tests reference `dev_api` not `dev-api`). + +- [ ] **Step 3: Create a `dev_api.py` symlink (module alias)** + +The test imports `dev_api` (underscore). Create a thin module alias: + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa +ln -sf dev-api.py dev_api.py +``` + +Re-run — should now fail with `ImportError` or route-not-found errors (not import error), which confirms the test infrastructure works. + +- [ ] **Step 4: Implement the four backend changes in `dev-api.py`** + +**4a — Stage signals in `list_interviews()`:** Replace lines 286–301 with: + +```python +SIGNAL_EXCLUDED = ("neutral", "unrelated", "digest", "event_rescheduled") + +@app.get("/api/interviews") +def list_interviews(): + db = _get_db() + placeholders = ",".join("?" * len(PIPELINE_STATUSES)) + rows = db.execute( + f"SELECT id, title, company, url, location, is_remote, salary, " + f"match_score, keyword_gaps, status, " + f"interview_date, rejection_stage, " + f"applied_at, phone_screen_at, interviewing_at, offer_at, hired_at, survey_at " + f"FROM jobs WHERE status IN ({placeholders}) " + f"ORDER BY match_score DESC NULLS LAST", + list(PIPELINE_STATUSES), + ).fetchall() + + job_ids = [r["id"] for r in rows] + signals_by_job: dict[int, list] = {r["id"]: [] for r in rows} + + if job_ids: + sig_placeholders = ",".join("?" * len(job_ids)) + excl_placeholders = ",".join("?" * len(SIGNAL_EXCLUDED)) + sig_rows = db.execute( + f"SELECT id, job_id, subject, received_at, stage_signal " + f"FROM job_contacts " + f"WHERE job_id IN ({sig_placeholders}) " + f" AND suggestion_dismissed = 0 " + f" AND stage_signal NOT IN ({excl_placeholders}) " + f" AND stage_signal IS NOT NULL " + f"ORDER BY received_at DESC", + job_ids + list(SIGNAL_EXCLUDED), + ).fetchall() + for sr in sig_rows: + signals_by_job[sr["job_id"]].append({ + "id": sr["id"], + "subject": sr["subject"], + "received_at": sr["received_at"], + "stage_signal": sr["stage_signal"], + }) + + db.close() + return [ + {**dict(r), "is_remote": bool(r["is_remote"]), "stage_signals": signals_by_job[r["id"]]} + for r in rows + ] +``` + +**4b — Email sync endpoints:** Add after the `list_interviews` function (before the `POST /api/jobs/{id}/move` block): + +```python +# ── POST /api/email/sync ────────────────────────────────────────────────── + +@app.post("/api/email/sync", status_code=202) +def trigger_email_sync(): + db = _get_db() + db.execute( + "INSERT INTO background_tasks (task_type, job_id, status) VALUES ('email_sync', 0, 'queued')" + ) + db.commit() + task_id = db.execute("SELECT last_insert_rowid()").fetchone()[0] + db.close() + return {"task_id": task_id} + + +# ── GET /api/email/sync/status ──────────────────────────────────────────── + +@app.get("/api/email/sync/status") +def email_sync_status(): + db = _get_db() + row = db.execute( + "SELECT status, finished_at AS last_completed_at, error " + "FROM background_tasks " + "WHERE task_type = 'email_sync' " + "ORDER BY id DESC LIMIT 1" + ).fetchone() + db.close() + if row is None: + return {"status": "idle", "last_completed_at": None, "error": None} + # background_tasks may not have an error column in staging — guard with dict access + row_dict = dict(row) + return { + "status": row_dict["status"], + "last_completed_at": row_dict["last_completed_at"], + "error": row_dict.get("error"), + } + + +# ── POST /api/stage-signals/{id}/dismiss ───────────────────────────────── + +@app.post("/api/stage-signals/{signal_id}/dismiss") +def dismiss_signal(signal_id: int): + db = _get_db() + result = db.execute( + "UPDATE job_contacts SET suggestion_dismissed = 1 WHERE id = ?", + (signal_id,), + ) + db.commit() + db.close() + if result.rowcount == 0: + raise HTTPException(404, "Signal not found") + return {"ok": True} +``` + +- [ ] **Step 5: Run tests again — verify they pass** + +```bash +/devl/miniconda3/envs/job-seeker/bin/pytest tests/test_dev_api_interviews.py -v +``` + +Expected: all tests PASS. + +- [ ] **Step 6: Run the full test suite to check for regressions** + +```bash +/devl/miniconda3/envs/job-seeker/bin/pytest tests/ -v --ignore=tests/e2e +``` + +Expected: existing tests still pass. + +- [ ] **Step 7: Commit** + +Note: `dev_api.py` is a symlink committed to the repo so that pytest can import the `dev_api` module by its Python-valid name. It points to `dev-api.py` (which uses a hyphen and is not directly importable). This is intentional. + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa +git add dev-api.py dev_api.py tests/test_dev_api_interviews.py +git commit -m "feat(interviews): add stage signals, email sync, and dismiss endpoints to dev-api" +``` + +--- + +## Task 2: Store — StageSignal type + PipelineJob update + +**Files:** +- Modify: `web/src/stores/interviews.ts` + +### Context + +Add and export the `StageSignal` interface before `PipelineJob`. Add `stage_signals: StageSignal[]` to `PipelineJob`. The `fetchAll()` function already maps data with `{ ...j }` — since the API now returns `stage_signals`, it will be included automatically. No other logic changes. + +- [ ] **Step 1: Add the `StageSignal` export and update `PipelineJob`** + +In `web/src/stores/interviews.ts`, insert the `StageSignal` interface before the `PipelineJob` interface and add `stage_signals` to `PipelineJob`: + +```typescript +// ADD before PipelineJob: +export interface StageSignal { + id: number // job_contacts.id — used for POST /api/stage-signals/{id}/dismiss + subject: string + received_at: string // ISO timestamp + stage_signal: 'interview_scheduled' | 'positive_response' | 'offer_received' | 'survey_received' | 'rejected' +} + +// MODIFY PipelineJob — add as last field: + stage_signals: StageSignal[] // undismissed signals, newest first +``` + +- [ ] **Step 2: Type-check** + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa/web +npx vue-tsc --noEmit +``` + +Expected: 0 errors. + +- [ ] **Step 3: Commit** + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa +git add web/src/stores/interviews.ts +git commit -m "feat(interviews): export StageSignal interface; add stage_signals to PipelineJob" +``` + +--- + +## Task 3: MoveToSheet — preSelectedStage prop + +**Files:** +- Modify: `web/src/components/MoveToSheet.vue` + +### Context + +`MoveToSheet` currently initializes `selectedStage` to `null`. Adding an optional `preSelectedStage` prop means: if it's provided, `selectedStage` starts with that value (the stage button appears pre-highlighted). The prop is typed `PipelineStage | undefined` and defaults to `undefined`. All existing non-signal `openMove()` calls pass no `preSelectedStage`, so the sheet defaults to null-selected as before. + +- [ ] **Step 1: Add the `preSelectedStage` optional prop** + +Open `web/src/components/MoveToSheet.vue`. The file currently has a bare `defineProps<{...}>()` call with no `withDefaults`. Simply add the optional field — no `withDefaults` wrapper needed since optional props default to `undefined` in Vue 3. + +```typescript +// BEFORE (lines 6–9): +const props = defineProps<{ + currentStatus: string + jobTitle: string +}>() + +// AFTER: +const props = defineProps<{ + currentStatus: string + jobTitle: string + preSelectedStage?: PipelineStage +}>() +``` + +`PipelineStage` is already imported on line 4 (`import type { PipelineStage } from '../stores/interviews'`) — no new import needed. + +- [ ] **Step 2: Pre-select the stage on mount** + +Replace the current `selectedStage` initialization (line 16): + +```typescript +// BEFORE: +const selectedStage = ref(null) + +// AFTER: +const selectedStage = ref(props.preSelectedStage ?? null) +``` + +- [ ] **Step 3: Type-check** + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa/web +npx vue-tsc --noEmit +``` + +Expected: 0 errors. + +- [ ] **Step 4: Commit** + +```bash +cd /Library/Development/CircuitForge/peregrine/.worktrees/feature-vue-spa +git add web/src/components/MoveToSheet.vue +git commit -m "feat(interviews): add preSelectedStage prop to MoveToSheet" +``` + +--- + +## Task 4: InterviewCard — signal banner + move emit extension + +**Files:** +- Modify: `web/src/components/InterviewCard.vue` + +### Context + +The card currently ends at its `` close with no signal section. We add a signal banner block inside the card border (after existing content). The `move` emit is extended from `move: [jobId: number]` to `move: [jobId: number, preSelectedStage?: PipelineStage]` — the second arg is optional so existing `@move="openMove"` usages remain valid. + +`StageSignal` and `PipelineStage` are imported from the store. The `job` prop already comes in as `PipelineJob` (post-Task 2 it now includes `stage_signals`). + +**Signal → stage mapping (must be hardcoded; `rejected` → `'interview_rejected'`, not `'rejected'`)**: + +``` +interview_scheduled → phone_screen, amber, "Move to Phone Screen" +positive_response → phone_screen, amber, "Move to Phone Screen" +offer_received → offer, green, "Move to Offer" +survey_received → survey, amber, "Move to Survey" +rejected → interview_rejected, red, "Mark Rejected" +``` + +**Multiple signals**: when `stage_signals.length > 1`, only the most recent banner shows. A `+N more` link below it toggles showing all signals. `sigExpanded` ref tracks this state. + +**Dismiss**: optimistic — remove from local `job.stage_signals` array immediately, then `POST /api/stage-signals/{id}/dismiss`. No error recovery needed (optimistic per spec). + +`overflow: hidden` on `.interview-card` must **not** clip the banner. Remove that rule — the card has a border already so there's no visual change. (The card currently has `overflow: hidden` which would hide the bottom banner border-radius.) + +- [ ] **Step 1: Add signal helpers to the `