Compare commits

...

15 commits

Author SHA1 Message Date
303b4bfb6f feat: SSE live score push for background enrichment (#1)
After a search, the API now returns a session_id. If any trust scores are
partial (pending seller age or category data), the frontend opens a
Server-Sent Events stream to /api/updates/{session_id}. As the background
BTF (account age) and category enrichment threads complete, they re-score
affected listings and push updated TrustScore payloads over SSE. The
frontend patches the trustScores and sellers maps reactively so signal
dots light up without requiring a manual re-search.

Backend:
- _update_queues registry maps session_id -> SimpleQueue (thread-safe bridge)
- _trigger_scraper_enrichment accepts session_id/user_db/query, builds a
  seller->listings map, calls _push_updates() after each enrichment pass
  which re-scores, saves trust scores, and puts events on the queue
- New GET /api/updates/{session_id} SSE endpoint: polls queue every 500ms,
  emits heartbeats every 15s, closes on sentinel None or 90s timeout
- search endpoint generates session_id and returns it in response

Frontend:
- search store adds enriching state and _openUpdates() / closeUpdates()
- On search completion, if partial scores exist, opens EventSource stream
- onmessage: patches trustScores and sellers maps (new Map() to trigger
  Vue reactivity), updates marketPrice if included
- on 'done' event or error: closes stream, enriching = false
- SearchView: pulsing 'Updating scores...' badge in toolbar while enriching
2026-04-05 23:12:27 -07:00
45c758bb53 revert: remove ADD COLUMN IF NOT EXISTS (not a SQLite feature)
SQLite does not support ADD COLUMN IF NOT EXISTS regardless of version.
The idempotency fix lives in cf-core's migration runner instead.
2026-04-05 22:24:06 -07:00
59f728cba0 fix: make ALTER TABLE migrations idempotent with IF NOT EXISTS
SQLite's executescript() auto-commits each DDL statement, so a partial
migration failure leaves columns in the DB without marking the migration
done. On the next startup the runner retries and hits duplicate column errors.

Use ADD COLUMN IF NOT EXISTS (SQLite 3.35+, shipped in Python 3.11+)
so migrations 004 and 005 are safe to re-run in any partial state.
2026-04-05 22:18:25 -07:00
81e41e39ab fix: remove duplicate first_seen_at ALTER TABLE in migration 004
001_init.sql already defines first_seen_at in the CREATE TABLE statement.
On fresh installs, migration 004 failed with 'duplicate column name: first_seen_at'.
Remove the redundant ALTER TABLE; last_seen_at/times_seen/price_at_first_seen
are still added by 004 as before.
2026-04-05 22:12:47 -07:00
234c76e686 feat: add no-Docker install path (conda/venv + uvicorn + npm build) 2026-04-05 22:09:24 -07:00
7672dd758a fix: self-hosted install — network_mode, cf-core bind mount, install script 2026-04-05 22:02:50 -07:00
663d92fc11 refactor: use shorter circuitforge_core.api import for feedback router 2026-04-05 21:21:54 -07:00
c2fa107c47 fix: use correct tab field name in feedback test 2026-04-05 20:50:13 -07:00
1ca9398df4 refactor: move feedback router import to top-level block 2026-04-05 18:45:16 -07:00
5a3f9cb460 feat: migrate feedback endpoint to circuitforge-core router
Replaces the inline feedback block (FeedbackRequest/FeedbackResponse
models, _fb_headers, _ensure_feedback_labels, status + submit routes)
with make_feedback_router() from circuitforge-core. Removes now-unused
imports (_requests, _platform, Literal, subprocess, datetime/timezone).
Adds 7 tests covering status + submit paths via TestClient.
2026-04-05 18:18:25 -07:00
f7d5b20aa5 chore: bump circuitforge-core dep to >=0.8.0 (orch split) 2026-04-04 22:48:48 -07:00
bdbcb046cc fix: detect eBay condition field for parts/repair listings; add clear-filters btn
- aggregator: also check listing.condition against damage keywords so listings
  with eBay condition "for parts or not working" flag scratch_dent_mentioned
  even when the title looks clean
- aggregator: add "parts/repair" (slash) + "parts or not working" to keyword set
- trust/__init__.py: pass listing.condition into aggregate()
- 3 new regression tests (synthetic fixtures, 17 total passing)
- SearchView: extract DEFAULT_FILTERS const + resetFilters(); add "Clear filters"
  button that shows only when activeFilterCount > 0 with count badge
- .env.example: document LLM inference env vars (ANTHROPIC/OPENAI/OLLAMA/CF_ORCH_URL)
  and cf-core wiring notes; closes #17
2026-04-04 22:42:56 -07:00
ccbbe58bd4 chore: pin circuitforge-core>=0.7.0 (affiliates + preferences modules) 2026-04-04 19:17:49 -07:00
c5988a059d Merge pull request 'feat: eBay affiliate link builder' (#20) from feature/affiliate-links into main 2026-04-04 19:16:33 -07:00
3f7c2b9135 Merge pull request 'feat: in-app feedback FAB' (#18) from feature/feedback-button into main 2026-04-03 22:01:06 -07:00
15 changed files with 774 additions and 157 deletions

View file

@ -54,6 +54,28 @@ SNIPE_DB=data/snipe.db
# own ID; the CF cloud instance uses CF's campaign ID (disclosed in the UI). # own ID; the CF cloud instance uses CF's campaign ID (disclosed in the UI).
# EBAY_AFFILIATE_CAMPAIGN_ID= # EBAY_AFFILIATE_CAMPAIGN_ID=
# ── LLM inference (vision / photo analysis) ──────────────────────────────────
# circuitforge-core LLMRouter auto-detects backends from these env vars
# (no llm.yaml required). Backends are tried in this priority order:
# 1. ANTHROPIC_API_KEY → Claude API (cloud; requires Paid tier key)
# 2. OPENAI_API_KEY → OpenAI-compatible endpoint
# 3. OLLAMA_HOST → local Ollama (default: http://localhost:11434)
# Leave all unset to disable LLM features (photo analysis won't run).
# ANTHROPIC_API_KEY=
# ANTHROPIC_MODEL=claude-haiku-4-5-20251001
# OPENAI_API_KEY=
# OPENAI_BASE_URL=https://api.openai.com/v1
# OPENAI_MODEL=gpt-4o-mini
# OLLAMA_HOST=http://localhost:11434
# OLLAMA_MODEL=llava:7b
# CF Orchestrator — managed inference for Paid+ cloud users (internal use only).
# Self-hosted users leave this unset; it has no effect without a valid allocation token.
# CF_ORCH_URL=https://orch.circuitforge.tech
# ── In-app feedback (beta) ──────────────────────────────────────────────────── # ── In-app feedback (beta) ────────────────────────────────────────────────────
# When set, a feedback FAB appears in the UI and routes submissions to Forgejo. # When set, a feedback FAB appears in the UI and routes submissions to Forgejo.
# Leave unset to silently hide the button (demo/offline deployments). # Leave unset to silently hide the button (demo/offline deployments).

View file

@ -4,6 +4,50 @@
**Status:** Active — eBay listing intelligence MVP complete (search, trust scoring, affiliate links, feedback FAB, vision task scheduling). Auction sniping engine and multi-platform support are next. **Status:** Active — eBay listing intelligence MVP complete (search, trust scoring, affiliate links, feedback FAB, vision task scheduling). Auction sniping engine and multi-platform support are next.
## Quick install (self-hosted)
**Requirements:** Docker with Compose plugin, Git. No API keys needed to get started.
```bash
# One-line install — clones to ~/snipe by default
bash <(curl -fsSL https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/raw/branch/main/install.sh)
# Or clone manually and run the script:
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/snipe.git
bash snipe/install.sh
```
Then open **http://localhost:8509**.
### Manual setup (if you prefer)
Snipe's API image is built from a parent context that includes `circuitforge-core`. Both repos must sit as siblings in the same directory:
```
workspace/
├── snipe/ ← this repo
└── circuitforge-core/ ← required sibling
```
```bash
mkdir snipe-workspace && cd snipe-workspace
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/snipe.git
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/circuitforge-core.git
cd snipe
cp .env.example .env # edit if you have eBay API credentials (optional)
./manage.sh start
```
### Optional: eBay API credentials
Snipe works without any credentials using its Playwright scraper fallback. Adding eBay API credentials unlocks faster searches and inline seller account age (no extra scrape needed):
1. Register at [developer.ebay.com](https://developer.ebay.com/my/keys)
2. Copy your Production **App ID** and **Cert ID** into `.env`
3. Restart: `./manage.sh restart`
---
## What it does ## What it does
Snipe has two layers that work together: Snipe has two layers that work together:

View file

@ -3,27 +3,27 @@ from __future__ import annotations
import dataclasses import dataclasses
import hashlib import hashlib
import json as _json
import logging import logging
import os import os
import queue as _queue
import uuid
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from contextlib import asynccontextmanager from contextlib import asynccontextmanager
from pathlib import Path from pathlib import Path
import asyncio
import csv import csv
import io import io
import platform as _platform
import subprocess
from datetime import datetime, timezone
from typing import Literal
import requests as _requests from fastapi import Depends, FastAPI, HTTPException, Request, UploadFile, File
from fastapi import Depends, FastAPI, HTTPException, UploadFile, File
from fastapi.responses import StreamingResponse from fastapi.responses import StreamingResponse
from pydantic import BaseModel from pydantic import BaseModel
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from circuitforge_core.config import load_env from circuitforge_core.config import load_env
from circuitforge_core.affiliates import wrap_url as _wrap_affiliate_url from circuitforge_core.affiliates import wrap_url as _wrap_affiliate_url
from circuitforge_core.api import make_feedback_router as _make_feedback_router
from app.db.store import Store from app.db.store import Store
from app.db.models import SavedSearch as SavedSearchModel, ScammerEntry from app.db.models import SavedSearch as SavedSearchModel, ScammerEntry
from app.platforms import SearchFilters from app.platforms import SearchFilters
@ -38,6 +38,12 @@ from api.ebay_webhook import router as ebay_webhook_router
load_env(Path(".env")) load_env(Path(".env"))
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
# ── SSE update registry ───────────────────────────────────────────────────────
# Maps session_id → SimpleQueue of update events.
# SimpleQueue is always thread-safe; no asyncio loop needed to write from threads.
# Keys are cleaned up when the SSE stream ends (client disconnect or timeout).
_update_queues: dict[str, _queue.SimpleQueue] = {}
@asynccontextmanager @asynccontextmanager
async def _lifespan(app: FastAPI): async def _lifespan(app: FastAPI):
@ -81,6 +87,12 @@ app.add_middleware(
allow_headers=["*"], allow_headers=["*"],
) )
_feedback_router = _make_feedback_router(
repo="Circuit-Forge/snipe",
product="snipe",
)
app.include_router(_feedback_router, prefix="/api/feedback")
@app.get("/api/health") @app.get("/api/health")
def health(): def health():
@ -107,31 +119,34 @@ def _trigger_scraper_enrichment(
listings: list, listings: list,
shared_store: Store, shared_store: Store,
shared_db: Path, shared_db: Path,
user_db: Path | None = None,
query: str = "",
session_id: str | None = None,
) -> None: ) -> None:
"""Fire-and-forget background enrichment for missing seller signals. """Fire-and-forget background enrichment for missing seller signals.
Two enrichment passes run concurrently in the same daemon thread: Two enrichment passes run in the same daemon thread:
1. BTF (/itm/ pages) fills account_age_days for sellers where it is None. 1. BTF (/itm/ pages) fills account_age_days for sellers where it is None.
2. _ssn search pages fills category_history_json for sellers with no history. 2. _ssn search pages fills category_history_json for sellers with no history.
The main response returns immediately; enriched data lands in the DB for When session_id is provided, pushes re-scored trust score updates to the
future searches. Uses ScrapedEbayAdapter's Playwright stack regardless of SSE queue after each pass so the frontend can update scores live.
which adapter was used for the main search (Shopping API handles age for
the API adapter inline; BTF is the fallback for no-creds / scraper mode).
shared_store: used for pre-flight seller checks (same-thread reads). shared_store: used for pre-flight seller checks (same-thread reads).
shared_db: path passed to background thread it creates its own Store shared_db: path passed to background thread (sqlite3 is not thread-safe).
(sqlite3 connections are not thread-safe). user_db: path to per-user listings/trust_scores DB (same as shared_db in local mode).
query: original search query used for market comp lookup during re-score.
session_id: SSE session key; if set, updates are pushed to _update_queues[session_id].
""" """
# Caps per search: limits Playwright sessions launched in the background so we
# don't hammer Kasada or spin up dozens of Xvfb instances after a large search.
# Remaining sellers get enriched incrementally on subsequent searches.
_BTF_MAX_PER_SEARCH = 3 _BTF_MAX_PER_SEARCH = 3
_CAT_MAX_PER_SEARCH = 3 _CAT_MAX_PER_SEARCH = 3
needs_btf: dict[str, str] = {} needs_btf: dict[str, str] = {}
needs_categories: list[str] = [] needs_categories: list[str] = []
# Map seller_id → [listings] for this search so we know what to re-score
seller_listing_map: dict[str, list] = {}
for listing in listings: for listing in listings:
sid = listing.seller_platform_id sid = listing.seller_platform_id
if not sid: if not sid:
@ -139,6 +154,7 @@ def _trigger_scraper_enrichment(
seller = shared_store.get_seller("ebay", sid) seller = shared_store.get_seller("ebay", sid)
if not seller: if not seller:
continue continue
seller_listing_map.setdefault(sid, []).append(listing)
if ((seller.account_age_days is None or seller.feedback_count == 0) if ((seller.account_age_days is None or seller.feedback_count == 0)
and sid not in needs_btf and sid not in needs_btf
and len(needs_btf) < _BTF_MAX_PER_SEARCH): and len(needs_btf) < _BTF_MAX_PER_SEARCH):
@ -149,6 +165,8 @@ def _trigger_scraper_enrichment(
needs_categories.append(sid) needs_categories.append(sid)
if not needs_btf and not needs_categories: if not needs_btf and not needs_categories:
if session_id and session_id in _update_queues:
_update_queues[session_id].put(None) # sentinel — nothing to enrich
return return
log.info( log.info(
@ -156,17 +174,55 @@ def _trigger_scraper_enrichment(
len(needs_btf), len(needs_categories), len(needs_btf), len(needs_categories),
) )
def _push_updates(enriched_seller_ids: list[str]) -> None:
"""Re-score listings for enriched sellers and push updates to SSE queue."""
if not session_id or session_id not in _update_queues:
return
q = _update_queues[session_id]
thread_shared = Store(shared_db)
thread_user = Store(user_db or shared_db)
scorer = TrustScorer(thread_shared)
comp = thread_shared.get_market_comp("ebay", hashlib.md5(query.encode()).hexdigest())
market_price = comp.median_price if comp else None
for sid in enriched_seller_ids:
seller = thread_shared.get_seller("ebay", sid)
if not seller:
continue
affected = seller_listing_map.get(sid, [])
if not affected:
continue
new_scores = scorer.score_batch(affected, query)
thread_user.save_trust_scores(new_scores)
for listing, ts in zip(affected, new_scores):
if ts is None:
continue
q.put({
"platform_listing_id": listing.platform_listing_id,
"trust_score": dataclasses.asdict(ts),
"seller": dataclasses.asdict(seller),
"market_price": market_price,
})
def _run(): def _run():
try: try:
enricher = ScrapedEbayAdapter(Store(shared_db)) enricher = ScrapedEbayAdapter(Store(shared_db))
if needs_btf: if needs_btf:
enricher.enrich_sellers_btf(needs_btf, max_workers=2) enricher.enrich_sellers_btf(needs_btf, max_workers=2)
log.info("BTF enrichment complete for %d sellers", len(needs_btf)) log.info("BTF enrichment complete for %d sellers", len(needs_btf))
_push_updates(list(needs_btf.keys()))
if needs_categories: if needs_categories:
enricher.enrich_sellers_categories(needs_categories, max_workers=2) enricher.enrich_sellers_categories(needs_categories, max_workers=2)
log.info("Category enrichment complete for %d sellers", len(needs_categories)) log.info("Category enrichment complete for %d sellers", len(needs_categories))
# Re-score only sellers not already covered by BTF push
cat_only = [s for s in needs_categories if s not in needs_btf]
if cat_only:
_push_updates(cat_only)
except Exception as e: except Exception as e:
log.warning("Scraper enrichment failed: %s", e) log.warning("Scraper enrichment failed: %s", e)
finally:
# Sentinel: tells SSE stream the enrichment thread is done
if session_id and session_id in _update_queues:
_update_queues[session_id].put(None)
import threading import threading
t = threading.Thread(target=_run, daemon=True) t = threading.Thread(target=_run, daemon=True)
@ -363,9 +419,15 @@ def search(
listings = [staged.get(l.platform_listing_id, l) for l in listings] listings = [staged.get(l.platform_listing_id, l) for l in listings]
# BTF enrichment: scrape /itm/ pages for sellers missing account_age_days. # BTF enrichment: scrape /itm/ pages for sellers missing account_age_days.
# Runs in the background so it doesn't delay the response; next search of # Runs in the background so it doesn't delay the response. A session_id is
# the same sellers will have full scores. # generated so the frontend can open an SSE stream and receive live score
_trigger_scraper_enrichment(listings, shared_store, shared_db) # updates as enrichment completes.
session_id = str(uuid.uuid4())
_update_queues[session_id] = _queue.SimpleQueue()
_trigger_scraper_enrichment(
listings, shared_store, shared_db,
user_db=user_db, query=q, session_id=session_id,
)
scorer = TrustScorer(shared_store) scorer = TrustScorer(shared_store)
trust_scores_list = scorer.score_batch(listings, q) trust_scores_list = scorer.score_batch(listings, q)
@ -409,6 +471,7 @@ def search(
"market_price": market_price, "market_price": market_price,
"adapter_used": adapter_used, "adapter_used": adapter_used,
"affiliate_active": bool(os.environ.get("EBAY_AFFILIATE_CAMPAIGN_ID", "").strip()), "affiliate_active": bool(os.environ.get("EBAY_AFFILIATE_CAMPAIGN_ID", "").strip()),
"session_id": session_id,
} }
@ -506,6 +569,73 @@ def enrich_seller(
} }
# ── SSE live score updates ────────────────────────────────────────────────────
@app.get("/api/updates/{session_id}")
async def stream_updates(session_id: str, request: Request):
"""Server-Sent Events stream for live trust score updates.
Opens after a search when any listings have score_is_partial=true.
Streams re-scored trust score payloads as enrichment completes, then
sends a 'done' event and closes.
Each event payload:
{ platform_listing_id, trust_score, seller, market_price }
Closes automatically after 90 seconds (worst-case Playwright enrichment).
The client should also close on 'done' event.
"""
if session_id not in _update_queues:
raise HTTPException(status_code=404, detail="Unknown session_id")
q = _update_queues[session_id]
deadline = asyncio.get_event_loop().time() + 90.0
heartbeat_interval = 15.0
next_heartbeat = asyncio.get_event_loop().time() + heartbeat_interval
async def generate():
nonlocal next_heartbeat
try:
while asyncio.get_event_loop().time() < deadline:
if await request.is_disconnected():
break
# Drain all available updates (non-blocking)
while True:
try:
item = q.get_nowait()
except _queue.Empty:
break
if item is None:
# Sentinel: enrichment thread is done
yield "event: done\ndata: {}\n\n"
return
yield f"data: {_json.dumps(item)}\n\n"
# Heartbeat to keep the connection alive through proxies
now = asyncio.get_event_loop().time()
if now >= next_heartbeat:
yield ": heartbeat\n\n"
next_heartbeat = now + heartbeat_interval
await asyncio.sleep(0.5)
# Timeout reached
yield "event: done\ndata: {\"reason\": \"timeout\"}\n\n"
finally:
_update_queues.pop(session_id, None)
return StreamingResponse(
generate(),
media_type="text/event-stream",
headers={
"Cache-Control": "no-cache",
"X-Accel-Buffering": "no", # nginx: disable proxy buffering for SSE
"Connection": "keep-alive",
},
)
# ── Saved Searches ──────────────────────────────────────────────────────────── # ── Saved Searches ────────────────────────────────────────────────────────────
class SavedSearchCreate(BaseModel): class SavedSearchCreate(BaseModel):
@ -645,117 +775,3 @@ async def import_blocklist(
return {"imported": imported, "errors": errors} return {"imported": imported, "errors": errors}
# ── Feedback ────────────────────────────────────────────────────────────────
# Creates Forgejo issues from in-app beta feedback.
# Silently disabled when FORGEJO_API_TOKEN is not set.
_FEEDBACK_LABEL_COLORS = {
"beta-feedback": "#0075ca",
"needs-triage": "#e4e669",
"bug": "#d73a4a",
"feature-request": "#a2eeef",
"question": "#d876e3",
}
def _fb_headers() -> dict:
token = os.environ.get("FORGEJO_API_TOKEN", "")
return {"Authorization": f"token {token}", "Content-Type": "application/json"}
def _ensure_feedback_labels(names: list[str]) -> list[int]:
base = os.environ.get("FORGEJO_API_URL", "https://git.opensourcesolarpunk.com/api/v1")
repo = os.environ.get("FORGEJO_REPO", "Circuit-Forge/snipe")
resp = _requests.get(f"{base}/repos/{repo}/labels", headers=_fb_headers(), timeout=10)
existing = {lb["name"]: lb["id"] for lb in resp.json()} if resp.ok else {}
ids: list[int] = []
for name in names:
if name in existing:
ids.append(existing[name])
else:
r = _requests.post(
f"{base}/repos/{repo}/labels",
headers=_fb_headers(),
json={"name": name, "color": _FEEDBACK_LABEL_COLORS.get(name, "#ededed")},
timeout=10,
)
if r.ok:
ids.append(r.json()["id"])
return ids
class FeedbackRequest(BaseModel):
title: str
description: str
type: Literal["bug", "feature", "other"] = "other"
repro: str = ""
view: str = "unknown"
submitter: str = ""
class FeedbackResponse(BaseModel):
issue_number: int
issue_url: str
@app.get("/api/feedback/status")
def feedback_status() -> dict:
"""Return whether feedback submission is configured on this instance."""
demo = os.environ.get("DEMO_MODE", "false").lower() in ("1", "true", "yes")
return {"enabled": bool(os.environ.get("FORGEJO_API_TOKEN")) and not demo}
@app.post("/api/feedback", response_model=FeedbackResponse)
def submit_feedback(payload: FeedbackRequest) -> FeedbackResponse:
"""File a Forgejo issue from in-app feedback."""
token = os.environ.get("FORGEJO_API_TOKEN", "")
if not token:
raise HTTPException(status_code=503, detail="Feedback disabled: FORGEJO_API_TOKEN not configured.")
if os.environ.get("DEMO_MODE", "false").lower() in ("1", "true", "yes"):
raise HTTPException(status_code=403, detail="Feedback disabled in demo mode.")
try:
version = subprocess.check_output(
["git", "describe", "--tags", "--always"],
cwd=Path(__file__).resolve().parents[1], text=True, timeout=5,
).strip()
except Exception:
version = "dev"
_TYPE_LABELS = {"bug": "🐛 Bug", "feature": "✨ Feature Request", "other": "💬 Other"}
body_lines = [
f"## {_TYPE_LABELS.get(payload.type, '💬 Other')}",
"",
payload.description,
"",
]
if payload.type == "bug" and payload.repro:
body_lines += ["### Reproduction Steps", "", payload.repro, ""]
body_lines += [
"### Context", "",
f"- **view:** {payload.view}",
f"- **version:** {version}",
f"- **platform:** {_platform.platform()}",
f"- **timestamp:** {datetime.now(timezone.utc).isoformat().replace('+00:00', 'Z')}",
"",
]
if payload.submitter:
body_lines += ["---", f"*Submitted by: {payload.submitter}*"]
labels = ["beta-feedback", "needs-triage",
{"bug": "bug", "feature": "feature-request"}.get(payload.type, "question")]
base = os.environ.get("FORGEJO_API_URL", "https://git.opensourcesolarpunk.com/api/v1")
repo = os.environ.get("FORGEJO_REPO", "Circuit-Forge/snipe")
label_ids = _ensure_feedback_labels(labels)
resp = _requests.post(
f"{base}/repos/{repo}/issues",
headers=_fb_headers(),
json={"title": payload.title, "body": "\n".join(body_lines), "labels": label_ids},
timeout=15,
)
if not resp.ok:
raise HTTPException(status_code=502, detail=f"Forgejo error: {resp.text[:200]}")
data = resp.json()
return FeedbackResponse(issue_number=data["number"], issue_url=data["html_url"])

View file

@ -52,6 +52,7 @@ class TrustScorer:
signal_scores, is_dup, seller, signal_scores, is_dup, seller,
listing_id=listing.id or 0, listing_id=listing.id or 0,
listing_title=listing.title, listing_title=listing.title,
listing_condition=listing.condition,
times_seen=listing.times_seen, times_seen=listing.times_seen,
first_seen_at=listing.first_seen_at, first_seen_at=listing.first_seen_at,
price=listing.price, price=listing.price,

View file

@ -23,8 +23,9 @@ _SCRATCH_DENT_KEYWORDS = frozenset([
"crack", "cracked", "chip", "chipped", "crack", "cracked", "chip", "chipped",
"damage", "damaged", "cosmetic damage", "damage", "damaged", "cosmetic damage",
"blemish", "wear", "worn", "worn in", "blemish", "wear", "worn", "worn in",
# Parts / condition catch-alls # Parts / condition catch-alls (also matches eBay condition field strings verbatim)
"as is", "for parts", "parts only", "spares or repair", "parts or repair", "as is", "for parts", "parts only", "spares or repair", "parts or repair",
"parts/repair", "parts or not working", "not working",
# Evasive redirects — seller hiding damage detail in listing body # Evasive redirects — seller hiding damage detail in listing body
"see description", "read description", "read listing", "see listing", "see description", "read description", "read listing", "see listing",
"see photos for", "see pics for", "see images for", "see photos for", "see pics for", "see images for",
@ -72,6 +73,7 @@ class Aggregator:
seller: Optional[Seller], seller: Optional[Seller],
listing_id: int = 0, listing_id: int = 0,
listing_title: str = "", listing_title: str = "",
listing_condition: str = "",
times_seen: int = 1, times_seen: int = 1,
first_seen_at: Optional[str] = None, first_seen_at: Optional[str] = None,
price: float = 0.0, price: float = 0.0,
@ -137,7 +139,9 @@ class Aggregator:
) )
if photo_hash_duplicate and not is_established_retailer: if photo_hash_duplicate and not is_established_retailer:
red_flags.append("duplicate_photo") red_flags.append("duplicate_photo")
if listing_title and _has_damage_keywords(listing_title): if (listing_title and _has_damage_keywords(listing_title)) or (
listing_condition and _has_damage_keywords(listing_condition)
):
red_flags.append("scratch_dent_mentioned") red_flags.append("scratch_dent_mentioned")
# Staging DB signals # Staging DB signals

View file

@ -1,21 +1,17 @@
# compose.override.yml — dev-only additions (auto-applied by Docker Compose in dev).
# Safe to delete on a self-hosted machine — compose.yml is self-contained.
#
# What this adds over compose.yml:
# - Live source mounts so code changes take effect without rebuilding images
# - RELOAD=true to enable uvicorn --reload for the API
# - NOTE: circuitforge-core is NOT mounted here — use `./manage.sh build` to
# pick up cf-core changes. Mounting it as a bind volume would break self-hosted
# installs that don't have the sibling directory.
services: services:
api: api:
build:
context: ..
dockerfile: snipe/Dockerfile
network_mode: host
volumes: volumes:
- ../circuitforge-core:/app/circuitforge-core
- ./api:/app/snipe/api - ./api:/app/snipe/api
- ./app:/app/snipe/app - ./app:/app/snipe/app
- ./data:/app/snipe/data
- ./tests:/app/snipe/tests - ./tests:/app/snipe/tests
environment: environment:
- RELOAD=true - RELOAD=true
web:
build:
context: .
dockerfile: docker/web/Dockerfile
volumes:
- ./web/src:/app/src # not used at runtime but keeps override valid

View file

@ -3,11 +3,14 @@ services:
build: build:
context: .. context: ..
dockerfile: snipe/Dockerfile dockerfile: snipe/Dockerfile
ports: # Host networking lets nginx (in the web container) reach the API at
- "8510:8510" # 172.17.0.1:8510 (the Docker bridge gateway). Required — nginx.conf
# is baked into the image and hard-codes that address.
network_mode: host
env_file: .env env_file: .env
volumes: volumes:
- ./data:/app/snipe/data - ./data:/app/snipe/data
restart: unless-stopped
web: web:
build: build:

226
install.sh Executable file
View file

@ -0,0 +1,226 @@
#!/usr/bin/env bash
# Snipe — self-hosted install script
#
# Supports two install paths:
# Docker (recommended) — everything in containers, no system Python deps required
# No-Docker — conda or venv + direct uvicorn, for machines without Docker
#
# Usage:
# bash install.sh # installs to ~/snipe
# bash install.sh /opt/snipe # custom install directory
# bash install.sh ~/snipe --no-docker # force no-Docker path even if Docker present
#
# Requirements (Docker path): Docker with Compose plugin, Git
# Requirements (no-Docker path): Python 3.11+, Node.js 20+, Git, xvfb (system)
set -euo pipefail
INSTALL_DIR="${1:-$HOME/snipe}"
FORCE_NO_DOCKER="${2:-}"
FORGEJO="https://git.opensourcesolarpunk.com/Circuit-Forge"
CONDA_ENV="cf"
info() { echo " [snipe] $*"; }
ok() { echo "$*"; }
warn() { echo "! $*"; }
fail() { echo "$*" >&2; exit 1; }
hr() { echo "────────────────────────────────────────────────────────"; }
echo ""
echo " Snipe — self-hosted installer"
echo " Install directory: $INSTALL_DIR"
echo ""
# ── Detect capabilities ──────────────────────────────────────────────────────
HAS_DOCKER=false
HAS_CONDA=false
HAS_PYTHON=false
HAS_NODE=false
docker compose version >/dev/null 2>&1 && HAS_DOCKER=true
conda --version >/dev/null 2>&1 && HAS_CONDA=true
python3 --version >/dev/null 2>&1 && HAS_PYTHON=true
node --version >/dev/null 2>&1 && HAS_NODE=true
command -v git >/dev/null 2>&1 || fail "Git is required. Install with: sudo apt-get install git"
# Honour --no-docker flag
[[ "$FORCE_NO_DOCKER" == "--no-docker" ]] && HAS_DOCKER=false
if $HAS_DOCKER; then
INSTALL_PATH="docker"
ok "Docker found — using Docker install path (recommended)"
elif $HAS_PYTHON; then
INSTALL_PATH="python"
warn "Docker not found — using no-Docker path (conda or venv)"
else
fail "Docker or Python 3.11+ is required. Install Docker: https://docs.docker.com/get-docker/"
fi
# ── Clone repos ──────────────────────────────────────────────────────────────
# compose.yml and the Dockerfile both use context: .. (parent directory), so
# snipe/ and circuitforge-core/ must be siblings inside INSTALL_DIR.
SNIPE_DIR="$INSTALL_DIR/snipe"
CORE_DIR="$INSTALL_DIR/circuitforge-core"
if [[ -d "$SNIPE_DIR" ]]; then
info "Snipe already cloned — pulling latest..."
git -C "$SNIPE_DIR" pull --ff-only
else
info "Cloning Snipe..."
mkdir -p "$INSTALL_DIR"
git clone "$FORGEJO/snipe.git" "$SNIPE_DIR"
fi
ok "Snipe → $SNIPE_DIR"
if [[ -d "$CORE_DIR" ]]; then
info "circuitforge-core already cloned — pulling latest..."
git -C "$CORE_DIR" pull --ff-only
else
info "Cloning circuitforge-core (shared library)..."
git clone "$FORGEJO/circuitforge-core.git" "$CORE_DIR"
fi
ok "circuitforge-core → $CORE_DIR"
# ── Configure environment ────────────────────────────────────────────────────
ENV_FILE="$SNIPE_DIR/.env"
if [[ ! -f "$ENV_FILE" ]]; then
cp "$SNIPE_DIR/.env.example" "$ENV_FILE"
# Safe defaults for local installs — no eBay registration, no Heimdall
sed -i 's/^EBAY_WEBHOOK_VERIFY_SIGNATURES=true/EBAY_WEBHOOK_VERIFY_SIGNATURES=false/' "$ENV_FILE"
ok ".env created from .env.example"
echo ""
info "Snipe works out of the box with no API keys."
info "Add EBAY_APP_ID / EBAY_CERT_ID later for faster searches (optional)."
echo ""
else
info ".env already exists — skipping (delete it to reset)"
fi
cd "$SNIPE_DIR"
# ── Docker install path ───────────────────────────────────────────────────────
if [[ "$INSTALL_PATH" == "docker" ]]; then
info "Building Docker images (~1 GB download on first run)..."
docker compose build
info "Starting Snipe..."
docker compose up -d
echo ""
ok "Snipe is running!"
hr
echo " Web UI: http://localhost:8509"
echo " API: http://localhost:8510/docs"
echo ""
echo " Manage: cd $SNIPE_DIR && ./manage.sh {start|stop|restart|logs|test}"
hr
echo ""
exit 0
fi
# ── No-Docker install path ───────────────────────────────────────────────────
# System deps: Xvfb is required for Playwright (Kasada bypass via headed Chromium)
if ! command -v Xvfb >/dev/null 2>&1; then
info "Installing Xvfb (required for eBay scraper)..."
if command -v apt-get >/dev/null 2>&1; then
sudo apt-get install -y --no-install-recommends xvfb
elif command -v dnf >/dev/null 2>&1; then
sudo dnf install -y xorg-x11-server-Xvfb
elif command -v brew >/dev/null 2>&1; then
warn "macOS: Xvfb not available. The scraper fallback may fail."
warn "Add eBay API credentials to .env to use the API adapter instead."
else
warn "Could not install Xvfb automatically. Install it with your package manager."
fi
fi
# ── Python environment setup ─────────────────────────────────────────────────
if $HAS_CONDA; then
info "Setting up conda environment '$CONDA_ENV'..."
if conda env list | grep -q "^$CONDA_ENV "; then
info "Conda env '$CONDA_ENV' already exists — updating..."
conda run -n "$CONDA_ENV" pip install --quiet -e "$CORE_DIR"
conda run -n "$CONDA_ENV" pip install --quiet -e "$SNIPE_DIR"
else
conda create -n "$CONDA_ENV" python=3.11 -y
conda run -n "$CONDA_ENV" pip install --quiet -e "$CORE_DIR"
conda run -n "$CONDA_ENV" pip install --quiet -e "$SNIPE_DIR"
fi
conda run -n "$CONDA_ENV" playwright install chromium
conda run -n "$CONDA_ENV" playwright install-deps chromium
PYTHON_RUN="conda run -n $CONDA_ENV"
ok "Conda environment '$CONDA_ENV' ready"
else
info "Setting up Python venv at $SNIPE_DIR/.venv ..."
python3 -m venv "$SNIPE_DIR/.venv"
"$SNIPE_DIR/.venv/bin/pip" install --quiet -e "$CORE_DIR"
"$SNIPE_DIR/.venv/bin/pip" install --quiet -e "$SNIPE_DIR"
"$SNIPE_DIR/.venv/bin/playwright" install chromium
"$SNIPE_DIR/.venv/bin/playwright" install-deps chromium
PYTHON_RUN="$SNIPE_DIR/.venv/bin"
ok "Python venv ready at $SNIPE_DIR/.venv"
fi
# ── Frontend ─────────────────────────────────────────────────────────────────
if $HAS_NODE; then
info "Building Vue frontend..."
cd "$SNIPE_DIR/web"
npm ci --prefer-offline --silent
npm run build
cd "$SNIPE_DIR"
ok "Frontend built → web/dist/"
else
warn "Node.js not found — skipping frontend build."
warn "Install Node.js 20+ from https://nodejs.org and re-run install.sh to build the UI."
warn "Until then, you can access the API directly at http://localhost:8510/docs"
fi
# ── Write start/stop scripts ─────────────────────────────────────────────────
cat > "$SNIPE_DIR/start-local.sh" << 'STARTSCRIPT'
#!/usr/bin/env bash
# Start Snipe without Docker (API only — run from the snipe/ directory)
set -euo pipefail
cd "$(dirname "$0")"
if [[ -f .venv/bin/uvicorn ]]; then
UVICORN=".venv/bin/uvicorn"
elif command -v conda >/dev/null 2>&1 && conda env list | grep -q "^cf "; then
UVICORN="conda run -n cf uvicorn"
else
echo "No Python env found. Run install.sh first." >&2; exit 1
fi
mkdir -p data
echo "Starting Snipe API on http://localhost:8510 ..."
$UVICORN api.main:app --host 0.0.0.0 --port 8510 "${@}"
STARTSCRIPT
chmod +x "$SNIPE_DIR/start-local.sh"
# Frontend serving (if built)
cat > "$SNIPE_DIR/serve-ui.sh" << 'UISCRIPT'
#!/usr/bin/env bash
# Serve the pre-built Vue frontend on port 8509 (dev only — use nginx for production)
cd "$(dirname "$0")/web/dist"
python3 -m http.server 8509
UISCRIPT
chmod +x "$SNIPE_DIR/serve-ui.sh"
echo ""
ok "Snipe installed (no-Docker mode)"
hr
echo " Start API: cd $SNIPE_DIR && ./start-local.sh"
echo " Serve UI: cd $SNIPE_DIR && ./serve-ui.sh (separate terminal)"
echo " API docs: http://localhost:8510/docs"
echo " Web UI: http://localhost:8509 (after ./serve-ui.sh)"
echo ""
echo " For production, point nginx at web/dist/ and proxy /api/ to localhost:8510"
hr
echo ""

View file

@ -78,7 +78,7 @@ case "$cmd" in
test) test)
echo "Running test suite..." echo "Running test suite..."
docker compose -f "$COMPOSE_FILE" exec api \ docker compose -f "$COMPOSE_FILE" exec api \
conda run -n job-seeker python -m pytest /app/snipe/tests/ -v "${@}" python -m pytest /app/snipe/tests/ -v "${@}"
;; ;;
# ── Cloud commands ──────────────────────────────────────────────────────── # ── Cloud commands ────────────────────────────────────────────────────────

View file

@ -8,7 +8,7 @@ version = "0.1.0"
description = "Auction listing monitor and trust scorer" description = "Auction listing monitor and trust scorer"
requires-python = ">=3.11" requires-python = ">=3.11"
dependencies = [ dependencies = [
"circuitforge-core", "circuitforge-core>=0.8.0",
"streamlit>=1.32", "streamlit>=1.32",
"requests>=2.31", "requests>=2.31",
"imagehash>=4.3", "imagehash>=4.3",

134
tests/test_feedback.py Normal file
View file

@ -0,0 +1,134 @@
"""Tests for the shared feedback router (circuitforge-core) mounted in snipe."""
from __future__ import annotations
from collections.abc import Callable
from unittest.mock import MagicMock, patch
from fastapi import FastAPI
from fastapi.testclient import TestClient
from circuitforge_core.api.feedback import make_feedback_router
# ── Test app factory ──────────────────────────────────────────────────────────
def _make_client(demo_mode_fn: Callable[[], bool] | None = None) -> TestClient:
app = FastAPI()
router = make_feedback_router(
repo="Circuit-Forge/snipe",
product="snipe",
demo_mode_fn=demo_mode_fn,
)
app.include_router(router, prefix="/api/feedback")
return TestClient(app)
# ── GET /api/feedback/status ──────────────────────────────────────────────────
def test_status_disabled_when_no_token(monkeypatch):
monkeypatch.delenv("FORGEJO_API_TOKEN", raising=False)
monkeypatch.delenv("DEMO_MODE", raising=False)
client = _make_client(demo_mode_fn=lambda: False)
res = client.get("/api/feedback/status")
assert res.status_code == 200
assert res.json() == {"enabled": False}
def test_status_enabled_when_token_set(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
client = _make_client(demo_mode_fn=lambda: False)
res = client.get("/api/feedback/status")
assert res.status_code == 200
assert res.json() == {"enabled": True}
def test_status_disabled_in_demo_mode(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
demo = True
client = _make_client(demo_mode_fn=lambda: demo)
res = client.get("/api/feedback/status")
assert res.status_code == 200
assert res.json() == {"enabled": False}
# ── POST /api/feedback ────────────────────────────────────────────────────────
def test_submit_returns_503_when_no_token(monkeypatch):
monkeypatch.delenv("FORGEJO_API_TOKEN", raising=False)
client = _make_client(demo_mode_fn=lambda: False)
res = client.post("/api/feedback", json={
"title": "Test", "description": "desc", "type": "bug",
})
assert res.status_code == 503
def test_submit_returns_403_in_demo_mode(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
demo = True
client = _make_client(demo_mode_fn=lambda: demo)
res = client.post("/api/feedback", json={
"title": "Test", "description": "desc", "type": "bug",
})
assert res.status_code == 403
def test_submit_creates_issue(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
label_response = MagicMock()
label_response.ok = True
label_response.json.return_value = [
{"id": 1, "name": "beta-feedback"},
{"id": 2, "name": "needs-triage"},
{"id": 3, "name": "bug"},
]
issue_response = MagicMock()
issue_response.ok = True
issue_response.json.return_value = {
"number": 7,
"html_url": "https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7",
}
client = _make_client(demo_mode_fn=lambda: False)
with patch("circuitforge_core.api.feedback.requests.get", return_value=label_response), \
patch("circuitforge_core.api.feedback.requests.post", return_value=issue_response):
res = client.post("/api/feedback", json={
"title": "Listing scores wrong",
"description": "Trust score shows 0 when seller has 1000 feedback",
"type": "bug",
"repro": "1. Search for anything\n2. Check trust score",
"tab": "search",
})
assert res.status_code == 200
data = res.json()
assert data["issue_number"] == 7
assert data["issue_url"] == "https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7"
def test_submit_returns_502_on_forgejo_error(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
label_response = MagicMock()
label_response.ok = True
label_response.json.return_value = [
{"id": 1, "name": "beta-feedback"},
{"id": 2, "name": "needs-triage"},
{"id": 3, "name": "question"},
]
bad_response = MagicMock()
bad_response.ok = False
bad_response.text = "internal server error"
client = _make_client(demo_mode_fn=lambda: False)
with patch("circuitforge_core.api.feedback.requests.get", return_value=label_response), \
patch("circuitforge_core.api.feedback.requests.post", return_value=bad_response):
res = client.post("/api/feedback", json={
"title": "Oops", "description": "desc", "type": "other",
})
assert res.status_code == 502

View file

@ -80,6 +80,45 @@ def test_suspicious_price_flagged_when_price_genuinely_low():
assert "suspicious_price" in result.red_flags_json assert "suspicious_price" in result.red_flags_json
def test_scratch_dent_flagged_from_title_slash_variant():
"""Title containing 'parts/repair' (slash variant, no 'or') must trigger scratch_dent_mentioned."""
agg = Aggregator()
scores = {k: 15 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
result = agg.aggregate(
scores, photo_hash_duplicate=False, seller=None,
listing_title="Generic Widget XL - Parts/Repair",
)
assert "scratch_dent_mentioned" in result.red_flags_json
def test_scratch_dent_flagged_from_condition_field():
"""eBay formal condition 'for parts or not working' must trigger scratch_dent_mentioned
even when the listing title contains no damage keywords."""
agg = Aggregator()
scores = {k: 15 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
result = agg.aggregate(
scores, photo_hash_duplicate=False, seller=None,
listing_title="Generic Widget XL",
listing_condition="for parts or not working",
)
assert "scratch_dent_mentioned" in result.red_flags_json
def test_scratch_dent_not_flagged_for_clean_listing():
"""Clean title + 'New' condition must NOT trigger scratch_dent_mentioned."""
agg = Aggregator()
scores = {k: 15 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
result = agg.aggregate(
scores, photo_hash_duplicate=False, seller=None,
listing_title="Generic Widget XL",
listing_condition="new",
)
assert "scratch_dent_mentioned" not in result.red_flags_json
def test_new_account_not_flagged_when_age_absent(): def test_new_account_not_flagged_when_age_absent():
"""account_age_days=None (scraper tier) must NOT trigger new_account or account_under_30_days.""" """account_age_days=None (scraper tier) must NOT trigger new_account or account_under_30_days."""
agg = Aggregator() agg = Aggregator()

View file

@ -123,8 +123,10 @@ export const useSearchStore = defineStore('search', () => {
const affiliateActive = ref<boolean>(false) const affiliateActive = ref<boolean>(false)
const loading = ref(false) const loading = ref(false)
const error = ref<string | null>(null) const error = ref<string | null>(null)
const enriching = ref(false) // true while SSE stream is open
let _abort: AbortController | null = null let _abort: AbortController | null = null
let _sse: EventSource | null = null
function cancelSearch() { function cancelSearch() {
_abort?.abort() _abort?.abort()
@ -166,6 +168,7 @@ export const useSearchStore = defineStore('search', () => {
market_price: number | null market_price: number | null
adapter_used: 'api' | 'scraper' adapter_used: 'api' | 'scraper'
affiliate_active: boolean affiliate_active: boolean
session_id: string | null
} }
results.value = data.listings ?? [] results.value = data.listings ?? []
@ -182,6 +185,12 @@ export const useSearchStore = defineStore('search', () => {
marketPrice: marketPrice.value, marketPrice: marketPrice.value,
adapterUsed: adapterUsed.value, adapterUsed: adapterUsed.value,
}) })
// Open SSE stream if any scores are partial and a session_id was provided
const hasPartial = Object.values(data.trust_scores ?? {}).some(ts => ts.score_is_partial)
if (data.session_id && hasPartial) {
_openUpdates(data.session_id, apiBase)
}
} catch (e) { } catch (e) {
if (e instanceof DOMException && e.name === 'AbortError') { if (e instanceof DOMException && e.name === 'AbortError') {
// User cancelled — clear loading but don't surface as an error // User cancelled — clear loading but don't surface as an error
@ -196,6 +205,57 @@ export const useSearchStore = defineStore('search', () => {
} }
} }
function closeUpdates() {
if (_sse) {
_sse.close()
_sse = null
}
enriching.value = false
}
function _openUpdates(sessionId: string, apiBase: string) {
closeUpdates() // close any previous stream
enriching.value = true
const es = new EventSource(`${apiBase}/api/updates/${sessionId}`)
_sse = es
es.onmessage = (e) => {
try {
const update = JSON.parse(e.data) as {
platform_listing_id: string
trust_score: TrustScore
seller: Record<string, unknown>
market_price: number | null
}
if (update.platform_listing_id && update.trust_score) {
trustScores.value = new Map(trustScores.value)
trustScores.value.set(update.platform_listing_id, update.trust_score)
}
if (update.seller) {
const s = update.seller as Seller
if (s.platform_seller_id) {
sellers.value = new Map(sellers.value)
sellers.value.set(s.platform_seller_id, s)
}
}
if (update.market_price != null) {
marketPrice.value = update.market_price
}
} catch {
// malformed event — ignore
}
}
es.addEventListener('done', () => {
closeUpdates()
})
es.onerror = () => {
closeUpdates()
}
}
async function enrichSeller(sellerUsername: string, listingId: string): Promise<void> { async function enrichSeller(sellerUsername: string, listingId: string): Promise<void> {
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? '' const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
const params = new URLSearchParams({ const params = new URLSearchParams({
@ -230,10 +290,12 @@ export const useSearchStore = defineStore('search', () => {
adapterUsed, adapterUsed,
affiliateActive, affiliateActive,
loading, loading,
enriching,
error, error,
search, search,
cancelSearch, cancelSearch,
enrichSeller, enrichSeller,
closeUpdates,
clearResults, clearResults,
} }
}) })

View file

@ -91,6 +91,17 @@
aria-label="Search filters" aria-label="Search filters"
> >
<!-- Clear all filters only shown when at least one filter is active -->
<button
v-if="activeFilterCount > 0"
type="button"
class="filter-clear-btn"
@click="resetFilters"
aria-label="Clear all filters"
>
Clear filters ({{ activeFilterCount }})
</button>
<!-- eBay Search Parameters --> <!-- eBay Search Parameters -->
<!-- These are sent to eBay. Changes require a new search to take effect. --> <!-- These are sent to eBay. Changes require a new search to take effect. -->
<h2 class="filter-section-heading filter-section-heading--search"> <h2 class="filter-section-heading filter-section-heading--search">
@ -301,6 +312,11 @@
</span> </span>
</p> </p>
<div class="toolbar-actions"> <div class="toolbar-actions">
<!-- Live enrichment indicator visible while SSE stream is open -->
<span v-if="store.enriching" class="enriching-badge" aria-live="polite" title="Scores updating as seller data arrives">
<span class="enriching-dot" aria-hidden="true"></span>
Updating scores
</span>
<label for="sort-select" class="sr-only">Sort by</label> <label for="sort-select" class="sr-only">Sort by</label>
<select id="sort-select" v-model="sortBy" class="sort-select"> <select id="sort-select" v-model="sortBy" class="sort-select">
<option v-for="opt in SORT_OPTIONS" :key="opt.value" :value="opt.value"> <option v-for="opt in SORT_OPTIONS" :key="opt.value" :value="opt.value">
@ -405,7 +421,7 @@ onMounted(() => {
// Filters // Filters
const filters = reactive<SearchFilters>({ const DEFAULT_FILTERS: SearchFilters = {
minTrustScore: 0, minTrustScore: 0,
minPrice: undefined, minPrice: undefined,
maxPrice: undefined, maxPrice: undefined,
@ -424,7 +440,13 @@ const filters = reactive<SearchFilters>({
mustExclude: '', mustExclude: '',
categoryId: '', categoryId: '',
adapter: 'auto' as 'auto' | 'api' | 'scraper', adapter: 'auto' as 'auto' | 'api' | 'scraper',
}) }
const filters = reactive<SearchFilters>({ ...DEFAULT_FILTERS })
function resetFilters() {
Object.assign(filters, DEFAULT_FILTERS)
}
// Parse comma-separated keyword strings into trimmed, lowercase, non-empty term arrays // Parse comma-separated keyword strings into trimmed, lowercase, non-empty term arrays
const parsedMustInclude = computed(() => const parsedMustInclude = computed(() =>
@ -767,6 +789,27 @@ async function onSearch() {
margin-bottom: var(--space-2); margin-bottom: var(--space-2);
} }
/* Clear all filters button */
.filter-clear-btn {
display: flex;
align-items: center;
gap: var(--space-1);
width: 100%;
padding: var(--space-1) var(--space-2);
margin-bottom: var(--space-2);
background: color-mix(in srgb, var(--color-red, #ef4444) 12%, transparent);
color: var(--color-red, #ef4444);
border: 1px solid color-mix(in srgb, var(--color-red, #ef4444) 30%, transparent);
border-radius: var(--radius-sm);
font-size: 0.75rem;
font-weight: 600;
cursor: pointer;
transition: background 0.15s, color 0.15s;
}
.filter-clear-btn:hover {
background: color-mix(in srgb, var(--color-red, #ef4444) 22%, transparent);
}
/* Section headings that separate eBay Search params from local filters */ /* Section headings that separate eBay Search params from local filters */
.filter-section-heading { .filter-section-heading {
font-size: 0.6875rem; font-size: 0.6875rem;
@ -1041,6 +1084,33 @@ async function onSearch() {
flex-wrap: wrap; flex-wrap: wrap;
} }
.enriching-badge {
display: inline-flex;
align-items: center;
gap: var(--space-1);
padding: var(--space-1) var(--space-2);
background: color-mix(in srgb, var(--app-primary) 10%, transparent);
border: 1px solid color-mix(in srgb, var(--app-primary) 30%, transparent);
border-radius: var(--radius-full, 9999px);
color: var(--app-primary);
font-size: 0.75rem;
font-weight: 500;
white-space: nowrap;
}
.enriching-dot {
width: 6px;
height: 6px;
border-radius: 50%;
background: var(--app-primary);
animation: enriching-pulse 1.2s ease-in-out infinite;
}
@keyframes enriching-pulse {
0%, 100% { opacity: 1; transform: scale(1); }
50% { opacity: 0.4; transform: scale(0.7); }
}
.save-btn { .save-btn {
display: flex; display: flex;
align-items: center; align-items: center;