Compare commits
20 commits
feature/sh
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 303b4bfb6f | |||
| 45c758bb53 | |||
| 59f728cba0 | |||
| 81e41e39ab | |||
| 234c76e686 | |||
| 7672dd758a | |||
| 663d92fc11 | |||
| c2fa107c47 | |||
| 1ca9398df4 | |||
| 5a3f9cb460 | |||
| f7d5b20aa5 | |||
| bdbcb046cc | |||
| ccbbe58bd4 | |||
| c5988a059d | |||
| 0a93b7386a | |||
| 860276420e | |||
| 0430454dad | |||
| 3f7c2b9135 | |||
| 0617fc8256 | |||
| d5419d2b1b |
17 changed files with 1242 additions and 41 deletions
36
.env.example
36
.env.example
|
|
@ -46,3 +46,39 @@ SNIPE_DB=data/snipe.db
|
||||||
# Heimdall license server — for tier resolution and free-key auto-provisioning.
|
# Heimdall license server — for tier resolution and free-key auto-provisioning.
|
||||||
# HEIMDALL_URL=https://license.circuitforge.tech
|
# HEIMDALL_URL=https://license.circuitforge.tech
|
||||||
# HEIMDALL_ADMIN_TOKEN=
|
# HEIMDALL_ADMIN_TOKEN=
|
||||||
|
|
||||||
|
# ── eBay Affiliate (optional) ─────────────────────────────────────────────────
|
||||||
|
# Set to your eBay Partner Network (EPN) campaign ID to earn commissions on
|
||||||
|
# listing click-throughs. Leave blank for clean /itm/ URLs (no tracking).
|
||||||
|
# Register at https://partnernetwork.ebay.com — self-hosted users can use their
|
||||||
|
# own ID; the CF cloud instance uses CF's campaign ID (disclosed in the UI).
|
||||||
|
# EBAY_AFFILIATE_CAMPAIGN_ID=
|
||||||
|
|
||||||
|
# ── LLM inference (vision / photo analysis) ──────────────────────────────────
|
||||||
|
# circuitforge-core LLMRouter auto-detects backends from these env vars
|
||||||
|
# (no llm.yaml required). Backends are tried in this priority order:
|
||||||
|
# 1. ANTHROPIC_API_KEY → Claude API (cloud; requires Paid tier key)
|
||||||
|
# 2. OPENAI_API_KEY → OpenAI-compatible endpoint
|
||||||
|
# 3. OLLAMA_HOST → local Ollama (default: http://localhost:11434)
|
||||||
|
# Leave all unset to disable LLM features (photo analysis won't run).
|
||||||
|
|
||||||
|
# ANTHROPIC_API_KEY=
|
||||||
|
# ANTHROPIC_MODEL=claude-haiku-4-5-20251001
|
||||||
|
|
||||||
|
# OPENAI_API_KEY=
|
||||||
|
# OPENAI_BASE_URL=https://api.openai.com/v1
|
||||||
|
# OPENAI_MODEL=gpt-4o-mini
|
||||||
|
|
||||||
|
# OLLAMA_HOST=http://localhost:11434
|
||||||
|
# OLLAMA_MODEL=llava:7b
|
||||||
|
|
||||||
|
# CF Orchestrator — managed inference for Paid+ cloud users (internal use only).
|
||||||
|
# Self-hosted users leave this unset; it has no effect without a valid allocation token.
|
||||||
|
# CF_ORCH_URL=https://orch.circuitforge.tech
|
||||||
|
|
||||||
|
# ── In-app feedback (beta) ────────────────────────────────────────────────────
|
||||||
|
# When set, a feedback FAB appears in the UI and routes submissions to Forgejo.
|
||||||
|
# Leave unset to silently hide the button (demo/offline deployments).
|
||||||
|
# FORGEJO_API_TOKEN=
|
||||||
|
# FORGEJO_REPO=Circuit-Forge/snipe
|
||||||
|
# FORGEJO_API_URL=https://git.opensourcesolarpunk.com/api/v1
|
||||||
|
|
|
||||||
60
README.md
60
README.md
|
|
@ -2,7 +2,51 @@
|
||||||
|
|
||||||
> *Part of the Circuit Forge LLC "AI for the tasks you hate most" suite.*
|
> *Part of the Circuit Forge LLC "AI for the tasks you hate most" suite.*
|
||||||
|
|
||||||
**Status:** Active — eBay listing search + seller trust scoring MVP complete. Auction sniping engine and multi-platform support are next.
|
**Status:** Active — eBay listing intelligence MVP complete (search, trust scoring, affiliate links, feedback FAB, vision task scheduling). Auction sniping engine and multi-platform support are next.
|
||||||
|
|
||||||
|
## Quick install (self-hosted)
|
||||||
|
|
||||||
|
**Requirements:** Docker with Compose plugin, Git. No API keys needed to get started.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# One-line install — clones to ~/snipe by default
|
||||||
|
bash <(curl -fsSL https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/raw/branch/main/install.sh)
|
||||||
|
|
||||||
|
# Or clone manually and run the script:
|
||||||
|
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/snipe.git
|
||||||
|
bash snipe/install.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Then open **http://localhost:8509**.
|
||||||
|
|
||||||
|
### Manual setup (if you prefer)
|
||||||
|
|
||||||
|
Snipe's API image is built from a parent context that includes `circuitforge-core`. Both repos must sit as siblings in the same directory:
|
||||||
|
|
||||||
|
```
|
||||||
|
workspace/
|
||||||
|
├── snipe/ ← this repo
|
||||||
|
└── circuitforge-core/ ← required sibling
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdir snipe-workspace && cd snipe-workspace
|
||||||
|
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/snipe.git
|
||||||
|
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/circuitforge-core.git
|
||||||
|
cd snipe
|
||||||
|
cp .env.example .env # edit if you have eBay API credentials (optional)
|
||||||
|
./manage.sh start
|
||||||
|
```
|
||||||
|
|
||||||
|
### Optional: eBay API credentials
|
||||||
|
|
||||||
|
Snipe works without any credentials using its Playwright scraper fallback. Adding eBay API credentials unlocks faster searches and inline seller account age (no extra scrape needed):
|
||||||
|
|
||||||
|
1. Register at [developer.ebay.com](https://developer.ebay.com/my/keys)
|
||||||
|
2. Copy your Production **App ID** and **Cert ID** into `.env`
|
||||||
|
3. Restart: `./manage.sh restart`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## What it does
|
## What it does
|
||||||
|
|
||||||
|
|
@ -68,6 +112,20 @@ Scans listing titles for signals the item may have undisclosed damage or problem
|
||||||
- **On-demand**: ↻ button on any listing card triggers `POST /api/enrich` — runs enrichment and re-scores without waiting for a second search
|
- **On-demand**: ↻ button on any listing card triggers `POST /api/enrich` — runs enrichment and re-scores without waiting for a second search
|
||||||
- **Category history**: derived from the seller's accumulated listing data (Browse API `categories` field); improves with every search, no extra API calls
|
- **Category history**: derived from the seller's accumulated listing data (Browse API `categories` field); improves with every search, no extra API calls
|
||||||
|
|
||||||
|
### Affiliate link builder
|
||||||
|
|
||||||
|
Listing cards surface eBay affiliate-wrapped URLs. Uses `circuitforge_core.affiliates.wrap_url` — resolution order: user opted out → plain URL; user has BYOK affiliate ID → their ID; CF env var set (`EBAY_AFFILIATE_ID`) → CF's ID; otherwise plain URL. Users can configure their own eBay Partner Network ID or opt out entirely in Settings.
|
||||||
|
|
||||||
|
Disclosure tooltip appears on first encounter per-session and on each wrapped link (per-retailer copy from `get_disclosure_text`).
|
||||||
|
|
||||||
|
### Feedback FAB
|
||||||
|
|
||||||
|
In-app feedback button (bottom-right FAB) opens a modal: title, description, optional screenshot. Posts to the CF feedback endpoint. Status probed on load; FAB hidden if endpoint unreachable.
|
||||||
|
|
||||||
|
### Vision task scheduling
|
||||||
|
|
||||||
|
Photo condition assessment tasks queued through `circuitforge_core.tasks.TaskScheduler` — VRAM-aware slot management shared with any other LLM workloads on the same host. Runs moondream2 locally (free tier) or Claude vision (paid/cloud). Results stored per-listing and update the trust score card.
|
||||||
|
|
||||||
### Market price comparison
|
### Market price comparison
|
||||||
Completed sales fetched via eBay Marketplace Insights API (with Browse API fallback for app tiers that don't have Insights access). Median stored per query hash, used to score `price_vs_market` across all listings in a search.
|
Completed sales fetched via eBay Marketplace Insights API (with Browse API fallback for app tiers that don't have Insights access). Median stored per query hash, used to score `price_vs_market` across all listings in a search.
|
||||||
|
|
||||||
|
|
|
||||||
176
api/main.py
176
api/main.py
|
|
@ -3,20 +3,27 @@ from __future__ import annotations
|
||||||
|
|
||||||
import dataclasses
|
import dataclasses
|
||||||
import hashlib
|
import hashlib
|
||||||
|
import json as _json
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
|
import queue as _queue
|
||||||
|
import uuid
|
||||||
from concurrent.futures import ThreadPoolExecutor
|
from concurrent.futures import ThreadPoolExecutor
|
||||||
from contextlib import asynccontextmanager
|
from contextlib import asynccontextmanager
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
|
import asyncio
|
||||||
import csv
|
import csv
|
||||||
import io
|
import io
|
||||||
from fastapi import Depends, FastAPI, HTTPException, UploadFile, File
|
|
||||||
|
from fastapi import Depends, FastAPI, HTTPException, Request, UploadFile, File
|
||||||
from fastapi.responses import StreamingResponse
|
from fastapi.responses import StreamingResponse
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
|
||||||
from circuitforge_core.config import load_env
|
from circuitforge_core.config import load_env
|
||||||
|
from circuitforge_core.affiliates import wrap_url as _wrap_affiliate_url
|
||||||
|
from circuitforge_core.api import make_feedback_router as _make_feedback_router
|
||||||
from app.db.store import Store
|
from app.db.store import Store
|
||||||
from app.db.models import SavedSearch as SavedSearchModel, ScammerEntry
|
from app.db.models import SavedSearch as SavedSearchModel, ScammerEntry
|
||||||
from app.platforms import SearchFilters
|
from app.platforms import SearchFilters
|
||||||
|
|
@ -31,6 +38,12 @@ from api.ebay_webhook import router as ebay_webhook_router
|
||||||
load_env(Path(".env"))
|
load_env(Path(".env"))
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# ── SSE update registry ───────────────────────────────────────────────────────
|
||||||
|
# Maps session_id → SimpleQueue of update events.
|
||||||
|
# SimpleQueue is always thread-safe; no asyncio loop needed to write from threads.
|
||||||
|
# Keys are cleaned up when the SSE stream ends (client disconnect or timeout).
|
||||||
|
_update_queues: dict[str, _queue.SimpleQueue] = {}
|
||||||
|
|
||||||
|
|
||||||
@asynccontextmanager
|
@asynccontextmanager
|
||||||
async def _lifespan(app: FastAPI):
|
async def _lifespan(app: FastAPI):
|
||||||
|
|
@ -63,6 +76,7 @@ def _ebay_creds() -> tuple[str, str, str]:
|
||||||
client_secret = (os.environ.get("EBAY_CERT_ID") or os.environ.get("EBAY_CLIENT_SECRET", "")).strip()
|
client_secret = (os.environ.get("EBAY_CERT_ID") or os.environ.get("EBAY_CLIENT_SECRET", "")).strip()
|
||||||
return client_id, client_secret, env
|
return client_id, client_secret, env
|
||||||
|
|
||||||
|
|
||||||
app = FastAPI(title="Snipe API", version="0.1.0", lifespan=_lifespan)
|
app = FastAPI(title="Snipe API", version="0.1.0", lifespan=_lifespan)
|
||||||
app.include_router(ebay_webhook_router)
|
app.include_router(ebay_webhook_router)
|
||||||
|
|
||||||
|
|
@ -73,6 +87,12 @@ app.add_middleware(
|
||||||
allow_headers=["*"],
|
allow_headers=["*"],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
_feedback_router = _make_feedback_router(
|
||||||
|
repo="Circuit-Forge/snipe",
|
||||||
|
product="snipe",
|
||||||
|
)
|
||||||
|
app.include_router(_feedback_router, prefix="/api/feedback")
|
||||||
|
|
||||||
|
|
||||||
@app.get("/api/health")
|
@app.get("/api/health")
|
||||||
def health():
|
def health():
|
||||||
|
|
@ -99,31 +119,34 @@ def _trigger_scraper_enrichment(
|
||||||
listings: list,
|
listings: list,
|
||||||
shared_store: Store,
|
shared_store: Store,
|
||||||
shared_db: Path,
|
shared_db: Path,
|
||||||
|
user_db: Path | None = None,
|
||||||
|
query: str = "",
|
||||||
|
session_id: str | None = None,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Fire-and-forget background enrichment for missing seller signals.
|
"""Fire-and-forget background enrichment for missing seller signals.
|
||||||
|
|
||||||
Two enrichment passes run concurrently in the same daemon thread:
|
Two enrichment passes run in the same daemon thread:
|
||||||
1. BTF (/itm/ pages) — fills account_age_days for sellers where it is None.
|
1. BTF (/itm/ pages) — fills account_age_days for sellers where it is None.
|
||||||
2. _ssn search pages — fills category_history_json for sellers with no history.
|
2. _ssn search pages — fills category_history_json for sellers with no history.
|
||||||
|
|
||||||
The main response returns immediately; enriched data lands in the DB for
|
When session_id is provided, pushes re-scored trust score updates to the
|
||||||
future searches. Uses ScrapedEbayAdapter's Playwright stack regardless of
|
SSE queue after each pass so the frontend can update scores live.
|
||||||
which adapter was used for the main search (Shopping API handles age for
|
|
||||||
the API adapter inline; BTF is the fallback for no-creds / scraper mode).
|
|
||||||
|
|
||||||
shared_store: used for pre-flight seller checks (same-thread reads).
|
shared_store: used for pre-flight seller checks (same-thread reads).
|
||||||
shared_db: path passed to background thread — it creates its own Store
|
shared_db: path passed to background thread (sqlite3 is not thread-safe).
|
||||||
(sqlite3 connections are not thread-safe).
|
user_db: path to per-user listings/trust_scores DB (same as shared_db in local mode).
|
||||||
|
query: original search query — used for market comp lookup during re-score.
|
||||||
|
session_id: SSE session key; if set, updates are pushed to _update_queues[session_id].
|
||||||
"""
|
"""
|
||||||
# Caps per search: limits Playwright sessions launched in the background so we
|
|
||||||
# don't hammer Kasada or spin up dozens of Xvfb instances after a large search.
|
|
||||||
# Remaining sellers get enriched incrementally on subsequent searches.
|
|
||||||
_BTF_MAX_PER_SEARCH = 3
|
_BTF_MAX_PER_SEARCH = 3
|
||||||
_CAT_MAX_PER_SEARCH = 3
|
_CAT_MAX_PER_SEARCH = 3
|
||||||
|
|
||||||
needs_btf: dict[str, str] = {}
|
needs_btf: dict[str, str] = {}
|
||||||
needs_categories: list[str] = []
|
needs_categories: list[str] = []
|
||||||
|
|
||||||
|
# Map seller_id → [listings] for this search so we know what to re-score
|
||||||
|
seller_listing_map: dict[str, list] = {}
|
||||||
|
|
||||||
for listing in listings:
|
for listing in listings:
|
||||||
sid = listing.seller_platform_id
|
sid = listing.seller_platform_id
|
||||||
if not sid:
|
if not sid:
|
||||||
|
|
@ -131,6 +154,7 @@ def _trigger_scraper_enrichment(
|
||||||
seller = shared_store.get_seller("ebay", sid)
|
seller = shared_store.get_seller("ebay", sid)
|
||||||
if not seller:
|
if not seller:
|
||||||
continue
|
continue
|
||||||
|
seller_listing_map.setdefault(sid, []).append(listing)
|
||||||
if ((seller.account_age_days is None or seller.feedback_count == 0)
|
if ((seller.account_age_days is None or seller.feedback_count == 0)
|
||||||
and sid not in needs_btf
|
and sid not in needs_btf
|
||||||
and len(needs_btf) < _BTF_MAX_PER_SEARCH):
|
and len(needs_btf) < _BTF_MAX_PER_SEARCH):
|
||||||
|
|
@ -141,6 +165,8 @@ def _trigger_scraper_enrichment(
|
||||||
needs_categories.append(sid)
|
needs_categories.append(sid)
|
||||||
|
|
||||||
if not needs_btf and not needs_categories:
|
if not needs_btf and not needs_categories:
|
||||||
|
if session_id and session_id in _update_queues:
|
||||||
|
_update_queues[session_id].put(None) # sentinel — nothing to enrich
|
||||||
return
|
return
|
||||||
|
|
||||||
log.info(
|
log.info(
|
||||||
|
|
@ -148,17 +174,55 @@ def _trigger_scraper_enrichment(
|
||||||
len(needs_btf), len(needs_categories),
|
len(needs_btf), len(needs_categories),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def _push_updates(enriched_seller_ids: list[str]) -> None:
|
||||||
|
"""Re-score listings for enriched sellers and push updates to SSE queue."""
|
||||||
|
if not session_id or session_id not in _update_queues:
|
||||||
|
return
|
||||||
|
q = _update_queues[session_id]
|
||||||
|
thread_shared = Store(shared_db)
|
||||||
|
thread_user = Store(user_db or shared_db)
|
||||||
|
scorer = TrustScorer(thread_shared)
|
||||||
|
comp = thread_shared.get_market_comp("ebay", hashlib.md5(query.encode()).hexdigest())
|
||||||
|
market_price = comp.median_price if comp else None
|
||||||
|
for sid in enriched_seller_ids:
|
||||||
|
seller = thread_shared.get_seller("ebay", sid)
|
||||||
|
if not seller:
|
||||||
|
continue
|
||||||
|
affected = seller_listing_map.get(sid, [])
|
||||||
|
if not affected:
|
||||||
|
continue
|
||||||
|
new_scores = scorer.score_batch(affected, query)
|
||||||
|
thread_user.save_trust_scores(new_scores)
|
||||||
|
for listing, ts in zip(affected, new_scores):
|
||||||
|
if ts is None:
|
||||||
|
continue
|
||||||
|
q.put({
|
||||||
|
"platform_listing_id": listing.platform_listing_id,
|
||||||
|
"trust_score": dataclasses.asdict(ts),
|
||||||
|
"seller": dataclasses.asdict(seller),
|
||||||
|
"market_price": market_price,
|
||||||
|
})
|
||||||
|
|
||||||
def _run():
|
def _run():
|
||||||
try:
|
try:
|
||||||
enricher = ScrapedEbayAdapter(Store(shared_db))
|
enricher = ScrapedEbayAdapter(Store(shared_db))
|
||||||
if needs_btf:
|
if needs_btf:
|
||||||
enricher.enrich_sellers_btf(needs_btf, max_workers=2)
|
enricher.enrich_sellers_btf(needs_btf, max_workers=2)
|
||||||
log.info("BTF enrichment complete for %d sellers", len(needs_btf))
|
log.info("BTF enrichment complete for %d sellers", len(needs_btf))
|
||||||
|
_push_updates(list(needs_btf.keys()))
|
||||||
if needs_categories:
|
if needs_categories:
|
||||||
enricher.enrich_sellers_categories(needs_categories, max_workers=2)
|
enricher.enrich_sellers_categories(needs_categories, max_workers=2)
|
||||||
log.info("Category enrichment complete for %d sellers", len(needs_categories))
|
log.info("Category enrichment complete for %d sellers", len(needs_categories))
|
||||||
|
# Re-score only sellers not already covered by BTF push
|
||||||
|
cat_only = [s for s in needs_categories if s not in needs_btf]
|
||||||
|
if cat_only:
|
||||||
|
_push_updates(cat_only)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.warning("Scraper enrichment failed: %s", e)
|
log.warning("Scraper enrichment failed: %s", e)
|
||||||
|
finally:
|
||||||
|
# Sentinel: tells SSE stream the enrichment thread is done
|
||||||
|
if session_id and session_id in _update_queues:
|
||||||
|
_update_queues[session_id].put(None)
|
||||||
|
|
||||||
import threading
|
import threading
|
||||||
t = threading.Thread(target=_run, daemon=True)
|
t = threading.Thread(target=_run, daemon=True)
|
||||||
|
|
@ -355,9 +419,15 @@ def search(
|
||||||
listings = [staged.get(l.platform_listing_id, l) for l in listings]
|
listings = [staged.get(l.platform_listing_id, l) for l in listings]
|
||||||
|
|
||||||
# BTF enrichment: scrape /itm/ pages for sellers missing account_age_days.
|
# BTF enrichment: scrape /itm/ pages for sellers missing account_age_days.
|
||||||
# Runs in the background so it doesn't delay the response; next search of
|
# Runs in the background so it doesn't delay the response. A session_id is
|
||||||
# the same sellers will have full scores.
|
# generated so the frontend can open an SSE stream and receive live score
|
||||||
_trigger_scraper_enrichment(listings, shared_store, shared_db)
|
# updates as enrichment completes.
|
||||||
|
session_id = str(uuid.uuid4())
|
||||||
|
_update_queues[session_id] = _queue.SimpleQueue()
|
||||||
|
_trigger_scraper_enrichment(
|
||||||
|
listings, shared_store, shared_db,
|
||||||
|
user_db=user_db, query=q, session_id=session_id,
|
||||||
|
)
|
||||||
|
|
||||||
scorer = TrustScorer(shared_store)
|
scorer = TrustScorer(shared_store)
|
||||||
trust_scores_list = scorer.score_batch(listings, q)
|
trust_scores_list = scorer.score_batch(listings, q)
|
||||||
|
|
@ -389,12 +459,19 @@ def search(
|
||||||
and shared_store.get_seller("ebay", listing.seller_platform_id)
|
and shared_store.get_seller("ebay", listing.seller_platform_id)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
def _serialize_listing(l: object) -> dict:
|
||||||
|
d = dataclasses.asdict(l)
|
||||||
|
d["url"] = _wrap_affiliate_url(d["url"], retailer="ebay")
|
||||||
|
return d
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"listings": [dataclasses.asdict(l) for l in listings],
|
"listings": [_serialize_listing(l) for l in listings],
|
||||||
"trust_scores": trust_map,
|
"trust_scores": trust_map,
|
||||||
"sellers": seller_map,
|
"sellers": seller_map,
|
||||||
"market_price": market_price,
|
"market_price": market_price,
|
||||||
"adapter_used": adapter_used,
|
"adapter_used": adapter_used,
|
||||||
|
"affiliate_active": bool(os.environ.get("EBAY_AFFILIATE_CAMPAIGN_ID", "").strip()),
|
||||||
|
"session_id": session_id,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -492,6 +569,73 @@ def enrich_seller(
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ── SSE live score updates ────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@app.get("/api/updates/{session_id}")
|
||||||
|
async def stream_updates(session_id: str, request: Request):
|
||||||
|
"""Server-Sent Events stream for live trust score updates.
|
||||||
|
|
||||||
|
Opens after a search when any listings have score_is_partial=true.
|
||||||
|
Streams re-scored trust score payloads as enrichment completes, then
|
||||||
|
sends a 'done' event and closes.
|
||||||
|
|
||||||
|
Each event payload:
|
||||||
|
{ platform_listing_id, trust_score, seller, market_price }
|
||||||
|
|
||||||
|
Closes automatically after 90 seconds (worst-case Playwright enrichment).
|
||||||
|
The client should also close on 'done' event.
|
||||||
|
"""
|
||||||
|
if session_id not in _update_queues:
|
||||||
|
raise HTTPException(status_code=404, detail="Unknown session_id")
|
||||||
|
|
||||||
|
q = _update_queues[session_id]
|
||||||
|
deadline = asyncio.get_event_loop().time() + 90.0
|
||||||
|
heartbeat_interval = 15.0
|
||||||
|
next_heartbeat = asyncio.get_event_loop().time() + heartbeat_interval
|
||||||
|
|
||||||
|
async def generate():
|
||||||
|
nonlocal next_heartbeat
|
||||||
|
try:
|
||||||
|
while asyncio.get_event_loop().time() < deadline:
|
||||||
|
if await request.is_disconnected():
|
||||||
|
break
|
||||||
|
|
||||||
|
# Drain all available updates (non-blocking)
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
item = q.get_nowait()
|
||||||
|
except _queue.Empty:
|
||||||
|
break
|
||||||
|
if item is None:
|
||||||
|
# Sentinel: enrichment thread is done
|
||||||
|
yield "event: done\ndata: {}\n\n"
|
||||||
|
return
|
||||||
|
yield f"data: {_json.dumps(item)}\n\n"
|
||||||
|
|
||||||
|
# Heartbeat to keep the connection alive through proxies
|
||||||
|
now = asyncio.get_event_loop().time()
|
||||||
|
if now >= next_heartbeat:
|
||||||
|
yield ": heartbeat\n\n"
|
||||||
|
next_heartbeat = now + heartbeat_interval
|
||||||
|
|
||||||
|
await asyncio.sleep(0.5)
|
||||||
|
|
||||||
|
# Timeout reached
|
||||||
|
yield "event: done\ndata: {\"reason\": \"timeout\"}\n\n"
|
||||||
|
finally:
|
||||||
|
_update_queues.pop(session_id, None)
|
||||||
|
|
||||||
|
return StreamingResponse(
|
||||||
|
generate(),
|
||||||
|
media_type="text/event-stream",
|
||||||
|
headers={
|
||||||
|
"Cache-Control": "no-cache",
|
||||||
|
"X-Accel-Buffering": "no", # nginx: disable proxy buffering for SSE
|
||||||
|
"Connection": "keep-alive",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
# ── Saved Searches ────────────────────────────────────────────────────────────
|
# ── Saved Searches ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
class SavedSearchCreate(BaseModel):
|
class SavedSearchCreate(BaseModel):
|
||||||
|
|
@ -629,3 +773,5 @@ async def import_blocklist(
|
||||||
|
|
||||||
log.info("Blocklist import: %d added, %d errors", imported, len(errors))
|
log.info("Blocklist import: %d added, %d errors", imported, len(errors))
|
||||||
return {"imported": imported, "errors": errors}
|
return {"imported": imported, "errors": errors}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,9 +1,9 @@
|
||||||
-- Staging DB: persistent listing tracking across searches.
|
-- Staging DB: persistent listing tracking across searches.
|
||||||
-- Adds temporal metadata to listings so we can detect stale/repriced/recurring items.
|
-- Adds temporal metadata to listings so we can detect stale/repriced/recurring items.
|
||||||
|
|
||||||
ALTER TABLE listings ADD COLUMN first_seen_at TEXT;
|
ALTER TABLE listings ADD COLUMN first_seen_at TEXT;
|
||||||
ALTER TABLE listings ADD COLUMN last_seen_at TEXT;
|
ALTER TABLE listings ADD COLUMN last_seen_at TEXT;
|
||||||
ALTER TABLE listings ADD COLUMN times_seen INTEGER NOT NULL DEFAULT 1;
|
ALTER TABLE listings ADD COLUMN times_seen INTEGER NOT NULL DEFAULT 1;
|
||||||
ALTER TABLE listings ADD COLUMN price_at_first_seen REAL;
|
ALTER TABLE listings ADD COLUMN price_at_first_seen REAL;
|
||||||
|
|
||||||
-- Backfill existing rows so columns are non-null where we have data
|
-- Backfill existing rows so columns are non-null where we have data
|
||||||
|
|
|
||||||
|
|
@ -52,6 +52,7 @@ class TrustScorer:
|
||||||
signal_scores, is_dup, seller,
|
signal_scores, is_dup, seller,
|
||||||
listing_id=listing.id or 0,
|
listing_id=listing.id or 0,
|
||||||
listing_title=listing.title,
|
listing_title=listing.title,
|
||||||
|
listing_condition=listing.condition,
|
||||||
times_seen=listing.times_seen,
|
times_seen=listing.times_seen,
|
||||||
first_seen_at=listing.first_seen_at,
|
first_seen_at=listing.first_seen_at,
|
||||||
price=listing.price,
|
price=listing.price,
|
||||||
|
|
|
||||||
|
|
@ -23,8 +23,9 @@ _SCRATCH_DENT_KEYWORDS = frozenset([
|
||||||
"crack", "cracked", "chip", "chipped",
|
"crack", "cracked", "chip", "chipped",
|
||||||
"damage", "damaged", "cosmetic damage",
|
"damage", "damaged", "cosmetic damage",
|
||||||
"blemish", "wear", "worn", "worn in",
|
"blemish", "wear", "worn", "worn in",
|
||||||
# Parts / condition catch-alls
|
# Parts / condition catch-alls (also matches eBay condition field strings verbatim)
|
||||||
"as is", "for parts", "parts only", "spares or repair", "parts or repair",
|
"as is", "for parts", "parts only", "spares or repair", "parts or repair",
|
||||||
|
"parts/repair", "parts or not working", "not working",
|
||||||
# Evasive redirects — seller hiding damage detail in listing body
|
# Evasive redirects — seller hiding damage detail in listing body
|
||||||
"see description", "read description", "read listing", "see listing",
|
"see description", "read description", "read listing", "see listing",
|
||||||
"see photos for", "see pics for", "see images for",
|
"see photos for", "see pics for", "see images for",
|
||||||
|
|
@ -72,6 +73,7 @@ class Aggregator:
|
||||||
seller: Optional[Seller],
|
seller: Optional[Seller],
|
||||||
listing_id: int = 0,
|
listing_id: int = 0,
|
||||||
listing_title: str = "",
|
listing_title: str = "",
|
||||||
|
listing_condition: str = "",
|
||||||
times_seen: int = 1,
|
times_seen: int = 1,
|
||||||
first_seen_at: Optional[str] = None,
|
first_seen_at: Optional[str] = None,
|
||||||
price: float = 0.0,
|
price: float = 0.0,
|
||||||
|
|
@ -137,7 +139,9 @@ class Aggregator:
|
||||||
)
|
)
|
||||||
if photo_hash_duplicate and not is_established_retailer:
|
if photo_hash_duplicate and not is_established_retailer:
|
||||||
red_flags.append("duplicate_photo")
|
red_flags.append("duplicate_photo")
|
||||||
if listing_title and _has_damage_keywords(listing_title):
|
if (listing_title and _has_damage_keywords(listing_title)) or (
|
||||||
|
listing_condition and _has_damage_keywords(listing_condition)
|
||||||
|
):
|
||||||
red_flags.append("scratch_dent_mentioned")
|
red_flags.append("scratch_dent_mentioned")
|
||||||
|
|
||||||
# Staging DB signals
|
# Staging DB signals
|
||||||
|
|
|
||||||
|
|
@ -1,21 +1,17 @@
|
||||||
|
# compose.override.yml — dev-only additions (auto-applied by Docker Compose in dev).
|
||||||
|
# Safe to delete on a self-hosted machine — compose.yml is self-contained.
|
||||||
|
#
|
||||||
|
# What this adds over compose.yml:
|
||||||
|
# - Live source mounts so code changes take effect without rebuilding images
|
||||||
|
# - RELOAD=true to enable uvicorn --reload for the API
|
||||||
|
# - NOTE: circuitforge-core is NOT mounted here — use `./manage.sh build` to
|
||||||
|
# pick up cf-core changes. Mounting it as a bind volume would break self-hosted
|
||||||
|
# installs that don't have the sibling directory.
|
||||||
services:
|
services:
|
||||||
api:
|
api:
|
||||||
build:
|
|
||||||
context: ..
|
|
||||||
dockerfile: snipe/Dockerfile
|
|
||||||
network_mode: host
|
|
||||||
volumes:
|
volumes:
|
||||||
- ../circuitforge-core:/app/circuitforge-core
|
|
||||||
- ./api:/app/snipe/api
|
- ./api:/app/snipe/api
|
||||||
- ./app:/app/snipe/app
|
- ./app:/app/snipe/app
|
||||||
- ./data:/app/snipe/data
|
|
||||||
- ./tests:/app/snipe/tests
|
- ./tests:/app/snipe/tests
|
||||||
environment:
|
environment:
|
||||||
- RELOAD=true
|
- RELOAD=true
|
||||||
|
|
||||||
web:
|
|
||||||
build:
|
|
||||||
context: .
|
|
||||||
dockerfile: docker/web/Dockerfile
|
|
||||||
volumes:
|
|
||||||
- ./web/src:/app/src # not used at runtime but keeps override valid
|
|
||||||
|
|
|
||||||
|
|
@ -3,11 +3,14 @@ services:
|
||||||
build:
|
build:
|
||||||
context: ..
|
context: ..
|
||||||
dockerfile: snipe/Dockerfile
|
dockerfile: snipe/Dockerfile
|
||||||
ports:
|
# Host networking lets nginx (in the web container) reach the API at
|
||||||
- "8510:8510"
|
# 172.17.0.1:8510 (the Docker bridge gateway). Required — nginx.conf
|
||||||
|
# is baked into the image and hard-codes that address.
|
||||||
|
network_mode: host
|
||||||
env_file: .env
|
env_file: .env
|
||||||
volumes:
|
volumes:
|
||||||
- ./data:/app/snipe/data
|
- ./data:/app/snipe/data
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
web:
|
web:
|
||||||
build:
|
build:
|
||||||
|
|
|
||||||
226
install.sh
Executable file
226
install.sh
Executable file
|
|
@ -0,0 +1,226 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# Snipe — self-hosted install script
|
||||||
|
#
|
||||||
|
# Supports two install paths:
|
||||||
|
# Docker (recommended) — everything in containers, no system Python deps required
|
||||||
|
# No-Docker — conda or venv + direct uvicorn, for machines without Docker
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# bash install.sh # installs to ~/snipe
|
||||||
|
# bash install.sh /opt/snipe # custom install directory
|
||||||
|
# bash install.sh ~/snipe --no-docker # force no-Docker path even if Docker present
|
||||||
|
#
|
||||||
|
# Requirements (Docker path): Docker with Compose plugin, Git
|
||||||
|
# Requirements (no-Docker path): Python 3.11+, Node.js 20+, Git, xvfb (system)
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
INSTALL_DIR="${1:-$HOME/snipe}"
|
||||||
|
FORCE_NO_DOCKER="${2:-}"
|
||||||
|
FORGEJO="https://git.opensourcesolarpunk.com/Circuit-Forge"
|
||||||
|
CONDA_ENV="cf"
|
||||||
|
|
||||||
|
info() { echo " [snipe] $*"; }
|
||||||
|
ok() { echo "✓ $*"; }
|
||||||
|
warn() { echo "! $*"; }
|
||||||
|
fail() { echo "✗ $*" >&2; exit 1; }
|
||||||
|
hr() { echo "────────────────────────────────────────────────────────"; }
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo " Snipe — self-hosted installer"
|
||||||
|
echo " Install directory: $INSTALL_DIR"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# ── Detect capabilities ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
HAS_DOCKER=false
|
||||||
|
HAS_CONDA=false
|
||||||
|
HAS_PYTHON=false
|
||||||
|
HAS_NODE=false
|
||||||
|
|
||||||
|
docker compose version >/dev/null 2>&1 && HAS_DOCKER=true
|
||||||
|
conda --version >/dev/null 2>&1 && HAS_CONDA=true
|
||||||
|
python3 --version >/dev/null 2>&1 && HAS_PYTHON=true
|
||||||
|
node --version >/dev/null 2>&1 && HAS_NODE=true
|
||||||
|
command -v git >/dev/null 2>&1 || fail "Git is required. Install with: sudo apt-get install git"
|
||||||
|
|
||||||
|
# Honour --no-docker flag
|
||||||
|
[[ "$FORCE_NO_DOCKER" == "--no-docker" ]] && HAS_DOCKER=false
|
||||||
|
|
||||||
|
if $HAS_DOCKER; then
|
||||||
|
INSTALL_PATH="docker"
|
||||||
|
ok "Docker found — using Docker install path (recommended)"
|
||||||
|
elif $HAS_PYTHON; then
|
||||||
|
INSTALL_PATH="python"
|
||||||
|
warn "Docker not found — using no-Docker path (conda or venv)"
|
||||||
|
else
|
||||||
|
fail "Docker or Python 3.11+ is required. Install Docker: https://docs.docker.com/get-docker/"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── Clone repos ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# compose.yml and the Dockerfile both use context: .. (parent directory), so
|
||||||
|
# snipe/ and circuitforge-core/ must be siblings inside INSTALL_DIR.
|
||||||
|
SNIPE_DIR="$INSTALL_DIR/snipe"
|
||||||
|
CORE_DIR="$INSTALL_DIR/circuitforge-core"
|
||||||
|
|
||||||
|
if [[ -d "$SNIPE_DIR" ]]; then
|
||||||
|
info "Snipe already cloned — pulling latest..."
|
||||||
|
git -C "$SNIPE_DIR" pull --ff-only
|
||||||
|
else
|
||||||
|
info "Cloning Snipe..."
|
||||||
|
mkdir -p "$INSTALL_DIR"
|
||||||
|
git clone "$FORGEJO/snipe.git" "$SNIPE_DIR"
|
||||||
|
fi
|
||||||
|
ok "Snipe → $SNIPE_DIR"
|
||||||
|
|
||||||
|
if [[ -d "$CORE_DIR" ]]; then
|
||||||
|
info "circuitforge-core already cloned — pulling latest..."
|
||||||
|
git -C "$CORE_DIR" pull --ff-only
|
||||||
|
else
|
||||||
|
info "Cloning circuitforge-core (shared library)..."
|
||||||
|
git clone "$FORGEJO/circuitforge-core.git" "$CORE_DIR"
|
||||||
|
fi
|
||||||
|
ok "circuitforge-core → $CORE_DIR"
|
||||||
|
|
||||||
|
# ── Configure environment ────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
ENV_FILE="$SNIPE_DIR/.env"
|
||||||
|
if [[ ! -f "$ENV_FILE" ]]; then
|
||||||
|
cp "$SNIPE_DIR/.env.example" "$ENV_FILE"
|
||||||
|
# Safe defaults for local installs — no eBay registration, no Heimdall
|
||||||
|
sed -i 's/^EBAY_WEBHOOK_VERIFY_SIGNATURES=true/EBAY_WEBHOOK_VERIFY_SIGNATURES=false/' "$ENV_FILE"
|
||||||
|
ok ".env created from .env.example"
|
||||||
|
echo ""
|
||||||
|
info "Snipe works out of the box with no API keys."
|
||||||
|
info "Add EBAY_APP_ID / EBAY_CERT_ID later for faster searches (optional)."
|
||||||
|
echo ""
|
||||||
|
else
|
||||||
|
info ".env already exists — skipping (delete it to reset)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd "$SNIPE_DIR"
|
||||||
|
|
||||||
|
# ── Docker install path ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if [[ "$INSTALL_PATH" == "docker" ]]; then
|
||||||
|
info "Building Docker images (~1 GB download on first run)..."
|
||||||
|
docker compose build
|
||||||
|
|
||||||
|
info "Starting Snipe..."
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
ok "Snipe is running!"
|
||||||
|
hr
|
||||||
|
echo " Web UI: http://localhost:8509"
|
||||||
|
echo " API: http://localhost:8510/docs"
|
||||||
|
echo ""
|
||||||
|
echo " Manage: cd $SNIPE_DIR && ./manage.sh {start|stop|restart|logs|test}"
|
||||||
|
hr
|
||||||
|
echo ""
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── No-Docker install path ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# System deps: Xvfb is required for Playwright (Kasada bypass via headed Chromium)
|
||||||
|
if ! command -v Xvfb >/dev/null 2>&1; then
|
||||||
|
info "Installing Xvfb (required for eBay scraper)..."
|
||||||
|
if command -v apt-get >/dev/null 2>&1; then
|
||||||
|
sudo apt-get install -y --no-install-recommends xvfb
|
||||||
|
elif command -v dnf >/dev/null 2>&1; then
|
||||||
|
sudo dnf install -y xorg-x11-server-Xvfb
|
||||||
|
elif command -v brew >/dev/null 2>&1; then
|
||||||
|
warn "macOS: Xvfb not available. The scraper fallback may fail."
|
||||||
|
warn "Add eBay API credentials to .env to use the API adapter instead."
|
||||||
|
else
|
||||||
|
warn "Could not install Xvfb automatically. Install it with your package manager."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── Python environment setup ─────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if $HAS_CONDA; then
|
||||||
|
info "Setting up conda environment '$CONDA_ENV'..."
|
||||||
|
if conda env list | grep -q "^$CONDA_ENV "; then
|
||||||
|
info "Conda env '$CONDA_ENV' already exists — updating..."
|
||||||
|
conda run -n "$CONDA_ENV" pip install --quiet -e "$CORE_DIR"
|
||||||
|
conda run -n "$CONDA_ENV" pip install --quiet -e "$SNIPE_DIR"
|
||||||
|
else
|
||||||
|
conda create -n "$CONDA_ENV" python=3.11 -y
|
||||||
|
conda run -n "$CONDA_ENV" pip install --quiet -e "$CORE_DIR"
|
||||||
|
conda run -n "$CONDA_ENV" pip install --quiet -e "$SNIPE_DIR"
|
||||||
|
fi
|
||||||
|
conda run -n "$CONDA_ENV" playwright install chromium
|
||||||
|
conda run -n "$CONDA_ENV" playwright install-deps chromium
|
||||||
|
PYTHON_RUN="conda run -n $CONDA_ENV"
|
||||||
|
ok "Conda environment '$CONDA_ENV' ready"
|
||||||
|
else
|
||||||
|
info "Setting up Python venv at $SNIPE_DIR/.venv ..."
|
||||||
|
python3 -m venv "$SNIPE_DIR/.venv"
|
||||||
|
"$SNIPE_DIR/.venv/bin/pip" install --quiet -e "$CORE_DIR"
|
||||||
|
"$SNIPE_DIR/.venv/bin/pip" install --quiet -e "$SNIPE_DIR"
|
||||||
|
"$SNIPE_DIR/.venv/bin/playwright" install chromium
|
||||||
|
"$SNIPE_DIR/.venv/bin/playwright" install-deps chromium
|
||||||
|
PYTHON_RUN="$SNIPE_DIR/.venv/bin"
|
||||||
|
ok "Python venv ready at $SNIPE_DIR/.venv"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── Frontend ─────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
if $HAS_NODE; then
|
||||||
|
info "Building Vue frontend..."
|
||||||
|
cd "$SNIPE_DIR/web"
|
||||||
|
npm ci --prefer-offline --silent
|
||||||
|
npm run build
|
||||||
|
cd "$SNIPE_DIR"
|
||||||
|
ok "Frontend built → web/dist/"
|
||||||
|
else
|
||||||
|
warn "Node.js not found — skipping frontend build."
|
||||||
|
warn "Install Node.js 20+ from https://nodejs.org and re-run install.sh to build the UI."
|
||||||
|
warn "Until then, you can access the API directly at http://localhost:8510/docs"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# ── Write start/stop scripts ─────────────────────────────────────────────────
|
||||||
|
|
||||||
|
cat > "$SNIPE_DIR/start-local.sh" << 'STARTSCRIPT'
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# Start Snipe without Docker (API only — run from the snipe/ directory)
|
||||||
|
set -euo pipefail
|
||||||
|
cd "$(dirname "$0")"
|
||||||
|
|
||||||
|
if [[ -f .venv/bin/uvicorn ]]; then
|
||||||
|
UVICORN=".venv/bin/uvicorn"
|
||||||
|
elif command -v conda >/dev/null 2>&1 && conda env list | grep -q "^cf "; then
|
||||||
|
UVICORN="conda run -n cf uvicorn"
|
||||||
|
else
|
||||||
|
echo "No Python env found. Run install.sh first." >&2; exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
mkdir -p data
|
||||||
|
echo "Starting Snipe API on http://localhost:8510 ..."
|
||||||
|
$UVICORN api.main:app --host 0.0.0.0 --port 8510 "${@}"
|
||||||
|
STARTSCRIPT
|
||||||
|
chmod +x "$SNIPE_DIR/start-local.sh"
|
||||||
|
|
||||||
|
# Frontend serving (if built)
|
||||||
|
cat > "$SNIPE_DIR/serve-ui.sh" << 'UISCRIPT'
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# Serve the pre-built Vue frontend on port 8509 (dev only — use nginx for production)
|
||||||
|
cd "$(dirname "$0")/web/dist"
|
||||||
|
python3 -m http.server 8509
|
||||||
|
UISCRIPT
|
||||||
|
chmod +x "$SNIPE_DIR/serve-ui.sh"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
ok "Snipe installed (no-Docker mode)"
|
||||||
|
hr
|
||||||
|
echo " Start API: cd $SNIPE_DIR && ./start-local.sh"
|
||||||
|
echo " Serve UI: cd $SNIPE_DIR && ./serve-ui.sh (separate terminal)"
|
||||||
|
echo " API docs: http://localhost:8510/docs"
|
||||||
|
echo " Web UI: http://localhost:8509 (after ./serve-ui.sh)"
|
||||||
|
echo ""
|
||||||
|
echo " For production, point nginx at web/dist/ and proxy /api/ to localhost:8510"
|
||||||
|
hr
|
||||||
|
echo ""
|
||||||
|
|
@ -78,7 +78,7 @@ case "$cmd" in
|
||||||
test)
|
test)
|
||||||
echo "Running test suite..."
|
echo "Running test suite..."
|
||||||
docker compose -f "$COMPOSE_FILE" exec api \
|
docker compose -f "$COMPOSE_FILE" exec api \
|
||||||
conda run -n job-seeker python -m pytest /app/snipe/tests/ -v "${@}"
|
python -m pytest /app/snipe/tests/ -v "${@}"
|
||||||
;;
|
;;
|
||||||
|
|
||||||
# ── Cloud commands ────────────────────────────────────────────────────────
|
# ── Cloud commands ────────────────────────────────────────────────────────
|
||||||
|
|
|
||||||
|
|
@ -8,7 +8,7 @@ version = "0.1.0"
|
||||||
description = "Auction listing monitor and trust scorer"
|
description = "Auction listing monitor and trust scorer"
|
||||||
requires-python = ">=3.11"
|
requires-python = ">=3.11"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"circuitforge-core",
|
"circuitforge-core>=0.8.0",
|
||||||
"streamlit>=1.32",
|
"streamlit>=1.32",
|
||||||
"requests>=2.31",
|
"requests>=2.31",
|
||||||
"imagehash>=4.3",
|
"imagehash>=4.3",
|
||||||
|
|
|
||||||
134
tests/test_feedback.py
Normal file
134
tests/test_feedback.py
Normal file
|
|
@ -0,0 +1,134 @@
|
||||||
|
"""Tests for the shared feedback router (circuitforge-core) mounted in snipe."""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from collections.abc import Callable
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from circuitforge_core.api.feedback import make_feedback_router
|
||||||
|
|
||||||
|
|
||||||
|
# ── Test app factory ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _make_client(demo_mode_fn: Callable[[], bool] | None = None) -> TestClient:
|
||||||
|
app = FastAPI()
|
||||||
|
router = make_feedback_router(
|
||||||
|
repo="Circuit-Forge/snipe",
|
||||||
|
product="snipe",
|
||||||
|
demo_mode_fn=demo_mode_fn,
|
||||||
|
)
|
||||||
|
app.include_router(router, prefix="/api/feedback")
|
||||||
|
return TestClient(app)
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET /api/feedback/status ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_status_disabled_when_no_token(monkeypatch):
|
||||||
|
monkeypatch.delenv("FORGEJO_API_TOKEN", raising=False)
|
||||||
|
monkeypatch.delenv("DEMO_MODE", raising=False)
|
||||||
|
client = _make_client(demo_mode_fn=lambda: False)
|
||||||
|
res = client.get("/api/feedback/status")
|
||||||
|
assert res.status_code == 200
|
||||||
|
assert res.json() == {"enabled": False}
|
||||||
|
|
||||||
|
|
||||||
|
def test_status_enabled_when_token_set(monkeypatch):
|
||||||
|
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
|
||||||
|
client = _make_client(demo_mode_fn=lambda: False)
|
||||||
|
res = client.get("/api/feedback/status")
|
||||||
|
assert res.status_code == 200
|
||||||
|
assert res.json() == {"enabled": True}
|
||||||
|
|
||||||
|
|
||||||
|
def test_status_disabled_in_demo_mode(monkeypatch):
|
||||||
|
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
|
||||||
|
demo = True
|
||||||
|
client = _make_client(demo_mode_fn=lambda: demo)
|
||||||
|
res = client.get("/api/feedback/status")
|
||||||
|
assert res.status_code == 200
|
||||||
|
assert res.json() == {"enabled": False}
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST /api/feedback ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_submit_returns_503_when_no_token(monkeypatch):
|
||||||
|
monkeypatch.delenv("FORGEJO_API_TOKEN", raising=False)
|
||||||
|
client = _make_client(demo_mode_fn=lambda: False)
|
||||||
|
res = client.post("/api/feedback", json={
|
||||||
|
"title": "Test", "description": "desc", "type": "bug",
|
||||||
|
})
|
||||||
|
assert res.status_code == 503
|
||||||
|
|
||||||
|
|
||||||
|
def test_submit_returns_403_in_demo_mode(monkeypatch):
|
||||||
|
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
|
||||||
|
demo = True
|
||||||
|
client = _make_client(demo_mode_fn=lambda: demo)
|
||||||
|
res = client.post("/api/feedback", json={
|
||||||
|
"title": "Test", "description": "desc", "type": "bug",
|
||||||
|
})
|
||||||
|
assert res.status_code == 403
|
||||||
|
|
||||||
|
|
||||||
|
def test_submit_creates_issue(monkeypatch):
|
||||||
|
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
|
||||||
|
|
||||||
|
label_response = MagicMock()
|
||||||
|
label_response.ok = True
|
||||||
|
label_response.json.return_value = [
|
||||||
|
{"id": 1, "name": "beta-feedback"},
|
||||||
|
{"id": 2, "name": "needs-triage"},
|
||||||
|
{"id": 3, "name": "bug"},
|
||||||
|
]
|
||||||
|
|
||||||
|
issue_response = MagicMock()
|
||||||
|
issue_response.ok = True
|
||||||
|
issue_response.json.return_value = {
|
||||||
|
"number": 7,
|
||||||
|
"html_url": "https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7",
|
||||||
|
}
|
||||||
|
|
||||||
|
client = _make_client(demo_mode_fn=lambda: False)
|
||||||
|
|
||||||
|
with patch("circuitforge_core.api.feedback.requests.get", return_value=label_response), \
|
||||||
|
patch("circuitforge_core.api.feedback.requests.post", return_value=issue_response):
|
||||||
|
res = client.post("/api/feedback", json={
|
||||||
|
"title": "Listing scores wrong",
|
||||||
|
"description": "Trust score shows 0 when seller has 1000 feedback",
|
||||||
|
"type": "bug",
|
||||||
|
"repro": "1. Search for anything\n2. Check trust score",
|
||||||
|
"tab": "search",
|
||||||
|
})
|
||||||
|
|
||||||
|
assert res.status_code == 200
|
||||||
|
data = res.json()
|
||||||
|
assert data["issue_number"] == 7
|
||||||
|
assert data["issue_url"] == "https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7"
|
||||||
|
|
||||||
|
|
||||||
|
def test_submit_returns_502_on_forgejo_error(monkeypatch):
|
||||||
|
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
|
||||||
|
|
||||||
|
label_response = MagicMock()
|
||||||
|
label_response.ok = True
|
||||||
|
label_response.json.return_value = [
|
||||||
|
{"id": 1, "name": "beta-feedback"},
|
||||||
|
{"id": 2, "name": "needs-triage"},
|
||||||
|
{"id": 3, "name": "question"},
|
||||||
|
]
|
||||||
|
|
||||||
|
bad_response = MagicMock()
|
||||||
|
bad_response.ok = False
|
||||||
|
bad_response.text = "internal server error"
|
||||||
|
|
||||||
|
client = _make_client(demo_mode_fn=lambda: False)
|
||||||
|
|
||||||
|
with patch("circuitforge_core.api.feedback.requests.get", return_value=label_response), \
|
||||||
|
patch("circuitforge_core.api.feedback.requests.post", return_value=bad_response):
|
||||||
|
res = client.post("/api/feedback", json={
|
||||||
|
"title": "Oops", "description": "desc", "type": "other",
|
||||||
|
})
|
||||||
|
|
||||||
|
assert res.status_code == 502
|
||||||
|
|
@ -80,6 +80,45 @@ def test_suspicious_price_flagged_when_price_genuinely_low():
|
||||||
assert "suspicious_price" in result.red_flags_json
|
assert "suspicious_price" in result.red_flags_json
|
||||||
|
|
||||||
|
|
||||||
|
def test_scratch_dent_flagged_from_title_slash_variant():
|
||||||
|
"""Title containing 'parts/repair' (slash variant, no 'or') must trigger scratch_dent_mentioned."""
|
||||||
|
agg = Aggregator()
|
||||||
|
scores = {k: 15 for k in ["account_age", "feedback_count",
|
||||||
|
"feedback_ratio", "price_vs_market", "category_history"]}
|
||||||
|
result = agg.aggregate(
|
||||||
|
scores, photo_hash_duplicate=False, seller=None,
|
||||||
|
listing_title="Generic Widget XL - Parts/Repair",
|
||||||
|
)
|
||||||
|
assert "scratch_dent_mentioned" in result.red_flags_json
|
||||||
|
|
||||||
|
|
||||||
|
def test_scratch_dent_flagged_from_condition_field():
|
||||||
|
"""eBay formal condition 'for parts or not working' must trigger scratch_dent_mentioned
|
||||||
|
even when the listing title contains no damage keywords."""
|
||||||
|
agg = Aggregator()
|
||||||
|
scores = {k: 15 for k in ["account_age", "feedback_count",
|
||||||
|
"feedback_ratio", "price_vs_market", "category_history"]}
|
||||||
|
result = agg.aggregate(
|
||||||
|
scores, photo_hash_duplicate=False, seller=None,
|
||||||
|
listing_title="Generic Widget XL",
|
||||||
|
listing_condition="for parts or not working",
|
||||||
|
)
|
||||||
|
assert "scratch_dent_mentioned" in result.red_flags_json
|
||||||
|
|
||||||
|
|
||||||
|
def test_scratch_dent_not_flagged_for_clean_listing():
|
||||||
|
"""Clean title + 'New' condition must NOT trigger scratch_dent_mentioned."""
|
||||||
|
agg = Aggregator()
|
||||||
|
scores = {k: 15 for k in ["account_age", "feedback_count",
|
||||||
|
"feedback_ratio", "price_vs_market", "category_history"]}
|
||||||
|
result = agg.aggregate(
|
||||||
|
scores, photo_hash_duplicate=False, seller=None,
|
||||||
|
listing_title="Generic Widget XL",
|
||||||
|
listing_condition="new",
|
||||||
|
)
|
||||||
|
assert "scratch_dent_mentioned" not in result.red_flags_json
|
||||||
|
|
||||||
|
|
||||||
def test_new_account_not_flagged_when_age_absent():
|
def test_new_account_not_flagged_when_age_absent():
|
||||||
"""account_age_days=None (scraper tier) must NOT trigger new_account or account_under_30_days."""
|
"""account_age_days=None (scraper tier) must NOT trigger new_account or account_under_30_days."""
|
||||||
agg = Aggregator()
|
agg = Aggregator()
|
||||||
|
|
|
||||||
|
|
@ -8,23 +8,28 @@
|
||||||
<a href="#main-content" class="skip-link">Skip to main content</a>
|
<a href="#main-content" class="skip-link">Skip to main content</a>
|
||||||
<RouterView />
|
<RouterView />
|
||||||
</main>
|
</main>
|
||||||
|
|
||||||
|
<!-- Feedback FAB — hidden when FORGEJO_API_TOKEN not configured -->
|
||||||
|
<FeedbackButton :current-view="String(route.name ?? 'unknown')" />
|
||||||
</div>
|
</div>
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
<script setup lang="ts">
|
<script setup lang="ts">
|
||||||
import { onMounted } from 'vue'
|
import { onMounted } from 'vue'
|
||||||
import { RouterView } from 'vue-router'
|
import { RouterView, useRoute } from 'vue-router'
|
||||||
import { useMotion } from './composables/useMotion'
|
import { useMotion } from './composables/useMotion'
|
||||||
import { useSnipeMode } from './composables/useSnipeMode'
|
import { useSnipeMode } from './composables/useSnipeMode'
|
||||||
import { useKonamiCode } from './composables/useKonamiCode'
|
import { useKonamiCode } from './composables/useKonamiCode'
|
||||||
import { useSessionStore } from './stores/session'
|
import { useSessionStore } from './stores/session'
|
||||||
import { useBlocklistStore } from './stores/blocklist'
|
import { useBlocklistStore } from './stores/blocklist'
|
||||||
import AppNav from './components/AppNav.vue'
|
import AppNav from './components/AppNav.vue'
|
||||||
|
import FeedbackButton from './components/FeedbackButton.vue'
|
||||||
|
|
||||||
const motion = useMotion()
|
const motion = useMotion()
|
||||||
const { activate, restore } = useSnipeMode()
|
const { activate, restore } = useSnipeMode()
|
||||||
const session = useSessionStore()
|
const session = useSessionStore()
|
||||||
const blocklistStore = useBlocklistStore()
|
const blocklistStore = useBlocklistStore()
|
||||||
|
const route = useRoute()
|
||||||
|
|
||||||
useKonamiCode(activate)
|
useKonamiCode(activate)
|
||||||
|
|
||||||
|
|
|
||||||
413
web/src/components/FeedbackButton.vue
Normal file
413
web/src/components/FeedbackButton.vue
Normal file
|
|
@ -0,0 +1,413 @@
|
||||||
|
<template>
|
||||||
|
<!-- Floating trigger button -->
|
||||||
|
<button
|
||||||
|
v-if="enabled"
|
||||||
|
class="feedback-fab"
|
||||||
|
@click="open = true"
|
||||||
|
aria-label="Send feedback or report a bug"
|
||||||
|
title="Send feedback or report a bug"
|
||||||
|
>
|
||||||
|
<svg class="feedback-fab-icon" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="1.8" stroke-linecap="round" stroke-linejoin="round">
|
||||||
|
<path d="M21 15a2 2 0 01-2 2H7l-4 4V5a2 2 0 012-2h14a2 2 0 012 2z"/>
|
||||||
|
</svg>
|
||||||
|
<span class="feedback-fab-label">Feedback</span>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<!-- Modal — teleported to body to avoid z-index / overflow clipping -->
|
||||||
|
<Teleport to="body">
|
||||||
|
<Transition name="modal-fade">
|
||||||
|
<div v-if="open" class="feedback-overlay" @click.self="close">
|
||||||
|
<div class="feedback-modal" role="dialog" aria-modal="true" aria-label="Send Feedback">
|
||||||
|
|
||||||
|
<!-- Header -->
|
||||||
|
<div class="feedback-header">
|
||||||
|
<h2 class="feedback-title">{{ step === 1 ? "What's on your mind?" : "Review & submit" }}</h2>
|
||||||
|
<button class="feedback-close" @click="close" aria-label="Close">
|
||||||
|
<svg viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" width="18" height="18">
|
||||||
|
<line x1="18" y1="6" x2="6" y2="18"/><line x1="6" y1="6" x2="18" y2="18"/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- ── Step 1: Form ─────────────────────────────────────────── -->
|
||||||
|
<div v-if="step === 1" class="feedback-body">
|
||||||
|
<div class="form-group">
|
||||||
|
<label class="form-label">Type</label>
|
||||||
|
<div class="filter-chip-row">
|
||||||
|
<button
|
||||||
|
v-for="t in types"
|
||||||
|
:key="t.value"
|
||||||
|
:class="['btn-chip', { active: form.type === t.value }]"
|
||||||
|
@click="form.type = t.value"
|
||||||
|
type="button"
|
||||||
|
>{{ t.label }}</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="form-group">
|
||||||
|
<label class="form-label">Title <span class="form-required">*</span></label>
|
||||||
|
<input
|
||||||
|
v-model="form.title"
|
||||||
|
class="form-input"
|
||||||
|
type="text"
|
||||||
|
placeholder="Short summary of the issue or idea"
|
||||||
|
maxlength="120"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="form-group">
|
||||||
|
<label class="form-label">Description <span class="form-required">*</span></label>
|
||||||
|
<textarea
|
||||||
|
v-model="form.description"
|
||||||
|
class="form-input feedback-textarea"
|
||||||
|
placeholder="Describe what happened or what you'd like to see…"
|
||||||
|
rows="4"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div v-if="form.type === 'bug'" class="form-group">
|
||||||
|
<label class="form-label">Reproduction steps</label>
|
||||||
|
<textarea
|
||||||
|
v-model="form.repro"
|
||||||
|
class="form-input feedback-textarea"
|
||||||
|
placeholder="1. Go to… 2. Tap… 3. See error"
|
||||||
|
rows="3"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<p v-if="stepError" class="feedback-error">{{ stepError }}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- ── Step 2: Attribution + confirm ──────────────────────────── -->
|
||||||
|
<div v-if="step === 2" class="feedback-body">
|
||||||
|
<div class="feedback-summary card">
|
||||||
|
<div class="feedback-summary-row">
|
||||||
|
<span class="text-muted text-sm">Type</span>
|
||||||
|
<span class="text-sm font-semibold">{{ typeLabel }}</span>
|
||||||
|
</div>
|
||||||
|
<div class="feedback-summary-row">
|
||||||
|
<span class="text-muted text-sm">Title</span>
|
||||||
|
<span class="text-sm">{{ form.title }}</span>
|
||||||
|
</div>
|
||||||
|
<div class="feedback-summary-row">
|
||||||
|
<span class="text-muted text-sm">Description</span>
|
||||||
|
<span class="text-sm feedback-summary-desc">{{ form.description }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="form-group mt-md">
|
||||||
|
<label class="form-label">Attribution (optional)</label>
|
||||||
|
<input
|
||||||
|
v-model="form.submitter"
|
||||||
|
class="form-input"
|
||||||
|
type="text"
|
||||||
|
placeholder="Your name <email@example.com>"
|
||||||
|
/>
|
||||||
|
<p class="text-muted text-xs mt-xs">Include your name and email in the issue if you'd like a response. Never required.</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<p v-if="submitError" class="feedback-error">{{ submitError }}</p>
|
||||||
|
<div v-if="submitted" class="feedback-success">
|
||||||
|
Issue filed! <a :href="issueUrl" target="_blank" rel="noopener" class="feedback-link">View on Forgejo →</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Footer nav -->
|
||||||
|
<div class="feedback-footer">
|
||||||
|
<button v-if="step === 2 && !submitted" class="btn btn-ghost" @click="step = 1" :disabled="loading">← Back</button>
|
||||||
|
<button v-if="!submitted" class="btn btn-ghost" @click="close" :disabled="loading">Cancel</button>
|
||||||
|
<button
|
||||||
|
v-if="step === 1"
|
||||||
|
class="btn btn-primary"
|
||||||
|
@click="nextStep"
|
||||||
|
>Next →</button>
|
||||||
|
<button
|
||||||
|
v-if="step === 2 && !submitted"
|
||||||
|
class="btn btn-primary"
|
||||||
|
@click="submit"
|
||||||
|
:disabled="loading"
|
||||||
|
>{{ loading ? 'Filing…' : 'Submit' }}</button>
|
||||||
|
<button v-if="submitted" class="btn btn-primary" @click="close">Done</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</Transition>
|
||||||
|
</Teleport>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<script setup lang="ts">
|
||||||
|
import { ref, computed, onMounted } from 'vue'
|
||||||
|
|
||||||
|
const props = defineProps<{ currentView?: string }>()
|
||||||
|
|
||||||
|
// Probe once on mount — hidden until confirmed enabled so button never flashes
|
||||||
|
const enabled = ref(false)
|
||||||
|
onMounted(async () => {
|
||||||
|
try {
|
||||||
|
const res = await fetch('/api/feedback/status')
|
||||||
|
if (res.ok) {
|
||||||
|
const data = await res.json()
|
||||||
|
enabled.value = data.enabled === true
|
||||||
|
}
|
||||||
|
} catch { /* network error — stay hidden */ }
|
||||||
|
})
|
||||||
|
|
||||||
|
const open = ref(false)
|
||||||
|
const step = ref(1)
|
||||||
|
const loading = ref(false)
|
||||||
|
const stepError = ref('')
|
||||||
|
const submitError = ref('')
|
||||||
|
const submitted = ref(false)
|
||||||
|
const issueUrl = ref('')
|
||||||
|
|
||||||
|
const types: { value: 'bug' | 'feature' | 'other'; label: string }[] = [
|
||||||
|
{ value: 'bug', label: '🐛 Bug' },
|
||||||
|
{ value: 'feature', label: '✨ Feature request' },
|
||||||
|
{ value: 'other', label: '💬 Other' },
|
||||||
|
]
|
||||||
|
|
||||||
|
const form = ref({
|
||||||
|
type: 'bug' as 'bug' | 'feature' | 'other',
|
||||||
|
title: '',
|
||||||
|
description: '',
|
||||||
|
repro: '',
|
||||||
|
submitter: '',
|
||||||
|
})
|
||||||
|
|
||||||
|
const typeLabel = computed(() => types.find(t => t.value === form.value.type)?.label ?? '')
|
||||||
|
|
||||||
|
function close() {
|
||||||
|
open.value = false
|
||||||
|
// reset after transition
|
||||||
|
setTimeout(reset, 300)
|
||||||
|
}
|
||||||
|
|
||||||
|
function reset() {
|
||||||
|
step.value = 1
|
||||||
|
loading.value = false
|
||||||
|
stepError.value = ''
|
||||||
|
submitError.value = ''
|
||||||
|
submitted.value = false
|
||||||
|
issueUrl.value = ''
|
||||||
|
form.value = { type: 'bug', title: '', description: '', repro: '', submitter: '' }
|
||||||
|
}
|
||||||
|
|
||||||
|
function nextStep() {
|
||||||
|
stepError.value = ''
|
||||||
|
if (!form.value.title.trim() || !form.value.description.trim()) {
|
||||||
|
stepError.value = 'Please fill in both Title and Description.'
|
||||||
|
return
|
||||||
|
}
|
||||||
|
step.value = 2
|
||||||
|
}
|
||||||
|
|
||||||
|
async function submit() {
|
||||||
|
loading.value = true
|
||||||
|
submitError.value = ''
|
||||||
|
try {
|
||||||
|
const res = await fetch('/api/feedback', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify({
|
||||||
|
title: form.value.title.trim(),
|
||||||
|
description: form.value.description.trim(),
|
||||||
|
type: form.value.type,
|
||||||
|
repro: form.value.repro.trim(),
|
||||||
|
view: props.currentView ?? 'unknown',
|
||||||
|
submitter: form.value.submitter.trim(),
|
||||||
|
}),
|
||||||
|
})
|
||||||
|
if (!res.ok) {
|
||||||
|
const err = await res.json().catch(() => ({ detail: res.statusText }))
|
||||||
|
submitError.value = err.detail ?? 'Submission failed.'
|
||||||
|
return
|
||||||
|
}
|
||||||
|
const data = await res.json()
|
||||||
|
issueUrl.value = data.issue_url
|
||||||
|
submitted.value = true
|
||||||
|
} catch (e) {
|
||||||
|
submitError.value = 'Network error — please try again.'
|
||||||
|
} finally {
|
||||||
|
loading.value = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<style scoped>
|
||||||
|
/* ── Floating action button ─────────────────────────────────────────── */
|
||||||
|
.feedback-fab {
|
||||||
|
position: fixed;
|
||||||
|
right: var(--spacing-md);
|
||||||
|
bottom: calc(68px + var(--spacing-md)); /* above mobile bottom nav */
|
||||||
|
z-index: 190;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--spacing-xs);
|
||||||
|
padding: 9px var(--spacing-md);
|
||||||
|
background: var(--color-bg-elevated);
|
||||||
|
border: 1px solid var(--color-border);
|
||||||
|
border-radius: 999px;
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
font-size: var(--font-size-sm);
|
||||||
|
font-family: var(--font-body);
|
||||||
|
font-weight: 500;
|
||||||
|
cursor: pointer;
|
||||||
|
box-shadow: var(--shadow-md);
|
||||||
|
transition: background 0.15s, color 0.15s, box-shadow 0.15s, border-color 0.15s;
|
||||||
|
}
|
||||||
|
.feedback-fab:hover {
|
||||||
|
background: var(--color-bg-card);
|
||||||
|
color: var(--color-text-primary);
|
||||||
|
border-color: var(--color-border-focus);
|
||||||
|
box-shadow: var(--shadow-lg);
|
||||||
|
}
|
||||||
|
.feedback-fab-icon { width: 15px; height: 15px; flex-shrink: 0; }
|
||||||
|
.feedback-fab-label { white-space: nowrap; }
|
||||||
|
|
||||||
|
/* On desktop, bottom nav is gone — drop to standard corner */
|
||||||
|
@media (min-width: 769px) {
|
||||||
|
.feedback-fab {
|
||||||
|
bottom: var(--spacing-lg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ── Overlay ──────────────────────────────────────────────────────────── */
|
||||||
|
.feedback-overlay {
|
||||||
|
position: fixed;
|
||||||
|
inset: 0;
|
||||||
|
background: rgba(0, 0, 0, 0.55);
|
||||||
|
z-index: 1000;
|
||||||
|
display: flex;
|
||||||
|
align-items: flex-end;
|
||||||
|
justify-content: center;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (min-width: 500px) {
|
||||||
|
.feedback-overlay {
|
||||||
|
align-items: center;
|
||||||
|
padding: var(--spacing-md);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* ── Modal ────────────────────────────────────────────────────────────── */
|
||||||
|
.feedback-modal {
|
||||||
|
background: var(--color-bg-elevated);
|
||||||
|
border: 1px solid var(--color-border);
|
||||||
|
border-radius: var(--radius-lg) var(--radius-lg) 0 0;
|
||||||
|
width: 100%;
|
||||||
|
max-height: 90vh;
|
||||||
|
overflow-y: auto;
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
box-shadow: var(--shadow-xl);
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (min-width: 500px) {
|
||||||
|
.feedback-modal {
|
||||||
|
border-radius: var(--radius-lg);
|
||||||
|
width: 100%;
|
||||||
|
max-width: 520px;
|
||||||
|
max-height: 85vh;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.feedback-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: space-between;
|
||||||
|
padding: var(--spacing-md) var(--spacing-md) var(--spacing-sm);
|
||||||
|
border-bottom: 1px solid var(--color-border);
|
||||||
|
flex-shrink: 0;
|
||||||
|
}
|
||||||
|
.feedback-title {
|
||||||
|
font-family: var(--font-display);
|
||||||
|
font-size: var(--font-size-lg);
|
||||||
|
font-weight: 600;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
.feedback-close {
|
||||||
|
background: transparent;
|
||||||
|
border: none;
|
||||||
|
color: var(--color-text-muted);
|
||||||
|
cursor: pointer;
|
||||||
|
padding: 4px;
|
||||||
|
border-radius: var(--radius-sm);
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
}
|
||||||
|
.feedback-close:hover { color: var(--color-text-primary); }
|
||||||
|
|
||||||
|
.feedback-body {
|
||||||
|
padding: var(--spacing-md);
|
||||||
|
flex: 1;
|
||||||
|
overflow-y: auto;
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: var(--spacing-md);
|
||||||
|
}
|
||||||
|
|
||||||
|
.feedback-footer {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: flex-end;
|
||||||
|
gap: var(--spacing-sm);
|
||||||
|
padding: var(--spacing-sm) var(--spacing-md);
|
||||||
|
border-top: 1px solid var(--color-border);
|
||||||
|
flex-shrink: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.feedback-textarea {
|
||||||
|
resize: vertical;
|
||||||
|
min-height: 80px;
|
||||||
|
font-family: var(--font-body);
|
||||||
|
font-size: var(--font-size-sm);
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-required { color: var(--color-error); margin-left: 2px; }
|
||||||
|
|
||||||
|
.feedback-error {
|
||||||
|
color: var(--color-error);
|
||||||
|
font-size: var(--font-size-sm);
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.feedback-success {
|
||||||
|
color: var(--color-success);
|
||||||
|
font-size: var(--font-size-sm);
|
||||||
|
padding: var(--spacing-sm) var(--spacing-md);
|
||||||
|
background: var(--color-success-bg);
|
||||||
|
border: 1px solid var(--color-success-border);
|
||||||
|
border-radius: var(--radius-md);
|
||||||
|
}
|
||||||
|
.feedback-link { color: var(--color-success); font-weight: 600; text-decoration: underline; }
|
||||||
|
|
||||||
|
/* Summary card (step 2) */
|
||||||
|
.feedback-summary {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: var(--spacing-xs);
|
||||||
|
padding: var(--spacing-sm) var(--spacing-md);
|
||||||
|
background: var(--color-bg-secondary);
|
||||||
|
border-radius: var(--radius-md);
|
||||||
|
border: 1px solid var(--color-border);
|
||||||
|
}
|
||||||
|
.feedback-summary-row {
|
||||||
|
display: flex;
|
||||||
|
gap: var(--spacing-md);
|
||||||
|
align-items: flex-start;
|
||||||
|
}
|
||||||
|
.feedback-summary-row > :first-child { min-width: 72px; flex-shrink: 0; }
|
||||||
|
.feedback-summary-desc {
|
||||||
|
white-space: pre-wrap;
|
||||||
|
word-break: break-word;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mt-md { margin-top: var(--spacing-md); }
|
||||||
|
.mt-xs { margin-top: var(--spacing-xs); }
|
||||||
|
|
||||||
|
/* Transition */
|
||||||
|
.modal-fade-enter-active, .modal-fade-leave-active { transition: opacity 0.2s ease; }
|
||||||
|
.modal-fade-enter-from, .modal-fade-leave-to { opacity: 0; }
|
||||||
|
</style>
|
||||||
|
|
@ -120,10 +120,13 @@ export const useSearchStore = defineStore('search', () => {
|
||||||
)
|
)
|
||||||
const marketPrice = ref<number | null>(cached?.marketPrice ?? null)
|
const marketPrice = ref<number | null>(cached?.marketPrice ?? null)
|
||||||
const adapterUsed = ref<'api' | 'scraper' | null>(cached?.adapterUsed ?? null)
|
const adapterUsed = ref<'api' | 'scraper' | null>(cached?.adapterUsed ?? null)
|
||||||
|
const affiliateActive = ref<boolean>(false)
|
||||||
const loading = ref(false)
|
const loading = ref(false)
|
||||||
const error = ref<string | null>(null)
|
const error = ref<string | null>(null)
|
||||||
|
const enriching = ref(false) // true while SSE stream is open
|
||||||
|
|
||||||
let _abort: AbortController | null = null
|
let _abort: AbortController | null = null
|
||||||
|
let _sse: EventSource | null = null
|
||||||
|
|
||||||
function cancelSearch() {
|
function cancelSearch() {
|
||||||
_abort?.abort()
|
_abort?.abort()
|
||||||
|
|
@ -164,6 +167,8 @@ export const useSearchStore = defineStore('search', () => {
|
||||||
sellers: Record<string, Seller>
|
sellers: Record<string, Seller>
|
||||||
market_price: number | null
|
market_price: number | null
|
||||||
adapter_used: 'api' | 'scraper'
|
adapter_used: 'api' | 'scraper'
|
||||||
|
affiliate_active: boolean
|
||||||
|
session_id: string | null
|
||||||
}
|
}
|
||||||
|
|
||||||
results.value = data.listings ?? []
|
results.value = data.listings ?? []
|
||||||
|
|
@ -171,6 +176,7 @@ export const useSearchStore = defineStore('search', () => {
|
||||||
sellers.value = new Map(Object.entries(data.sellers ?? {}))
|
sellers.value = new Map(Object.entries(data.sellers ?? {}))
|
||||||
marketPrice.value = data.market_price ?? null
|
marketPrice.value = data.market_price ?? null
|
||||||
adapterUsed.value = data.adapter_used ?? null
|
adapterUsed.value = data.adapter_used ?? null
|
||||||
|
affiliateActive.value = data.affiliate_active ?? false
|
||||||
saveCache({
|
saveCache({
|
||||||
query: q,
|
query: q,
|
||||||
results: results.value,
|
results: results.value,
|
||||||
|
|
@ -179,6 +185,12 @@ export const useSearchStore = defineStore('search', () => {
|
||||||
marketPrice: marketPrice.value,
|
marketPrice: marketPrice.value,
|
||||||
adapterUsed: adapterUsed.value,
|
adapterUsed: adapterUsed.value,
|
||||||
})
|
})
|
||||||
|
|
||||||
|
// Open SSE stream if any scores are partial and a session_id was provided
|
||||||
|
const hasPartial = Object.values(data.trust_scores ?? {}).some(ts => ts.score_is_partial)
|
||||||
|
if (data.session_id && hasPartial) {
|
||||||
|
_openUpdates(data.session_id, apiBase)
|
||||||
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
if (e instanceof DOMException && e.name === 'AbortError') {
|
if (e instanceof DOMException && e.name === 'AbortError') {
|
||||||
// User cancelled — clear loading but don't surface as an error
|
// User cancelled — clear loading but don't surface as an error
|
||||||
|
|
@ -193,6 +205,57 @@ export const useSearchStore = defineStore('search', () => {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function closeUpdates() {
|
||||||
|
if (_sse) {
|
||||||
|
_sse.close()
|
||||||
|
_sse = null
|
||||||
|
}
|
||||||
|
enriching.value = false
|
||||||
|
}
|
||||||
|
|
||||||
|
function _openUpdates(sessionId: string, apiBase: string) {
|
||||||
|
closeUpdates() // close any previous stream
|
||||||
|
enriching.value = true
|
||||||
|
|
||||||
|
const es = new EventSource(`${apiBase}/api/updates/${sessionId}`)
|
||||||
|
_sse = es
|
||||||
|
|
||||||
|
es.onmessage = (e) => {
|
||||||
|
try {
|
||||||
|
const update = JSON.parse(e.data) as {
|
||||||
|
platform_listing_id: string
|
||||||
|
trust_score: TrustScore
|
||||||
|
seller: Record<string, unknown>
|
||||||
|
market_price: number | null
|
||||||
|
}
|
||||||
|
if (update.platform_listing_id && update.trust_score) {
|
||||||
|
trustScores.value = new Map(trustScores.value)
|
||||||
|
trustScores.value.set(update.platform_listing_id, update.trust_score)
|
||||||
|
}
|
||||||
|
if (update.seller) {
|
||||||
|
const s = update.seller as Seller
|
||||||
|
if (s.platform_seller_id) {
|
||||||
|
sellers.value = new Map(sellers.value)
|
||||||
|
sellers.value.set(s.platform_seller_id, s)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (update.market_price != null) {
|
||||||
|
marketPrice.value = update.market_price
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// malformed event — ignore
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
es.addEventListener('done', () => {
|
||||||
|
closeUpdates()
|
||||||
|
})
|
||||||
|
|
||||||
|
es.onerror = () => {
|
||||||
|
closeUpdates()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async function enrichSeller(sellerUsername: string, listingId: string): Promise<void> {
|
async function enrichSeller(sellerUsername: string, listingId: string): Promise<void> {
|
||||||
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
|
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
|
||||||
const params = new URLSearchParams({
|
const params = new URLSearchParams({
|
||||||
|
|
@ -225,11 +288,14 @@ export const useSearchStore = defineStore('search', () => {
|
||||||
sellers,
|
sellers,
|
||||||
marketPrice,
|
marketPrice,
|
||||||
adapterUsed,
|
adapterUsed,
|
||||||
|
affiliateActive,
|
||||||
loading,
|
loading,
|
||||||
|
enriching,
|
||||||
error,
|
error,
|
||||||
search,
|
search,
|
||||||
cancelSearch,
|
cancelSearch,
|
||||||
enrichSeller,
|
enrichSeller,
|
||||||
|
closeUpdates,
|
||||||
clearResults,
|
clearResults,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
|
||||||
|
|
@ -91,6 +91,17 @@
|
||||||
aria-label="Search filters"
|
aria-label="Search filters"
|
||||||
>
|
>
|
||||||
|
|
||||||
|
<!-- Clear all filters — only shown when at least one filter is active -->
|
||||||
|
<button
|
||||||
|
v-if="activeFilterCount > 0"
|
||||||
|
type="button"
|
||||||
|
class="filter-clear-btn"
|
||||||
|
@click="resetFilters"
|
||||||
|
aria-label="Clear all filters"
|
||||||
|
>
|
||||||
|
✕ Clear filters ({{ activeFilterCount }})
|
||||||
|
</button>
|
||||||
|
|
||||||
<!-- ── eBay Search Parameters ─────────────────────────────────────── -->
|
<!-- ── eBay Search Parameters ─────────────────────────────────────── -->
|
||||||
<!-- These are sent to eBay. Changes require a new search to take effect. -->
|
<!-- These are sent to eBay. Changes require a new search to take effect. -->
|
||||||
<h2 class="filter-section-heading filter-section-heading--search">
|
<h2 class="filter-section-heading filter-section-heading--search">
|
||||||
|
|
@ -296,8 +307,16 @@
|
||||||
<span v-if="hiddenCount > 0" class="results-hidden">
|
<span v-if="hiddenCount > 0" class="results-hidden">
|
||||||
· {{ hiddenCount }} hidden by filters
|
· {{ hiddenCount }} hidden by filters
|
||||||
</span>
|
</span>
|
||||||
|
<span v-if="store.affiliateActive" class="affiliate-disclosure">
|
||||||
|
· Links may include an affiliate code
|
||||||
|
</span>
|
||||||
</p>
|
</p>
|
||||||
<div class="toolbar-actions">
|
<div class="toolbar-actions">
|
||||||
|
<!-- Live enrichment indicator — visible while SSE stream is open -->
|
||||||
|
<span v-if="store.enriching" class="enriching-badge" aria-live="polite" title="Scores updating as seller data arrives">
|
||||||
|
<span class="enriching-dot" aria-hidden="true"></span>
|
||||||
|
Updating scores…
|
||||||
|
</span>
|
||||||
<label for="sort-select" class="sr-only">Sort by</label>
|
<label for="sort-select" class="sr-only">Sort by</label>
|
||||||
<select id="sort-select" v-model="sortBy" class="sort-select">
|
<select id="sort-select" v-model="sortBy" class="sort-select">
|
||||||
<option v-for="opt in SORT_OPTIONS" :key="opt.value" :value="opt.value">
|
<option v-for="opt in SORT_OPTIONS" :key="opt.value" :value="opt.value">
|
||||||
|
|
@ -402,7 +421,7 @@ onMounted(() => {
|
||||||
|
|
||||||
// ── Filters ──────────────────────────────────────────────────────────────────
|
// ── Filters ──────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
const filters = reactive<SearchFilters>({
|
const DEFAULT_FILTERS: SearchFilters = {
|
||||||
minTrustScore: 0,
|
minTrustScore: 0,
|
||||||
minPrice: undefined,
|
minPrice: undefined,
|
||||||
maxPrice: undefined,
|
maxPrice: undefined,
|
||||||
|
|
@ -421,7 +440,13 @@ const filters = reactive<SearchFilters>({
|
||||||
mustExclude: '',
|
mustExclude: '',
|
||||||
categoryId: '',
|
categoryId: '',
|
||||||
adapter: 'auto' as 'auto' | 'api' | 'scraper',
|
adapter: 'auto' as 'auto' | 'api' | 'scraper',
|
||||||
})
|
}
|
||||||
|
|
||||||
|
const filters = reactive<SearchFilters>({ ...DEFAULT_FILTERS })
|
||||||
|
|
||||||
|
function resetFilters() {
|
||||||
|
Object.assign(filters, DEFAULT_FILTERS)
|
||||||
|
}
|
||||||
|
|
||||||
// Parse comma-separated keyword strings into trimmed, lowercase, non-empty term arrays
|
// Parse comma-separated keyword strings into trimmed, lowercase, non-empty term arrays
|
||||||
const parsedMustInclude = computed(() =>
|
const parsedMustInclude = computed(() =>
|
||||||
|
|
@ -764,6 +789,27 @@ async function onSearch() {
|
||||||
margin-bottom: var(--space-2);
|
margin-bottom: var(--space-2);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* Clear all filters button */
|
||||||
|
.filter-clear-btn {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-1);
|
||||||
|
width: 100%;
|
||||||
|
padding: var(--space-1) var(--space-2);
|
||||||
|
margin-bottom: var(--space-2);
|
||||||
|
background: color-mix(in srgb, var(--color-red, #ef4444) 12%, transparent);
|
||||||
|
color: var(--color-red, #ef4444);
|
||||||
|
border: 1px solid color-mix(in srgb, var(--color-red, #ef4444) 30%, transparent);
|
||||||
|
border-radius: var(--radius-sm);
|
||||||
|
font-size: 0.75rem;
|
||||||
|
font-weight: 600;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: background 0.15s, color 0.15s;
|
||||||
|
}
|
||||||
|
.filter-clear-btn:hover {
|
||||||
|
background: color-mix(in srgb, var(--color-red, #ef4444) 22%, transparent);
|
||||||
|
}
|
||||||
|
|
||||||
/* Section headings that separate eBay Search params from local filters */
|
/* Section headings that separate eBay Search params from local filters */
|
||||||
.filter-section-heading {
|
.filter-section-heading {
|
||||||
font-size: 0.6875rem;
|
font-size: 0.6875rem;
|
||||||
|
|
@ -1029,6 +1075,7 @@ async function onSearch() {
|
||||||
}
|
}
|
||||||
|
|
||||||
.results-hidden { color: var(--color-warning); }
|
.results-hidden { color: var(--color-warning); }
|
||||||
|
.affiliate-disclosure { color: var(--color-text-muted, #8b949e); font-size: 0.8em; }
|
||||||
|
|
||||||
.toolbar-actions {
|
.toolbar-actions {
|
||||||
display: flex;
|
display: flex;
|
||||||
|
|
@ -1037,6 +1084,33 @@ async function onSearch() {
|
||||||
flex-wrap: wrap;
|
flex-wrap: wrap;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.enriching-badge {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--space-1);
|
||||||
|
padding: var(--space-1) var(--space-2);
|
||||||
|
background: color-mix(in srgb, var(--app-primary) 10%, transparent);
|
||||||
|
border: 1px solid color-mix(in srgb, var(--app-primary) 30%, transparent);
|
||||||
|
border-radius: var(--radius-full, 9999px);
|
||||||
|
color: var(--app-primary);
|
||||||
|
font-size: 0.75rem;
|
||||||
|
font-weight: 500;
|
||||||
|
white-space: nowrap;
|
||||||
|
}
|
||||||
|
|
||||||
|
.enriching-dot {
|
||||||
|
width: 6px;
|
||||||
|
height: 6px;
|
||||||
|
border-radius: 50%;
|
||||||
|
background: var(--app-primary);
|
||||||
|
animation: enriching-pulse 1.2s ease-in-out infinite;
|
||||||
|
}
|
||||||
|
|
||||||
|
@keyframes enriching-pulse {
|
||||||
|
0%, 100% { opacity: 1; transform: scale(1); }
|
||||||
|
50% { opacity: 0.4; transform: scale(0.7); }
|
||||||
|
}
|
||||||
|
|
||||||
.save-btn {
|
.save-btn {
|
||||||
display: flex;
|
display: flex;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
|
|
|
||||||
Loading…
Reference in a new issue