Compare commits

...

13 commits

Author SHA1 Message Date
98695b00f0 feat(snipe): eBay trust scoring MVP — search, filters, enrichment, comps
Core trust scoring:
- Five metadata signals (account age, feedback count/ratio, price vs market,
  category history), composited 0–100
- CV-based price signal suppression for heterogeneous search results
  (e.g. mixed laptop generations won't false-positive suspicious_price)
- Expanded scratch/dent title detection: evasive redirects, functional problem
  phrases, DIY/repair indicators
- Hard filters: new_account, established_bad_actor
- Soft flags: low_feedback, suspicious_price, duplicate_photo, scratch_dent,
  long_on_market, significant_price_drop

Search & filtering:
- Browse API adapter (up to 200 items/page) + Playwright scraper fallback
- OR-group query expansion for comprehensive variant coverage
- Must-include (AND/ANY/groups), must-exclude, category, price range filters
- Saved searches with full filter round-trip via URL params

Seller enrichment:
- Background BTF /itm/ scraping for account age (Kasada-safe headed Chromium)
- On-demand enrichment: POST /api/enrich + ListingCard ↻ button
- Category history derived from Browse API categories field (free, no extra calls)
- Shopping API GetUserProfile inline enrichment for API adapter

Market comps:
- eBay Marketplace Insights API with Browse API fallback (catches 403 + 404)
- Comps prioritised in ThreadPoolExecutor (submitted first)

Infrastructure:
- Staging DB fields: times_seen, first_seen_at, price_at_first_seen, category_name
- Migrations 004 (staging tracking) + 005 (listing category)
- eBay webhook handler stub
- Cloud compose stack (compose.cloud.yml)
- Vue frontend: search store, saved searches store, ListingCard, filter sidebar

Docs:
- README fully rewritten to reflect MVP status + full feature documentation
- Roadmap table linked to all 13 Forgejo issues
2026-03-26 23:37:09 -07:00
a8add8e96b feat(snipe): cloud deployment under menagerie.circuitforge.tech/snipe
- compose.cloud.yml: snipe-cloud project, proper Docker bridge network
  (api is internal-only, no host port), port 8514 for nginx
- docker/web/Dockerfile: VITE_BASE_URL + VITE_API_BASE build args so
  Vite bakes the /snipe path prefix into the bundle at cloud build time
- docker/web/nginx.cloud.conf: upstream api:8510 via Docker network
  (vs 172.17.0.1:8510 in dev which uses host networking)
- manage.sh: cloud-start/stop/restart/status/logs/build commands
- stores/search.ts: VITE_API_BASE prefix on all /api fetch calls

Gate: Caddy basicauth (username: cf) — temporary gate while proper
Heimdall license validation UI is built. Password stored at
/devl/snipe-cloud-data/.beta-password (host-only, not in repo).

Note: Caddyfile updated separately (caddy-proxy volume, not this repo).
2026-03-26 08:14:01 -07:00
11f2a3c2b3 feat(snipe): keyword must-include/must-exclude filtering
- Two sidebar fields: 'Must include' and 'Must exclude' (comma-separated)
- Must-exclude terms forwarded to eBay _nkw as -term prefixes (native eBay
  support) so exclusions reduce the eBay result set at the source — improves
  market comp quality as a side effect
- Must-include applied client-side only (substring, case-insensitive)
- Both applied client-side via passesFilter() for instant response without
  re-fetching (cache-friendly)
- Exclude input has subtle red border tint (color-mix) to signal intent
- Hint text: 're-search to apply to eBay' reminds user negatives need a
  new search to take effect at the eBay level
2026-03-25 22:54:24 -07:00
ea78b9c2cd feat(snipe): parallel search+comps, pagination, title fix, price flag fix
- Parallel execution: search() and get_completed_sales() now run
  concurrently via ThreadPoolExecutor — each gets its own Store/SQLite
  connection for thread safety. First cold search time ~halved.

- Pagination: SearchFilters.pages (default 1) controls how many eBay
  result pages are fetched. Both search and sold-comps support up to 3
  parallel Playwright sessions per call (capped to avoid Xvfb overload).
  UI: segmented 1/2/3/5 pages selector in filter sidebar with cost hint.

- True median: get_completed_sales() now averages the two middle values
  for even-length price lists instead of always taking the lower bound.

- Fix suspicious_price false positive: aggregator now checks
  signal_scores.get("price_vs_market") == 0 (pre-None-substitution)
  so listings without market data are never flagged as suspicious.

- Fix title pollution: scraper strips eBay's hidden screen-reader span
  ("Opens in a new window or tab") from listing titles via regex.
  Lazy-imports playwright/playwright_stealth inside _get() so pure
  parsing functions are importable without the full browser stack.

- Tests: 48 pass on host (scraper tests now runnable without Docker),
  new regression guards for all three bug fixes.
2026-03-25 22:16:08 -07:00
2ab41219f8 fix: account_age_days=None for scraper tier, stop false new_account flags
Scraper can't fetch seller profile age without following each listing's
seller link. Using 0 as sentinel caused every scraped seller to trigger
new_account and account_under_30_days red flags erroneously.

- Seller.account_age_days: int → Optional[int] (None = not yet fetched)
- Migration 003: recreate sellers table without NOT NULL constraint
- MetadataScorer: return None for unknown age → score_is_partial=True
- Aggregator: gate age flags on is not None
- Scraper: account_age_days=None instead of 0
2026-03-25 20:36:43 -07:00
58263d814a feat(snipe): FastAPI layer, Playwright+Xvfb scraper, caching, tests
- FastAPI service (port 8510) wrapping scraper + trust scorer
- Playwright+Xvfb+stealth transport to bypass eBay Kasada bot protection
- li.s-card selector migration (eBay markup change from li.s-item)
- Three-layer caching: HTML (5min), phash (permanent), market comp (6h SQLite)
- Batch DB writes (executemany + single commit) — warm requests <1s
- Unique Xvfb display counter (:200–:299) prevents lock file collisions
- Vue 3 nginx web service (port 8509) proxying /api/ to FastAPI
- Auction card de-emphasis: opacity 0.72 for listings with >1h remaining
- 35 scraper unit tests updated for new li.s-card fixture markup
- tests/ volume-mounted in compose.override.yml for live test editing
2026-03-25 20:09:30 -07:00
720744f75e chore: remove node_modules from tracking 2026-03-25 15:13:06 -07:00
c787ed751c chore: gitignore web/node_modules and web/dist 2026-03-25 15:12:57 -07:00
7a704441a6 feat(snipe): Vue 3 frontend scaffold + Docker web service
- web/: Vue 3 + Vite + UnoCSS + Pinia, dark tactical theme (amber/#0d1117)
- AppNav, ListingCard, SearchView with filters/sort, composables
  (useSnipeMode, useKonamiCode, useMotion), Pinia search store
- Steal shimmer, auction countdown, Snipe Mode easter egg all native in Vue
- docker/web/: nginx + multi-stage Dockerfile (node build → nginx serve)
- compose.yml: api (8510) + web (8509) services
- Dockerfile CMD updated to uvicorn for upcoming FastAPI layer
- Clean build: 0 TS errors, 380 modules
2026-03-25 15:11:35 -07:00
07794ee163 fix: rename app/app.py → streamlit_app.py to resolve package shadowing 2026-03-25 15:05:12 -07:00
6ec0f957b9 feat(snipe): auction support + easter eggs (Konami, The Steal, de-emphasis)
Auction metadata:
- Listing model gains buying_format + ends_at fields
- Migration 002 adds columns to existing databases
- scraper.py: parse s-item__time-left → absolute ends_at ISO timestamp
- normaliser.py: extract buyingOptions + itemEndDate from Browse API
- store.py: save/get updated for new fields

Easter eggs (app/ui/components/easter_eggs.py):
- Konami code detector (JS → URL param → Streamlit rerun)
- Web Audio API snipe call synthesis, gated behind sidebar checkbox
  (disabled by default for safety/accessibility)
- "The Steal" gold shimmer: trust ≥ 90, price 15–30% below market,
  no suspicious_price flag
- Auction de-emphasis: soft caption when > 1h remaining

UI updates:
- listing_row: steal banner + auction notice per row
- Search: inject CSS, check snipe mode, "Ending soon" sort option,
  pass market_price from comp cache to row renderer
- app.py: Konami detector + audio enable/disable sidebar toggle

Tests: 22 new tests (72 total, all green)
2026-03-25 14:27:02 -07:00
68a9879191 feat: add scraper adapter with auto-detect fallback and partial score logging 2026-03-25 14:12:29 -07:00
4977e517fe feat: Snipe MVP v0.1 — eBay trust scorer with faceted filter UI 2026-03-25 13:09:49 -07:00
64 changed files with 11084 additions and 103 deletions

View file

@ -1,4 +1,33 @@
EBAY_CLIENT_ID=your-client-id-here
EBAY_CLIENT_SECRET=your-client-secret-here
EBAY_ENV=production # or: sandbox
# Snipe works out of the box with the scraper (no credentials needed).
# Set eBay API credentials to unlock full trust scores —
# account age and category history signals require the eBay Browse API.
# Without credentials the app logs a warning and falls back to the scraper.
# ── eBay Developer Keys — Production ──────────────────────────────────────────
# From https://developer.ebay.com/my/keys (Production tab)
EBAY_APP_ID=
EBAY_DEV_ID=
EBAY_CERT_ID=
# ── eBay Developer Keys — Sandbox ─────────────────────────────────────────────
# From https://developer.ebay.com/my/keys (Sandbox tab)
EBAY_SANDBOX_APP_ID=
EBAY_SANDBOX_DEV_ID=
EBAY_SANDBOX_CERT_ID=
# ── Active environment ─────────────────────────────────────────────────────────
# production | sandbox
EBAY_ENV=production
# ── eBay Account Deletion Webhook ──────────────────────────────────────────────
# Register endpoint at https://developer.ebay.com/my/notification — required for
# production key activation. Set EBAY_NOTIFICATION_ENDPOINT to the public HTTPS
# URL eBay will POST to (e.g. https://snipe.circuitforge.tech/api/ebay/account-deletion).
EBAY_NOTIFICATION_TOKEN=
EBAY_NOTIFICATION_ENDPOINT=
# Set to false during sandbox/registration (no production token available yet).
# Set to true once production credentials are active — enforces ECDSA verification.
EBAY_WEBHOOK_VERIFY_SIGNATURES=true
# ── Database ───────────────────────────────────────────────────────────────────
SNIPE_DB=data/snipe.db

2
.gitignore vendored
View file

@ -7,3 +7,5 @@ dist/
.pytest_cache/
data/
.superpowers/
web/node_modules/
web/dist/

View file

@ -2,6 +2,11 @@ FROM python:3.11-slim
WORKDIR /app
# System deps for Playwright/Chromium
RUN apt-get update && apt-get install -y --no-install-recommends \
xvfb \
&& rm -rf /var/lib/apt/lists/*
# Install circuitforge-core from sibling directory (compose sets context: ..)
COPY circuitforge-core/ ./circuitforge-core/
RUN pip install --no-cache-dir -e ./circuitforge-core
@ -11,5 +16,10 @@ COPY snipe/ ./snipe/
WORKDIR /app/snipe
RUN pip install --no-cache-dir -e .
EXPOSE 8506
CMD ["streamlit", "run", "app/app.py", "--server.port=8506", "--server.address=0.0.0.0"]
# Install Playwright + Chromium (after snipe deps so layer is cached separately)
RUN pip install --no-cache-dir playwright playwright-stealth && \
playwright install chromium && \
playwright install-deps chromium
EXPOSE 8510
CMD ["uvicorn", "api.main:app", "--host", "0.0.0.0", "--port", "8510"]

7
PRIVACY.md Normal file
View file

@ -0,0 +1,7 @@
# Privacy Policy
CircuitForge LLC's privacy policy applies to this product and is published at:
**<https://circuitforge.tech/privacy>**
Last reviewed: March 2026.

196
README.md
View file

@ -1,3 +1,195 @@
# snipe
# Snipe — Auction Sniping & Listing Intelligence
snipe by Circuit Forge LLC — Auction sniping — CT Bids, antiques, estate auctions, eBay
> *Part of the Circuit Forge LLC "AI for the tasks you hate most" suite.*
**Status:** Active — eBay listing search + seller trust scoring MVP complete. Auction sniping engine and multi-platform support are next.
## What it does
Snipe has two layers that work together:
**Layer 1 — Listing intelligence (MVP, implemented)**
Before you bid, Snipe tells you whether a listing is worth your time. It fetches eBay listings, scores each seller's trustworthiness across five signals, flags suspicious pricing relative to completed sales, and surfaces red flags like new accounts, cosmetic damage buried in titles, and listings that have been sitting unsold for weeks.
**Layer 2 — Auction sniping (roadmap)**
Snipe manages the bid itself: monitors listings across platforms, schedules last-second bids, handles soft-close extensions, and guides you through the post-win logistics (payment routing, shipping coordination, provenance documentation for antiques).
The name is the origin of the word "sniping" — common snipes are notoriously elusive birds, secretive and camouflaged, that flush suddenly from cover. Shooting one required extreme patience, stillness, and a precise last-second shot. That's the auction strategy.
---
## Implemented: eBay Listing Intelligence
### Search & filtering
- Full-text eBay search via Browse API (with Playwright scraper fallback when no API credentials configured)
- Price range, must-include keywords (AND / ANY / OR-groups mode), must-exclude terms, eBay category filter
- OR-group mode expands keyword combinations into multiple targeted queries and deduplicates results — eBay relevance won't silently drop variants
- Pages-to-fetch control: each Browse API page returns up to 200 listings
- Saved searches with one-click re-run that restores all filter settings
### Seller trust scoring
Five signals, each scored 020, composited to 0100:
| Signal | What it measures |
|--------|-----------------|
| `account_age` | Days since eBay account registration |
| `feedback_count` | Total feedback received |
| `feedback_ratio` | Positive feedback percentage |
| `price_vs_market` | Listing price vs. median of recent completed sales |
| `category_history` | Whether seller has history selling in this category |
Scores are marked **partial** when signals are unavailable (e.g. account age not yet enriched). Partial scores are displayed with a visual indicator rather than penalizing the seller for missing data.
### Red flags
Hard filters that override the composite score:
- `new_account` — account registered within 7 days
- `established_bad_actor` — feedback ratio < 80% with 20+ reviews
Soft flags surfaced as warnings:
- `account_under_30_days` — account under 30 days old
- `low_feedback_count` — fewer than 10 reviews
- `suspicious_price` — listing price below 50% of market median *(suppressed automatically when the search returns a heterogeneous price distribution — e.g. mixed laptop generations — to prevent false positives)*
- `duplicate_photo` — same image found on another listing (perceptual hash)
- `scratch_dent_mentioned` — title keywords indicating cosmetic damage, functional problems, or evasive language (see below)
- `long_on_market` — listing has been seen 5+ times over 14+ days without selling
- `significant_price_drop` — current price more than 20% below first-seen price
### Scratch & dent title detection
Scans listing titles for signals the item may have undisclosed damage or problems:
- **Explicit damage**: scratch, scuff, dent, crack, chip, blemish, worn
- **Condition catch-alls**: as is, for parts, parts only, spares or repair
- **Evasive redirects**: "see description", "read description", "see photos for" (seller hiding damage detail in listing body)
- **Functional problems**: "not working", "stopped working", "no power", "dead on arrival", "powers on but", "faulty", "broken screen/hinge/port"
- **DIY/repair listings**: "needs repair", "needs tlc", "project laptop", "for repair", "sold as is"
### Seller enrichment
- **Inline (API adapter)**: account age filled from Browse API `registrationDate` field
- **Background (scraper)**: `/itm/` listing pages scraped for seller "Joined" date via Playwright + Xvfb (Kasada-safe headed Chromium)
- **On-demand**: ↻ button on any listing card triggers `POST /api/enrich` — runs enrichment and re-scores without waiting for a second search
- **Category history**: derived from the seller's accumulated listing data (Browse API `categories` field); improves with every search, no extra API calls
### Market price comparison
Completed sales fetched via eBay Marketplace Insights API (with Browse API fallback for app tiers that don't have Insights access). Median stored per query hash, used to score `price_vs_market` across all listings in a search.
### Adapters
| Adapter | When used | Signals available |
|---------|-----------|-------------------|
| Browse API (`api`) | eBay API credentials configured | All signals; account age inline |
| Playwright scraper (`scraper`) | No credentials / forced | All signals except account age (async BTF enrichment) |
| `auto` (default) | — | API if credentials present, scraper otherwise |
---
## Stack
| Layer | Tech | Port |
|-------|------|------|
| Frontend | Vue 3 + Pinia + UnoCSS + Vite (nginx) | 8509 |
| API | FastAPI (uvicorn) | 8510 |
| Scraper | Playwright + playwright-stealth + Xvfb | — |
| DB | SQLite (`data/snipe.db`) | — |
| Core | circuitforge-core (editable install) | — |
## Running
```bash
./manage.sh start # start all services
./manage.sh stop # stop
./manage.sh logs # tail logs
./manage.sh open # open in browser
```
Cloud stack (shared DB, multi-user):
```bash
docker compose -f compose.cloud.yml -p snipe-cloud up -d
docker compose -f compose.cloud.yml -p snipe-cloud build api # after Python changes
```
---
## Roadmap
### Near-term (eBay)
| Issue | Feature |
|-------|---------|
| [#1](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/1) | SSE/WebSocket live score push — enriched data appears without re-search |
| [#2](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/2) | eBay OAuth (Connect eBay Account) for full trust score access via Trading API |
| [#4](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/4) | Scammer database: community blocklist + batch eBay Trust & Safety reporting |
| [#5](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/5) | UPC/product lookup → LLM-crafted search terms (paid tier) |
| [#8](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/8) | "Triple Red" easter egg: CSS animation when all hard flags fire simultaneously |
| [#11](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/11) | Vision-based photo condition assessment — moondream2 (local) / Claude vision (cloud, paid) |
| [#12](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/12) | Background saved-search monitoring with configurable alerts |
### Cloud / infrastructure
| Issue | Feature |
|-------|---------|
| [#6](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/6) | Shared seller/scammer/comps DB across cloud users (public data, no re-scraping) |
| [#7](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7) | Shared image hash DB — requires explicit opt-in consent (CF privacy-by-architecture) |
### Auction sniping engine
| Issue | Feature |
|-------|---------|
| [#9](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/9) | Bid scheduling + snipe execution (NTP-synchronized, soft-close handling, human approval gate) |
| [#13](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/13) | Post-win workflow: payment routing, shipping coordination, provenance documentation |
### Multi-platform expansion
| Issue | Feature |
|-------|---------|
| [#10](https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/10) | CT Bids, HiBid, AuctionZip, Invaluable, GovPlanet, Bidsquare, Proxibid |
---
## Primary platforms (full vision)
- **eBay** — general + collectibles *(search + trust scoring: implemented)*
- **CT Bids** — Connecticut state surplus and municipal auctions
- **GovPlanet / IronPlanet** — government surplus equipment
- **AuctionZip** — antique auction house aggregator (1,000+ houses)
- **Invaluable / LiveAuctioneers** — fine art and antiques
- **Bidsquare** — antiques and collectibles
- **HiBid** — estate auctions
- **Proxibid** — industrial and collector auctions
## Why auctions are hard
Online auctions are frustrating because:
- Winning requires being present at the exact closing moment — sometimes 2 AM
- Platforms vary wildly: some allow proxy bids, some don't; closing times extend on activity
- Price history is hidden — you don't know if an item is underpriced or a trap
- Sellers hide damage in descriptions rather than titles to avoid automated filters
- Shipping logistics for large / fragile antiques require coordination with the auction house
- Provenance documentation is inconsistent across auction houses
## Bidding strategy engine (planned)
- **Hard snipe**: submit bid N seconds before close (default: 8s)
- **Soft-close handling**: detect if platform extends on last-minute bids; adjust strategy
- **Proxy ladder**: set max and let the engine bid in increments, reserve snipe for final window
- **Reserve detection**: identify likely reserve price from bid history patterns
- **Comparable sales**: pull recent auction results for same/similar items across platforms
## Post-win workflow (planned)
1. Payment method routing (platform-specific: CC, wire, check)
2. Shipping quote requests to approved carriers (freight / large items via uShip; parcel via FedEx/UPS)
3. Condition report request from auction house
4. Provenance packet generation (for antiques / fine art resale or insurance)
5. Add to inventory (for dealers / collectors tracking portfolio value)
## Product code (license key)
`CFG-SNPE-XXXX-XXXX-XXXX`
## Tech notes
- Shared `circuitforge-core` scaffold (DB, LLM router, tier system, config)
- Platform adapters: currently eBay only; AuctionZip, Invaluable, HiBid, CT Bids planned (Playwright + API where available)
- Bid execution: Playwright automation with precise timing (NTP-synchronized)
- Soft-close detection: platform-specific rules engine
- Comparable sales: eBay completed listings via Marketplace Insights API + Browse API fallback
- Vision module: condition assessment from listing photos — moondream2 / Claude vision (paid tier stub in `app/trust/photo.py`)
- **Kasada bypass**: headed Chromium via Xvfb; all scraping uses this path — headless and `requests`-based approaches are blocked by eBay

0
api/__init__.py Normal file
View file

149
api/ebay_webhook.py Normal file
View file

@ -0,0 +1,149 @@
"""eBay Marketplace Account Deletion webhook.
Required to activate eBay production API credentials.
Protocol (https://developer.ebay.com/develop/guides-v2/marketplace-user-account-deletion):
GET /api/ebay/account-deletion?challenge_code=<hex>
{"challengeResponse": SHA256(code + token + endpoint_url)}
POST /api/ebay/account-deletion
Header: X-EBAY-SIGNATURE: <base64-JSON {"kid": "...", "signature": "<b64>"}>
Body: JSON notification payload
200 on valid + deleted, 412 on bad signature
Public keys are fetched from the eBay Notification API and cached for 1 hour.
"""
from __future__ import annotations
import base64
import hashlib
import json
import logging
import os
import time
from pathlib import Path
from typing import Optional
import requests
from fastapi import APIRouter, Header, HTTPException, Request
from cryptography.exceptions import InvalidSignature
from cryptography.hazmat.primitives.asymmetric.ec import ECDSA
from cryptography.hazmat.primitives.hashes import SHA1
from cryptography.hazmat.primitives.serialization import load_pem_public_key
from app.db.store import Store
log = logging.getLogger(__name__)
router = APIRouter()
_DB_PATH = Path(os.environ.get("SNIPE_DB", "data/snipe.db"))
# ── Public-key cache ──────────────────────────────────────────────────────────
# eBay key rotation is rare; 1-hour TTL is appropriate.
_KEY_CACHE_TTL = 3600
_key_cache: dict[str, tuple[bytes, float]] = {} # kid → (pem_bytes, expiry)
# The eBay Notification service is a unified production-side system — signing keys
# always live at api.ebay.com regardless of whether the app uses sandbox or production
# Browse API credentials.
_EBAY_KEY_URL = "https://api.ebay.com/commerce/notification/v1/public_key/{kid}"
def _fetch_public_key(kid: str) -> bytes:
"""Return PEM public key bytes for the given kid, using a 1-hour cache."""
cached = _key_cache.get(kid)
if cached and time.time() < cached[1]:
return cached[0]
key_url = _EBAY_KEY_URL.format(kid=kid)
resp = requests.get(key_url, timeout=10)
if not resp.ok:
log.error("public key fetch failed: %s %s — body: %s", resp.status_code, key_url, resp.text[:500])
resp.raise_for_status()
pem_str: str = resp.json()["key"]
pem_bytes = pem_str.encode()
_key_cache[kid] = (pem_bytes, time.time() + _KEY_CACHE_TTL)
return pem_bytes
# ── GET — challenge verification ──────────────────────────────────────────────
@router.get("/api/ebay/account-deletion")
def ebay_challenge(challenge_code: str):
"""Respond to eBay's endpoint verification challenge.
eBay sends this GET once when you register the endpoint URL.
Response must be the SHA-256 hex digest of (code + token + endpoint).
"""
token = os.environ.get("EBAY_NOTIFICATION_TOKEN", "")
endpoint = os.environ.get("EBAY_NOTIFICATION_ENDPOINT", "")
if not token or not endpoint:
log.error("EBAY_NOTIFICATION_TOKEN or EBAY_NOTIFICATION_ENDPOINT not set")
raise HTTPException(status_code=500, detail="Webhook not configured")
digest = hashlib.sha256(
(challenge_code + token + endpoint).encode()
).hexdigest()
return {"challengeResponse": digest}
# ── POST — deletion notification ──────────────────────────────────────────────
@router.post("/api/ebay/account-deletion", status_code=200)
async def ebay_account_deletion(
request: Request,
x_ebay_signature: Optional[str] = Header(default=None),
):
"""Process an eBay Marketplace Account Deletion notification.
Verifies the ECDSA/SHA1 signature, then permanently deletes all stored
data (sellers + listings) for the named eBay user.
"""
body_bytes = await request.body()
# 1. Parse and verify signature header
if not x_ebay_signature:
log.warning("ebay_account_deletion: missing X-EBAY-SIGNATURE header")
raise HTTPException(status_code=412, detail="Missing signature")
try:
sig_json = json.loads(base64.b64decode(x_ebay_signature))
kid: str = sig_json["kid"]
sig_b64: str = sig_json["signature"]
sig_bytes = base64.b64decode(sig_b64)
except Exception as exc:
log.warning("ebay_account_deletion: malformed signature header — %s", exc)
raise HTTPException(status_code=412, detail="Malformed signature header")
# 2. Fetch and verify with eBay public key
# EBAY_WEBHOOK_VERIFY_SIGNATURES=false skips ECDSA during sandbox/registration phase.
# Set to true (default) once production credentials are active.
skip_verify = os.environ.get("EBAY_WEBHOOK_VERIFY_SIGNATURES", "true").lower() == "false"
if skip_verify:
log.warning("ebay_account_deletion: signature verification DISABLED — enable before production")
else:
try:
pem_bytes = _fetch_public_key(kid)
pub_key = load_pem_public_key(pem_bytes)
pub_key.verify(sig_bytes, body_bytes, ECDSA(SHA1()))
except InvalidSignature:
log.warning("ebay_account_deletion: ECDSA signature verification failed (kid=%s)", kid)
raise HTTPException(status_code=412, detail="Signature verification failed")
except Exception as exc:
log.error("ebay_account_deletion: unexpected error during verification — %s", exc)
raise HTTPException(status_code=412, detail="Verification error")
# 3. Extract username from notification payload and delete data
try:
payload = json.loads(body_bytes)
username: str = payload["notification"]["data"]["username"]
except (KeyError, json.JSONDecodeError) as exc:
log.error("ebay_account_deletion: could not parse payload — %s", exc)
raise HTTPException(status_code=400, detail="Unrecognisable payload")
store = Store(_DB_PATH)
store.delete_seller_data("ebay", username)
log.info("ebay_account_deletion: deleted data for eBay user %r", username)
return {}

395
api/main.py Normal file
View file

@ -0,0 +1,395 @@
"""Snipe FastAPI — search endpoint wired to ScrapedEbayAdapter + TrustScorer."""
from __future__ import annotations
import dataclasses
import hashlib
import logging
import os
from concurrent.futures import ThreadPoolExecutor
from pathlib import Path
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from fastapi.middleware.cors import CORSMiddleware
from circuitforge_core.config import load_env
from app.db.store import Store
from app.db.models import SavedSearch as SavedSearchModel
from app.platforms import SearchFilters
from app.platforms.ebay.scraper import ScrapedEbayAdapter
from app.platforms.ebay.adapter import EbayAdapter
from app.platforms.ebay.auth import EbayTokenManager
from app.platforms.ebay.query_builder import expand_queries, parse_groups
from app.trust import TrustScorer
from api.ebay_webhook import router as ebay_webhook_router
load_env(Path(".env"))
log = logging.getLogger(__name__)
_DB_PATH = Path(os.environ.get("SNIPE_DB", "data/snipe.db"))
_DB_PATH.parent.mkdir(exist_ok=True)
def _ebay_creds() -> tuple[str, str, str]:
"""Return (client_id, client_secret, env) from env vars.
New names: EBAY_APP_ID / EBAY_CERT_ID (sandbox: EBAY_SANDBOX_APP_ID / EBAY_SANDBOX_CERT_ID)
Legacy fallback: EBAY_CLIENT_ID / EBAY_CLIENT_SECRET
"""
env = os.environ.get("EBAY_ENV", "production").strip()
if env == "sandbox":
client_id = os.environ.get("EBAY_SANDBOX_APP_ID", "").strip()
client_secret = os.environ.get("EBAY_SANDBOX_CERT_ID", "").strip()
else:
client_id = (os.environ.get("EBAY_APP_ID") or os.environ.get("EBAY_CLIENT_ID", "")).strip()
client_secret = (os.environ.get("EBAY_CERT_ID") or os.environ.get("EBAY_CLIENT_SECRET", "")).strip()
return client_id, client_secret, env
app = FastAPI(title="Snipe API", version="0.1.0")
app.include_router(ebay_webhook_router)
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_methods=["*"],
allow_headers=["*"],
)
@app.get("/api/health")
def health():
return {"status": "ok"}
def _trigger_scraper_enrichment(listings: list, store: Store) -> None:
"""Fire-and-forget background enrichment for missing seller signals.
Two enrichment passes run concurrently in the same daemon thread:
1. BTF (/itm/ pages) fills account_age_days for sellers where it is None.
2. _ssn search pages fills category_history_json for sellers with no history.
The main response returns immediately; enriched data lands in the DB for
future searches. Uses ScrapedEbayAdapter's Playwright stack regardless of
which adapter was used for the main search (Shopping API handles age for
the API adapter inline; BTF is the fallback for no-creds / scraper mode).
"""
# Caps per search: limits Playwright sessions launched in the background so we
# don't hammer Kasada or spin up dozens of Xvfb instances after a large search.
# Remaining sellers get enriched incrementally on subsequent searches.
_BTF_MAX_PER_SEARCH = 3
_CAT_MAX_PER_SEARCH = 3
needs_btf: dict[str, str] = {}
needs_categories: list[str] = []
for listing in listings:
sid = listing.seller_platform_id
if not sid:
continue
seller = store.get_seller("ebay", sid)
if not seller:
continue
if (seller.account_age_days is None
and sid not in needs_btf
and len(needs_btf) < _BTF_MAX_PER_SEARCH):
needs_btf[sid] = listing.platform_listing_id
if (seller.category_history_json in ("{}", "", None)
and sid not in needs_categories
and len(needs_categories) < _CAT_MAX_PER_SEARCH):
needs_categories.append(sid)
if not needs_btf and not needs_categories:
return
log.info(
"Scraper enrichment: %d BTF age + %d category pages queued",
len(needs_btf), len(needs_categories),
)
def _run():
try:
enricher = ScrapedEbayAdapter(Store(_DB_PATH))
if needs_btf:
enricher.enrich_sellers_btf(needs_btf, max_workers=2)
log.info("BTF enrichment complete for %d sellers", len(needs_btf))
if needs_categories:
enricher.enrich_sellers_categories(needs_categories, max_workers=2)
log.info("Category enrichment complete for %d sellers", len(needs_categories))
except Exception as e:
log.warning("Scraper enrichment failed: %s", e)
import threading
t = threading.Thread(target=_run, daemon=True)
t.start()
def _parse_terms(raw: str) -> list[str]:
"""Split a comma-separated keyword string into non-empty, stripped terms."""
return [t.strip() for t in raw.split(",") if t.strip()]
def _make_adapter(store: Store, force: str = "auto"):
"""Return the appropriate adapter.
force: "auto" | "api" | "scraper"
auto API if creds present, else scraper
api Browse API (raises if no creds)
scraper Playwright scraper regardless of creds
"""
client_id, client_secret, env = _ebay_creds()
has_creds = bool(client_id and client_secret)
if force == "scraper":
return ScrapedEbayAdapter(store)
if force == "api":
if not has_creds:
raise ValueError("adapter=api requested but no eBay API credentials configured")
return EbayAdapter(EbayTokenManager(client_id, client_secret, env), store, env=env)
# auto
if has_creds:
return EbayAdapter(EbayTokenManager(client_id, client_secret, env), store, env=env)
log.debug("No eBay API credentials — using scraper adapter (partial trust scores)")
return ScrapedEbayAdapter(store)
def _adapter_name(force: str = "auto") -> str:
"""Return the name of the adapter that would be used — without creating it."""
client_id, client_secret, _ = _ebay_creds()
if force == "scraper":
return "scraper"
if force == "api" or (force == "auto" and client_id and client_secret):
return "api"
return "scraper"
@app.get("/api/search")
def search(
q: str = "",
max_price: float = 0,
min_price: float = 0,
pages: int = 1,
must_include: str = "", # raw filter string; client-side always applied
must_include_mode: str = "all", # "all" | "any" | "groups" — drives eBay expansion
must_exclude: str = "", # comma-separated; forwarded to eBay -term + client-side
category_id: str = "", # eBay category ID — forwarded to Browse API / scraper _sacat
adapter: str = "auto", # "auto" | "api" | "scraper" — override adapter selection
):
if not q.strip():
return {"listings": [], "trust_scores": {}, "sellers": {}, "market_price": None, "adapter_used": _adapter_name(adapter)}
must_exclude_terms = _parse_terms(must_exclude)
# In Groups mode, expand OR groups into multiple targeted eBay queries to
# guarantee comprehensive result coverage — eBay relevance won't silently drop variants.
if must_include_mode == "groups" and must_include.strip():
or_groups = parse_groups(must_include)
ebay_queries = expand_queries(q, or_groups)
else:
ebay_queries = [q]
base_filters = SearchFilters(
max_price=max_price if max_price > 0 else None,
min_price=min_price if min_price > 0 else None,
pages=max(1, pages),
must_exclude=must_exclude_terms, # forwarded to eBay -term by the scraper
category_id=category_id.strip() or None,
)
adapter_used = _adapter_name(adapter)
# Each thread creates its own Store — sqlite3 check_same_thread=True.
def _run_search(ebay_query: str) -> list:
return _make_adapter(Store(_DB_PATH), adapter).search(ebay_query, base_filters)
def _run_comps() -> None:
try:
_make_adapter(Store(_DB_PATH), adapter).get_completed_sales(q, pages)
except Exception:
log.warning("comps: unhandled exception for %r", q, exc_info=True)
try:
# Comps submitted first — guarantees an immediate worker slot even at max concurrency.
# Seller enrichment runs after the executor exits (background thread), so comps are
# always prioritised over tracking seller age / category history.
max_workers = min(len(ebay_queries) + 1, 5)
with ThreadPoolExecutor(max_workers=max_workers) as ex:
comps_future = ex.submit(_run_comps)
search_futures = [ex.submit(_run_search, eq) for eq in ebay_queries]
# Merge and deduplicate across all search queries
seen_ids: set[str] = set()
listings: list = []
for fut in search_futures:
for listing in fut.result():
if listing.platform_listing_id not in seen_ids:
seen_ids.add(listing.platform_listing_id)
listings.append(listing)
comps_future.result() # side-effect: market comp written to DB
except Exception as e:
log.warning("eBay scrape failed: %s", e)
raise HTTPException(status_code=502, detail=f"eBay search failed: {e}")
log.info("Multi-search: %d queries → %d unique listings", len(ebay_queries), len(listings))
# Main-thread store for all post-search reads/writes — fresh connection, same thread.
store = Store(_DB_PATH)
store.save_listings(listings)
# Derive category_history from accumulated listing data — free for API adapter
# (category_name comes from Browse API response), no-op for scraper listings (category_name=None).
seller_ids = list({l.seller_platform_id for l in listings if l.seller_platform_id})
n_cat = store.refresh_seller_categories("ebay", seller_ids)
if n_cat:
log.info("Category history derived for %d sellers from listing data", n_cat)
# Re-fetch to hydrate staging fields (times_seen, first_seen_at, id, price_at_first_seen)
# that are only available from the DB after the upsert.
staged = store.get_listings_staged("ebay", [l.platform_listing_id for l in listings])
listings = [staged.get(l.platform_listing_id, l) for l in listings]
# BTF enrichment: scrape /itm/ pages for sellers missing account_age_days.
# Runs in the background so it doesn't delay the response; next search of
# the same sellers will have full scores.
_trigger_scraper_enrichment(listings, store)
scorer = TrustScorer(store)
trust_scores_list = scorer.score_batch(listings, q)
query_hash = hashlib.md5(q.encode()).hexdigest()
comp = store.get_market_comp("ebay", query_hash)
market_price = comp.median_price if comp else None
# Serialize — keyed by platform_listing_id for easy Vue lookup
trust_map = {
listing.platform_listing_id: dataclasses.asdict(ts)
for listing, ts in zip(listings, trust_scores_list)
if ts is not None
}
seller_map = {
listing.seller_platform_id: dataclasses.asdict(
store.get_seller("ebay", listing.seller_platform_id)
)
for listing in listings
if listing.seller_platform_id
and store.get_seller("ebay", listing.seller_platform_id)
}
return {
"listings": [dataclasses.asdict(l) for l in listings],
"trust_scores": trust_map,
"sellers": seller_map,
"market_price": market_price,
"adapter_used": adapter_used,
}
# ── On-demand enrichment ──────────────────────────────────────────────────────
@app.post("/api/enrich")
def enrich_seller(seller: str, listing_id: str, query: str = ""):
"""Synchronous on-demand enrichment for a single seller + re-score.
Runs enrichment paths in parallel:
- Shopping API GetUserProfile (fast, ~500ms) account_age_days if API creds present
- BTF /itm/ Playwright scrape (~20s) account_age_days fallback
- _ssn Playwright scrape (~20s) category_history_json
BTF and _ssn run concurrently; total wall time ~20s when Playwright needed.
Returns the updated trust_score and seller so the frontend can patch in-place.
"""
import threading
store = Store(_DB_PATH)
seller_obj = store.get_seller("ebay", seller)
if not seller_obj:
raise HTTPException(status_code=404, detail=f"Seller '{seller}' not found")
# Fast path: Shopping API for account age (inline, no Playwright)
try:
api_adapter = _make_adapter(store, "api")
if hasattr(api_adapter, "enrich_sellers_shopping_api"):
api_adapter.enrich_sellers_shopping_api([seller])
except Exception:
pass # no API creds — fall through to BTF
seller_obj = store.get_seller("ebay", seller)
needs_btf = seller_obj is not None and seller_obj.account_age_days is None
needs_categories = seller_obj is None or seller_obj.category_history_json in ("{}", "", None)
# Slow path: Playwright for remaining gaps (BTF + _ssn in parallel threads)
if needs_btf or needs_categories:
scraper = ScrapedEbayAdapter(Store(_DB_PATH))
errors: list[Exception] = []
def _btf():
try:
scraper.enrich_sellers_btf({seller: listing_id}, max_workers=1)
except Exception as e:
errors.append(e)
def _ssn():
try:
ScrapedEbayAdapter(Store(_DB_PATH)).enrich_sellers_categories([seller], max_workers=1)
except Exception as e:
errors.append(e)
threads = []
if needs_btf:
threads.append(threading.Thread(target=_btf, daemon=True))
if needs_categories:
threads.append(threading.Thread(target=_ssn, daemon=True))
for t in threads:
t.start()
for t in threads:
t.join(timeout=60)
if errors:
log.warning("enrich_seller: %d scrape error(s): %s", len(errors), errors[0])
# Re-fetch listing with staging fields, re-score
staged = store.get_listings_staged("ebay", [listing_id])
listing = staged.get(listing_id)
if not listing:
raise HTTPException(status_code=404, detail=f"Listing '{listing_id}' not found")
scorer = TrustScorer(store)
trust_list = scorer.score_batch([listing], query or listing.title)
trust = trust_list[0] if trust_list else None
seller_final = store.get_seller("ebay", seller)
return {
"trust_score": dataclasses.asdict(trust) if trust else None,
"seller": dataclasses.asdict(seller_final) if seller_final else None,
}
# ── Saved Searches ────────────────────────────────────────────────────────────
class SavedSearchCreate(BaseModel):
name: str
query: str
filters_json: str = "{}"
@app.get("/api/saved-searches")
def list_saved_searches():
return {"saved_searches": [dataclasses.asdict(s) for s in Store(_DB_PATH).list_saved_searches()]}
@app.post("/api/saved-searches", status_code=201)
def create_saved_search(body: SavedSearchCreate):
created = Store(_DB_PATH).save_saved_search(
SavedSearchModel(name=body.name, query=body.query, platform="ebay", filters_json=body.filters_json)
)
return dataclasses.asdict(created)
@app.delete("/api/saved-searches/{saved_id}", status_code=204)
def delete_saved_search(saved_id: int):
Store(_DB_PATH).delete_saved_search(saved_id)
@app.patch("/api/saved-searches/{saved_id}/run")
def mark_saved_search_run(saved_id: int):
Store(_DB_PATH).update_saved_search_last_run(saved_id)
return {"ok": True}

View file

@ -0,0 +1,3 @@
-- Add auction metadata to listings (v0.1.1)
ALTER TABLE listings ADD COLUMN buying_format TEXT NOT NULL DEFAULT 'fixed_price';
ALTER TABLE listings ADD COLUMN ends_at TEXT;

View file

@ -0,0 +1,23 @@
-- Make account_age_days nullable — scraper tier cannot fetch it without
-- following each seller's profile link, so NULL means "not yet fetched"
-- rather than "genuinely zero days old". This prevents false new_account
-- flags for all scraped listings.
--
-- SQLite doesn't support ALTER COLUMN, so we recreate the sellers table.
CREATE TABLE sellers_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
platform TEXT NOT NULL,
platform_seller_id TEXT NOT NULL,
username TEXT NOT NULL,
account_age_days INTEGER, -- NULL = not yet fetched
feedback_count INTEGER NOT NULL,
feedback_ratio REAL NOT NULL,
category_history_json TEXT NOT NULL DEFAULT '{}',
fetched_at TEXT DEFAULT CURRENT_TIMESTAMP,
UNIQUE(platform, platform_seller_id)
);
INSERT INTO sellers_new SELECT * FROM sellers;
DROP TABLE sellers;
ALTER TABLE sellers_new RENAME TO sellers;

View file

@ -0,0 +1,24 @@
-- Staging DB: persistent listing tracking across searches.
-- Adds temporal metadata to listings so we can detect stale/repriced/recurring items.
ALTER TABLE listings ADD COLUMN first_seen_at TEXT;
ALTER TABLE listings ADD COLUMN last_seen_at TEXT;
ALTER TABLE listings ADD COLUMN times_seen INTEGER NOT NULL DEFAULT 1;
ALTER TABLE listings ADD COLUMN price_at_first_seen REAL;
-- Backfill existing rows so columns are non-null where we have data
UPDATE listings SET
first_seen_at = fetched_at,
last_seen_at = fetched_at,
price_at_first_seen = price
WHERE first_seen_at IS NULL;
-- Price history: append-only snapshots; one row per (listing, price) change.
-- Duplicate prices are ignored (INSERT OR IGNORE) so only transitions are recorded.
CREATE TABLE IF NOT EXISTS listing_price_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
listing_id INTEGER NOT NULL REFERENCES listings(id),
price REAL NOT NULL,
captured_at TEXT DEFAULT CURRENT_TIMESTAMP,
UNIQUE(listing_id, price)
);

View file

@ -0,0 +1,3 @@
-- Add per-listing category name, extracted from eBay API response.
-- Used to derive seller category_history_json without _ssn scraping.
ALTER TABLE listings ADD COLUMN category_name TEXT;

View file

@ -9,7 +9,7 @@ class Seller:
platform: str
platform_seller_id: str
username: str
account_age_days: int
account_age_days: Optional[int] # None = not yet fetched (scraper tier)
feedback_count: int
feedback_ratio: float # 0.01.0
category_history_json: str # JSON blob of past category sales
@ -29,9 +29,17 @@ class Listing:
url: str
photo_urls: list[str] = field(default_factory=list)
listing_age_days: int = 0
buying_format: str = "fixed_price" # "fixed_price", "auction", "best_offer"
ends_at: Optional[str] = None # ISO8601 auction end time; None for fixed-price
id: Optional[int] = None
fetched_at: Optional[str] = None
trust_score_id: Optional[int] = None
category_name: Optional[str] = None # leaf category from eBay API (e.g. "Graphics/Video Cards")
# Staging DB fields — populated from DB after upsert
first_seen_at: Optional[str] = None
last_seen_at: Optional[str] = None
times_seen: int = 1
price_at_first_seen: Optional[float] = None
@dataclass

View file

@ -7,7 +7,7 @@ from typing import Optional
from circuitforge_core.db import get_connection, run_migrations
from .models import Listing, Seller, TrustScore, MarketComp
from .models import Listing, Seller, TrustScore, MarketComp, SavedSearch
MIGRATIONS_DIR = Path(__file__).parent / "migrations"
@ -19,15 +19,32 @@ class Store:
# --- Seller ---
def save_seller(self, seller: Seller) -> None:
def delete_seller_data(self, platform: str, platform_seller_id: str) -> None:
"""Permanently erase a seller and all their listings — GDPR/eBay deletion compliance."""
self._conn.execute(
"DELETE FROM sellers WHERE platform=? AND platform_seller_id=?",
(platform, platform_seller_id),
)
self._conn.execute(
"DELETE FROM listings WHERE platform=? AND seller_platform_id=?",
(platform, platform_seller_id),
)
self._conn.commit()
def save_seller(self, seller: Seller) -> None:
self.save_sellers([seller])
def save_sellers(self, sellers: list[Seller]) -> None:
self._conn.executemany(
"INSERT OR REPLACE INTO sellers "
"(platform, platform_seller_id, username, account_age_days, "
"feedback_count, feedback_ratio, category_history_json) "
"VALUES (?,?,?,?,?,?,?)",
(seller.platform, seller.platform_seller_id, seller.username,
seller.account_age_days, seller.feedback_count, seller.feedback_ratio,
seller.category_history_json),
[
(s.platform, s.platform_seller_id, s.username, s.account_age_days,
s.feedback_count, s.feedback_ratio, s.category_history_json)
for s in sellers
],
)
self._conn.commit()
@ -42,25 +59,141 @@ class Store:
return None
return Seller(*row[:7], id=row[7], fetched_at=row[8])
def refresh_seller_categories(self, platform: str, seller_ids: list[str]) -> int:
"""Derive category_history_json for sellers that lack it by aggregating
their stored listings' category_name values.
Returns the count of sellers updated.
"""
from app.platforms.ebay.scraper import _classify_category_label # lazy to avoid circular
if not seller_ids:
return 0
updated = 0
for sid in seller_ids:
seller = self.get_seller(platform, sid)
if not seller or seller.category_history_json not in ("{}", "", None):
continue # already enriched
rows = self._conn.execute(
"SELECT category_name, COUNT(*) FROM listings "
"WHERE platform=? AND seller_platform_id=? AND category_name IS NOT NULL "
"GROUP BY category_name",
(platform, sid),
).fetchall()
if not rows:
continue
counts: dict[str, int] = {}
for cat_name, cnt in rows:
key = _classify_category_label(cat_name)
if key:
counts[key] = counts.get(key, 0) + cnt
if counts:
from dataclasses import replace
updated_seller = replace(seller, category_history_json=json.dumps(counts))
self.save_seller(updated_seller)
updated += 1
return updated
# --- Listing ---
def save_listing(self, listing: Listing) -> None:
self._conn.execute(
"INSERT OR REPLACE INTO listings "
"(platform, platform_listing_id, title, price, currency, condition, "
"seller_platform_id, url, photo_urls, listing_age_days) "
"VALUES (?,?,?,?,?,?,?,?,?,?)",
(listing.platform, listing.platform_listing_id, listing.title,
listing.price, listing.currency, listing.condition,
listing.seller_platform_id, listing.url,
json.dumps(listing.photo_urls), listing.listing_age_days),
self.save_listings([listing])
def save_listings(self, listings: list[Listing]) -> None:
"""Upsert listings, preserving first_seen_at and price_at_first_seen on conflict.
Uses INSERT ... ON CONFLICT DO UPDATE (SQLite 3.24+) so row IDs are stable
across searches trust_score FK references survive re-indexing.
times_seen and last_seen_at accumulate on every sighting.
"""
now = datetime.now(timezone.utc).isoformat()
self._conn.executemany(
"""
INSERT INTO listings
(platform, platform_listing_id, title, price, currency, condition,
seller_platform_id, url, photo_urls, listing_age_days, buying_format,
ends_at, first_seen_at, last_seen_at, times_seen, price_at_first_seen,
category_name)
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,1,?,?)
ON CONFLICT(platform, platform_listing_id) DO UPDATE SET
title = excluded.title,
price = excluded.price,
condition = excluded.condition,
seller_platform_id = excluded.seller_platform_id,
url = excluded.url,
photo_urls = excluded.photo_urls,
listing_age_days = excluded.listing_age_days,
buying_format = excluded.buying_format,
ends_at = excluded.ends_at,
last_seen_at = excluded.last_seen_at,
times_seen = times_seen + 1,
category_name = COALESCE(excluded.category_name, category_name)
-- first_seen_at and price_at_first_seen intentionally preserved
""",
[
(l.platform, l.platform_listing_id, l.title, l.price, l.currency,
l.condition, l.seller_platform_id, l.url,
json.dumps(l.photo_urls), l.listing_age_days, l.buying_format, l.ends_at,
now, now, l.price, l.category_name)
for l in listings
],
)
# Record price snapshots — INSERT OR IGNORE means only price changes land
self._conn.executemany(
"""
INSERT OR IGNORE INTO listing_price_history (listing_id, price, captured_at)
SELECT id, ?, ? FROM listings
WHERE platform=? AND platform_listing_id=?
""",
[
(l.price, now, l.platform, l.platform_listing_id)
for l in listings
],
)
self._conn.commit()
def get_listings_staged(self, platform: str, platform_listing_ids: list[str]) -> dict[str, "Listing"]:
"""Bulk fetch listings by platform_listing_id, returning staging fields.
Returns a dict keyed by platform_listing_id. Used to hydrate freshly-normalised
listing objects after save_listings() so trust scoring sees times_seen,
first_seen_at, price_at_first_seen, and the DB-assigned id.
"""
if not platform_listing_ids:
return {}
placeholders = ",".join("?" * len(platform_listing_ids))
rows = self._conn.execute(
f"SELECT platform, platform_listing_id, title, price, currency, condition, "
f"seller_platform_id, url, photo_urls, listing_age_days, id, fetched_at, "
f"buying_format, ends_at, first_seen_at, last_seen_at, times_seen, price_at_first_seen, "
f"category_name "
f"FROM listings WHERE platform=? AND platform_listing_id IN ({placeholders})",
[platform] + list(platform_listing_ids),
).fetchall()
result: dict[str, Listing] = {}
for row in rows:
pid = row[1]
result[pid] = Listing(
*row[:8],
photo_urls=json.loads(row[8]),
listing_age_days=row[9],
id=row[10],
fetched_at=row[11],
buying_format=row[12] or "fixed_price",
ends_at=row[13],
first_seen_at=row[14],
last_seen_at=row[15],
times_seen=row[16] or 1,
price_at_first_seen=row[17],
category_name=row[18],
)
return result
def get_listing(self, platform: str, platform_listing_id: str) -> Optional[Listing]:
row = self._conn.execute(
"SELECT platform, platform_listing_id, title, price, currency, condition, "
"seller_platform_id, url, photo_urls, listing_age_days, id, fetched_at "
"seller_platform_id, url, photo_urls, listing_age_days, id, fetched_at, "
"buying_format, ends_at, first_seen_at, last_seen_at, times_seen, price_at_first_seen "
"FROM listings WHERE platform=? AND platform_listing_id=?",
(platform, platform_listing_id),
).fetchone()
@ -72,6 +205,12 @@ class Store:
listing_age_days=row[9],
id=row[10],
fetched_at=row[11],
buying_format=row[12] or "fixed_price",
ends_at=row[13],
first_seen_at=row[14],
last_seen_at=row[15],
times_seen=row[16] or 1,
price_at_first_seen=row[17],
)
# --- MarketComp ---
@ -86,6 +225,44 @@ class Store:
)
self._conn.commit()
# --- SavedSearch ---
def save_saved_search(self, s: SavedSearch) -> SavedSearch:
cur = self._conn.execute(
"INSERT INTO saved_searches (name, query, platform, filters_json) VALUES (?,?,?,?)",
(s.name, s.query, s.platform, s.filters_json),
)
self._conn.commit()
row = self._conn.execute(
"SELECT id, created_at FROM saved_searches WHERE id=?", (cur.lastrowid,)
).fetchone()
return SavedSearch(
name=s.name, query=s.query, platform=s.platform,
filters_json=s.filters_json, id=row[0], created_at=row[1],
)
def list_saved_searches(self) -> list[SavedSearch]:
rows = self._conn.execute(
"SELECT name, query, platform, filters_json, id, created_at, last_run_at "
"FROM saved_searches ORDER BY created_at DESC"
).fetchall()
return [
SavedSearch(name=r[0], query=r[1], platform=r[2], filters_json=r[3],
id=r[4], created_at=r[5], last_run_at=r[6])
for r in rows
]
def delete_saved_search(self, saved_id: int) -> None:
self._conn.execute("DELETE FROM saved_searches WHERE id=?", (saved_id,))
self._conn.commit()
def update_saved_search_last_run(self, saved_id: int) -> None:
self._conn.execute(
"UPDATE saved_searches SET last_run_at=? WHERE id=?",
(datetime.now(timezone.utc).isoformat(), saved_id),
)
self._conn.commit()
def get_market_comp(self, platform: str, query_hash: str) -> Optional[MarketComp]:
row = self._conn.execute(
"SELECT platform, query_hash, median_price, sample_count, expires_at, id, fetched_at "

View file

@ -12,6 +12,10 @@ class SearchFilters:
min_price: Optional[float] = None
condition: Optional[list[str]] = field(default_factory=list)
location_radius_km: Optional[int] = None
pages: int = 1 # number of result pages to fetch (48 listings/page)
must_include: list[str] = field(default_factory=list) # client-side title filter
must_exclude: list[str] = field(default_factory=list) # forwarded to eBay -term AND client-side
category_id: Optional[str] = None # eBay category ID (e.g. "27386" = GPUs)
class PlatformAdapter(ABC):

View file

@ -1,16 +1,58 @@
"""eBay Browse API adapter."""
from __future__ import annotations
import hashlib
import logging
from dataclasses import replace
from datetime import datetime, timedelta, timezone
from typing import Optional
import requests
log = logging.getLogger(__name__)
_SHOPPING_BASE = "https://open.api.ebay.com/shopping"
# Rate limiting for Shopping API GetUserProfile calls.
# Enrichment is incremental — these caps spread API calls across multiple
# searches rather than bursting on first encounter with a new seller batch.
_SHOPPING_API_MAX_PER_SEARCH = 5 # sellers enriched per search call
_SHOPPING_API_INTER_REQUEST_DELAY = 0.5 # seconds between successive calls
_SELLER_ENRICH_TTL_HOURS = 24 # skip re-enrichment within this window
from app.db.models import Listing, Seller, MarketComp
from app.db.store import Store
from app.platforms import PlatformAdapter, SearchFilters
from app.platforms.ebay.auth import EbayTokenManager
from app.platforms.ebay.normaliser import normalise_listing, normalise_seller
_BROWSE_LIMIT = 200 # max items per Browse API page
_INSIGHTS_BASE = {
"production": "https://api.ebay.com/buy/marketplace_insights/v1_beta",
"sandbox": "https://api.sandbox.ebay.com/buy/marketplace_insights/v1_beta",
}
def _build_browse_query(base_query: str, or_groups: list[list[str]], must_exclude: list[str]) -> str:
"""Convert OR groups + exclusions into Browse API boolean query syntax.
Browse API uses SQL-like boolean: AND (implicit), OR (keyword), NOT (keyword).
Parentheses work as grouping operators.
Example: 'GPU (16gb OR 24gb OR 48gb) (nvidia OR rtx OR geforce) NOT "parts only"'
"""
parts = [base_query.strip()]
for group in or_groups:
clean = [t.strip() for t in group if t.strip()]
if len(clean) == 1:
parts.append(clean[0])
elif len(clean) > 1:
parts.append(f"({' OR '.join(clean)})")
for term in must_exclude:
term = term.strip()
if term:
# Use minus syntax (-term / -"phrase") — Browse API's NOT keyword
# over-filters dramatically in practice; minus works like web search negatives.
parts.append(f'-"{term}"' if " " in term else f"-{term}")
return " ".join(p for p in parts if p)
BROWSE_BASE = {
"production": "https://api.ebay.com/buy/browse/v1",
"sandbox": "https://api.sandbox.ebay.com/buy/browse/v1",
@ -25,29 +67,146 @@ class EbayAdapter(PlatformAdapter):
def __init__(self, token_manager: EbayTokenManager, store: Store, env: str = "production"):
self._tokens = token_manager
self._store = store
self._env = env
self._browse_base = BROWSE_BASE[env]
def _headers(self) -> dict:
return {"Authorization": f"Bearer {self._tokens.get_token()}"}
def search(self, query: str, filters: SearchFilters) -> list[Listing]:
params: dict = {"q": query, "limit": 50}
filter_parts = []
# Build Browse API boolean query from OR groups + exclusions
browse_q = _build_browse_query(query, getattr(filters, "or_groups", []), filters.must_exclude)
filter_parts: list[str] = []
if filters.max_price:
filter_parts.append(f"price:[..{filters.max_price}],priceCurrency:USD")
if filters.min_price:
filter_parts.append(f"price:[{filters.min_price}..],priceCurrency:USD")
if filters.condition:
cond_map = {"new": "NEW", "used": "USED", "open box": "OPEN_BOX", "for parts": "FOR_PARTS_NOT_WORKING"}
cond_map = {
"new": "NEW", "used": "USED",
"open box": "OPEN_BOX", "for parts": "FOR_PARTS_NOT_WORKING",
}
ebay_conds = [cond_map[c] for c in filters.condition if c in cond_map]
if ebay_conds:
filter_parts.append(f"conditions:{{{','.join(ebay_conds)}}}")
if filter_parts:
params["filter"] = ",".join(filter_parts)
resp = requests.get(f"{self._browse_base}/item_summary/search",
headers=self._headers(), params=params)
resp.raise_for_status()
items = resp.json().get("itemSummaries", [])
return [normalise_listing(item) for item in items]
base_params: dict = {"q": browse_q, "limit": _BROWSE_LIMIT}
if filter_parts:
base_params["filter"] = ",".join(filter_parts)
if filters.category_id:
base_params["category_ids"] = filters.category_id
pages = max(1, filters.pages)
seen_ids: set[str] = set()
listings: list[Listing] = []
sellers_to_save: dict[str, Seller] = {}
for page in range(pages):
params = {**base_params, "offset": page * _BROWSE_LIMIT}
resp = requests.get(
f"{self._browse_base}/item_summary/search",
headers=self._headers(),
params=params,
)
resp.raise_for_status()
data = resp.json()
items = data.get("itemSummaries", [])
if not items:
break # no more results
for item in items:
listing = normalise_listing(item)
if listing.platform_listing_id not in seen_ids:
seen_ids.add(listing.platform_listing_id)
listings.append(listing)
# Extract inline seller data available in item_summary
seller_raw = item.get("seller", {})
if seller_raw.get("username") and seller_raw["username"] not in sellers_to_save:
sellers_to_save[seller_raw["username"]] = normalise_seller(seller_raw)
if not data.get("next"):
break # Browse API paginates via "next" href; absence = last page
if sellers_to_save:
self._store.save_sellers(list(sellers_to_save.values()))
# Enrich sellers missing account_age_days via Shopping API (fast HTTP, no Playwright).
# Capped at _SHOPPING_API_MAX_PER_SEARCH to avoid bursting the daily quota when
# many new sellers appear in a single search batch.
needs_age = [s.platform_seller_id for s in sellers_to_save.values()
if s.account_age_days is None]
if needs_age:
self.enrich_sellers_shopping_api(needs_age[:_SHOPPING_API_MAX_PER_SEARCH])
return listings
def enrich_sellers_shopping_api(self, usernames: list[str]) -> None:
"""Fetch RegistrationDate for sellers via Shopping API GetUserProfile.
Uses app-level Bearer token no user OAuth required. Silently skips
on rate limit (error 1.21) or any other failure so the search response
is never blocked. BTF scraping remains the fallback for the scraper adapter.
Rate limiting: _SHOPPING_API_INTER_REQUEST_DELAY between calls; sellers
enriched within _SELLER_ENRICH_TTL_HOURS are skipped (account age doesn't
change day to day). Callers should already cap the list length.
"""
token = self._tokens.get_token()
headers = {
"X-EBAY-API-IAF-TOKEN": f"Bearer {token}",
"User-Agent": "Mozilla/5.0",
}
cutoff = datetime.now(timezone.utc) - timedelta(hours=_SELLER_ENRICH_TTL_HOURS)
first = True
for username in usernames:
try:
# Skip recently enriched sellers — account age doesn't change daily.
seller = self._store.get_seller("ebay", username)
if seller and seller.fetched_at:
try:
ft = datetime.fromisoformat(seller.fetched_at.replace("Z", "+00:00"))
if ft.tzinfo is None:
ft = ft.replace(tzinfo=timezone.utc)
if ft > cutoff and seller.account_age_days is not None:
continue
except ValueError:
pass
if not first:
import time as _time
_time.sleep(_SHOPPING_API_INTER_REQUEST_DELAY)
first = False
resp = requests.get(
_SHOPPING_BASE,
headers=headers,
params={
"callname": "GetUserProfile",
"appid": self._tokens.client_id,
"siteid": "0",
"version": "967",
"UserID": username,
"responseencoding": "JSON",
},
timeout=10,
)
data = resp.json()
if data.get("Ack") != "Success":
errors = data.get("Errors", [])
if any(e.get("ErrorCode") == "1.21" for e in errors):
log.debug("Shopping API rate-limited for %s — BTF fallback", username)
continue
reg_date = data.get("User", {}).get("RegistrationDate")
if reg_date:
dt = datetime.fromisoformat(reg_date.replace("Z", "+00:00"))
age_days = (datetime.now(timezone.utc) - dt).days
seller = self._store.get_seller("ebay", username)
if seller:
self._store.save_seller(replace(seller, account_age_days=age_days))
log.debug("Shopping API: %s registered %d days ago", username, age_days)
except Exception as e:
log.debug("Shopping API enrich failed for %s: %s", username, e)
def get_seller(self, seller_platform_id: str) -> Optional[Seller]:
cached = self._store.get_seller("ebay", seller_platform_id)
@ -69,30 +228,62 @@ class EbayAdapter(PlatformAdapter):
except Exception:
return None # Caller handles None gracefully (partial score)
def get_completed_sales(self, query: str) -> list[Listing]:
def get_completed_sales(self, query: str, pages: int = 1) -> list[Listing]:
query_hash = hashlib.md5(query.encode()).hexdigest()
cached = self._store.get_market_comp("ebay", query_hash)
if cached:
return [] # Comp data is used directly; return empty to signal cache hit
if self._store.get_market_comp("ebay", query_hash):
return [] # cache hit
params = {"q": query, "limit": 20, "filter": "buyingOptions:{FIXED_PRICE}"}
prices: list[float] = []
try:
resp = requests.get(f"{self._browse_base}/item_summary/search",
headers=self._headers(), params=params)
# Marketplace Insights API returns sold/completed items — best source for comps.
# Falls back gracefully to Browse API active listings if the endpoint is
# unavailable (requires buy.marketplace.insights scope).
insights_base = _INSIGHTS_BASE.get(self._env, _INSIGHTS_BASE["production"])
resp = requests.get(
f"{insights_base}/item_summary/search",
headers=self._headers(),
params={"q": query, "limit": 50, "filter": "buyingOptions:{FIXED_PRICE}"},
)
if resp.status_code in (403, 404):
# 403 = scope not granted; 404 = endpoint not available for this app tier.
# Both mean: fall back to active listing prices via Browse API.
log.info("comps api: Marketplace Insights unavailable (%d), falling back to Browse API", resp.status_code)
raise PermissionError("Marketplace Insights not available")
resp.raise_for_status()
items = resp.json().get("itemSummaries", [])
listings = [normalise_listing(item) for item in items]
if listings:
prices = sorted(l.price for l in listings)
median = prices[len(prices) // 2]
comp = MarketComp(
platform="ebay",
query_hash=query_hash,
median_price=median,
sample_count=len(prices),
expires_at=(datetime.now(timezone.utc) + timedelta(hours=6)).isoformat(),
prices = [float(i["lastSoldPrice"]["value"]) for i in items if "lastSoldPrice" in i]
log.info("comps api: Marketplace Insights returned %d items, %d with lastSoldPrice", len(items), len(prices))
except PermissionError:
# Fallback: use active listing prices (less accurate but always available)
try:
resp = requests.get(
f"{self._browse_base}/item_summary/search",
headers=self._headers(),
params={"q": query, "limit": 50, "filter": "buyingOptions:{FIXED_PRICE}"},
)
self._store.save_market_comp(comp)
return listings
resp.raise_for_status()
items = resp.json().get("itemSummaries", [])
prices = [float(i["price"]["value"]) for i in items if "price" in i]
log.info("comps api: Browse API fallback returned %d items, %d with price", len(items), len(prices))
except Exception:
log.warning("comps api: Browse API fallback failed for %r", query, exc_info=True)
return []
except Exception:
log.warning("comps api: unexpected error for %r", query, exc_info=True)
return []
if not prices:
log.warning("comps api: 0 valid prices extracted — no comp saved for %r", query)
return []
prices.sort()
n = len(prices)
median = (prices[n // 2 - 1] + prices[n // 2]) / 2 if n % 2 == 0 else prices[n // 2]
self._store.save_market_comp(MarketComp(
platform="ebay",
query_hash=query_hash,
median_price=median,
sample_count=n,
expires_at=(datetime.now(timezone.utc) + timedelta(hours=6)).isoformat(),
))
return []

View file

@ -21,6 +21,10 @@ class EbayTokenManager:
self._token: Optional[str] = None
self._expires_at: float = 0.0
@property
def client_id(self) -> str:
return self._client_id
def get_token(self) -> str:
"""Return a valid access token, fetching or refreshing as needed."""
if self._token and time.time() < self._expires_at - 60:

View file

@ -2,6 +2,7 @@
from __future__ import annotations
import json
from datetime import datetime, timezone
from typing import Optional
from app.db.models import Listing, Seller
@ -25,6 +26,26 @@ def normalise_listing(raw: dict) -> Listing:
except ValueError:
pass
options = raw.get("buyingOptions", [])
if "AUCTION" in options:
buying_format = "auction"
elif "BEST_OFFER" in options:
buying_format = "best_offer"
else:
buying_format = "fixed_price"
ends_at = None
end_raw = raw.get("itemEndDate", "")
if end_raw:
try:
ends_at = datetime.fromisoformat(end_raw.replace("Z", "+00:00")).isoformat()
except ValueError:
pass
# Leaf category is categories[0] (most specific); parent path follows.
categories = raw.get("categories", [])
category_name: Optional[str] = categories[0]["categoryName"] if categories else None
seller = raw.get("seller", {})
return Listing(
platform="ebay",
@ -37,13 +58,16 @@ def normalise_listing(raw: dict) -> Listing:
url=raw.get("itemWebUrl", ""),
photo_urls=photos,
listing_age_days=listing_age_days,
buying_format=buying_format,
ends_at=ends_at,
category_name=category_name,
)
def normalise_seller(raw: dict) -> Seller:
feedback_pct = float(raw.get("feedbackPercentage", "0").strip("%")) / 100.0
account_age_days = 0
account_age_days: Optional[int] = None # None = registrationDate not in API response
reg_date_raw = raw.get("registrationDate", "")
if reg_date_raw:
try:

View file

@ -0,0 +1,85 @@
"""
Build eBay-compatible boolean search queries from OR groups.
eBay honors parenthetical OR groups in the _nkw search parameter:
(term1,term2,term3) must contain at least one of these terms
-term / -"phrase" must NOT contain this term / phrase
space between groups implicit AND
expand_queries() generates one eBay query per term in the smallest OR group,
using eBay's OR syntax for all remaining groups. This guarantees coverage even
if eBay's relevance ranking would suppress some matches in a single combined query.
Example:
base = "GPU"
or_groups = [["16gb","24gb","40gb","48gb"], ["nvidia","quadro","rtx","geforce","titan"]]
4 queries (one per memory size, brand group as eBay OR):
"GPU 16gb (nvidia,quadro,rtx,geforce,titan)"
"GPU 24gb (nvidia,quadro,rtx,geforce,titan)"
"GPU 40gb (nvidia,quadro,rtx,geforce,titan)"
"GPU 48gb (nvidia,quadro,rtx,geforce,titan)"
"""
from __future__ import annotations
def _group_to_ebay(group: list[str]) -> str:
"""Convert a list of alternatives to an eBay OR clause."""
clean = [t.strip() for t in group if t.strip()]
if not clean:
return ""
if len(clean) == 1:
return clean[0]
return f"({','.join(clean)})"
def build_ebay_query(base_query: str, or_groups: list[list[str]]) -> str:
"""
Build a single eBay _nkw query string using eBay's parenthetical OR syntax.
Exclusions are handled separately via SearchFilters.must_exclude.
"""
parts = [base_query.strip()]
for group in or_groups:
clause = _group_to_ebay(group)
if clause:
parts.append(clause)
return " ".join(p for p in parts if p)
def expand_queries(base_query: str, or_groups: list[list[str]]) -> list[str]:
"""
Expand OR groups into one eBay query per term in the smallest group,
using eBay's OR syntax for all remaining groups.
This guarantees every term in the pivot group is explicitly searched,
which prevents eBay's relevance engine from silently skipping rare variants.
Falls back to a single query when there are no OR groups.
"""
if not or_groups:
return [base_query.strip()]
# Pivot on the smallest group to minimise the number of Playwright calls
smallest_idx = min(range(len(or_groups)), key=lambda i: len(or_groups[i]))
pivot = or_groups[smallest_idx]
rest = [g for i, g in enumerate(or_groups) if i != smallest_idx]
queries = []
for term in pivot:
q = build_ebay_query(base_query, [[term]] + rest)
queries.append(q)
return queries
def parse_groups(raw: str) -> list[list[str]]:
"""
Parse a Groups-mode must_include string into nested OR groups.
Format: comma separates groups (AND), pipe separates alternatives within a group (OR).
"16gb|24gb|48gb, nvidia|rtx|geforce"
[["16gb","24gb","48gb"], ["nvidia","rtx","geforce"]]
"""
groups = []
for chunk in raw.split(","):
alts = [t.strip().lower() for t in chunk.split("|") if t.strip()]
if alts:
groups.append(alts)
return groups

View file

@ -0,0 +1,526 @@
"""Scraper-based eBay adapter — free tier, no API key required.
Data available from search results HTML (single page load):
title, price, condition, photos, URL
seller username, feedback count, feedback ratio
account registration date enriched async via BTF /itm/ scrape
category history enriched async via _ssn seller search page
This is the MIT discovery layer. EbayAdapter (paid/CF proxy) unlocks full trust scores.
"""
from __future__ import annotations
import hashlib
import itertools
import json
import logging
import re
import time
from concurrent.futures import ThreadPoolExecutor, as_completed
from datetime import datetime, timedelta, timezone
from typing import Optional
log = logging.getLogger(__name__)
from bs4 import BeautifulSoup
from app.db.models import Listing, MarketComp, Seller
from app.db.store import Store
from app.platforms import PlatformAdapter, SearchFilters
EBAY_SEARCH_URL = "https://www.ebay.com/sch/i.html"
EBAY_ITEM_URL = "https://www.ebay.com/itm/"
_HTML_CACHE_TTL = 300 # seconds — 5 minutes
_JOINED_RE = re.compile(r"Joined\s+(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)\w*\s+(\d{4})", re.I)
_MONTH_MAP = {m: i+1 for i, m in enumerate(
["Jan","Feb","Mar","Apr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"]
)}
# Module-level cache persists across per-request adapter instantiations.
# Keyed by URL; value is (html, expiry_timestamp).
_html_cache: dict[str, tuple[str, float]] = {}
# Cycle through display numbers :200:299 so concurrent/sequential Playwright
# calls don't collide on the Xvfb lock file from the previous run.
_display_counter = itertools.cycle(range(200, 300))
_HEADERS = {
"User-Agent": (
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 "
"(KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36"
),
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8",
"Accept-Language": "en-US,en;q=0.5",
"Accept-Encoding": "gzip, deflate, br",
"DNT": "1",
"Connection": "keep-alive",
"Upgrade-Insecure-Requests": "1",
}
_SELLER_RE = re.compile(r"^(.+?)\s+\(([0-9,]+)\)\s+([\d.]+)%")
_FEEDBACK_RE = re.compile(r"([\d.]+)%\s+positive\s+\(([0-9,]+)\)", re.I)
_PRICE_RE = re.compile(r"[\d,]+\.?\d*")
_ITEM_ID_RE = re.compile(r"/itm/(\d+)")
_TIME_LEFT_RE = re.compile(r"(?:(\d+)d\s*)?(?:(\d+)h\s*)?(?:(\d+)m\s*)?(?:(\d+)s\s*)?left", re.I)
_PARENS_COUNT_RE = re.compile(r"\((\d{1,6})\)")
# Maps title-keyword fragments → internal MetadataScorer category keys.
# Checked in order — first match wins. Broader terms intentionally listed last.
_CATEGORY_KEYWORDS: list[tuple[frozenset[str], str]] = [
(frozenset(["cell phone", "smartphone", "mobile phone"]), "CELL_PHONES"),
(frozenset(["video game", "gaming", "console", "playstation", "xbox", "nintendo"]), "VIDEO_GAMES"),
(frozenset(["computer", "tablet", "laptop", "notebook", "chromebook"]), "COMPUTERS_TABLETS"),
(frozenset(["electronic"]), "ELECTRONICS"),
]
def _classify_category_label(text: str) -> Optional[str]:
"""Map an eBay category label to an internal MetadataScorer key, or None."""
lower = text.lower()
for keywords, key in _CATEGORY_KEYWORDS:
if any(kw in lower for kw in keywords):
return key
return None
# ---------------------------------------------------------------------------
# Pure HTML parsing functions (unit-testable, no HTTP)
# ---------------------------------------------------------------------------
def _parse_price(text: str) -> float:
"""Extract first numeric value from price text.
Handles '$950.00', '$900.00 to $1,050.00', '$1,234.56/ea'.
Takes the lower bound for price ranges (conservative for trust scoring).
"""
m = _PRICE_RE.search(text.replace(",", ""))
return float(m.group()) if m else 0.0
def _parse_seller(text: str) -> tuple[str, int, float]:
"""Parse eBay seller-info text into (username, feedback_count, feedback_ratio).
Input format: 'tech_seller (1,234) 99.1% positive feedback'
Returns ('tech_seller', 1234, 0.991).
Falls back gracefully if the format doesn't match.
"""
text = text.strip()
m = _SELLER_RE.match(text)
if not m:
return (text.split()[0] if text else ""), 0, 0.0
return m.group(1).strip(), int(m.group(2).replace(",", "")), float(m.group(3)) / 100.0
def _parse_time_left(text: str) -> Optional[timedelta]:
"""Parse eBay time-left text into a timedelta.
Handles '3d 14h left', '14h 23m left', '23m 45s left'.
Returns None if text doesn't match (i.e. fixed-price listing).
"""
if not text:
return None
m = _TIME_LEFT_RE.search(text)
if not m or not any(m.groups()):
return None
days = int(m.group(1) or 0)
hours = int(m.group(2) or 0)
minutes = int(m.group(3) or 0)
seconds = int(m.group(4) or 0)
if days == hours == minutes == seconds == 0:
return None
return timedelta(days=days, hours=hours, minutes=minutes, seconds=seconds)
def _extract_seller_from_card(card) -> tuple[str, int, float]:
"""Extract (username, feedback_count, feedback_ratio) from an s-card element.
New eBay layout has seller username and feedback as separate su-styled-text spans.
We find the feedback span by regex, then take the immediately preceding text as username.
"""
texts = [s.get_text(strip=True) for s in card.select("span.su-styled-text") if s.get_text(strip=True)]
username, count, ratio = "", 0, 0.0
for i, t in enumerate(texts):
m = _FEEDBACK_RE.search(t)
if m:
ratio = float(m.group(1)) / 100.0
count = int(m.group(2).replace(",", ""))
# Username is the span just before the feedback span
if i > 0:
username = texts[i - 1].strip()
break
return username, count, ratio
def scrape_listings(html: str) -> list[Listing]:
"""Parse eBay search results HTML into Listing objects."""
soup = BeautifulSoup(html, "lxml")
results = []
for item in soup.select("li.s-card"):
# Skip promos: no data-listingid or title is "Shop on eBay"
platform_listing_id = item.get("data-listingid", "")
if not platform_listing_id:
continue
title_el = item.select_one("div.s-card__title")
if not title_el or "Shop on eBay" in title_el.get_text():
continue
link_el = item.select_one('a.s-card__link[href*="/itm/"]')
url = link_el["href"].split("?")[0] if link_el else ""
price_el = item.select_one("span.s-card__price")
price = _parse_price(price_el.get_text()) if price_el else 0.0
condition_el = item.select_one("div.s-card__subtitle")
condition = condition_el.get_text(strip=True).split("·")[0].strip().lower() if condition_el else ""
seller_username, _, _ = _extract_seller_from_card(item)
img_el = item.select_one("img.s-card__image")
photo_url = img_el.get("src") or img_el.get("data-src") or "" if img_el else ""
# Auction detection via time-left text patterns in card spans
time_remaining = None
for span in item.select("span.su-styled-text"):
t = span.get_text(strip=True)
td = _parse_time_left(t)
if td:
time_remaining = td
break
buying_format = "auction" if time_remaining is not None else "fixed_price"
ends_at = (datetime.now(timezone.utc) + time_remaining).isoformat() if time_remaining else None
# Strip eBay's screen-reader accessibility text injected into title links.
# get_text() is CSS-blind and picks up visually-hidden spans.
raw_title = title_el.get_text(separator=" ", strip=True)
title = re.sub(r"\s*Opens in a new window or tab\s*", "", raw_title, flags=re.IGNORECASE).strip()
results.append(Listing(
platform="ebay",
platform_listing_id=platform_listing_id,
title=title,
price=price,
currency="USD",
condition=condition,
seller_platform_id=seller_username,
url=url,
photo_urls=[photo_url] if photo_url else [],
listing_age_days=0,
buying_format=buying_format,
ends_at=ends_at,
))
return results
def scrape_sellers(html: str) -> dict[str, Seller]:
"""Extract Seller objects from search results HTML.
Returns a dict keyed by username. account_age_days and category_history_json
are left empty they require a separate seller profile page fetch, which
would mean one extra HTTP request per seller. That data gap is what separates
free (scraper) from paid (API) tier.
"""
soup = BeautifulSoup(html, "lxml")
sellers: dict[str, Seller] = {}
for item in soup.select("li.s-card"):
if not item.get("data-listingid"):
continue
username, count, ratio = _extract_seller_from_card(item)
if username and username not in sellers:
sellers[username] = Seller(
platform="ebay",
platform_seller_id=username,
username=username,
account_age_days=None, # not fetched at scraper tier
feedback_count=count,
feedback_ratio=ratio,
category_history_json="{}", # not available from search HTML
)
return sellers
def scrape_seller_categories(html: str) -> dict[str, int]:
"""Parse category distribution from a seller's _ssn search page.
eBay renders category refinements in the left sidebar. We scan all
anchor-text blocks for recognisable category labels and accumulate
listing counts from the adjacent parenthetical "(N)" strings.
Returns a dict like {"ELECTRONICS": 45, "CELL_PHONES": 23}.
Empty dict = no recognisable categories found (score stays None).
"""
soup = BeautifulSoup(html, "lxml")
counts: dict[str, int] = {}
# eBay sidebar refinement links contain the category label and a count.
# Multiple layout variants exist — scan broadly and classify by keyword.
for el in soup.select("a[href*='_sacat='], li.x-refine__main__list--value a"):
text = el.get_text(separator=" ", strip=True)
key = _classify_category_label(text)
if not key:
continue
m = _PARENS_COUNT_RE.search(text)
count = int(m.group(1)) if m else 1
counts[key] = counts.get(key, 0) + count
return counts
# ---------------------------------------------------------------------------
# Adapter
# ---------------------------------------------------------------------------
class ScrapedEbayAdapter(PlatformAdapter):
"""
Scraper-based eBay adapter implementing PlatformAdapter with no API key.
Extracts seller feedback directly from search result cards no extra
per-seller page requests. The two unavailable signals (account_age,
category_history) cause TrustScorer to set score_is_partial=True.
"""
def __init__(self, store: Store, delay: float = 1.0):
self._store = store
self._delay = delay
def _fetch_url(self, url: str) -> str:
"""Core Playwright fetch — stealthed headed Chromium via Xvfb.
Shared by both search (_get) and BTF item-page enrichment (_fetch_item_html).
Results cached for _HTML_CACHE_TTL seconds.
"""
cached = _html_cache.get(url)
if cached and time.time() < cached[1]:
return cached[0]
time.sleep(self._delay)
import subprocess, os
display_num = next(_display_counter)
display = f":{display_num}"
xvfb = subprocess.Popen(
["Xvfb", display, "-screen", "0", "1280x800x24"],
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL,
)
env = os.environ.copy()
env["DISPLAY"] = display
try:
from playwright.sync_api import sync_playwright # noqa: PLC0415 — lazy: only needed in Docker
from playwright_stealth import Stealth # noqa: PLC0415
with sync_playwright() as pw:
browser = pw.chromium.launch(
headless=False,
env=env,
args=["--no-sandbox", "--disable-dev-shm-usage"],
)
ctx = browser.new_context(
user_agent=_HEADERS["User-Agent"],
viewport={"width": 1280, "height": 800},
)
page = ctx.new_page()
Stealth().apply_stealth_sync(page)
page.goto(url, wait_until="domcontentloaded", timeout=30_000)
page.wait_for_timeout(2000) # let any JS challenges resolve
html = page.content()
browser.close()
finally:
xvfb.terminate()
xvfb.wait()
_html_cache[url] = (html, time.time() + _HTML_CACHE_TTL)
return html
def _get(self, params: dict) -> str:
"""Fetch eBay search results HTML. params → query string appended to EBAY_SEARCH_URL."""
url = EBAY_SEARCH_URL + "?" + "&".join(f"{k}={v}" for k, v in params.items())
return self._fetch_url(url)
def _fetch_item_html(self, item_id: str) -> str:
"""Fetch a single eBay listing page. /itm/ pages pass Kasada; /usr/ pages do not."""
return self._fetch_url(f"{EBAY_ITEM_URL}{item_id}")
@staticmethod
def _parse_joined_date(html: str) -> Optional[int]:
"""Parse 'Joined {Mon} {Year}' from a listing page BTF seller card.
Returns account_age_days (int) or None if the date is not found.
eBay renders this as a span.ux-textspans inside the seller section.
"""
m = _JOINED_RE.search(html)
if not m:
return None
month_str, year_str = m.group(1)[:3].capitalize(), m.group(2)
month = _MONTH_MAP.get(month_str)
if not month:
return None
try:
reg_date = datetime(int(year_str), month, 1, tzinfo=timezone.utc)
return (datetime.now(timezone.utc) - reg_date).days
except ValueError:
return None
def enrich_sellers_btf(
self,
seller_to_listing: dict[str, str],
max_workers: int = 2,
) -> None:
"""Background BTF enrichment — scrape /itm/ pages to fill in account_age_days.
seller_to_listing: {seller_platform_id -> platform_listing_id}
Only pass sellers whose account_age_days is None (unknown from API batch).
Caller limits the dict to new/stale sellers to avoid redundant scrapes.
Runs Playwright fetches in a thread pool (max_workers=2 by default to
avoid hammering Kasada). Updates seller records in the DB in-place.
Does not raise failures per-seller are silently skipped so the main
search response is never blocked.
"""
def _enrich_one(item: tuple[str, str]) -> None:
seller_id, listing_id = item
try:
html = self._fetch_item_html(listing_id)
age_days = self._parse_joined_date(html)
if age_days is not None:
seller = self._store.get_seller("ebay", seller_id)
if seller:
from dataclasses import replace
updated = replace(seller, account_age_days=age_days)
self._store.save_seller(updated)
except Exception:
pass # non-fatal: partial score is better than a crashed enrichment
with ThreadPoolExecutor(max_workers=max_workers) as ex:
list(ex.map(_enrich_one, seller_to_listing.items()))
def enrich_sellers_categories(
self,
seller_platform_ids: list[str],
max_workers: int = 2,
) -> None:
"""Scrape _ssn seller pages to populate category_history_json.
Uses the same headed Playwright stack as search() the _ssn=USERNAME
filter is just a query param on the standard search template, so it
passes Kasada identically. Silently skips on failure so the main
search response is never affected.
"""
def _enrich_one(seller_id: str) -> None:
try:
html = self._get({"_ssn": seller_id, "_sop": "12", "_ipg": "48"})
categories = scrape_seller_categories(html)
if categories:
seller = self._store.get_seller("ebay", seller_id)
if seller:
from dataclasses import replace
updated = replace(seller, category_history_json=json.dumps(categories))
self._store.save_seller(updated)
except Exception:
pass
with ThreadPoolExecutor(max_workers=max_workers) as ex:
list(ex.map(_enrich_one, seller_platform_ids))
def search(self, query: str, filters: SearchFilters) -> list[Listing]:
base_params: dict = {"_nkw": query, "_sop": "15", "_ipg": "48"}
if filters.category_id:
base_params["_sacat"] = filters.category_id
if filters.max_price:
base_params["_udhi"] = str(filters.max_price)
if filters.min_price:
base_params["_udlo"] = str(filters.min_price)
if filters.condition:
cond_map = {
"new": "1000", "used": "3000",
"open box": "2500", "for parts": "7000",
}
codes = [cond_map[c] for c in filters.condition if c in cond_map]
if codes:
base_params["LH_ItemCondition"] = "|".join(codes)
# Append negative keywords to the eBay query — eBay supports "-term" in _nkw natively.
# Multi-word phrases must be quoted: -"parts only" not -parts only (which splits the words).
if filters.must_exclude:
parts = []
for t in filters.must_exclude:
t = t.strip()
if not t:
continue
parts.append(f'-"{t}"' if " " in t else f"-{t}")
base_params["_nkw"] = f"{base_params['_nkw']} {' '.join(parts)}"
pages = max(1, filters.pages)
page_params = [{**base_params, "_pgn": str(p)} for p in range(1, pages + 1)]
with ThreadPoolExecutor(max_workers=min(pages, 3)) as ex:
htmls = list(ex.map(self._get, page_params))
seen_ids: set[str] = set()
listings: list[Listing] = []
sellers: dict[str, "Seller"] = {}
for html in htmls:
for listing in scrape_listings(html):
if listing.platform_listing_id not in seen_ids:
seen_ids.add(listing.platform_listing_id)
listings.append(listing)
sellers.update(scrape_sellers(html))
self._store.save_sellers(list(sellers.values()))
return listings
def get_seller(self, seller_platform_id: str) -> Optional[Seller]:
# Sellers are pre-populated during search(); no extra fetch needed
return self._store.get_seller("ebay", seller_platform_id)
def get_completed_sales(self, query: str, pages: int = 1) -> list[Listing]:
query_hash = hashlib.md5(query.encode()).hexdigest()
if self._store.get_market_comp("ebay", query_hash):
return [] # cache hit — comp already stored
base_params = {
"_nkw": query,
"LH_Sold": "1",
"LH_Complete": "1",
"_sop": "13", # sort by price+shipping, lowest first
"_ipg": "48",
}
pages = max(1, pages)
page_params = [{**base_params, "_pgn": str(p)} for p in range(1, pages + 1)]
log.info("comps scrape: fetching %d page(s) of sold listings for %r", pages, query)
try:
with ThreadPoolExecutor(max_workers=min(pages, 3)) as ex:
htmls = list(ex.map(self._get, page_params))
seen_ids: set[str] = set()
all_listings: list[Listing] = []
for html in htmls:
for listing in scrape_listings(html):
if listing.platform_listing_id not in seen_ids:
seen_ids.add(listing.platform_listing_id)
all_listings.append(listing)
prices = sorted(l.price for l in all_listings if l.price > 0)
if prices:
mid = len(prices) // 2
median = (prices[mid - 1] + prices[mid]) / 2 if len(prices) % 2 == 0 else prices[mid]
self._store.save_market_comp(MarketComp(
platform="ebay",
query_hash=query_hash,
median_price=median,
sample_count=len(prices),
expires_at=(datetime.now(timezone.utc) + timedelta(hours=6)).isoformat(),
))
log.info("comps scrape: saved market comp median=$%.2f from %d prices", median, len(prices))
else:
log.warning("comps scrape: %d listings parsed but 0 valid prices — no comp saved", len(all_listings))
return all_listings
except Exception:
log.warning("comps scrape: failed for %r", query, exc_info=True)
return []

View file

@ -4,6 +4,7 @@ from .aggregator import Aggregator
from app.db.models import Seller, Listing, TrustScore
from app.db.store import Store
import hashlib
import math
class TrustScorer:
@ -24,6 +25,16 @@ class TrustScorer:
comp = self._store.get_market_comp("ebay", query_hash)
market_median = comp.median_price if comp else None
# Coefficient of variation: stddev/mean across batch prices.
# None when fewer than 2 priced listings (can't compute variance).
_prices = [l.price for l in listings if l.price > 0]
if len(_prices) >= 2:
_mean = sum(_prices) / len(_prices)
_stddev = math.sqrt(sum((p - _mean) ** 2 for p in _prices) / len(_prices))
price_cv: float | None = _stddev / _mean if _mean > 0 else None
else:
price_cv = None
photo_url_sets = [l.photo_urls for l in listings]
duplicates = self._photo.check_duplicates(photo_url_sets)
@ -31,11 +42,19 @@ class TrustScorer:
for listing, is_dup in zip(listings, duplicates):
seller = self._store.get_seller("ebay", listing.seller_platform_id)
if seller:
signal_scores = self._meta.score(seller, market_median, listing.price)
signal_scores = self._meta.score(seller, market_median, listing.price, price_cv)
else:
signal_scores = {k: None for k in
["account_age", "feedback_count", "feedback_ratio",
"price_vs_market", "category_history"]}
trust = self._agg.aggregate(signal_scores, is_dup, seller, listing.id or 0)
trust = self._agg.aggregate(
signal_scores, is_dup, seller,
listing_id=listing.id or 0,
listing_title=listing.title,
times_seen=listing.times_seen,
first_seen_at=listing.first_seen_at,
price=listing.price,
price_at_first_seen=listing.price_at_first_seen,
)
scores.append(trust)
return scores

View file

@ -1,6 +1,7 @@
"""Composite score and red flag extraction."""
from __future__ import annotations
import json
from datetime import datetime, timezone
from typing import Optional
from app.db.models import Seller, TrustScore
@ -8,6 +9,55 @@ HARD_FILTER_AGE_DAYS = 7
HARD_FILTER_BAD_RATIO_MIN_COUNT = 20
HARD_FILTER_BAD_RATIO_THRESHOLD = 0.80
# Title keywords that suggest cosmetic damage or wear (free-tier title scan).
# Description-body scan (paid BSL feature) runs via BTF enrichment — not implemented yet.
_SCRATCH_DENT_KEYWORDS = frozenset([
# Explicit cosmetic damage
"scratch", "scratched", "scratches", "scuff", "scuffed",
"dent", "dented", "ding", "dinged",
"crack", "cracked", "chip", "chipped",
"damage", "damaged", "cosmetic damage",
"blemish", "wear", "worn", "worn in",
# Parts / condition catch-alls
"as is", "for parts", "parts only", "spares or repair", "parts or repair",
# Evasive redirects — seller hiding damage detail in listing body
"see description", "read description", "read listing", "see listing",
"see photos for", "see pics for", "see images for",
# Functional problem phrases (phrases > single words to avoid false positives)
"issue with", "issues with", "problem with", "problems with",
"not working", "stopped working", "doesn't work", "does not work",
"no power", "dead on arrival", "powers on but", "turns on but", "boots but",
"faulty", "broken screen", "broken hinge", "broken port",
# DIY / project / repair listings
"needs repair", "needs work", "needs tlc",
"project unit", "project item", "project laptop", "project phone",
"for repair", "sold as is",
])
def _has_damage_keywords(title: str) -> bool:
lower = title.lower()
return any(kw in lower for kw in _SCRATCH_DENT_KEYWORDS)
_LONG_ON_MARKET_MIN_SIGHTINGS = 5
_LONG_ON_MARKET_MIN_DAYS = 14
_PRICE_DROP_THRESHOLD = 0.20 # 20% below first-seen price
def _days_since(iso: Optional[str]) -> Optional[int]:
if not iso:
return None
try:
dt = datetime.fromisoformat(iso.replace("Z", "+00:00"))
# Normalize to naive UTC so both paths (timezone-aware ISO and SQLite
# CURRENT_TIMESTAMP naive strings) compare correctly.
if dt.tzinfo is not None:
dt = dt.replace(tzinfo=None)
return (datetime.utcnow() - dt).days
except ValueError:
return None
class Aggregator:
def aggregate(
@ -16,15 +66,29 @@ class Aggregator:
photo_hash_duplicate: bool,
seller: Optional[Seller],
listing_id: int = 0,
listing_title: str = "",
times_seen: int = 1,
first_seen_at: Optional[str] = None,
price: float = 0.0,
price_at_first_seen: Optional[float] = None,
) -> TrustScore:
is_partial = any(v is None for v in signal_scores.values())
clean = {k: (v if v is not None else 0) for k, v in signal_scores.items()}
composite = sum(clean.values())
# Score only against signals that returned real data — treating "no data"
# as 0 conflates "bad signal" with "missing signal" and drags scores down
# unfairly when the API doesn't expose a field (e.g. registrationDate).
available = [v for v in signal_scores.values() if v is not None]
available_max = len(available) * 20
if available_max > 0:
composite = round((sum(available) / available_max) * 100)
else:
composite = 0
red_flags: list[str] = []
# Hard filters
if seller and seller.account_age_days < HARD_FILTER_AGE_DAYS:
if seller and seller.account_age_days is not None and seller.account_age_days < HARD_FILTER_AGE_DAYS:
red_flags.append("new_account")
if seller and (
seller.feedback_ratio < HARD_FILTER_BAD_RATIO_THRESHOLD
@ -33,14 +97,26 @@ class Aggregator:
red_flags.append("established_bad_actor")
# Soft flags
if seller and seller.account_age_days < 30:
if seller and seller.account_age_days is not None and seller.account_age_days < 30:
red_flags.append("account_under_30_days")
if seller and seller.feedback_count < 10:
red_flags.append("low_feedback_count")
if clean["price_vs_market"] == 0:
if signal_scores.get("price_vs_market") == 0: # only flag when data exists and price is genuinely <50% of market
red_flags.append("suspicious_price")
if photo_hash_duplicate:
red_flags.append("duplicate_photo")
if listing_title and _has_damage_keywords(listing_title):
red_flags.append("scratch_dent_mentioned")
# Staging DB signals
days_in_index = _days_since(first_seen_at)
if (times_seen >= _LONG_ON_MARKET_MIN_SIGHTINGS
and days_in_index is not None
and days_in_index >= _LONG_ON_MARKET_MIN_DAYS):
red_flags.append("long_on_market")
if (price_at_first_seen and price_at_first_seen > 0
and price < price_at_first_seen * (1 - _PRICE_DROP_THRESHOLD)):
red_flags.append("significant_price_drop")
return TrustScore(
listing_id=listing_id,

View file

@ -6,6 +6,11 @@ from app.db.models import Seller
ELECTRONICS_CATEGORIES = {"ELECTRONICS", "COMPUTERS_TABLETS", "VIDEO_GAMES", "CELL_PHONES"}
# Coefficient of variation (stddev/mean) above which the price distribution is
# considered too heterogeneous to trust the market median for scam detection.
# e.g. "Lenovo RTX intel" mixes $200 old ThinkPads with $2000 Legions → CV ~1.0+
_HETEROGENEOUS_CV_THRESHOLD = 0.6
class MetadataScorer:
def score(
@ -13,12 +18,13 @@ class MetadataScorer:
seller: Seller,
market_median: Optional[float],
listing_price: float,
price_cv: Optional[float] = None,
) -> dict[str, Optional[int]]:
return {
"account_age": self._account_age(seller.account_age_days),
"account_age": self._account_age(seller.account_age_days) if seller.account_age_days is not None else None,
"feedback_count": self._feedback_count(seller.feedback_count),
"feedback_ratio": self._feedback_ratio(seller.feedback_ratio, seller.feedback_count),
"price_vs_market": self._price_vs_market(listing_price, market_median),
"price_vs_market": self._price_vs_market(listing_price, market_median, price_cv),
"category_history": self._category_history(seller.category_history_json),
}
@ -43,9 +49,11 @@ class MetadataScorer:
if ratio < 0.98: return 15
return 20
def _price_vs_market(self, price: float, median: Optional[float]) -> Optional[int]:
def _price_vs_market(self, price: float, median: Optional[float], price_cv: Optional[float] = None) -> Optional[int]:
if median is None: return None # data unavailable → aggregator sets score_is_partial
if median <= 0: return None
if price_cv is not None and price_cv > _HETEROGENEOUS_CV_THRESHOLD:
return None # mixed model/generation search — median is unreliable
ratio = price / median
if ratio < 0.50: return 0 # >50% below = scam
if ratio < 0.70: return 5 # >30% below = suspicious
@ -53,11 +61,13 @@ class MetadataScorer:
if ratio <= 1.20: return 20
return 15 # above market = still ok, just expensive
def _category_history(self, category_history_json: str) -> int:
def _category_history(self, category_history_json: str) -> Optional[int]:
try:
history = json.loads(category_history_json)
except (ValueError, TypeError):
return 0
return None # unparseable → data unavailable
if not history:
return None # empty dict → no category data from this source
electronics_sales = sum(
v for k, v in history.items() if k in ELECTRONICS_CATEGORIES
)

View file

@ -11,6 +11,10 @@ try:
except ImportError:
_IMAGEHASH_AVAILABLE = False
# Module-level phash cache: url → hash string (or None on failure).
# Avoids re-downloading the same eBay CDN image on repeated searches.
_phash_cache: dict[str, Optional[str]] = {}
class PhotoScorer:
"""
@ -52,13 +56,17 @@ class PhotoScorer:
def _fetch_hash(self, url: str) -> Optional[str]:
if not url:
return None
if url in _phash_cache:
return _phash_cache[url]
try:
resp = requests.get(url, timeout=5, stream=True)
resp.raise_for_status()
img = Image.open(io.BytesIO(resp.content))
return str(imagehash.phash(img))
result: Optional[str] = str(imagehash.phash(img))
except Exception:
return None
result = None
_phash_cache[url] = result
return result
def _url_dedup(self, photo_urls_per_listing: list[list[str]]) -> list[bool]:
seen: set[str] = set()

View file

@ -1,34 +1,88 @@
"""Main search + results page."""
from __future__ import annotations
import logging
import os
from pathlib import Path
import streamlit as st
from circuitforge_core.config import load_env
from app.db.store import Store
from app.platforms import SearchFilters
from app.platforms.ebay.auth import EbayTokenManager
from app.platforms.ebay.adapter import EbayAdapter
from app.platforms import PlatformAdapter, SearchFilters
from app.trust import TrustScorer
from app.ui.components.filters import build_filter_options, render_filter_sidebar, FilterState
from app.ui.components.listing_row import render_listing_row
from app.ui.components.easter_eggs import (
inject_steal_css, check_snipe_mode, render_snipe_mode_banner,
auction_hours_remaining,
)
log = logging.getLogger(__name__)
load_env(Path(".env"))
_DB_PATH = Path(os.environ.get("SNIPE_DB", "data/snipe.db"))
_DB_PATH.parent.mkdir(exist_ok=True)
def _get_adapter() -> EbayAdapter:
store = Store(_DB_PATH)
tokens = EbayTokenManager(
client_id=os.environ.get("EBAY_CLIENT_ID", ""),
client_secret=os.environ.get("EBAY_CLIENT_SECRET", ""),
env=os.environ.get("EBAY_ENV", "production"),
def _get_adapter(store: Store) -> PlatformAdapter:
"""Return the best available eBay adapter based on what's configured.
Auto-detects: if EBAY_CLIENT_ID + EBAY_CLIENT_SECRET are present, use the
full API adapter (all 5 trust signals). Otherwise fall back to the scraper
(3/5 signals, score_is_partial=True) and warn to logs so ops can see why
scores are partial without touching the UI.
"""
client_id = os.environ.get("EBAY_CLIENT_ID", "").strip()
client_secret = os.environ.get("EBAY_CLIENT_SECRET", "").strip()
if client_id and client_secret:
from app.platforms.ebay.adapter import EbayAdapter
from app.platforms.ebay.auth import EbayTokenManager
env = os.environ.get("EBAY_ENV", "production")
return EbayAdapter(EbayTokenManager(client_id, client_secret, env), store, env=env)
log.warning(
"EBAY_CLIENT_ID / EBAY_CLIENT_SECRET not set — "
"falling back to scraper (partial trust scores: account_age and "
"category_history signals unavailable). Set API credentials for full scoring."
)
return EbayAdapter(tokens, store, env=os.environ.get("EBAY_ENV", "production"))
from app.platforms.ebay.scraper import ScrapedEbayAdapter
return ScrapedEbayAdapter(store)
def _keyword_passes(title_lower: str, state: FilterState) -> bool:
"""Apply must_include / must_exclude keyword filtering against a lowercased title."""
include_raw = state.must_include.strip()
if include_raw:
mode = state.must_include_mode
if mode == "groups":
groups = [
[alt.strip().lower() for alt in g.split("|") if alt.strip()]
for g in include_raw.split(",")
if any(alt.strip() for alt in g.split("|"))
]
if not all(any(alt in title_lower for alt in group) for group in groups):
return False
elif mode == "any":
terms = [t.strip().lower() for t in include_raw.split(",") if t.strip()]
if not any(t in title_lower for t in terms):
return False
else: # "all"
terms = [t.strip().lower() for t in include_raw.split(",") if t.strip()]
if not all(t in title_lower for t in terms):
return False
exclude_raw = state.must_exclude.strip()
if exclude_raw:
terms = [t.strip().lower() for t in exclude_raw.split(",") if t.strip()]
if any(t in title_lower for t in terms):
return False
return True
def _passes_filter(listing, trust, seller, state: FilterState) -> bool:
import json
if not _keyword_passes(listing.title.lower(), state):
return False
if trust and trust.composite_score < state.min_trust_score:
return False
if state.min_price and listing.price < state.min_price:
@ -55,7 +109,12 @@ def _passes_filter(listing, trust, seller, state: FilterState) -> bool:
return True
def render() -> None:
def render(audio_enabled: bool = False) -> None:
inject_steal_css()
if check_snipe_mode():
render_snipe_mode_banner(audio_enabled)
st.title("🔍 Snipe — eBay Listing Search")
col_q, col_price, col_btn = st.columns([4, 2, 1])
@ -68,9 +127,11 @@ def render() -> None:
st.info("Enter a search term and click Search.")
return
store = Store(_DB_PATH)
adapter = _get_adapter(store)
with st.spinner("Fetching listings..."):
try:
adapter = _get_adapter()
filters = SearchFilters(max_price=max_price if max_price > 0 else None)
listings = adapter.search(query, filters)
adapter.get_completed_sales(query) # warm the comps cache
@ -82,7 +143,6 @@ def render() -> None:
st.warning("No listings found.")
return
store = Store(_DB_PATH)
for listing in listings:
store.save_listing(listing)
if listing.seller_platform_id:
@ -97,14 +157,21 @@ def render() -> None:
opts = build_filter_options(pairs)
filter_state = render_filter_sidebar(pairs, opts)
sort_col = st.selectbox("Sort by", ["Trust score", "Price ↑", "Price ↓", "Newest"],
label_visibility="collapsed")
sort_col = st.selectbox(
"Sort by",
["Trust score", "Price ↑", "Price ↓", "Newest", "Ending soon"],
label_visibility="collapsed",
)
def sort_key(pair):
l, t = pair
if sort_col == "Trust score": return -(t.composite_score if t else 0)
if sort_col == "Price ↑": return l.price
if sort_col == "Price ↓": return -l.price
if sort_col == "Trust score": return -(t.composite_score if t else 0)
if sort_col == "Price ↑": return l.price
if sort_col == "Price ↓": return -l.price
if sort_col == "Ending soon":
h = auction_hours_remaining(l)
# Non-auctions sort to end; auctions sort ascending by time left
return (h if h is not None else float("inf"))
return l.listing_age_days
sorted_pairs = sorted(pairs, key=sort_key)
@ -114,9 +181,14 @@ def render() -> None:
st.caption(f"{len(visible)} results · {hidden_count} hidden by filters")
import hashlib
query_hash = hashlib.md5(query.encode()).hexdigest()
comp = store.get_market_comp("ebay", query_hash)
market_price = comp.median_price if comp else None
for listing, trust in visible:
seller = store.get_seller("ebay", listing.seller_platform_id)
render_listing_row(listing, trust, seller)
render_listing_row(listing, trust, seller, market_price=market_price)
if hidden_count:
if st.button(f"Show {hidden_count} hidden results"):
@ -124,4 +196,4 @@ def render() -> None:
for listing, trust in sorted_pairs:
if (listing.platform, listing.platform_listing_id) not in visible_ids:
seller = store.get_seller("ebay", listing.seller_platform_id)
render_listing_row(listing, trust, seller)
render_listing_row(listing, trust, seller, market_price=market_price)

View file

@ -0,0 +1,219 @@
"""Easter egg features for Snipe.
Three features:
1. Konami code Snipe Mode JS detector sets ?snipe_mode=1 URL param,
Streamlit detects it on rerun. Audio is synthesised client-side via Web
Audio API (no bundled file; local-first friendly). Disabled by default
for accessibility / autoplay-policy reasons; requires explicit sidebar opt-in.
2. The Steal shimmer a listing with trust 90, price 1530 % below market,
and no suspicious_price flag gets a gold shimmer banner.
3. Auction de-emphasis auctions with > 1 h remaining show a soft notice
because live prices are misleading until the final minutes.
"""
from __future__ import annotations
import json
from datetime import datetime, timezone
from typing import Optional
import streamlit as st
from app.db.models import Listing, TrustScore
# ---------------------------------------------------------------------------
# 1. Konami → Snipe Mode
# ---------------------------------------------------------------------------
_KONAMI_JS = """
<script>
(function () {
const SEQ = [38,38,40,40,37,39,37,39,66,65];
let idx = 0;
document.addEventListener('keydown', function (e) {
if (e.keyCode === SEQ[idx]) {
idx++;
if (idx === SEQ.length) {
idx = 0;
const url = new URL(window.location.href);
url.searchParams.set('snipe_mode', '1');
window.location.href = url.toString();
}
} else {
idx = (e.keyCode === SEQ[0]) ? 1 : 0;
}
});
})();
</script>
"""
_SNIPE_AUDIO_JS = """
<script>
(function () {
if (window.__snipeAudioPlayed) return;
window.__snipeAudioPlayed = true;
try {
const ctx = new (window.AudioContext || window.webkitAudioContext)();
// Short "sniper scope click" high sine blip followed by a lower resonant hit
function blip(freq, start, dur, gain) {
const osc = ctx.createOscillator();
const env = ctx.createGain();
osc.connect(env); env.connect(ctx.destination);
osc.type = 'sine'; osc.frequency.setValueAtTime(freq, ctx.currentTime + start);
env.gain.setValueAtTime(0, ctx.currentTime + start);
env.gain.linearRampToValueAtTime(gain, ctx.currentTime + start + 0.01);
env.gain.exponentialRampToValueAtTime(0.0001, ctx.currentTime + start + dur);
osc.start(ctx.currentTime + start);
osc.stop(ctx.currentTime + start + dur + 0.05);
}
blip(880, 0.00, 0.08, 0.3);
blip(440, 0.10, 0.15, 0.2);
blip(220, 0.20, 0.25, 0.15);
} catch (e) { /* AudioContext blocked silent fail */ }
})();
</script>
"""
_SNIPE_BANNER_CSS = """
<style>
@keyframes snipe-scan {
0% { background-position: -200% center; }
100% { background-position: 200% center; }
}
.snipe-mode-banner {
background: linear-gradient(
90deg,
#0d1117 0%, #0d1117 40%,
#39ff14 50%,
#0d1117 60%, #0d1117 100%
);
background-size: 200% auto;
animation: snipe-scan 2s linear infinite;
color: #39ff14;
font-family: monospace;
font-size: 13px;
letter-spacing: 0.15em;
padding: 6px 16px;
border-radius: 4px;
margin-bottom: 10px;
text-align: center;
text-shadow: 0 0 8px #39ff14;
}
</style>
"""
def inject_konami_detector() -> None:
"""Inject the JS Konami sequence detector into the page (once per load)."""
st.components.v1.html(_KONAMI_JS, height=0)
def check_snipe_mode() -> bool:
"""Return True if ?snipe_mode=1 is present in the URL query params."""
return st.query_params.get("snipe_mode", "") == "1"
def render_snipe_mode_banner(audio_enabled: bool) -> None:
"""Render the Snipe Mode activation banner and optionally play the audio cue."""
st.markdown(_SNIPE_BANNER_CSS, unsafe_allow_html=True)
st.markdown(
'<div class="snipe-mode-banner">🎯 SNIPE MODE ACTIVATED — TARGET ACQUIRED</div>',
unsafe_allow_html=True,
)
if audio_enabled:
st.components.v1.html(_SNIPE_AUDIO_JS, height=0)
# ---------------------------------------------------------------------------
# 2. The Steal shimmer
# ---------------------------------------------------------------------------
_STEAL_CSS = """
<style>
@keyframes steal-glow {
0% { box-shadow: 0 0 6px 1px rgba(255,215,0,0.5); }
50% { box-shadow: 0 0 18px 4px rgba(255,215,0,0.9); }
100% { box-shadow: 0 0 6px 1px rgba(255,215,0,0.5); }
}
.steal-banner {
background: linear-gradient(
90deg,
transparent 0%,
rgba(255,215,0,0.12) 30%,
rgba(255,215,0,0.35) 50%,
rgba(255,215,0,0.12) 70%,
transparent 100%
);
border: 1px solid rgba(255,215,0,0.6);
animation: steal-glow 2.2s ease-in-out infinite;
border-radius: 6px;
padding: 4px 12px;
font-size: 12px;
color: #ffd700;
font-weight: 600;
margin-bottom: 6px;
letter-spacing: 0.05em;
}
</style>
"""
def inject_steal_css() -> None:
"""Inject the steal-shimmer CSS (idempotent — Streamlit deduplicates)."""
st.markdown(_STEAL_CSS, unsafe_allow_html=True)
def is_steal(listing: Listing, trust: Optional[TrustScore], market_price: Optional[float]) -> bool:
"""Return True when this listing qualifies as 'The Steal'.
Criteria (all must hold):
- trust composite 90
- no suspicious_price flag
- price is 1530 % below the market median
(deeper discounts are suspicious, not steals)
"""
if trust is None or market_price is None or market_price <= 0:
return False
if trust.composite_score < 90:
return False
flags = json.loads(trust.red_flags_json or "[]")
if "suspicious_price" in flags:
return False
discount = (market_price - listing.price) / market_price
return 0.15 <= discount <= 0.30
def render_steal_banner() -> None:
"""Render the gold shimmer steal banner above a listing row."""
st.markdown(
'<div class="steal-banner">✦ THE STEAL — significantly below market, high trust</div>',
unsafe_allow_html=True,
)
# ---------------------------------------------------------------------------
# 3. Auction de-emphasis
# ---------------------------------------------------------------------------
def auction_hours_remaining(listing: Listing) -> Optional[float]:
"""Return hours remaining for an auction listing, or None for fixed-price / no data."""
if listing.buying_format != "auction" or not listing.ends_at:
return None
try:
ends = datetime.fromisoformat(listing.ends_at)
delta = ends - datetime.now(timezone.utc)
return max(delta.total_seconds() / 3600, 0.0)
except (ValueError, TypeError):
return None
def render_auction_notice(hours: float) -> None:
"""Render a soft de-emphasis notice for auctions with significant time remaining."""
if hours >= 1.0:
h = int(hours)
label = f"{h}h left" if h < 24 else f"{h // 24}d {h % 24}h left"
st.caption(
f"⏰ Auction · {label} — price not final until last few minutes"
)

View file

@ -33,6 +33,9 @@ class FilterState:
hide_marketing_photos: bool = False
hide_suspicious_price: bool = False
hide_duplicate_photos: bool = False
must_include: str = ""
must_include_mode: str = "all" # "all" | "any" | "groups"
must_exclude: str = ""
def build_filter_options(
@ -78,6 +81,29 @@ def render_filter_sidebar(
st.sidebar.markdown("### Filters")
st.sidebar.caption(f"{len(pairs)} results")
st.sidebar.markdown("**Keywords**")
state.must_include_mode = st.sidebar.radio(
"Must include mode",
options=["all", "any", "groups"],
format_func=lambda m: {"all": "All (AND)", "any": "Any (OR)", "groups": "Groups (CNF)"}[m],
horizontal=True,
key="include_mode",
label_visibility="collapsed",
)
hint = {
"all": "Every term must appear",
"any": "At least one term must appear",
"groups": "Comma = AND · pipe | = OR within group",
}[state.must_include_mode]
state.must_include = st.sidebar.text_input(
"Must include", value="", placeholder="16gb, founders…" if state.must_include_mode != "groups" else "founders|fe, 16gb…",
key="must_include",
)
st.sidebar.caption(hint)
state.must_exclude = st.sidebar.text_input(
"Must exclude", value="", placeholder="broken, parts…", key="must_exclude",
)
state.min_trust_score = st.sidebar.slider("Min trust score", 0, 100, 0, key="min_trust")
st.sidebar.caption(
f"🟢 Safe (80+): {opts.score_bands['safe']} "

View file

@ -1,10 +1,15 @@
"""Render a single listing row with trust score, badges, and error states."""
from __future__ import annotations
import json
import streamlit as st
from app.db.models import Listing, TrustScore, Seller
from typing import Optional
import streamlit as st
from app.db.models import Listing, TrustScore, Seller
from app.ui.components.easter_eggs import (
is_steal, render_steal_banner, render_auction_notice, auction_hours_remaining,
)
def _score_colour(score: int) -> str:
if score >= 80: return "🟢"
@ -29,7 +34,17 @@ def render_listing_row(
listing: Listing,
trust: Optional[TrustScore],
seller: Optional[Seller] = None,
market_price: Optional[float] = None,
) -> None:
# Easter egg: The Steal shimmer
if is_steal(listing, trust, market_price):
render_steal_banner()
# Auction de-emphasis (if > 1h remaining, price is not meaningful yet)
hours = auction_hours_remaining(listing)
if hours is not None:
render_auction_notice(hours)
col_img, col_info, col_score = st.columns([1, 5, 2])
with col_img:

43
compose.cloud.yml Normal file
View file

@ -0,0 +1,43 @@
# Snipe — cloud managed instance
# Project: snipe-cloud (docker compose -f compose.cloud.yml -p snipe-cloud ...)
# Web: http://127.0.0.1:8514 → menagerie.circuitforge.tech/snipe (via Caddy basicauth)
# API: internal only on snipe-cloud-net (no host port — only reachable via nginx)
#
# Usage: ./manage.sh cloud-start | cloud-stop | cloud-restart | cloud-status | cloud-logs | cloud-build
services:
api:
build:
context: ..
dockerfile: snipe/Dockerfile
restart: unless-stopped
env_file: .env
# No network_mode: host — isolated on snipe-cloud-net; nginx reaches it via 'api:8510'
volumes:
- /devl/snipe-cloud-data:/app/snipe/data
networks:
- snipe-cloud-net
web:
build:
context: .
dockerfile: docker/web/Dockerfile
args:
# Vite bakes these at image build time — changing them requires cloud-build.
# VITE_BASE_URL: app served under /snipe → asset URLs become /snipe/assets/...
# VITE_API_BASE: prepended to all /api/* fetch calls → /snipe/api/search
VITE_BASE_URL: /snipe
VITE_API_BASE: /snipe
restart: unless-stopped
ports:
- "8514:80" # Caddy (caddy-proxy container) reaches via host.docker.internal:8514
volumes:
- ./docker/web/nginx.cloud.conf:/etc/nginx/conf.d/default.conf:ro
networks:
- snipe-cloud-net
depends_on:
- api
networks:
snipe-cloud-net:
driver: bridge

View file

@ -1,8 +1,21 @@
services:
snipe:
api:
build:
context: ..
dockerfile: snipe/Dockerfile
network_mode: host
volumes:
- ../circuitforge-core:/app/circuitforge-core
- ./api:/app/snipe/api
- ./app:/app/snipe/app
- ./data:/app/snipe/data
- ./tests:/app/snipe/tests
environment:
- STREAMLIT_SERVER_RUN_ON_SAVE=true
- RELOAD=true
web:
build:
context: .
dockerfile: docker/web/Dockerfile
volumes:
- ./web/src:/app/src # not used at runtime but keeps override valid

View file

@ -1,10 +1,20 @@
services:
snipe:
api:
build:
context: ..
dockerfile: snipe/Dockerfile
ports:
- "8506:8506"
- "8510:8510"
env_file: .env
volumes:
- ./data:/app/snipe/data
web:
build:
context: .
dockerfile: docker/web/Dockerfile
ports:
- "8509:80"
restart: unless-stopped
depends_on:
- api

22
docker/web/Dockerfile Normal file
View file

@ -0,0 +1,22 @@
# Stage 1: build
FROM node:20-alpine AS build
WORKDIR /app
COPY web/package*.json ./
RUN npm ci --prefer-offline
COPY web/ ./
# Build-time env vars — Vite bakes these as static strings into the bundle.
# VITE_BASE_URL: URL prefix the app is served under (/ for dev, /snipe for cloud)
# VITE_API_BASE: prefix for all /api/* fetch calls (empty for dev, /snipe for cloud)
ARG VITE_BASE_URL=/
ARG VITE_API_BASE=
ENV VITE_BASE_URL=$VITE_BASE_URL
ENV VITE_API_BASE=$VITE_API_BASE
RUN npm run build
# Stage 2: serve
FROM nginx:alpine
COPY docker/web/nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=build /app/dist /usr/share/nginx/html
EXPOSE 80

View file

@ -0,0 +1,34 @@
server {
listen 80;
server_name _;
root /usr/share/nginx/html;
index index.html;
# Proxy API requests to the FastAPI container via Docker bridge network.
# In cloud, 'api' resolves to the api service container — no host networking needed.
location /api/ {
proxy_pass http://api:8510;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $http_x_forwarded_proto;
}
# index.html — never cache; ensures clients always get the latest entry point
location = /index.html {
add_header Cache-Control "no-cache, no-store, must-revalidate";
try_files $uri /index.html;
}
# SPA fallback for all other routes
location / {
try_files $uri $uri/ /index.html;
}
# Cache static assets aggressively — content hash in filename guarantees freshness
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
expires 1y;
add_header Cache-Control "public, immutable";
}
}

32
docker/web/nginx.conf Normal file
View file

@ -0,0 +1,32 @@
server {
listen 80;
server_name _;
root /usr/share/nginx/html;
index index.html;
# Proxy API requests to the FastAPI backend container
location /api/ {
proxy_pass http://172.17.0.1:8510; # Docker host bridge IP api runs network_mode:host
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
# index.html never cache; ensures clients always get the latest entry point
# after a deployment (chunks are content-hashed so they can be cached forever)
location = /index.html {
add_header Cache-Control "no-cache, no-store, must-revalidate";
try_files $uri /index.html;
}
# SPA fallback for all other routes
location / {
try_files $uri $uri/ /index.html;
}
# Cache static assets aggressively content hash in filename guarantees freshness
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
expires 1y;
add_header Cache-Control "public, immutable";
}
}

View file

@ -2,11 +2,35 @@
set -euo pipefail
SERVICE=snipe
PORT=8506
PORT=8509 # Vue web UI (nginx) — dev
API_PORT=8510 # FastAPI — dev
CLOUD_PORT=8514 # Vue web UI (nginx) — cloud (menagerie.circuitforge.tech/snipe)
COMPOSE_FILE="compose.yml"
CLOUD_COMPOSE_FILE="compose.cloud.yml"
CLOUD_PROJECT="snipe-cloud"
usage() {
echo "Usage: $0 {start|stop|restart|status|logs|open|update}"
echo "Usage: $0 {start|stop|restart|status|logs|open|build|update|test"
echo " |cloud-start|cloud-stop|cloud-restart|cloud-status|cloud-logs|cloud-build}"
echo ""
echo "Dev:"
echo " start Build (if needed) and start all services"
echo " stop Stop and remove containers"
echo " restart Stop then start"
echo " status Show running containers"
echo " logs [svc] Follow logs (api | web — defaults to all)"
echo " open Open web UI in browser"
echo " build Rebuild Docker images without cache"
echo " update Pull latest images and rebuild"
echo " test Run pytest test suite in the api container"
echo ""
echo "Cloud (menagerie.circuitforge.tech/snipe):"
echo " cloud-start Build cloud images and start snipe-cloud project"
echo " cloud-stop Stop cloud instance"
echo " cloud-restart Stop then start cloud instance"
echo " cloud-status Show cloud containers"
echo " cloud-logs Follow cloud logs [api|web — defaults to all]"
echo " cloud-build Rebuild cloud images without cache (required after code changes)"
exit 1
}
@ -16,29 +40,76 @@ shift || true
case "$cmd" in
start)
docker compose -f "$COMPOSE_FILE" up -d
echo "$SERVICE started on http://localhost:$PORT"
echo "$SERVICE started — web: http://localhost:$PORT api: http://localhost:$API_PORT"
;;
stop)
docker compose -f "$COMPOSE_FILE" down
docker compose -f "$COMPOSE_FILE" down --remove-orphans
;;
restart)
docker compose -f "$COMPOSE_FILE" down
docker compose -f "$COMPOSE_FILE" down --remove-orphans
docker compose -f "$COMPOSE_FILE" up -d
echo "$SERVICE restarted on http://localhost:$PORT"
echo "$SERVICE restarted http://localhost:$PORT"
;;
status)
docker compose -f "$COMPOSE_FILE" ps
;;
logs)
docker compose -f "$COMPOSE_FILE" logs -f "${@:-$SERVICE}"
# logs [api|web] — default: all services
target="${1:-}"
if [[ -n "$target" ]]; then
docker compose -f "$COMPOSE_FILE" logs -f "$target"
else
docker compose -f "$COMPOSE_FILE" logs -f
fi
;;
open)
xdg-open "http://localhost:$PORT" 2>/dev/null || open "http://localhost:$PORT"
xdg-open "http://localhost:$PORT" 2>/dev/null || open "http://localhost:$PORT" 2>/dev/null || \
echo "Open http://localhost:$PORT in your browser"
;;
build)
docker compose -f "$COMPOSE_FILE" build --no-cache
echo "Build complete."
;;
update)
docker compose -f "$COMPOSE_FILE" pull
docker compose -f "$COMPOSE_FILE" up -d --build
echo "$SERVICE updated — http://localhost:$PORT"
;;
test)
echo "Running test suite..."
docker compose -f "$COMPOSE_FILE" exec api \
conda run -n job-seeker python -m pytest /app/snipe/tests/ -v "${@}"
;;
# ── Cloud commands ────────────────────────────────────────────────────────
cloud-start)
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" up -d --build
echo "$SERVICE cloud started — https://menagerie.circuitforge.tech/snipe"
;;
cloud-stop)
docker compose -p "$CLOUD_PROJECT" down --remove-orphans
;;
cloud-restart)
docker compose -p "$CLOUD_PROJECT" down --remove-orphans
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" up -d --build
echo "$SERVICE cloud restarted — https://menagerie.circuitforge.tech/snipe"
;;
cloud-status)
docker compose -p "$CLOUD_PROJECT" ps
;;
cloud-logs)
target="${1:-}"
if [[ -n "$target" ]]; then
docker compose -p "$CLOUD_PROJECT" logs -f "$target"
else
docker compose -p "$CLOUD_PROJECT" logs -f
fi
;;
cloud-build)
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" build --no-cache
echo "Cloud build complete. Run './manage.sh cloud-restart' to deploy."
;;
*)
usage
;;

View file

@ -14,11 +14,18 @@ dependencies = [
"imagehash>=4.3",
"Pillow>=10.0",
"python-dotenv>=1.0",
"beautifulsoup4>=4.12",
"lxml>=5.0",
"fastapi>=0.111",
"uvicorn[standard]>=0.29",
"playwright>=1.44",
"playwright-stealth>=1.0",
"cryptography>=42.0",
]
[tool.setuptools.packages.find]
where = ["."]
include = ["app*"]
include = ["app*", "api*"]
[tool.pytest.ini_options]
testpaths = ["tests"]

View file

@ -15,5 +15,16 @@ if not wizard.is_configured():
wizard.run()
st.stop()
from app.ui.components.easter_eggs import inject_konami_detector
inject_konami_detector()
with st.sidebar:
st.divider()
audio_enabled = st.checkbox(
"🔊 Enable audio easter egg",
value=False,
help="Plays a synthesised sound on Konami code activation. Off by default.",
)
from app.ui.Search import render
render()
render(audio_enabled=audio_enabled)

View file

@ -1,4 +1,5 @@
import pytest
from datetime import datetime, timedelta, timezone
from pathlib import Path
from app.db.store import Store
from app.db.models import Listing, Seller, TrustScore, MarketComp
@ -57,7 +58,7 @@ def test_save_and_get_market_comp(store):
query_hash="abc123",
median_price=1050.0,
sample_count=12,
expires_at="2026-03-26T00:00:00",
expires_at=(datetime.now(timezone.utc) + timedelta(hours=6)).isoformat(),
)
store.save_market_comp(comp)
result = store.get_market_comp("ebay", "abc123")

View file

@ -0,0 +1,292 @@
"""Tests for the scraper-based eBay adapter.
Uses a minimal HTML fixture mirroring eBay's current s-card markup.
No HTTP requests are made all tests operate on the pure parsing functions.
"""
import pytest
from datetime import timedelta
from app.platforms.ebay.scraper import (
scrape_listings,
scrape_sellers,
_parse_price,
_parse_time_left,
_extract_seller_from_card,
)
from bs4 import BeautifulSoup
# ---------------------------------------------------------------------------
# Minimal eBay search results HTML fixture (li.s-card schema)
# ---------------------------------------------------------------------------
_EBAY_HTML = """
<html><body>
<ul class="srp-results">
<!-- Promo item: no data-listingid must be skipped -->
<li class="s-card">
<div class="s-card__title">Shop on eBay</div>
</li>
<!-- Real listing 1: established seller, used, fixed price -->
<!-- Includes eBay's hidden accessibility span to test title stripping -->
<li class="s-card" data-listingid="123456789">
<div class="s-card__title">RTX 4090 Founders Edition GPU<span class="clipped">Opens in a new window or tab</span></div>
<a class="s-card__link" href="https://www.ebay.com/itm/123456789?somequery=1"></a>
<span class="s-card__price">$950.00</span>
<div class="s-card__subtitle">Used · Free shipping</div>
<img class="s-card__image" src="https://i.ebayimg.com/thumbs/1.jpg"/>
<span class="su-styled-text">techguy</span>
<span class="su-styled-text">99.1% positive (1,234)</span>
</li>
<!-- Real listing 2: price range, new, data-src photo -->
<li class="s-card" data-listingid="987654321">
<div class="s-card__title">RTX 4090 Gaming OC 24GB</div>
<a class="s-card__link" href="https://www.ebay.com/itm/987654321"></a>
<span class="s-card__price">$1,100.00 to $1,200.00</span>
<div class="s-card__subtitle">New · Free shipping</div>
<img class="s-card__image" data-src="https://i.ebayimg.com/thumbs/2.jpg" src=""/>
<span class="su-styled-text">gpu_warehouse</span>
<span class="su-styled-text">98.7% positive (450)</span>
</li>
<!-- Real listing 3: new account, suspicious price -->
<li class="s-card" data-listingid="555000111">
<div class="s-card__title">RTX 4090 BNIB Sealed</div>
<a class="s-card__link" href="https://www.ebay.com/itm/555000111"></a>
<span class="s-card__price">$499.00</span>
<div class="s-card__subtitle">New</div>
<img class="s-card__image" src="https://i.ebayimg.com/thumbs/3.jpg"/>
<span class="su-styled-text">new_user_2024</span>
<span class="su-styled-text">100.0% positive (2)</span>
</li>
</ul>
</body></html>
"""
_AUCTION_HTML = """
<html><body>
<ul class="srp-results">
<li class="s-card" data-listingid="777000999">
<div class="s-card__title">Vintage Leica M6 Camera Body</div>
<a class="s-card__link" href="https://www.ebay.com/itm/777000999"></a>
<span class="s-card__price">$450.00</span>
<div class="s-card__subtitle">Used</div>
<img class="s-card__image" src="https://i.ebayimg.com/thumbs/cam.jpg"/>
<span class="su-styled-text">camera_dealer</span>
<span class="su-styled-text">97.5% positive (800)</span>
<span class="su-styled-text">2h 30m left</span>
</li>
</ul>
</body></html>
"""
# ---------------------------------------------------------------------------
# _parse_price
# ---------------------------------------------------------------------------
class TestParsePrice:
def test_simple_price(self):
assert _parse_price("$950.00") == 950.0
def test_price_range_takes_lower_bound(self):
assert _parse_price("$900.00 to $1,050.00") == 900.0
def test_price_with_commas(self):
assert _parse_price("$1,100.00") == 1100.0
def test_price_per_ea(self):
assert _parse_price("$1,234.56/ea") == 1234.56
def test_empty_returns_zero(self):
assert _parse_price("") == 0.0
# ---------------------------------------------------------------------------
# _extract_seller_from_card
# ---------------------------------------------------------------------------
class TestExtractSellerFromCard:
def _card(self, html: str):
return BeautifulSoup(html, "lxml").select_one("li.s-card")
def test_standard_card(self):
card = self._card("""
<li class="s-card" data-listingid="1">
<span class="su-styled-text">techguy</span>
<span class="su-styled-text">99.1% positive (1,234)</span>
</li>""")
username, count, ratio = _extract_seller_from_card(card)
assert username == "techguy"
assert count == 1234
assert ratio == pytest.approx(0.991, abs=0.001)
def test_new_account(self):
card = self._card("""
<li class="s-card" data-listingid="2">
<span class="su-styled-text">new_user_2024</span>
<span class="su-styled-text">100.0% positive (2)</span>
</li>""")
username, count, ratio = _extract_seller_from_card(card)
assert username == "new_user_2024"
assert count == 2
assert ratio == pytest.approx(1.0, abs=0.001)
def test_no_feedback_span_returns_empty(self):
card = self._card("""
<li class="s-card" data-listingid="3">
<span class="su-styled-text">some_seller</span>
</li>""")
username, count, ratio = _extract_seller_from_card(card)
assert username == ""
assert count == 0
assert ratio == 0.0
# ---------------------------------------------------------------------------
# _parse_time_left
# ---------------------------------------------------------------------------
class TestParseTimeLeft:
def test_days_and_hours(self):
assert _parse_time_left("3d 14h left") == timedelta(days=3, hours=14)
def test_hours_and_minutes(self):
assert _parse_time_left("14h 23m left") == timedelta(hours=14, minutes=23)
def test_minutes_and_seconds(self):
assert _parse_time_left("23m 45s left") == timedelta(minutes=23, seconds=45)
def test_days_only(self):
assert _parse_time_left("2d left") == timedelta(days=2)
def test_no_match_returns_none(self):
assert _parse_time_left("Buy It Now") is None
def test_empty_returns_none(self):
assert _parse_time_left("") is None
def test_all_zeros_returns_none(self):
assert _parse_time_left("0d 0h 0m 0s left") is None
# ---------------------------------------------------------------------------
# scrape_listings
# ---------------------------------------------------------------------------
class TestScrapeListings:
def test_skips_promo_without_listingid(self):
listings = scrape_listings(_EBAY_HTML)
titles = [l.title for l in listings]
assert "Shop on eBay" not in titles
def test_strips_ebay_accessibility_text_from_title(self):
"""eBay injects a hidden 'Opens in a new window or tab' span into title links
for screen readers. get_text() is CSS-blind so we must strip it explicitly."""
listings = scrape_listings(_EBAY_HTML)
for listing in listings:
assert "Opens in a new window or tab" not in listing.title
# Verify the actual title content is preserved
assert listings[0].title == "RTX 4090 Founders Edition GPU"
def test_parses_three_real_listings(self):
assert len(scrape_listings(_EBAY_HTML)) == 3
def test_platform_listing_id_from_data_attribute(self):
listings = scrape_listings(_EBAY_HTML)
assert listings[0].platform_listing_id == "123456789"
assert listings[1].platform_listing_id == "987654321"
assert listings[2].platform_listing_id == "555000111"
def test_url_strips_query_string(self):
listings = scrape_listings(_EBAY_HTML)
assert "?" not in listings[0].url
assert listings[0].url == "https://www.ebay.com/itm/123456789"
def test_price_range_takes_lower(self):
assert scrape_listings(_EBAY_HTML)[1].price == 1100.0
def test_condition_extracted_and_lowercased(self):
listings = scrape_listings(_EBAY_HTML)
assert listings[0].condition == "used"
assert listings[1].condition == "new"
def test_photo_prefers_data_src_over_src(self):
# Listing 2 has data-src set, src is empty
assert scrape_listings(_EBAY_HTML)[1].photo_urls == ["https://i.ebayimg.com/thumbs/2.jpg"]
def test_photo_falls_back_to_src(self):
assert scrape_listings(_EBAY_HTML)[0].photo_urls == ["https://i.ebayimg.com/thumbs/1.jpg"]
def test_seller_platform_id_from_card(self):
listings = scrape_listings(_EBAY_HTML)
assert listings[0].seller_platform_id == "techguy"
assert listings[2].seller_platform_id == "new_user_2024"
def test_platform_is_ebay(self):
assert all(l.platform == "ebay" for l in scrape_listings(_EBAY_HTML))
def test_currency_is_usd(self):
assert all(l.currency == "USD" for l in scrape_listings(_EBAY_HTML))
def test_fixed_price_no_ends_at(self):
listings = scrape_listings(_EBAY_HTML)
assert all(l.ends_at is None for l in listings)
assert all(l.buying_format == "fixed_price" for l in listings)
def test_auction_sets_buying_format_and_ends_at(self):
listings = scrape_listings(_AUCTION_HTML)
assert len(listings) == 1
assert listings[0].buying_format == "auction"
assert listings[0].ends_at is not None
def test_empty_html_returns_empty_list(self):
assert scrape_listings("<html><body></body></html>") == []
# ---------------------------------------------------------------------------
# scrape_sellers
# ---------------------------------------------------------------------------
class TestScrapeSellers:
def test_extracts_three_sellers(self):
assert len(scrape_sellers(_EBAY_HTML)) == 3
def test_feedback_count_and_ratio(self):
sellers = scrape_sellers(_EBAY_HTML)
assert sellers["techguy"].feedback_count == 1234
assert sellers["techguy"].feedback_ratio == pytest.approx(0.991, abs=0.001)
def test_deduplicates_sellers(self):
# Same seller appearing in two cards should only produce one Seller object
html = """<html><body><ul>
<li class="s-card" data-listingid="1">
<div class="s-card__title">Item A</div>
<a class="s-card__link" href="https://www.ebay.com/itm/1"></a>
<span class="su-styled-text">repeatguy</span>
<span class="su-styled-text">99.0% positive (500)</span>
</li>
<li class="s-card" data-listingid="2">
<div class="s-card__title">Item B</div>
<a class="s-card__link" href="https://www.ebay.com/itm/2"></a>
<span class="su-styled-text">repeatguy</span>
<span class="su-styled-text">99.0% positive (500)</span>
</li>
</ul></body></html>"""
sellers = scrape_sellers(html)
assert len(sellers) == 1
assert "repeatguy" in sellers
def test_account_age_is_none(self):
"""account_age_days is None from scraper tier — causes score_is_partial=True."""
sellers = scrape_sellers(_EBAY_HTML)
assert all(s.account_age_days is None for s in sellers.values())
def test_category_history_always_empty(self):
"""category_history_json is '{}' from scraper — causes score_is_partial=True."""
sellers = scrape_sellers(_EBAY_HTML)
assert all(s.category_history_json == "{}" for s in sellers.values())
def test_platform_is_ebay(self):
sellers = scrape_sellers(_EBAY_HTML)
assert all(s.platform == "ebay" for s in sellers.values())

View file

@ -50,3 +50,46 @@ def test_partial_score_flagged_when_signals_missing():
}
result = agg.aggregate(scores, photo_hash_duplicate=False, seller=None)
assert result.score_is_partial is True
def test_suspicious_price_not_flagged_when_market_data_absent():
"""None price_vs_market (no market comp) must NOT trigger suspicious_price.
Regression guard: clean[] replaces None with 0, so naive `clean[...] == 0`
would fire even when the signal is simply unavailable.
"""
agg = Aggregator()
scores = {
"account_age": 15, "feedback_count": 15,
"feedback_ratio": 20, "price_vs_market": None, # no market data
"category_history": 0,
}
result = agg.aggregate(scores, photo_hash_duplicate=False, seller=None)
assert "suspicious_price" not in result.red_flags_json
def test_suspicious_price_flagged_when_price_genuinely_low():
"""price_vs_market == 0 (explicitly, meaning >50% below median) → flag fires."""
agg = Aggregator()
scores = {
"account_age": 15, "feedback_count": 15,
"feedback_ratio": 20, "price_vs_market": 0, # price is scam-level low
"category_history": 0,
}
result = agg.aggregate(scores, photo_hash_duplicate=False, seller=None)
assert "suspicious_price" in result.red_flags_json
def test_new_account_not_flagged_when_age_absent():
"""account_age_days=None (scraper tier) must NOT trigger new_account or account_under_30_days."""
agg = Aggregator()
scores = {k: 10 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
scraper_seller = Seller(
platform="ebay", platform_seller_id="u", username="u",
account_age_days=None, # not fetched at scraper tier
feedback_count=50, feedback_ratio=0.99, category_history_json="{}",
)
result = agg.aggregate(scores, photo_hash_duplicate=False, seller=scraper_seller)
assert "new_account" not in result.red_flags_json
assert "account_under_30_days" not in result.red_flags_json

View file

@ -0,0 +1,122 @@
"""Tests for easter egg helpers (pure logic — no Streamlit calls)."""
from __future__ import annotations
import json
from datetime import datetime, timedelta, timezone
import pytest
from app.db.models import Listing, TrustScore
from app.ui.components.easter_eggs import is_steal, auction_hours_remaining
def _listing(**kwargs) -> Listing:
defaults = dict(
platform="ebay",
platform_listing_id="1",
title="Test",
price=800.0,
currency="USD",
condition="used",
seller_platform_id="seller1",
url="https://ebay.com/itm/1",
buying_format="fixed_price",
ends_at=None,
)
defaults.update(kwargs)
return Listing(**defaults)
def _trust(score: int, flags: list[str] | None = None) -> TrustScore:
return TrustScore(
listing_id=1,
composite_score=score,
account_age_score=20,
feedback_count_score=20,
feedback_ratio_score=20,
price_vs_market_score=20,
category_history_score=score - 80 if score >= 80 else 0,
red_flags_json=json.dumps(flags or []),
)
# ---------------------------------------------------------------------------
# is_steal
# ---------------------------------------------------------------------------
class TestIsSteal:
def test_qualifies_when_high_trust_and_20_pct_below(self):
listing = _listing(price=840.0) # 16% below 1000
trust = _trust(92)
assert is_steal(listing, trust, market_price=1000.0) is True
def test_fails_when_trust_below_90(self):
listing = _listing(price=840.0)
trust = _trust(89)
assert is_steal(listing, trust, market_price=1000.0) is False
def test_fails_when_discount_too_deep(self):
# 35% below market — suspicious, not a steal
listing = _listing(price=650.0)
trust = _trust(95)
assert is_steal(listing, trust, market_price=1000.0) is False
def test_fails_when_discount_too_shallow(self):
# 10% below market — not enough of a deal
listing = _listing(price=900.0)
trust = _trust(95)
assert is_steal(listing, trust, market_price=1000.0) is False
def test_fails_when_suspicious_price_flag(self):
listing = _listing(price=840.0)
trust = _trust(92, flags=["suspicious_price"])
assert is_steal(listing, trust, market_price=1000.0) is False
def test_fails_when_no_market_price(self):
listing = _listing(price=840.0)
trust = _trust(92)
assert is_steal(listing, trust, market_price=None) is False
def test_fails_when_no_trust(self):
listing = _listing(price=840.0)
assert is_steal(listing, None, market_price=1000.0) is False
def test_boundary_15_pct(self):
listing = _listing(price=850.0) # exactly 15% below 1000
trust = _trust(92)
assert is_steal(listing, trust, market_price=1000.0) is True
def test_boundary_30_pct(self):
listing = _listing(price=700.0) # exactly 30% below 1000
trust = _trust(92)
assert is_steal(listing, trust, market_price=1000.0) is True
# ---------------------------------------------------------------------------
# auction_hours_remaining
# ---------------------------------------------------------------------------
class TestAuctionHoursRemaining:
def _auction_listing(self, hours_ahead: float) -> Listing:
ends = (datetime.now(timezone.utc) + timedelta(hours=hours_ahead)).isoformat()
return _listing(buying_format="auction", ends_at=ends)
def test_returns_hours_for_active_auction(self):
listing = self._auction_listing(3.0)
h = auction_hours_remaining(listing)
assert h is not None
assert 2.9 < h < 3.1
def test_returns_none_for_fixed_price(self):
listing = _listing(buying_format="fixed_price")
assert auction_hours_remaining(listing) is None
def test_returns_none_when_no_ends_at(self):
listing = _listing(buying_format="auction", ends_at=None)
assert auction_hours_remaining(listing) is None
def test_returns_zero_for_ended_auction(self):
ends = (datetime.now(timezone.utc) - timedelta(hours=1)).isoformat()
listing = _listing(buying_format="auction", ends_at=ends)
h = auction_hours_remaining(listing)
assert h == 0.0

20
web/index.html Normal file
View file

@ -0,0 +1,20 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<!-- Emoji favicon: target reticle — inline SVG to avoid a separate file -->
<link rel="icon" href="data:image/svg+xml,<svg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 100 100'><text y='.9em' font-size='90'>🎯</text></svg>" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Snipe</title>
<!-- Inline background prevents blank flash before CSS bundle loads -->
<!-- Matches --color-surface dark tactical theme from theme.css -->
<style>
html, body { margin: 0; background: #0d1117; min-height: 100vh; }
</style>
</head>
<body>
<!-- Mount target only — App.vue root must NOT use id="app". Gotcha #1. -->
<div id="app"></div>
<script type="module" src="/src/main.ts"></script>
</body>
</html>

4966
web/package-lock.json generated Normal file

File diff suppressed because it is too large Load diff

39
web/package.json Normal file
View file

@ -0,0 +1,39 @@
{
"name": "snipe-web",
"private": true,
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "vue-tsc && vite build",
"preview": "vite preview",
"test": "vitest run",
"test:watch": "vitest"
},
"dependencies": {
"@fontsource/atkinson-hyperlegible": "^5.2.8",
"@fontsource/fraunces": "^5.2.9",
"@fontsource/jetbrains-mono": "^5.2.8",
"@heroicons/vue": "^2.2.0",
"@vueuse/core": "^14.2.1",
"@vueuse/integrations": "^14.2.1",
"animejs": "^4.3.6",
"pinia": "^3.0.4",
"vue": "^3.5.25",
"vue-router": "^5.0.3"
},
"devDependencies": {
"@types/node": "^24.10.1",
"@unocss/preset-attributify": "^66.6.4",
"@unocss/preset-wind": "^66.6.4",
"@vitejs/plugin-vue": "^6.0.2",
"@vue/test-utils": "^2.4.6",
"@vue/tsconfig": "^0.8.1",
"jsdom": "^28.1.0",
"typescript": "~5.9.3",
"unocss": "^66.6.4",
"vite": "^7.3.1",
"vitest": "^4.0.18",
"vue-tsc": "^3.1.5"
}
}

94
web/src/App.vue Normal file
View file

@ -0,0 +1,94 @@
<template>
<!-- Root uses .app-root class, NOT id="app" index.html owns #app.
Nested #app elements cause ambiguous CSS specificity. Gotcha #1. -->
<div class="app-root" :class="{ 'rich-motion': motion.rich.value }">
<AppNav />
<main class="app-main" id="main-content" tabindex="-1">
<!-- Skip to main content link (screen reader / keyboard nav) -->
<a href="#main-content" class="skip-link">Skip to main content</a>
<RouterView />
</main>
</div>
</template>
<script setup lang="ts">
import { onMounted } from 'vue'
import { RouterView } from 'vue-router'
import { useMotion } from './composables/useMotion'
import { useSnipeMode } from './composables/useSnipeMode'
import { useKonamiCode } from './composables/useKonamiCode'
import AppNav from './components/AppNav.vue'
const motion = useMotion()
const { activate, restore } = useSnipeMode()
useKonamiCode(activate)
onMounted(() => {
restore() // re-apply snipe mode from localStorage on hard reload
})
</script>
<style>
/* Global resets — unscoped, applied once to document */
*, *::before, *::after {
box-sizing: border-box;
margin: 0;
padding: 0;
}
html {
font-family: var(--font-body, sans-serif);
color: var(--color-text, #e6edf3);
background: var(--color-surface, #0d1117);
overflow-x: clip; /* no BFC side effects. Gotcha #3. */
}
body {
min-height: 100dvh; /* dynamic viewport — mobile chrome-aware. Gotcha #13. */
overflow-x: hidden;
}
#app { min-height: 100dvh; }
/* Layout root — sidebar pushes content right on desktop */
.app-root {
display: flex;
min-height: 100dvh;
}
/* Main content area */
.app-main {
flex: 1;
min-width: 0; /* prevents flex blowout */
/* Desktop: offset by sidebar width */
margin-left: var(--sidebar-width, 220px);
}
/* Skip-to-content link — visible only on keyboard focus */
.skip-link {
position: absolute;
top: -999px;
left: var(--space-4);
background: var(--app-primary);
color: var(--color-text-inverse);
padding: var(--space-2) var(--space-4);
border-radius: var(--radius-md);
font-weight: 600;
z-index: 9999;
text-decoration: none;
transition: top 0ms;
}
.skip-link:focus {
top: var(--space-4);
}
/* Mobile: no sidebar margin, add bottom tab bar clearance */
@media (max-width: 1023px) {
.app-main {
margin-left: 0;
padding-bottom: calc(56px + env(safe-area-inset-bottom));
}
}
</style>

227
web/src/assets/theme.css Normal file
View file

@ -0,0 +1,227 @@
/* assets/theme.css CENTRAL THEME FILE for Snipe
Dark tactical theme: near-black surfaces, amber accent, trust-signal colours.
ALL color/font/spacing tokens live here nowhere else.
Snipe Mode easter egg: activated by Konami code (cf-snipe-mode in localStorage).
*/
/* ── Snipe — dark tactical (default — always dark) ─ */
:root {
/* Brand — amber target reticle */
--app-primary: #f59e0b;
--app-primary-hover: #d97706;
--app-primary-light: rgba(245, 158, 11, 0.12);
/* Surfaces — near-black GitHub-dark inspired */
--color-surface: #0d1117;
--color-surface-2: #161b22;
--color-surface-raised: #1c2129;
/* Borders */
--color-border: #30363d;
--color-border-light: #21262d;
/* Text */
--color-text: #e6edf3;
--color-text-muted: #8b949e;
--color-text-inverse: #0d1117;
/* Trust signal colours */
--trust-high: #3fb950; /* composite_score >= 80 — green */
--trust-mid: #d29922; /* composite_score 5079 — amber */
--trust-low: #f85149; /* composite_score < 50 — red */
/* Semantic */
--color-success: #3fb950;
--color-error: #f85149;
--color-warning: #d29922;
--color-info: #58a6ff;
/* Typography */
--font-display: 'Fraunces', Georgia, serif;
--font-body: 'Atkinson Hyperlegible', system-ui, sans-serif;
--font-mono: 'JetBrains Mono', 'Fira Code', monospace;
/* Spacing scale */
--space-1: 0.25rem;
--space-2: 0.5rem;
--space-3: 0.75rem;
--space-4: 1rem;
--space-6: 1.5rem;
--space-8: 2rem;
--space-12: 3rem;
--space-16: 4rem;
--space-24: 6rem;
/* Radii */
--radius-sm: 0.25rem;
--radius-md: 0.5rem;
--radius-lg: 1rem;
--radius-full: 9999px;
/* Shadows — dark base */
--shadow-sm: 0 1px 3px rgba(0, 0, 0, 0.4), 0 1px 2px rgba(0, 0, 0, 0.3);
--shadow-md: 0 4px 12px rgba(0, 0, 0, 0.5), 0 2px 4px rgba(0, 0, 0, 0.3);
--shadow-lg: 0 10px 30px rgba(0, 0, 0, 0.6), 0 4px 8px rgba(0, 0, 0, 0.3);
/* Transitions */
--transition: 200ms ease;
--transition-slow: 400ms ease;
/* Layout */
--sidebar-width: 220px;
}
/* ── Snipe Mode easter egg theme ─────────────────── */
/* Activated by Konami code; stored as 'cf-snipe-mode' in localStorage */
/* Applied: document.documentElement.dataset.snipeMode = 'active' */
[data-snipe-mode="active"] {
--app-primary: #ff6b35;
--app-primary-hover: #ff4500;
--app-primary-light: rgba(255, 107, 53, 0.15);
--color-surface: #050505;
--color-surface-2: #0a0a0a;
--color-surface-raised: #0f0f0f;
--color-border: #ff6b3530;
--color-border-light: #ff6b3518;
--color-text: #ff9970;
--color-text-muted: #ff6b3580;
/* Glow variants for snipe mode UI */
--snipe-glow-xs: rgba(255, 107, 53, 0.08);
--snipe-glow-sm: rgba(255, 107, 53, 0.15);
--snipe-glow-md: rgba(255, 107, 53, 0.4);
--shadow-sm: 0 1px 3px rgba(255, 107, 53, 0.08);
--shadow-md: 0 4px 12px rgba(255, 107, 53, 0.12);
--shadow-lg: 0 10px 30px rgba(255, 107, 53, 0.18);
}
/* ── Base resets ─────────────────────────────────── */
*, *::before, *::after { box-sizing: border-box; }
html {
font-family: var(--font-body);
color: var(--color-text);
background: var(--color-surface);
scroll-behavior: smooth;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
body { margin: 0; min-height: 100vh; }
h1, h2, h3, h4, h5, h6 {
font-family: var(--font-display);
color: var(--app-primary);
line-height: 1.2;
margin: 0;
}
/* Focus visible — keyboard nav — accessibility requirement */
:focus-visible {
outline: 2px solid var(--app-primary);
outline-offset: 3px;
border-radius: var(--radius-sm);
}
/* Respect reduced motion */
@media (prefers-reduced-motion: reduce) {
*, *::before, *::after {
animation-duration: 0.01ms !important;
transition-duration: 0.01ms !important;
}
}
/* ── Utility: screen reader only ────────────────── */
.sr-only {
position: absolute;
width: 1px;
height: 1px;
padding: 0;
margin: -1px;
overflow: hidden;
clip: rect(0, 0, 0, 0);
white-space: nowrap;
border: 0;
}
/* Steal shimmer animation
Applied to ListingCard when listing qualifies as a steal:
composite_score >= 80 AND price < marketPrice * 0.8
The shimmer sweeps left-to-right across the card border.
*/
@keyframes steal-shimmer {
0% { background-position: -200% center; }
100% { background-position: 200% center; }
}
.steal-card {
border: 1.5px solid transparent;
background-clip: padding-box;
position: relative;
}
.steal-card::before {
content: '';
position: absolute;
inset: -1.5px;
border-radius: inherit;
background: linear-gradient(
90deg,
var(--trust-high) 0%,
#7ee787 40%,
var(--app-primary) 60%,
var(--trust-high) 100%
);
background-size: 200% auto;
animation: steal-shimmer 2.4s linear infinite;
z-index: -1;
}
/* Auction de-emphasis
Auctions with >1h remaining have fluid prices de-emphasise
the card and current price to avoid anchoring on a misleading figure.
*/
.listing-card--auction {
opacity: 0.72;
border-color: var(--color-border-light);
}
.listing-card--auction:hover {
opacity: 1;
}
.auction-price--live {
opacity: 0.55;
font-style: italic;
}
.auction-badge {
display: inline-flex;
align-items: center;
gap: var(--space-1);
padding: 2px var(--space-2);
border-radius: var(--radius-full);
background: var(--color-warning);
color: var(--color-text-inverse);
font-family: var(--font-mono);
font-size: 0.7rem;
font-weight: 700;
letter-spacing: 0.04em;
}
.fixed-price-badge {
display: inline-flex;
align-items: center;
gap: var(--space-1);
padding: 2px var(--space-2);
border-radius: var(--radius-full);
background: var(--color-surface-raised);
color: var(--color-text-muted);
border: 1px solid var(--color-border);
font-size: 0.7rem;
font-weight: 600;
}

View file

@ -0,0 +1,245 @@
<template>
<!-- Desktop: persistent sidebar (1024px) -->
<!-- Mobile: bottom tab bar (<1024px) -->
<nav class="app-sidebar" role="navigation" aria-label="Main navigation">
<!-- Brand -->
<div class="sidebar__brand">
<RouterLink to="/" class="sidebar__logo">
<span class="sidebar__target" aria-hidden="true">🎯</span>
<span class="sidebar__wordmark">Snipe</span>
</RouterLink>
</div>
<!-- Nav links -->
<ul class="sidebar__links" role="list">
<li v-for="link in navLinks" :key="link.to">
<RouterLink
:to="link.to"
class="sidebar__link"
active-class="sidebar__link--active"
:aria-label="link.label"
>
<component :is="link.icon" class="sidebar__icon" aria-hidden="true" />
<span class="sidebar__label">{{ link.label }}</span>
</RouterLink>
</li>
</ul>
<!-- Snipe mode exit (shows when active) -->
<div v-if="isSnipeMode" class="sidebar__snipe-exit">
<button class="sidebar__snipe-btn" @click="deactivate">
Exit snipe mode
</button>
</div>
<!-- Settings at bottom -->
<div class="sidebar__footer">
<RouterLink to="/settings" class="sidebar__link sidebar__link--footer" active-class="sidebar__link--active">
<Cog6ToothIcon class="sidebar__icon" aria-hidden="true" />
<span class="sidebar__label">Settings</span>
</RouterLink>
</div>
</nav>
<!-- Mobile bottom tab bar -->
<nav class="app-tabbar" role="navigation" aria-label="Main navigation">
<ul class="tabbar__links" role="list">
<li v-for="link in mobileLinks" :key="link.to">
<RouterLink
:to="link.to"
class="tabbar__link"
active-class="tabbar__link--active"
:aria-label="link.label"
>
<component :is="link.icon" class="tabbar__icon" aria-hidden="true" />
<span class="tabbar__label">{{ link.label }}</span>
</RouterLink>
</li>
</ul>
</nav>
</template>
<script setup lang="ts">
import { computed } from 'vue'
import { RouterLink } from 'vue-router'
import {
MagnifyingGlassIcon,
BookmarkIcon,
Cog6ToothIcon,
} from '@heroicons/vue/24/outline'
import { useSnipeMode } from '../composables/useSnipeMode'
const { active: isSnipeMode, deactivate } = useSnipeMode()
const navLinks = computed(() => [
{ to: '/', icon: MagnifyingGlassIcon, label: 'Search' },
{ to: '/saved', icon: BookmarkIcon, label: 'Saved' },
])
const mobileLinks = [
{ to: '/', icon: MagnifyingGlassIcon, label: 'Search' },
{ to: '/saved', icon: BookmarkIcon, label: 'Saved' },
{ to: '/settings', icon: Cog6ToothIcon, label: 'Settings' },
]
</script>
<style scoped>
/* ── Sidebar (desktop ≥1024px) ──────────────────────── */
.app-sidebar {
position: fixed;
top: 0;
left: 0;
bottom: 0;
width: var(--sidebar-width);
display: flex;
flex-direction: column;
background: var(--color-surface-2);
border-right: 1px solid var(--color-border);
z-index: 100;
padding: var(--space-4) 0;
}
.sidebar__brand {
padding: 0 var(--space-4) var(--space-4);
border-bottom: 1px solid var(--color-border-light);
margin-bottom: var(--space-3);
}
.sidebar__logo {
display: flex;
align-items: center;
gap: var(--space-2);
text-decoration: none;
}
.sidebar__target {
font-size: 1.5rem;
line-height: 1;
flex-shrink: 0;
}
.sidebar__wordmark {
font-family: var(--font-display);
font-weight: 700;
font-size: 1.35rem;
color: var(--app-primary);
letter-spacing: -0.01em;
}
.sidebar__links {
flex: 1;
list-style: none;
margin: 0;
padding: 0 var(--space-3);
display: flex;
flex-direction: column;
gap: var(--space-1);
overflow-y: auto;
}
.sidebar__link {
display: flex;
align-items: center;
gap: var(--space-3);
padding: var(--space-3) var(--space-4);
border-radius: var(--radius-md);
color: var(--color-text-muted);
text-decoration: none;
font-size: 0.875rem;
font-weight: 500;
min-height: 44px; /* WCAG 2.5.5 touch target */
transition:
background 150ms ease,
color 150ms ease;
}
.sidebar__link:hover {
background: var(--app-primary-light);
color: var(--app-primary);
}
.sidebar__link--active {
background: var(--app-primary-light);
color: var(--app-primary);
font-weight: 600;
}
.sidebar__icon {
width: 1.25rem;
height: 1.25rem;
flex-shrink: 0;
}
/* Snipe mode exit button */
.sidebar__snipe-exit {
padding: var(--space-3);
border-top: 1px solid var(--color-border-light);
}
.sidebar__snipe-btn {
width: 100%;
padding: var(--space-2) var(--space-3);
background: transparent;
border: 1px solid var(--app-primary);
border-radius: var(--radius-md);
color: var(--app-primary);
font-family: var(--font-mono);
font-size: 0.75rem;
cursor: pointer;
transition: background 150ms ease, color 150ms ease;
}
.sidebar__snipe-btn:hover {
background: var(--app-primary);
color: var(--color-surface);
}
.sidebar__footer {
padding: var(--space-3) var(--space-3) 0;
border-top: 1px solid var(--color-border-light);
}
/* ── Mobile tab bar (<1024px) ───────────────────────── */
.app-tabbar {
display: none;
position: fixed;
bottom: 0;
left: 0;
right: 0;
background: var(--color-surface-2);
border-top: 1px solid var(--color-border);
z-index: 100;
padding-bottom: env(safe-area-inset-bottom);
}
.tabbar__links {
display: flex;
list-style: none;
margin: 0;
padding: 0;
}
.tabbar__link {
flex: 1;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
gap: 2px;
padding: var(--space-2) var(--space-1);
min-height: 56px; /* WCAG 2.5.5 touch target */
color: var(--color-text-muted);
text-decoration: none;
font-size: 10px;
transition: color 150ms ease;
}
.tabbar__link--active { color: var(--app-primary); }
.tabbar__icon { width: 1.5rem; height: 1.5rem; }
/* ── Responsive ─────────────────────────────────────── */
@media (max-width: 1023px) {
.app-sidebar { display: none; }
.app-tabbar { display: block; }
}
</style>

View file

@ -0,0 +1,518 @@
<template>
<article
class="listing-card"
:class="{
'steal-card': isSteal,
'listing-card--auction': isAuction && hoursRemaining !== null && hoursRemaining > 1,
}"
>
<!-- Thumbnail -->
<div class="card__thumb">
<img
v-if="listing.photo_urls.length"
:src="listing.photo_urls[0]"
:alt="listing.title"
class="card__img"
loading="lazy"
@error="imgFailed = true"
/>
<div v-if="!listing.photo_urls.length || imgFailed" class="card__img-placeholder" aria-hidden="true">
📷
</div>
</div>
<!-- Main info -->
<div class="card__body">
<!-- Title row -->
<a :href="listing.url" target="_blank" rel="noopener noreferrer" class="card__title">
{{ listing.title }}
</a>
<!-- Format + condition badges -->
<div class="card__badges">
<span v-if="isAuction" class="auction-badge" :title="auctionEndsLabel">
{{ auctionCountdown }}
</span>
<span v-else class="fixed-price-badge">Fixed Price</span>
<span v-if="listing.buying_format === 'best_offer'" class="fixed-price-badge">Best Offer</span>
<span class="card__condition">{{ conditionLabel }}</span>
</div>
<!-- Seller info -->
<p class="card__seller" v-if="seller">
<span class="card__seller-name">{{ seller.username }}</span>
· {{ seller.feedback_count }} feedback
· {{ (seller.feedback_ratio * 100).toFixed(1) }}%
· {{ accountAgeLabel }}
</p>
<p class="card__seller" v-else>
<span class="card__seller-name">{{ listing.seller_platform_id }}</span>
<span class="card__seller-unavail">· seller data unavailable</span>
</p>
<!-- Red flag badges -->
<div v-if="redFlags.length" class="card__flags" role="list" aria-label="Risk flags">
<span
v-for="flag in redFlags"
:key="flag"
class="card__flag-badge"
role="listitem"
>
{{ flagLabel(flag) }}
</span>
</div>
<p v-if="pendingSignalNames.length" class="card__score-pending">
Updating: {{ pendingSignalNames.join(', ') }}
</p>
<p v-if="!trust" class="card__partial-warning">
Could not score this listing
</p>
</div>
<!-- Score + price column -->
<div class="card__score-col">
<!-- Trust score badge -->
<div
class="card__trust"
:class="[trustClass, { 'card__trust--partial': trust?.score_is_partial }]"
:title="trustBadgeTitle"
>
<span class="card__trust-num">{{ trust?.composite_score ?? '?' }}</span>
<span class="card__trust-label">Trust</span>
<!-- Signal dots: one per scoring signal, grey = pending -->
<span v-if="trust" class="card__signal-dots" aria-hidden="true">
<span
v-for="dot in signalDots"
:key="dot.key"
class="card__signal-dot"
:class="dot.pending ? 'card__signal-dot--pending' : 'card__signal-dot--present'"
:title="dot.label"
/>
</span>
<!-- Jump the queue: force enrichment for this seller -->
<button
v-if="pendingSignalNames.length"
class="card__enrich-btn"
:class="{ 'card__enrich-btn--spinning': enriching, 'card__enrich-btn--error': enrichError }"
:title="enrichError ? 'Enrichment failed — try again' : 'Refresh score now'"
:disabled="enriching"
@click.stop="onEnrich"
>{{ enrichError ? '✗' : '↻' }}</button>
</div>
<!-- Price -->
<div class="card__price-wrap">
<span
class="card__price"
:class="{ 'auction-price--live': isAuction && hoursRemaining !== null && hoursRemaining > 1 }"
>
{{ formattedPrice }}
</span>
<span v-if="marketPrice && isSteal" class="card__steal-label">
🎯 Steal
</span>
<span v-if="marketPrice" class="card__market-price" title="Median market price">
market ~{{ formattedMarket }}
</span>
</div>
</div>
</article>
</template>
<script setup lang="ts">
import { computed, ref } from 'vue'
import type { Listing, TrustScore, Seller } from '../stores/search'
import { useSearchStore } from '../stores/search'
const props = defineProps<{
listing: Listing
trust: TrustScore | null
seller: Seller | null
marketPrice: number | null
}>()
const store = useSearchStore()
const enriching = ref(false)
const enrichError = ref(false)
async function onEnrich() {
if (enriching.value) return
enriching.value = true
enrichError.value = false
try {
await store.enrichSeller(props.listing.seller_platform_id, props.listing.platform_listing_id)
} catch {
enrichError.value = true
} finally {
enriching.value = false
}
}
const imgFailed = ref(false)
// Computed helpers
const isAuction = computed(() => props.listing.buying_format === 'auction')
const hoursRemaining = computed<number | null>(() => {
if (!props.listing.ends_at) return null
const ms = new Date(props.listing.ends_at).getTime() - Date.now()
return ms > 0 ? ms / 3_600_000 : 0
})
const auctionCountdown = computed(() => {
const h = hoursRemaining.value
if (h === null) return 'Auction'
if (h <= 0) return 'Ended'
if (h < 1) return `${Math.round(h * 60)}m left`
if (h < 24) return `${h.toFixed(1)}h left`
return `${Math.floor(h / 24)}d left`
})
const auctionEndsLabel = computed(() =>
props.listing.ends_at
? `Ends ${new Date(props.listing.ends_at).toLocaleString()}`
: 'Auction',
)
const conditionLabel = computed(() => {
const map: Record<string, string> = {
new: 'New',
like_new: 'Like New',
very_good: 'Very Good',
good: 'Good',
acceptable: 'Acceptable',
for_parts: 'For Parts',
}
return map[props.listing.condition] ?? props.listing.condition
})
const accountAgeLabel = computed(() => {
if (!props.seller) return ''
const days = props.seller.account_age_days
if (days == null) return 'member'
if (days >= 365) return `${Math.floor(days / 365)}yr member`
return `${days}d member`
})
const redFlags = computed<string[]>(() => {
try {
return JSON.parse(props.trust?.red_flags_json ?? '[]')
} catch {
return []
}
})
function flagLabel(flag: string): string {
const labels: Record<string, string> = {
new_account: '✗ New account',
account_under_30_days: '⚠ Account <30d',
low_feedback_count: '⚠ Low feedback',
suspicious_price: '✗ Suspicious price',
duplicate_photo: '✗ Duplicate photo',
established_bad_actor: '✗ Bad actor',
marketing_photo: '✗ Marketing photo',
}
return labels[flag] ?? `${flag}`
}
const trustClass = computed(() => {
const s = props.trust?.composite_score
if (s == null) return 'card__trust--unknown'
if (s >= 80) return 'card__trust--high'
if (s >= 50) return 'card__trust--mid'
return 'card__trust--low'
})
interface SignalDot { key: string; label: string; pending: boolean }
const signalDots = computed<SignalDot[]>(() => {
const agePending = props.seller?.account_age_days == null
const catPending = !props.seller || props.seller.category_history_json === '{}'
const mktPending = props.marketPrice == null
return [
{ key: 'feedback_count', label: 'Feedback count', pending: false },
{ key: 'feedback_ratio', label: 'Feedback ratio', pending: false },
{ key: 'account_age', label: agePending ? 'Account age: pending' : 'Account age', pending: agePending },
{ key: 'price_vs_market', label: mktPending ? 'Market price: pending' : 'vs market price', pending: mktPending },
{ key: 'category_history', label: catPending ? 'Category history: pending' : 'Category history', pending: catPending },
]
})
const pendingSignalNames = computed<string[]>(() => {
if (!props.trust?.score_is_partial) return []
return signalDots.value.filter(d => d.pending).map(d => d.label.replace(': pending', ''))
})
const trustBadgeTitle = computed(() => {
const base = `Trust score: ${props.trust?.composite_score ?? '?'}/100`
if (!pendingSignalNames.value.length) return base
return `${base} · pending: ${pendingSignalNames.value.join(', ')} (search again to update)`
})
const isSteal = computed(() => {
const s = props.trust?.composite_score
if (!s || s < 80) return false
if (!props.marketPrice) return false
return props.listing.price < props.marketPrice * 0.8
})
const formattedPrice = computed(() => {
const sym = props.listing.currency === 'USD' ? '$' : props.listing.currency + ' '
return `${sym}${props.listing.price.toLocaleString('en-US', { minimumFractionDigits: 0, maximumFractionDigits: 2 })}`
})
const formattedMarket = computed(() => {
if (!props.marketPrice) return ''
return `$${props.marketPrice.toLocaleString('en-US', { maximumFractionDigits: 0 })}`
})
</script>
<style scoped>
.listing-card {
display: grid;
grid-template-columns: 80px 1fr auto;
gap: var(--space-3);
padding: var(--space-4);
background: var(--color-surface-2);
border: 1px solid var(--color-border);
border-radius: var(--radius-lg);
position: relative;
overflow: hidden;
transition: border-color 150ms ease, box-shadow 150ms ease;
}
.listing-card:hover {
border-color: var(--app-primary);
box-shadow: var(--shadow-md);
}
/* Thumbnail */
.card__thumb {
width: 80px;
height: 80px;
border-radius: var(--radius-md);
overflow: hidden;
flex-shrink: 0;
background: var(--color-surface-raised);
display: flex;
align-items: center;
justify-content: center;
}
.card__img {
width: 100%;
height: 100%;
object-fit: cover;
}
.card__img-placeholder {
font-size: 2rem;
opacity: 0.4;
}
/* Body */
.card__body {
min-width: 0;
display: flex;
flex-direction: column;
gap: var(--space-1);
}
.card__title {
font-weight: 600;
font-size: 0.9375rem;
color: var(--color-text);
text-decoration: none;
line-height: 1.4;
display: -webkit-box;
-webkit-line-clamp: 2;
-webkit-box-orient: vertical;
overflow: hidden;
}
.card__title:hover { color: var(--app-primary); text-decoration: underline; }
.card__badges {
display: flex;
flex-wrap: wrap;
gap: var(--space-1);
align-items: center;
}
.card__condition {
font-size: 0.75rem;
color: var(--color-text-muted);
padding: 2px var(--space-2);
border: 1px solid var(--color-border);
border-radius: var(--radius-full);
}
.card__seller {
font-size: 0.8125rem;
color: var(--color-text-muted);
margin: 0;
}
.card__seller-name { color: var(--color-text); font-weight: 500; }
.card__seller-unavail { font-style: italic; }
.card__flags {
display: flex;
flex-wrap: wrap;
gap: var(--space-1);
}
.card__flag-badge {
background: rgba(248, 81, 73, 0.15);
color: var(--color-error);
border: 1px solid rgba(248, 81, 73, 0.3);
padding: 1px var(--space-2);
border-radius: var(--radius-sm);
font-size: 0.6875rem;
font-weight: 600;
}
.card__partial-warning {
font-size: 0.75rem;
color: var(--color-warning);
margin: 0;
}
.card__score-pending {
font-size: 0.7rem;
color: var(--color-text-muted);
margin: 0;
font-style: italic;
}
/* Score + price column */
.card__score-col {
display: flex;
flex-direction: column;
align-items: flex-end;
gap: var(--space-2);
min-width: 72px;
}
.card__trust {
display: flex;
flex-direction: column;
align-items: center;
padding: var(--space-1) var(--space-2);
border-radius: var(--radius-md);
border: 1.5px solid currentColor;
min-width: 52px;
}
.card__trust-num {
font-family: var(--font-mono);
font-size: 1.1rem;
font-weight: 700;
line-height: 1;
}
.card__trust-label {
font-size: 0.625rem;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.05em;
opacity: 0.8;
}
.card__trust--high { color: var(--trust-high); }
.card__trust--mid { color: var(--trust-mid); }
.card__trust--low { color: var(--trust-low); }
.card__trust--unknown { color: var(--color-text-muted); }
.card__trust--partial {
animation: trust-pulse 2.5s ease-in-out infinite;
}
@keyframes trust-pulse {
0%, 100% { opacity: 1; }
50% { opacity: 0.55; }
}
.card__signal-dots {
display: flex;
gap: 3px;
margin-top: 4px;
justify-content: center;
}
.card__signal-dot {
width: 5px;
height: 5px;
border-radius: 50%;
flex-shrink: 0;
}
.card__signal-dot--present { background: currentColor; opacity: 0.7; }
.card__signal-dot--pending { background: var(--color-border); opacity: 1; }
.card__enrich-btn {
margin-top: 4px;
background: none;
border: 1px solid currentColor;
border-radius: var(--radius-sm);
color: currentColor;
cursor: pointer;
font-size: 0.65rem;
line-height: 1;
opacity: 0.6;
padding: 1px 4px;
transition: opacity 150ms ease;
}
.card__enrich-btn:hover:not(:disabled) { opacity: 1; }
.card__enrich-btn:disabled { cursor: default; }
.card__enrich-btn--spinning { animation: enrich-spin 0.8s linear infinite; }
.card__enrich-btn--error { color: var(--color-error); opacity: 1; }
@keyframes enrich-spin {
from { transform: rotate(0deg); }
to { transform: rotate(360deg); }
}
.card__price-wrap {
display: flex;
flex-direction: column;
align-items: flex-end;
gap: 2px;
}
.card__price {
font-family: var(--font-mono);
font-size: 1.1rem;
font-weight: 700;
color: var(--color-text);
}
.card__steal-label {
font-size: 0.7rem;
font-weight: 700;
color: var(--trust-high);
text-transform: uppercase;
letter-spacing: 0.06em;
}
.card__market-price {
font-size: 0.7rem;
color: var(--color-text-muted);
font-family: var(--font-mono);
}
/* Mobile: stack vertically */
@media (max-width: 600px) {
.listing-card {
grid-template-columns: 60px 1fr;
grid-template-rows: auto auto;
}
.card__score-col {
grid-column: 1 / -1;
flex-direction: row;
justify-content: space-between;
align-items: center;
min-width: unset;
padding-top: var(--space-2);
border-top: 1px solid var(--color-border);
}
}
</style>

View file

@ -0,0 +1,32 @@
import { onMounted, onUnmounted } from 'vue'
const KONAMI = [
'ArrowUp', 'ArrowUp',
'ArrowDown', 'ArrowDown',
'ArrowLeft', 'ArrowRight',
'ArrowLeft', 'ArrowRight',
'b', 'a',
]
/**
* Listens for the Konami code sequence on the document and calls `onActivate`
* when the full sequence is entered. Works identically to Peregrine's pattern.
*/
export function useKonamiCode(onActivate: () => void) {
let pos = 0
function handleKey(e: KeyboardEvent) {
if (e.key === KONAMI[pos]) {
pos++
if (pos === KONAMI.length) {
pos = 0
onActivate()
}
} else {
pos = e.key === KONAMI[0] ? 1 : 0
}
}
onMounted(() => document.addEventListener('keydown', handleKey))
onUnmounted(() => document.removeEventListener('keydown', handleKey))
}

View file

@ -0,0 +1,30 @@
import { computed, ref } from 'vue'
// Snipe-namespaced localStorage entry
const LS_MOTION = 'cf-snipe-rich-motion'
// OS-level prefers-reduced-motion — checked once at module load
const OS_REDUCED = typeof window !== 'undefined'
? window.matchMedia('(prefers-reduced-motion: reduce)').matches
: false
// Reactive ref so toggling localStorage triggers re-reads in the same session
const _richOverride = ref(
typeof window !== 'undefined'
? localStorage.getItem(LS_MOTION)
: null,
)
export function useMotion() {
// null/missing = default ON; 'false' = explicitly disabled by user
const rich = computed(() =>
!OS_REDUCED && _richOverride.value !== 'false',
)
function setRich(enabled: boolean) {
localStorage.setItem(LS_MOTION, enabled ? 'true' : 'false')
_richOverride.value = enabled ? 'true' : 'false'
}
return { rich, setRich }
}

View file

@ -0,0 +1,82 @@
import { ref } from 'vue'
const LS_KEY = 'cf-snipe-mode'
const DATA_ATTR = 'snipeMode'
// Module-level ref so state is shared across all callers
const active = ref(false)
/**
* Snipe Mode easter egg activated by Konami code.
*
* When active:
* - Sets data-snipe-mode="active" on <html> (triggers CSS theme override in theme.css)
* - Persists to localStorage
* - Plays a snipe sound via Web Audio API (if audioEnabled is true)
*
* Audio synthesis mirrors the Streamlit version:
* 1. High-frequency sine blip (targeting beep)
* 2. Lower resonant hit with decay (impact)
*/
export function useSnipeMode(audioEnabled = true) {
function _playSnipeSound() {
if (!audioEnabled) return
try {
const ctx = new AudioContext()
// Phase 1: targeting blip — short high sine
const blip = ctx.createOscillator()
const blipGain = ctx.createGain()
blip.type = 'sine'
blip.frequency.setValueAtTime(880, ctx.currentTime)
blip.frequency.linearRampToValueAtTime(1200, ctx.currentTime + 0.05)
blipGain.gain.setValueAtTime(0.25, ctx.currentTime)
blipGain.gain.linearRampToValueAtTime(0, ctx.currentTime + 0.08)
blip.connect(blipGain)
blipGain.connect(ctx.destination)
blip.start(ctx.currentTime)
blip.stop(ctx.currentTime + 0.08)
// Phase 2: resonant hit — lower freq with exponential decay
const hit = ctx.createOscillator()
const hitGain = ctx.createGain()
hit.type = 'sine'
hit.frequency.setValueAtTime(440, ctx.currentTime + 0.08)
hit.frequency.exponentialRampToValueAtTime(110, ctx.currentTime + 0.45)
hitGain.gain.setValueAtTime(0.4, ctx.currentTime + 0.08)
hitGain.gain.exponentialRampToValueAtTime(0.001, ctx.currentTime + 0.5)
hit.connect(hitGain)
hitGain.connect(ctx.destination)
hit.start(ctx.currentTime + 0.08)
hit.stop(ctx.currentTime + 0.5)
// Close context after sound finishes
setTimeout(() => ctx.close(), 600)
} catch {
// Web Audio API unavailable — silently skip
}
}
function activate() {
active.value = true
document.documentElement.dataset[DATA_ATTR] = 'active'
localStorage.setItem(LS_KEY, 'active')
_playSnipeSound()
}
function deactivate() {
active.value = false
delete document.documentElement.dataset[DATA_ATTR]
localStorage.removeItem(LS_KEY)
}
/** Re-apply from localStorage on hard reload (call from App.vue onMounted). */
function restore() {
if (localStorage.getItem(LS_KEY) === 'active') {
active.value = true
document.documentElement.dataset[DATA_ATTR] = 'active'
}
}
return { active, activate, deactivate, restore }
}

23
web/src/main.ts Normal file
View file

@ -0,0 +1,23 @@
import { createApp } from 'vue'
import { createPinia } from 'pinia'
import { router } from './router'
// Self-hosted fonts — no Google Fonts CDN (privacy requirement)
import '@fontsource/fraunces/400.css'
import '@fontsource/fraunces/700.css'
import '@fontsource/atkinson-hyperlegible/400.css'
import '@fontsource/atkinson-hyperlegible/700.css'
import '@fontsource/jetbrains-mono/400.css'
import 'virtual:uno.css'
import './assets/theme.css'
import App from './App.vue'
// Manual scroll restoration — prevents browser from jumping to last position on SPA nav
if ('scrollRestoration' in history) history.scrollRestoration = 'manual'
const app = createApp(App)
app.use(createPinia())
app.use(router)
app.mount('#app')

13
web/src/router/index.ts Normal file
View file

@ -0,0 +1,13 @@
import { createRouter, createWebHistory } from 'vue-router'
import SearchView from '../views/SearchView.vue'
export const router = createRouter({
history: createWebHistory(import.meta.env.BASE_URL),
routes: [
{ path: '/', component: SearchView },
{ path: '/listing/:id', component: () => import('../views/ListingView.vue') },
{ path: '/saved', component: () => import('../views/SavedSearchesView.vue') },
// Catch-all — FastAPI serves index.html for all unknown routes (SPA mode)
{ path: '/:pathMatch(.*)*', redirect: '/' },
],
})

View file

@ -0,0 +1,56 @@
import { defineStore } from 'pinia'
import { ref } from 'vue'
import type { SavedSearch, SearchFilters } from './search'
export type { SavedSearch }
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
export const useSavedSearchesStore = defineStore('savedSearches', () => {
const items = ref<SavedSearch[]>([])
const loading = ref(false)
const error = ref<string | null>(null)
async function fetchAll() {
loading.value = true
error.value = null
try {
const res = await fetch(`${apiBase}/api/saved-searches`)
if (!res.ok) throw new Error(`${res.status} ${res.statusText}`)
const data = await res.json() as { saved_searches: SavedSearch[] }
items.value = data.saved_searches
} catch (e) {
error.value = e instanceof Error ? e.message : 'Failed to load saved searches'
} finally {
loading.value = false
}
}
async function create(name: string, query: string, filters: SearchFilters): Promise<SavedSearch> {
// Strip per-run fields before persisting
const { pages: _pages, ...persistable } = filters
const res = await fetch(`${apiBase}/api/saved-searches`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ name, query, filters_json: JSON.stringify(persistable) }),
})
if (!res.ok) throw new Error(`Save failed: ${res.status} ${res.statusText}`)
const created = await res.json() as SavedSearch
items.value = [created, ...items.value]
return created
}
async function remove(id: number) {
await fetch(`${apiBase}/api/saved-searches/${id}`, { method: 'DELETE' })
items.value = items.value.filter(s => s.id !== id)
}
async function markRun(id: number) {
// Fire-and-forget — don't block navigation on this
fetch(`${apiBase}/api/saved-searches/${id}/run`, { method: 'PATCH' }).catch(() => {})
const item = items.value.find(s => s.id === id)
if (item) item.last_run_at = new Date().toISOString()
}
return { items, loading, error, fetchAll, create, remove, markRun }
})

199
web/src/stores/search.ts Normal file
View file

@ -0,0 +1,199 @@
import { defineStore } from 'pinia'
import { ref } from 'vue'
// ── Domain types (mirror app/db/models.py) ───────────────────────────────────
export interface Listing {
id: number | null
platform: string
platform_listing_id: string
title: string
price: number
currency: string
condition: string
seller_platform_id: string
url: string
photo_urls: string[]
listing_age_days: number
buying_format: 'fixed_price' | 'auction' | 'best_offer'
ends_at: string | null
fetched_at: string | null
trust_score_id: number | null
}
export interface TrustScore {
id: number | null
listing_id: number
composite_score: number // 0100
account_age_score: number // 020
feedback_count_score: number // 020
feedback_ratio_score: number // 020
price_vs_market_score: number // 020
category_history_score: number // 020
photo_hash_duplicate: boolean
photo_analysis_json: string | null
red_flags_json: string // JSON array of flag strings
score_is_partial: boolean
scored_at: string | null
}
export interface Seller {
id: number | null
platform: string
platform_seller_id: string
username: string
account_age_days: number | null
feedback_count: number
feedback_ratio: number // 0.01.0
category_history_json: string
fetched_at: string | null
}
export type MustIncludeMode = 'all' | 'any' | 'groups'
export interface SavedSearch {
id: number
name: string
query: string
platform: string
filters_json: string // JSON blob of SearchFilters subset
created_at: string | null
last_run_at: string | null
}
export interface SearchFilters {
minTrustScore?: number
minPrice?: number
maxPrice?: number
conditions?: string[]
minAccountAgeDays?: number
minFeedbackCount?: number
minFeedbackRatio?: number
hideNewAccounts?: boolean
hideSuspiciousPrice?: boolean
hideDuplicatePhotos?: boolean
hideScratchDent?: boolean
hideLongOnMarket?: boolean
hidePriceDrop?: boolean
pages?: number // number of eBay result pages to fetch (48 listings/page, default 1)
mustInclude?: string // term string; client-side title filter; semantics set by mustIncludeMode
mustIncludeMode?: MustIncludeMode // 'all' = AND, 'any' = OR, 'groups' = CNF (pipe = OR within group)
mustExclude?: string // comma-separated; forwarded to eBay -term AND client-side
categoryId?: string // eBay category ID (e.g. "27386" = Graphics/Video Cards)
adapter?: 'auto' | 'api' | 'scraper' // override adapter selection
}
// ── Store ────────────────────────────────────────────────────────────────────
export const useSearchStore = defineStore('search', () => {
const query = ref('')
const results = ref<Listing[]>([])
const trustScores = ref<Map<string, TrustScore>>(new Map()) // key: platform_listing_id
const sellers = ref<Map<string, Seller>>(new Map()) // key: platform_seller_id
const marketPrice = ref<number | null>(null)
const adapterUsed = ref<'api' | 'scraper' | null>(null)
const loading = ref(false)
const error = ref<string | null>(null)
let _abort: AbortController | null = null
function cancelSearch() {
_abort?.abort()
_abort = null
loading.value = false
}
async function search(q: string, filters: SearchFilters = {}) {
// Cancel any in-flight search before starting a new one
_abort?.abort()
_abort = new AbortController()
const signal = _abort.signal
query.value = q
loading.value = true
error.value = null
try {
// TODO: POST /api/search with { query: q, filters }
// API does not exist yet — stub returns empty results
// VITE_API_BASE is '' in dev; '/snipe' under menagerie (baked at build time by Vite)
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
const params = new URLSearchParams({ q })
if (filters.maxPrice != null) params.set('max_price', String(filters.maxPrice))
if (filters.minPrice != null) params.set('min_price', String(filters.minPrice))
if (filters.pages != null && filters.pages > 1) params.set('pages', String(filters.pages))
if (filters.mustInclude?.trim()) params.set('must_include', filters.mustInclude.trim())
if (filters.mustIncludeMode) params.set('must_include_mode', filters.mustIncludeMode)
if (filters.mustExclude?.trim()) params.set('must_exclude', filters.mustExclude.trim())
if (filters.categoryId?.trim()) params.set('category_id', filters.categoryId.trim())
if (filters.adapter && filters.adapter !== 'auto') params.set('adapter', filters.adapter)
const res = await fetch(`${apiBase}/api/search?${params}`, { signal })
if (!res.ok) throw new Error(`Search failed: ${res.status} ${res.statusText}`)
const data = await res.json() as {
listings: Listing[]
trust_scores: Record<string, TrustScore>
sellers: Record<string, Seller>
market_price: number | null
adapter_used: 'api' | 'scraper'
}
results.value = data.listings ?? []
trustScores.value = new Map(Object.entries(data.trust_scores ?? {}))
sellers.value = new Map(Object.entries(data.sellers ?? {}))
marketPrice.value = data.market_price ?? null
adapterUsed.value = data.adapter_used ?? null
} catch (e) {
if (e instanceof DOMException && e.name === 'AbortError') {
// User cancelled — clear loading but don't surface as an error
results.value = []
} else {
error.value = e instanceof Error ? e.message : 'Unknown error'
results.value = []
}
} finally {
loading.value = false
_abort = null
}
}
async function enrichSeller(sellerUsername: string, listingId: string): Promise<void> {
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
const params = new URLSearchParams({
seller: sellerUsername,
listing_id: listingId,
query: query.value,
})
const res = await fetch(`${apiBase}/api/enrich?${params}`, { method: 'POST' })
if (!res.ok) throw new Error(`Enrich failed: ${res.status} ${res.statusText}`)
const data = await res.json() as {
trust_score: TrustScore | null
seller: Seller | null
}
if (data.trust_score) trustScores.value.set(listingId, data.trust_score)
if (data.seller) sellers.value.set(sellerUsername, data.seller)
}
function clearResults() {
results.value = []
trustScores.value = new Map()
sellers.value = new Map()
marketPrice.value = null
error.value = null
}
return {
query,
results,
trustScores,
sellers,
marketPrice,
adapterUsed,
loading,
error,
search,
cancelSearch,
enrichSeller,
clearResults,
}
})

2
web/src/test-setup.ts Normal file
View file

@ -0,0 +1,2 @@
// Vitest global test setup
// Add any test utilities, global mocks, or imports here.

View file

@ -0,0 +1,55 @@
<template>
<div class="listing-view">
<div class="placeholder">
<span class="placeholder__icon" aria-hidden="true">🎯</span>
<h1 class="placeholder__title">Listing Detail</h1>
<p class="placeholder__body">Coming soon full listing detail view with trust score breakdown, photo analysis, and seller history.</p>
<RouterLink to="/" class="placeholder__back"> Back to Search</RouterLink>
</div>
</div>
</template>
<script setup lang="ts">
import { RouterLink } from 'vue-router'
</script>
<style scoped>
.listing-view {
display: flex;
align-items: center;
justify-content: center;
min-height: 60dvh;
padding: var(--space-8);
}
.placeholder {
display: flex;
flex-direction: column;
align-items: center;
gap: var(--space-4);
text-align: center;
max-width: 480px;
}
.placeholder__icon { font-size: 3rem; }
.placeholder__title {
font-family: var(--font-display);
font-size: 1.5rem;
color: var(--app-primary);
}
.placeholder__body {
color: var(--color-text-muted);
line-height: 1.6;
}
.placeholder__back {
color: var(--app-primary);
text-decoration: none;
font-weight: 600;
transition: opacity 150ms ease;
}
.placeholder__back:hover { opacity: 0.75; }
</style>

View file

@ -0,0 +1,218 @@
<template>
<div class="saved-view">
<header class="saved-header">
<h1 class="saved-title">Saved Searches</h1>
</header>
<div v-if="store.loading" class="saved-state">
<p class="saved-state-text">Loading</p>
</div>
<div v-else-if="store.error" class="saved-state saved-state--error" role="alert">
{{ store.error }}
</div>
<div v-else-if="!store.items.length" class="saved-state">
<span class="saved-state-icon" aria-hidden="true">🔖</span>
<p class="saved-state-text">No saved searches yet.</p>
<p class="saved-state-hint">Run a search and click <strong>Save</strong> to bookmark it here.</p>
<RouterLink to="/" class="saved-back"> Go to Search</RouterLink>
</div>
<ul v-else class="saved-list" role="list">
<li v-for="item in store.items" :key="item.id" class="saved-card">
<div class="saved-card-body">
<p class="saved-card-name">{{ item.name }}</p>
<p class="saved-card-query">
<span class="saved-card-q-label">q:</span>
{{ item.query }}
</p>
<p class="saved-card-meta">
<span v-if="item.last_run_at">Last run {{ formatDate(item.last_run_at) }}</span>
<span v-else>Never run</span>
· Saved {{ formatDate(item.created_at) }}
</p>
</div>
<div class="saved-card-actions">
<button class="saved-run-btn" type="button" @click="onRun(item)">
Run
</button>
<button
class="saved-delete-btn"
type="button"
:aria-label="`Delete saved search: ${item.name}`"
@click="onDelete(item.id)"
>
</button>
</div>
</li>
</ul>
</div>
</template>
<script setup lang="ts">
import { onMounted } from 'vue'
import { useRouter, RouterLink } from 'vue-router'
import { useSavedSearchesStore } from '../stores/savedSearches'
import type { SavedSearch } from '../stores/savedSearches'
const store = useSavedSearchesStore()
const router = useRouter()
onMounted(() => store.fetchAll())
function formatDate(iso: string | null): string {
if (!iso) return '—'
const d = new Date(iso)
return d.toLocaleDateString(undefined, { month: 'short', day: 'numeric', year: 'numeric' })
}
async function onRun(item: SavedSearch) {
store.markRun(item.id)
const query: Record<string, string> = { q: item.query }
if (item.filters_json && item.filters_json !== '{}') query.filters = item.filters_json
router.push({ path: '/', query })
}
async function onDelete(id: number) {
await store.remove(id)
}
</script>
<style scoped>
.saved-view {
display: flex;
flex-direction: column;
min-height: 100dvh;
}
.saved-header {
padding: var(--space-6);
border-bottom: 1px solid var(--color-border);
background: var(--color-surface-2);
}
.saved-title {
font-family: var(--font-display);
font-size: 1.25rem;
color: var(--color-text);
}
/* Empty / loading / error state */
.saved-state {
display: flex;
flex-direction: column;
align-items: center;
gap: var(--space-4);
padding: var(--space-16) var(--space-4);
text-align: center;
}
.saved-state--error { color: var(--color-error); }
.saved-state-icon { font-size: 2.5rem; }
.saved-state-text { color: var(--color-text-muted); font-size: 0.9375rem; margin: 0; }
.saved-state-hint { color: var(--color-text-muted); font-size: 0.875rem; margin: 0; }
.saved-back {
color: var(--app-primary);
text-decoration: none;
font-weight: 600;
font-size: 0.875rem;
}
.saved-back:hover { opacity: 0.75; }
/* Card list */
.saved-list {
list-style: none;
padding: var(--space-6);
display: flex;
flex-direction: column;
gap: var(--space-3);
max-width: 720px;
}
.saved-card {
display: flex;
align-items: center;
gap: var(--space-4);
padding: var(--space-4) var(--space-5);
background: var(--color-surface-2);
border: 1px solid var(--color-border);
border-radius: var(--radius-md);
transition: border-color 150ms ease;
}
.saved-card:hover { border-color: var(--app-primary); }
.saved-card-body { flex: 1; min-width: 0; }
.saved-card-name {
font-weight: 600;
font-size: 0.9375rem;
color: var(--color-text);
margin: 0 0 var(--space-1);
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}
.saved-card-query {
font-family: var(--font-mono);
font-size: 0.75rem;
color: var(--app-primary);
margin: 0 0 var(--space-1);
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}
.saved-card-q-label {
color: var(--color-text-muted);
margin-right: var(--space-1);
}
.saved-card-meta {
font-size: 0.75rem;
color: var(--color-text-muted);
margin: 0;
}
.saved-card-actions {
display: flex;
align-items: center;
gap: var(--space-2);
flex-shrink: 0;
}
.saved-run-btn {
padding: var(--space-2) var(--space-4);
background: var(--app-primary);
border: none;
border-radius: var(--radius-md);
color: var(--color-text-inverse);
font-family: var(--font-body);
font-size: 0.875rem;
font-weight: 600;
cursor: pointer;
transition: background 150ms ease;
}
.saved-run-btn:hover { background: var(--app-primary-hover); }
.saved-delete-btn {
padding: var(--space-2);
background: transparent;
border: 1px solid var(--color-border);
border-radius: var(--radius-md);
color: var(--color-text-muted);
font-size: 0.75rem;
line-height: 1;
cursor: pointer;
transition: border-color 150ms ease, color 150ms ease;
min-width: 28px;
}
.saved-delete-btn:hover { border-color: var(--color-error); color: var(--color-error); }
@media (max-width: 767px) {
.saved-header { padding: var(--space-4); }
.saved-list { padding: var(--space-4); }
.saved-card { flex-direction: column; align-items: flex-start; gap: var(--space-3); }
.saved-card-actions { width: 100%; justify-content: flex-end; }
}
</style>

1077
web/src/views/SearchView.vue Normal file

File diff suppressed because it is too large Load diff

14
web/tsconfig.app.json Normal file
View file

@ -0,0 +1,14 @@
{
"extends": "@vue/tsconfig/tsconfig.dom.json",
"compilerOptions": {
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.app.tsbuildinfo",
"types": ["vite/client"],
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"erasableSyntaxOnly": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedSideEffectImports": true
},
"include": ["src/**/*.ts", "src/**/*.tsx", "src/**/*.vue"]
}

7
web/tsconfig.json Normal file
View file

@ -0,0 +1,7 @@
{
"files": [],
"references": [
{ "path": "./tsconfig.app.json" },
{ "path": "./tsconfig.node.json" }
]
}

22
web/tsconfig.node.json Normal file
View file

@ -0,0 +1,22 @@
{
"compilerOptions": {
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.node.tsbuildinfo",
"target": "ES2023",
"lib": ["ES2023"],
"module": "ESNext",
"types": ["node"],
"skipLibCheck": true,
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"verbatimModuleSyntax": true,
"moduleDetection": "force",
"noEmit": true,
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"erasableSyntaxOnly": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedSideEffectImports": true
},
"include": ["vite.config.ts", "uno.config.ts"]
}

13
web/uno.config.ts Normal file
View file

@ -0,0 +1,13 @@
import { defineConfig, presetWind, presetAttributify } from 'unocss'
export default defineConfig({
presets: [
presetWind(),
// prefixedOnly: avoids false-positive CSS for bare attribute names like "h2", "grid",
// "shadow" in source files. Use <div un-flex> not <div flex>. Gotcha #4.
presetAttributify({ prefix: 'un-', prefixedOnly: true }),
],
// Snipe-specific theme tokens are defined as CSS custom properties in
// src/assets/theme.css — see that file for the full dark tactical palette.
// UnoCSS config is kept minimal; all colour decisions use var(--...) tokens.
})

27
web/vite.config.ts Normal file
View file

@ -0,0 +1,27 @@
import { defineConfig } from 'vite'
import vue from '@vitejs/plugin-vue'
import UnoCSS from 'unocss/vite'
export default defineConfig({
plugins: [vue(), UnoCSS()],
base: process.env.VITE_BASE_URL ?? '/',
build: {
// 16-char content hash prevents filename collisions that break immutable caching
rollupOptions: { output: { hashCharacters: 'base64', entryFileNames: 'assets/[name]-[hash:16].js', chunkFileNames: 'assets/[name]-[hash:16].js', assetFileNames: 'assets/[name]-[hash:16].[ext]' } },
},
server: {
host: '0.0.0.0',
port: 5174,
proxy: {
'/api': {
target: 'http://localhost:8510',
changeOrigin: true,
},
},
},
test: {
environment: 'jsdom',
globals: true,
setupFiles: ['./src/test-setup.ts'],
},
})