Compare commits

..

No commits in common. "main" and "feature/feedback-button" have entirely different histories.

15 changed files with 159 additions and 813 deletions

View file

@ -47,35 +47,6 @@ SNIPE_DB=data/snipe.db
# HEIMDALL_URL=https://license.circuitforge.tech
# HEIMDALL_ADMIN_TOKEN=
# ── eBay Affiliate (optional) ─────────────────────────────────────────────────
# Set to your eBay Partner Network (EPN) campaign ID to earn commissions on
# listing click-throughs. Leave blank for clean /itm/ URLs (no tracking).
# Register at https://partnernetwork.ebay.com — self-hosted users can use their
# own ID; the CF cloud instance uses CF's campaign ID (disclosed in the UI).
# EBAY_AFFILIATE_CAMPAIGN_ID=
# ── LLM inference (vision / photo analysis) ──────────────────────────────────
# circuitforge-core LLMRouter auto-detects backends from these env vars
# (no llm.yaml required). Backends are tried in this priority order:
# 1. ANTHROPIC_API_KEY → Claude API (cloud; requires Paid tier key)
# 2. OPENAI_API_KEY → OpenAI-compatible endpoint
# 3. OLLAMA_HOST → local Ollama (default: http://localhost:11434)
# Leave all unset to disable LLM features (photo analysis won't run).
# ANTHROPIC_API_KEY=
# ANTHROPIC_MODEL=claude-haiku-4-5-20251001
# OPENAI_API_KEY=
# OPENAI_BASE_URL=https://api.openai.com/v1
# OPENAI_MODEL=gpt-4o-mini
# OLLAMA_HOST=http://localhost:11434
# OLLAMA_MODEL=llava:7b
# CF Orchestrator — managed inference for Paid+ cloud users (internal use only).
# Self-hosted users leave this unset; it has no effect without a valid allocation token.
# CF_ORCH_URL=https://orch.circuitforge.tech
# ── In-app feedback (beta) ────────────────────────────────────────────────────
# When set, a feedback FAB appears in the UI and routes submissions to Forgejo.
# Leave unset to silently hide the button (demo/offline deployments).

View file

@ -2,51 +2,7 @@
> *Part of the Circuit Forge LLC "AI for the tasks you hate most" suite.*
**Status:** Active — eBay listing intelligence MVP complete (search, trust scoring, affiliate links, feedback FAB, vision task scheduling). Auction sniping engine and multi-platform support are next.
## Quick install (self-hosted)
**Requirements:** Docker with Compose plugin, Git. No API keys needed to get started.
```bash
# One-line install — clones to ~/snipe by default
bash <(curl -fsSL https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/raw/branch/main/install.sh)
# Or clone manually and run the script:
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/snipe.git
bash snipe/install.sh
```
Then open **http://localhost:8509**.
### Manual setup (if you prefer)
Snipe's API image is built from a parent context that includes `circuitforge-core`. Both repos must sit as siblings in the same directory:
```
workspace/
├── snipe/ ← this repo
└── circuitforge-core/ ← required sibling
```
```bash
mkdir snipe-workspace && cd snipe-workspace
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/snipe.git
git clone https://git.opensourcesolarpunk.com/Circuit-Forge/circuitforge-core.git
cd snipe
cp .env.example .env # edit if you have eBay API credentials (optional)
./manage.sh start
```
### Optional: eBay API credentials
Snipe works without any credentials using its Playwright scraper fallback. Adding eBay API credentials unlocks faster searches and inline seller account age (no extra scrape needed):
1. Register at [developer.ebay.com](https://developer.ebay.com/my/keys)
2. Copy your Production **App ID** and **Cert ID** into `.env`
3. Restart: `./manage.sh restart`
---
**Status:** Active — eBay listing search + seller trust scoring MVP complete. Auction sniping engine and multi-platform support are next.
## What it does
@ -112,20 +68,6 @@ Scans listing titles for signals the item may have undisclosed damage or problem
- **On-demand**: ↻ button on any listing card triggers `POST /api/enrich` — runs enrichment and re-scores without waiting for a second search
- **Category history**: derived from the seller's accumulated listing data (Browse API `categories` field); improves with every search, no extra API calls
### Affiliate link builder
Listing cards surface eBay affiliate-wrapped URLs. Uses `circuitforge_core.affiliates.wrap_url` — resolution order: user opted out → plain URL; user has BYOK affiliate ID → their ID; CF env var set (`EBAY_AFFILIATE_ID`) → CF's ID; otherwise plain URL. Users can configure their own eBay Partner Network ID or opt out entirely in Settings.
Disclosure tooltip appears on first encounter per-session and on each wrapped link (per-retailer copy from `get_disclosure_text`).
### Feedback FAB
In-app feedback button (bottom-right FAB) opens a modal: title, description, optional screenshot. Posts to the CF feedback endpoint. Status probed on load; FAB hidden if endpoint unreachable.
### Vision task scheduling
Photo condition assessment tasks queued through `circuitforge_core.tasks.TaskScheduler` — VRAM-aware slot management shared with any other LLM workloads on the same host. Runs moondream2 locally (free tier) or Claude vision (paid/cloud). Results stored per-listing and update the trust score card.
### Market price comparison
Completed sales fetched via eBay Marketplace Insights API (with Browse API fallback for app tiers that don't have Insights access). Median stored per query hash, used to score `price_vs_market` across all listings in a search.

View file

@ -3,27 +3,26 @@ from __future__ import annotations
import dataclasses
import hashlib
import json as _json
import logging
import os
import queue as _queue
import uuid
from concurrent.futures import ThreadPoolExecutor
from contextlib import asynccontextmanager
from pathlib import Path
import asyncio
import csv
import io
import platform as _platform
import subprocess
from datetime import datetime, timezone
from typing import Literal
from fastapi import Depends, FastAPI, HTTPException, Request, UploadFile, File
import requests as _requests
from fastapi import Depends, FastAPI, HTTPException, UploadFile, File
from fastapi.responses import StreamingResponse
from pydantic import BaseModel
from fastapi.middleware.cors import CORSMiddleware
from circuitforge_core.config import load_env
from circuitforge_core.affiliates import wrap_url as _wrap_affiliate_url
from circuitforge_core.api import make_feedback_router as _make_feedback_router
from app.db.store import Store
from app.db.models import SavedSearch as SavedSearchModel, ScammerEntry
from app.platforms import SearchFilters
@ -38,12 +37,6 @@ from api.ebay_webhook import router as ebay_webhook_router
load_env(Path(".env"))
log = logging.getLogger(__name__)
# ── SSE update registry ───────────────────────────────────────────────────────
# Maps session_id → SimpleQueue of update events.
# SimpleQueue is always thread-safe; no asyncio loop needed to write from threads.
# Keys are cleaned up when the SSE stream ends (client disconnect or timeout).
_update_queues: dict[str, _queue.SimpleQueue] = {}
@asynccontextmanager
async def _lifespan(app: FastAPI):
@ -76,7 +69,6 @@ def _ebay_creds() -> tuple[str, str, str]:
client_secret = (os.environ.get("EBAY_CERT_ID") or os.environ.get("EBAY_CLIENT_SECRET", "")).strip()
return client_id, client_secret, env
app = FastAPI(title="Snipe API", version="0.1.0", lifespan=_lifespan)
app.include_router(ebay_webhook_router)
@ -87,12 +79,6 @@ app.add_middleware(
allow_headers=["*"],
)
_feedback_router = _make_feedback_router(
repo="Circuit-Forge/snipe",
product="snipe",
)
app.include_router(_feedback_router, prefix="/api/feedback")
@app.get("/api/health")
def health():
@ -119,34 +105,31 @@ def _trigger_scraper_enrichment(
listings: list,
shared_store: Store,
shared_db: Path,
user_db: Path | None = None,
query: str = "",
session_id: str | None = None,
) -> None:
"""Fire-and-forget background enrichment for missing seller signals.
Two enrichment passes run in the same daemon thread:
Two enrichment passes run concurrently in the same daemon thread:
1. BTF (/itm/ pages) fills account_age_days for sellers where it is None.
2. _ssn search pages fills category_history_json for sellers with no history.
When session_id is provided, pushes re-scored trust score updates to the
SSE queue after each pass so the frontend can update scores live.
The main response returns immediately; enriched data lands in the DB for
future searches. Uses ScrapedEbayAdapter's Playwright stack regardless of
which adapter was used for the main search (Shopping API handles age for
the API adapter inline; BTF is the fallback for no-creds / scraper mode).
shared_store: used for pre-flight seller checks (same-thread reads).
shared_db: path passed to background thread (sqlite3 is not thread-safe).
user_db: path to per-user listings/trust_scores DB (same as shared_db in local mode).
query: original search query used for market comp lookup during re-score.
session_id: SSE session key; if set, updates are pushed to _update_queues[session_id].
shared_db: path passed to background thread it creates its own Store
(sqlite3 connections are not thread-safe).
"""
# Caps per search: limits Playwright sessions launched in the background so we
# don't hammer Kasada or spin up dozens of Xvfb instances after a large search.
# Remaining sellers get enriched incrementally on subsequent searches.
_BTF_MAX_PER_SEARCH = 3
_CAT_MAX_PER_SEARCH = 3
needs_btf: dict[str, str] = {}
needs_categories: list[str] = []
# Map seller_id → [listings] for this search so we know what to re-score
seller_listing_map: dict[str, list] = {}
for listing in listings:
sid = listing.seller_platform_id
if not sid:
@ -154,7 +137,6 @@ def _trigger_scraper_enrichment(
seller = shared_store.get_seller("ebay", sid)
if not seller:
continue
seller_listing_map.setdefault(sid, []).append(listing)
if ((seller.account_age_days is None or seller.feedback_count == 0)
and sid not in needs_btf
and len(needs_btf) < _BTF_MAX_PER_SEARCH):
@ -165,8 +147,6 @@ def _trigger_scraper_enrichment(
needs_categories.append(sid)
if not needs_btf and not needs_categories:
if session_id and session_id in _update_queues:
_update_queues[session_id].put(None) # sentinel — nothing to enrich
return
log.info(
@ -174,55 +154,17 @@ def _trigger_scraper_enrichment(
len(needs_btf), len(needs_categories),
)
def _push_updates(enriched_seller_ids: list[str]) -> None:
"""Re-score listings for enriched sellers and push updates to SSE queue."""
if not session_id or session_id not in _update_queues:
return
q = _update_queues[session_id]
thread_shared = Store(shared_db)
thread_user = Store(user_db or shared_db)
scorer = TrustScorer(thread_shared)
comp = thread_shared.get_market_comp("ebay", hashlib.md5(query.encode()).hexdigest())
market_price = comp.median_price if comp else None
for sid in enriched_seller_ids:
seller = thread_shared.get_seller("ebay", sid)
if not seller:
continue
affected = seller_listing_map.get(sid, [])
if not affected:
continue
new_scores = scorer.score_batch(affected, query)
thread_user.save_trust_scores(new_scores)
for listing, ts in zip(affected, new_scores):
if ts is None:
continue
q.put({
"platform_listing_id": listing.platform_listing_id,
"trust_score": dataclasses.asdict(ts),
"seller": dataclasses.asdict(seller),
"market_price": market_price,
})
def _run():
try:
enricher = ScrapedEbayAdapter(Store(shared_db))
if needs_btf:
enricher.enrich_sellers_btf(needs_btf, max_workers=2)
log.info("BTF enrichment complete for %d sellers", len(needs_btf))
_push_updates(list(needs_btf.keys()))
if needs_categories:
enricher.enrich_sellers_categories(needs_categories, max_workers=2)
log.info("Category enrichment complete for %d sellers", len(needs_categories))
# Re-score only sellers not already covered by BTF push
cat_only = [s for s in needs_categories if s not in needs_btf]
if cat_only:
_push_updates(cat_only)
except Exception as e:
log.warning("Scraper enrichment failed: %s", e)
finally:
# Sentinel: tells SSE stream the enrichment thread is done
if session_id and session_id in _update_queues:
_update_queues[session_id].put(None)
import threading
t = threading.Thread(target=_run, daemon=True)
@ -419,15 +361,9 @@ def search(
listings = [staged.get(l.platform_listing_id, l) for l in listings]
# BTF enrichment: scrape /itm/ pages for sellers missing account_age_days.
# Runs in the background so it doesn't delay the response. A session_id is
# generated so the frontend can open an SSE stream and receive live score
# updates as enrichment completes.
session_id = str(uuid.uuid4())
_update_queues[session_id] = _queue.SimpleQueue()
_trigger_scraper_enrichment(
listings, shared_store, shared_db,
user_db=user_db, query=q, session_id=session_id,
)
# Runs in the background so it doesn't delay the response; next search of
# the same sellers will have full scores.
_trigger_scraper_enrichment(listings, shared_store, shared_db)
scorer = TrustScorer(shared_store)
trust_scores_list = scorer.score_batch(listings, q)
@ -459,19 +395,12 @@ def search(
and shared_store.get_seller("ebay", listing.seller_platform_id)
}
def _serialize_listing(l: object) -> dict:
d = dataclasses.asdict(l)
d["url"] = _wrap_affiliate_url(d["url"], retailer="ebay")
return d
return {
"listings": [_serialize_listing(l) for l in listings],
"listings": [dataclasses.asdict(l) for l in listings],
"trust_scores": trust_map,
"sellers": seller_map,
"market_price": market_price,
"adapter_used": adapter_used,
"affiliate_active": bool(os.environ.get("EBAY_AFFILIATE_CAMPAIGN_ID", "").strip()),
"session_id": session_id,
}
@ -569,73 +498,6 @@ def enrich_seller(
}
# ── SSE live score updates ────────────────────────────────────────────────────
@app.get("/api/updates/{session_id}")
async def stream_updates(session_id: str, request: Request):
"""Server-Sent Events stream for live trust score updates.
Opens after a search when any listings have score_is_partial=true.
Streams re-scored trust score payloads as enrichment completes, then
sends a 'done' event and closes.
Each event payload:
{ platform_listing_id, trust_score, seller, market_price }
Closes automatically after 90 seconds (worst-case Playwright enrichment).
The client should also close on 'done' event.
"""
if session_id not in _update_queues:
raise HTTPException(status_code=404, detail="Unknown session_id")
q = _update_queues[session_id]
deadline = asyncio.get_event_loop().time() + 90.0
heartbeat_interval = 15.0
next_heartbeat = asyncio.get_event_loop().time() + heartbeat_interval
async def generate():
nonlocal next_heartbeat
try:
while asyncio.get_event_loop().time() < deadline:
if await request.is_disconnected():
break
# Drain all available updates (non-blocking)
while True:
try:
item = q.get_nowait()
except _queue.Empty:
break
if item is None:
# Sentinel: enrichment thread is done
yield "event: done\ndata: {}\n\n"
return
yield f"data: {_json.dumps(item)}\n\n"
# Heartbeat to keep the connection alive through proxies
now = asyncio.get_event_loop().time()
if now >= next_heartbeat:
yield ": heartbeat\n\n"
next_heartbeat = now + heartbeat_interval
await asyncio.sleep(0.5)
# Timeout reached
yield "event: done\ndata: {\"reason\": \"timeout\"}\n\n"
finally:
_update_queues.pop(session_id, None)
return StreamingResponse(
generate(),
media_type="text/event-stream",
headers={
"Cache-Control": "no-cache",
"X-Accel-Buffering": "no", # nginx: disable proxy buffering for SSE
"Connection": "keep-alive",
},
)
# ── Saved Searches ────────────────────────────────────────────────────────────
class SavedSearchCreate(BaseModel):
@ -775,3 +637,117 @@ async def import_blocklist(
return {"imported": imported, "errors": errors}
# ── Feedback ────────────────────────────────────────────────────────────────
# Creates Forgejo issues from in-app beta feedback.
# Silently disabled when FORGEJO_API_TOKEN is not set.
_FEEDBACK_LABEL_COLORS = {
"beta-feedback": "#0075ca",
"needs-triage": "#e4e669",
"bug": "#d73a4a",
"feature-request": "#a2eeef",
"question": "#d876e3",
}
def _fb_headers() -> dict:
token = os.environ.get("FORGEJO_API_TOKEN", "")
return {"Authorization": f"token {token}", "Content-Type": "application/json"}
def _ensure_feedback_labels(names: list[str]) -> list[int]:
base = os.environ.get("FORGEJO_API_URL", "https://git.opensourcesolarpunk.com/api/v1")
repo = os.environ.get("FORGEJO_REPO", "Circuit-Forge/snipe")
resp = _requests.get(f"{base}/repos/{repo}/labels", headers=_fb_headers(), timeout=10)
existing = {lb["name"]: lb["id"] for lb in resp.json()} if resp.ok else {}
ids: list[int] = []
for name in names:
if name in existing:
ids.append(existing[name])
else:
r = _requests.post(
f"{base}/repos/{repo}/labels",
headers=_fb_headers(),
json={"name": name, "color": _FEEDBACK_LABEL_COLORS.get(name, "#ededed")},
timeout=10,
)
if r.ok:
ids.append(r.json()["id"])
return ids
class FeedbackRequest(BaseModel):
title: str
description: str
type: Literal["bug", "feature", "other"] = "other"
repro: str = ""
view: str = "unknown"
submitter: str = ""
class FeedbackResponse(BaseModel):
issue_number: int
issue_url: str
@app.get("/api/feedback/status")
def feedback_status() -> dict:
"""Return whether feedback submission is configured on this instance."""
demo = os.environ.get("DEMO_MODE", "false").lower() in ("1", "true", "yes")
return {"enabled": bool(os.environ.get("FORGEJO_API_TOKEN")) and not demo}
@app.post("/api/feedback", response_model=FeedbackResponse)
def submit_feedback(payload: FeedbackRequest) -> FeedbackResponse:
"""File a Forgejo issue from in-app feedback."""
token = os.environ.get("FORGEJO_API_TOKEN", "")
if not token:
raise HTTPException(status_code=503, detail="Feedback disabled: FORGEJO_API_TOKEN not configured.")
if os.environ.get("DEMO_MODE", "false").lower() in ("1", "true", "yes"):
raise HTTPException(status_code=403, detail="Feedback disabled in demo mode.")
try:
version = subprocess.check_output(
["git", "describe", "--tags", "--always"],
cwd=Path(__file__).resolve().parents[1], text=True, timeout=5,
).strip()
except Exception:
version = "dev"
_TYPE_LABELS = {"bug": "🐛 Bug", "feature": "✨ Feature Request", "other": "💬 Other"}
body_lines = [
f"## {_TYPE_LABELS.get(payload.type, '💬 Other')}",
"",
payload.description,
"",
]
if payload.type == "bug" and payload.repro:
body_lines += ["### Reproduction Steps", "", payload.repro, ""]
body_lines += [
"### Context", "",
f"- **view:** {payload.view}",
f"- **version:** {version}",
f"- **platform:** {_platform.platform()}",
f"- **timestamp:** {datetime.now(timezone.utc).isoformat().replace('+00:00', 'Z')}",
"",
]
if payload.submitter:
body_lines += ["---", f"*Submitted by: {payload.submitter}*"]
labels = ["beta-feedback", "needs-triage",
{"bug": "bug", "feature": "feature-request"}.get(payload.type, "question")]
base = os.environ.get("FORGEJO_API_URL", "https://git.opensourcesolarpunk.com/api/v1")
repo = os.environ.get("FORGEJO_REPO", "Circuit-Forge/snipe")
label_ids = _ensure_feedback_labels(labels)
resp = _requests.post(
f"{base}/repos/{repo}/issues",
headers=_fb_headers(),
json={"title": payload.title, "body": "\n".join(body_lines), "labels": label_ids},
timeout=15,
)
if not resp.ok:
raise HTTPException(status_code=502, detail=f"Forgejo error: {resp.text[:200]}")
data = resp.json()
return FeedbackResponse(issue_number=data["number"], issue_url=data["html_url"])

View file

@ -52,7 +52,6 @@ class TrustScorer:
signal_scores, is_dup, seller,
listing_id=listing.id or 0,
listing_title=listing.title,
listing_condition=listing.condition,
times_seen=listing.times_seen,
first_seen_at=listing.first_seen_at,
price=listing.price,

View file

@ -23,9 +23,8 @@ _SCRATCH_DENT_KEYWORDS = frozenset([
"crack", "cracked", "chip", "chipped",
"damage", "damaged", "cosmetic damage",
"blemish", "wear", "worn", "worn in",
# Parts / condition catch-alls (also matches eBay condition field strings verbatim)
# Parts / condition catch-alls
"as is", "for parts", "parts only", "spares or repair", "parts or repair",
"parts/repair", "parts or not working", "not working",
# Evasive redirects — seller hiding damage detail in listing body
"see description", "read description", "read listing", "see listing",
"see photos for", "see pics for", "see images for",
@ -73,7 +72,6 @@ class Aggregator:
seller: Optional[Seller],
listing_id: int = 0,
listing_title: str = "",
listing_condition: str = "",
times_seen: int = 1,
first_seen_at: Optional[str] = None,
price: float = 0.0,
@ -139,9 +137,7 @@ class Aggregator:
)
if photo_hash_duplicate and not is_established_retailer:
red_flags.append("duplicate_photo")
if (listing_title and _has_damage_keywords(listing_title)) or (
listing_condition and _has_damage_keywords(listing_condition)
):
if listing_title and _has_damage_keywords(listing_title):
red_flags.append("scratch_dent_mentioned")
# Staging DB signals

View file

@ -1,17 +1,21 @@
# compose.override.yml — dev-only additions (auto-applied by Docker Compose in dev).
# Safe to delete on a self-hosted machine — compose.yml is self-contained.
#
# What this adds over compose.yml:
# - Live source mounts so code changes take effect without rebuilding images
# - RELOAD=true to enable uvicorn --reload for the API
# - NOTE: circuitforge-core is NOT mounted here — use `./manage.sh build` to
# pick up cf-core changes. Mounting it as a bind volume would break self-hosted
# installs that don't have the sibling directory.
services:
api:
build:
context: ..
dockerfile: snipe/Dockerfile
network_mode: host
volumes:
- ../circuitforge-core:/app/circuitforge-core
- ./api:/app/snipe/api
- ./app:/app/snipe/app
- ./data:/app/snipe/data
- ./tests:/app/snipe/tests
environment:
- RELOAD=true
web:
build:
context: .
dockerfile: docker/web/Dockerfile
volumes:
- ./web/src:/app/src # not used at runtime but keeps override valid

View file

@ -3,14 +3,11 @@ services:
build:
context: ..
dockerfile: snipe/Dockerfile
# Host networking lets nginx (in the web container) reach the API at
# 172.17.0.1:8510 (the Docker bridge gateway). Required — nginx.conf
# is baked into the image and hard-codes that address.
network_mode: host
ports:
- "8510:8510"
env_file: .env
volumes:
- ./data:/app/snipe/data
restart: unless-stopped
web:
build:

View file

@ -1,226 +0,0 @@
#!/usr/bin/env bash
# Snipe — self-hosted install script
#
# Supports two install paths:
# Docker (recommended) — everything in containers, no system Python deps required
# No-Docker — conda or venv + direct uvicorn, for machines without Docker
#
# Usage:
# bash install.sh # installs to ~/snipe
# bash install.sh /opt/snipe # custom install directory
# bash install.sh ~/snipe --no-docker # force no-Docker path even if Docker present
#
# Requirements (Docker path): Docker with Compose plugin, Git
# Requirements (no-Docker path): Python 3.11+, Node.js 20+, Git, xvfb (system)
set -euo pipefail
INSTALL_DIR="${1:-$HOME/snipe}"
FORCE_NO_DOCKER="${2:-}"
FORGEJO="https://git.opensourcesolarpunk.com/Circuit-Forge"
CONDA_ENV="cf"
info() { echo " [snipe] $*"; }
ok() { echo "$*"; }
warn() { echo "! $*"; }
fail() { echo "$*" >&2; exit 1; }
hr() { echo "────────────────────────────────────────────────────────"; }
echo ""
echo " Snipe — self-hosted installer"
echo " Install directory: $INSTALL_DIR"
echo ""
# ── Detect capabilities ──────────────────────────────────────────────────────
HAS_DOCKER=false
HAS_CONDA=false
HAS_PYTHON=false
HAS_NODE=false
docker compose version >/dev/null 2>&1 && HAS_DOCKER=true
conda --version >/dev/null 2>&1 && HAS_CONDA=true
python3 --version >/dev/null 2>&1 && HAS_PYTHON=true
node --version >/dev/null 2>&1 && HAS_NODE=true
command -v git >/dev/null 2>&1 || fail "Git is required. Install with: sudo apt-get install git"
# Honour --no-docker flag
[[ "$FORCE_NO_DOCKER" == "--no-docker" ]] && HAS_DOCKER=false
if $HAS_DOCKER; then
INSTALL_PATH="docker"
ok "Docker found — using Docker install path (recommended)"
elif $HAS_PYTHON; then
INSTALL_PATH="python"
warn "Docker not found — using no-Docker path (conda or venv)"
else
fail "Docker or Python 3.11+ is required. Install Docker: https://docs.docker.com/get-docker/"
fi
# ── Clone repos ──────────────────────────────────────────────────────────────
# compose.yml and the Dockerfile both use context: .. (parent directory), so
# snipe/ and circuitforge-core/ must be siblings inside INSTALL_DIR.
SNIPE_DIR="$INSTALL_DIR/snipe"
CORE_DIR="$INSTALL_DIR/circuitforge-core"
if [[ -d "$SNIPE_DIR" ]]; then
info "Snipe already cloned — pulling latest..."
git -C "$SNIPE_DIR" pull --ff-only
else
info "Cloning Snipe..."
mkdir -p "$INSTALL_DIR"
git clone "$FORGEJO/snipe.git" "$SNIPE_DIR"
fi
ok "Snipe → $SNIPE_DIR"
if [[ -d "$CORE_DIR" ]]; then
info "circuitforge-core already cloned — pulling latest..."
git -C "$CORE_DIR" pull --ff-only
else
info "Cloning circuitforge-core (shared library)..."
git clone "$FORGEJO/circuitforge-core.git" "$CORE_DIR"
fi
ok "circuitforge-core → $CORE_DIR"
# ── Configure environment ────────────────────────────────────────────────────
ENV_FILE="$SNIPE_DIR/.env"
if [[ ! -f "$ENV_FILE" ]]; then
cp "$SNIPE_DIR/.env.example" "$ENV_FILE"
# Safe defaults for local installs — no eBay registration, no Heimdall
sed -i 's/^EBAY_WEBHOOK_VERIFY_SIGNATURES=true/EBAY_WEBHOOK_VERIFY_SIGNATURES=false/' "$ENV_FILE"
ok ".env created from .env.example"
echo ""
info "Snipe works out of the box with no API keys."
info "Add EBAY_APP_ID / EBAY_CERT_ID later for faster searches (optional)."
echo ""
else
info ".env already exists — skipping (delete it to reset)"
fi
cd "$SNIPE_DIR"
# ── Docker install path ───────────────────────────────────────────────────────
if [[ "$INSTALL_PATH" == "docker" ]]; then
info "Building Docker images (~1 GB download on first run)..."
docker compose build
info "Starting Snipe..."
docker compose up -d
echo ""
ok "Snipe is running!"
hr
echo " Web UI: http://localhost:8509"
echo " API: http://localhost:8510/docs"
echo ""
echo " Manage: cd $SNIPE_DIR && ./manage.sh {start|stop|restart|logs|test}"
hr
echo ""
exit 0
fi
# ── No-Docker install path ───────────────────────────────────────────────────
# System deps: Xvfb is required for Playwright (Kasada bypass via headed Chromium)
if ! command -v Xvfb >/dev/null 2>&1; then
info "Installing Xvfb (required for eBay scraper)..."
if command -v apt-get >/dev/null 2>&1; then
sudo apt-get install -y --no-install-recommends xvfb
elif command -v dnf >/dev/null 2>&1; then
sudo dnf install -y xorg-x11-server-Xvfb
elif command -v brew >/dev/null 2>&1; then
warn "macOS: Xvfb not available. The scraper fallback may fail."
warn "Add eBay API credentials to .env to use the API adapter instead."
else
warn "Could not install Xvfb automatically. Install it with your package manager."
fi
fi
# ── Python environment setup ─────────────────────────────────────────────────
if $HAS_CONDA; then
info "Setting up conda environment '$CONDA_ENV'..."
if conda env list | grep -q "^$CONDA_ENV "; then
info "Conda env '$CONDA_ENV' already exists — updating..."
conda run -n "$CONDA_ENV" pip install --quiet -e "$CORE_DIR"
conda run -n "$CONDA_ENV" pip install --quiet -e "$SNIPE_DIR"
else
conda create -n "$CONDA_ENV" python=3.11 -y
conda run -n "$CONDA_ENV" pip install --quiet -e "$CORE_DIR"
conda run -n "$CONDA_ENV" pip install --quiet -e "$SNIPE_DIR"
fi
conda run -n "$CONDA_ENV" playwright install chromium
conda run -n "$CONDA_ENV" playwright install-deps chromium
PYTHON_RUN="conda run -n $CONDA_ENV"
ok "Conda environment '$CONDA_ENV' ready"
else
info "Setting up Python venv at $SNIPE_DIR/.venv ..."
python3 -m venv "$SNIPE_DIR/.venv"
"$SNIPE_DIR/.venv/bin/pip" install --quiet -e "$CORE_DIR"
"$SNIPE_DIR/.venv/bin/pip" install --quiet -e "$SNIPE_DIR"
"$SNIPE_DIR/.venv/bin/playwright" install chromium
"$SNIPE_DIR/.venv/bin/playwright" install-deps chromium
PYTHON_RUN="$SNIPE_DIR/.venv/bin"
ok "Python venv ready at $SNIPE_DIR/.venv"
fi
# ── Frontend ─────────────────────────────────────────────────────────────────
if $HAS_NODE; then
info "Building Vue frontend..."
cd "$SNIPE_DIR/web"
npm ci --prefer-offline --silent
npm run build
cd "$SNIPE_DIR"
ok "Frontend built → web/dist/"
else
warn "Node.js not found — skipping frontend build."
warn "Install Node.js 20+ from https://nodejs.org and re-run install.sh to build the UI."
warn "Until then, you can access the API directly at http://localhost:8510/docs"
fi
# ── Write start/stop scripts ─────────────────────────────────────────────────
cat > "$SNIPE_DIR/start-local.sh" << 'STARTSCRIPT'
#!/usr/bin/env bash
# Start Snipe without Docker (API only — run from the snipe/ directory)
set -euo pipefail
cd "$(dirname "$0")"
if [[ -f .venv/bin/uvicorn ]]; then
UVICORN=".venv/bin/uvicorn"
elif command -v conda >/dev/null 2>&1 && conda env list | grep -q "^cf "; then
UVICORN="conda run -n cf uvicorn"
else
echo "No Python env found. Run install.sh first." >&2; exit 1
fi
mkdir -p data
echo "Starting Snipe API on http://localhost:8510 ..."
$UVICORN api.main:app --host 0.0.0.0 --port 8510 "${@}"
STARTSCRIPT
chmod +x "$SNIPE_DIR/start-local.sh"
# Frontend serving (if built)
cat > "$SNIPE_DIR/serve-ui.sh" << 'UISCRIPT'
#!/usr/bin/env bash
# Serve the pre-built Vue frontend on port 8509 (dev only — use nginx for production)
cd "$(dirname "$0")/web/dist"
python3 -m http.server 8509
UISCRIPT
chmod +x "$SNIPE_DIR/serve-ui.sh"
echo ""
ok "Snipe installed (no-Docker mode)"
hr
echo " Start API: cd $SNIPE_DIR && ./start-local.sh"
echo " Serve UI: cd $SNIPE_DIR && ./serve-ui.sh (separate terminal)"
echo " API docs: http://localhost:8510/docs"
echo " Web UI: http://localhost:8509 (after ./serve-ui.sh)"
echo ""
echo " For production, point nginx at web/dist/ and proxy /api/ to localhost:8510"
hr
echo ""

View file

@ -78,7 +78,7 @@ case "$cmd" in
test)
echo "Running test suite..."
docker compose -f "$COMPOSE_FILE" exec api \
python -m pytest /app/snipe/tests/ -v "${@}"
conda run -n job-seeker python -m pytest /app/snipe/tests/ -v "${@}"
;;
# ── Cloud commands ────────────────────────────────────────────────────────

View file

@ -8,7 +8,7 @@ version = "0.1.0"
description = "Auction listing monitor and trust scorer"
requires-python = ">=3.11"
dependencies = [
"circuitforge-core>=0.8.0",
"circuitforge-core",
"streamlit>=1.32",
"requests>=2.31",
"imagehash>=4.3",

View file

@ -1,134 +0,0 @@
"""Tests for the shared feedback router (circuitforge-core) mounted in snipe."""
from __future__ import annotations
from collections.abc import Callable
from unittest.mock import MagicMock, patch
from fastapi import FastAPI
from fastapi.testclient import TestClient
from circuitforge_core.api.feedback import make_feedback_router
# ── Test app factory ──────────────────────────────────────────────────────────
def _make_client(demo_mode_fn: Callable[[], bool] | None = None) -> TestClient:
app = FastAPI()
router = make_feedback_router(
repo="Circuit-Forge/snipe",
product="snipe",
demo_mode_fn=demo_mode_fn,
)
app.include_router(router, prefix="/api/feedback")
return TestClient(app)
# ── GET /api/feedback/status ──────────────────────────────────────────────────
def test_status_disabled_when_no_token(monkeypatch):
monkeypatch.delenv("FORGEJO_API_TOKEN", raising=False)
monkeypatch.delenv("DEMO_MODE", raising=False)
client = _make_client(demo_mode_fn=lambda: False)
res = client.get("/api/feedback/status")
assert res.status_code == 200
assert res.json() == {"enabled": False}
def test_status_enabled_when_token_set(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
client = _make_client(demo_mode_fn=lambda: False)
res = client.get("/api/feedback/status")
assert res.status_code == 200
assert res.json() == {"enabled": True}
def test_status_disabled_in_demo_mode(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
demo = True
client = _make_client(demo_mode_fn=lambda: demo)
res = client.get("/api/feedback/status")
assert res.status_code == 200
assert res.json() == {"enabled": False}
# ── POST /api/feedback ────────────────────────────────────────────────────────
def test_submit_returns_503_when_no_token(monkeypatch):
monkeypatch.delenv("FORGEJO_API_TOKEN", raising=False)
client = _make_client(demo_mode_fn=lambda: False)
res = client.post("/api/feedback", json={
"title": "Test", "description": "desc", "type": "bug",
})
assert res.status_code == 503
def test_submit_returns_403_in_demo_mode(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
demo = True
client = _make_client(demo_mode_fn=lambda: demo)
res = client.post("/api/feedback", json={
"title": "Test", "description": "desc", "type": "bug",
})
assert res.status_code == 403
def test_submit_creates_issue(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
label_response = MagicMock()
label_response.ok = True
label_response.json.return_value = [
{"id": 1, "name": "beta-feedback"},
{"id": 2, "name": "needs-triage"},
{"id": 3, "name": "bug"},
]
issue_response = MagicMock()
issue_response.ok = True
issue_response.json.return_value = {
"number": 7,
"html_url": "https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7",
}
client = _make_client(demo_mode_fn=lambda: False)
with patch("circuitforge_core.api.feedback.requests.get", return_value=label_response), \
patch("circuitforge_core.api.feedback.requests.post", return_value=issue_response):
res = client.post("/api/feedback", json={
"title": "Listing scores wrong",
"description": "Trust score shows 0 when seller has 1000 feedback",
"type": "bug",
"repro": "1. Search for anything\n2. Check trust score",
"tab": "search",
})
assert res.status_code == 200
data = res.json()
assert data["issue_number"] == 7
assert data["issue_url"] == "https://git.opensourcesolarpunk.com/Circuit-Forge/snipe/issues/7"
def test_submit_returns_502_on_forgejo_error(monkeypatch):
monkeypatch.setenv("FORGEJO_API_TOKEN", "test-token")
label_response = MagicMock()
label_response.ok = True
label_response.json.return_value = [
{"id": 1, "name": "beta-feedback"},
{"id": 2, "name": "needs-triage"},
{"id": 3, "name": "question"},
]
bad_response = MagicMock()
bad_response.ok = False
bad_response.text = "internal server error"
client = _make_client(demo_mode_fn=lambda: False)
with patch("circuitforge_core.api.feedback.requests.get", return_value=label_response), \
patch("circuitforge_core.api.feedback.requests.post", return_value=bad_response):
res = client.post("/api/feedback", json={
"title": "Oops", "description": "desc", "type": "other",
})
assert res.status_code == 502

View file

@ -80,45 +80,6 @@ def test_suspicious_price_flagged_when_price_genuinely_low():
assert "suspicious_price" in result.red_flags_json
def test_scratch_dent_flagged_from_title_slash_variant():
"""Title containing 'parts/repair' (slash variant, no 'or') must trigger scratch_dent_mentioned."""
agg = Aggregator()
scores = {k: 15 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
result = agg.aggregate(
scores, photo_hash_duplicate=False, seller=None,
listing_title="Generic Widget XL - Parts/Repair",
)
assert "scratch_dent_mentioned" in result.red_flags_json
def test_scratch_dent_flagged_from_condition_field():
"""eBay formal condition 'for parts or not working' must trigger scratch_dent_mentioned
even when the listing title contains no damage keywords."""
agg = Aggregator()
scores = {k: 15 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
result = agg.aggregate(
scores, photo_hash_duplicate=False, seller=None,
listing_title="Generic Widget XL",
listing_condition="for parts or not working",
)
assert "scratch_dent_mentioned" in result.red_flags_json
def test_scratch_dent_not_flagged_for_clean_listing():
"""Clean title + 'New' condition must NOT trigger scratch_dent_mentioned."""
agg = Aggregator()
scores = {k: 15 for k in ["account_age", "feedback_count",
"feedback_ratio", "price_vs_market", "category_history"]}
result = agg.aggregate(
scores, photo_hash_duplicate=False, seller=None,
listing_title="Generic Widget XL",
listing_condition="new",
)
assert "scratch_dent_mentioned" not in result.red_flags_json
def test_new_account_not_flagged_when_age_absent():
"""account_age_days=None (scraper tier) must NOT trigger new_account or account_under_30_days."""
agg = Aggregator()

View file

@ -120,13 +120,10 @@ export const useSearchStore = defineStore('search', () => {
)
const marketPrice = ref<number | null>(cached?.marketPrice ?? null)
const adapterUsed = ref<'api' | 'scraper' | null>(cached?.adapterUsed ?? null)
const affiliateActive = ref<boolean>(false)
const loading = ref(false)
const error = ref<string | null>(null)
const enriching = ref(false) // true while SSE stream is open
let _abort: AbortController | null = null
let _sse: EventSource | null = null
function cancelSearch() {
_abort?.abort()
@ -167,8 +164,6 @@ export const useSearchStore = defineStore('search', () => {
sellers: Record<string, Seller>
market_price: number | null
adapter_used: 'api' | 'scraper'
affiliate_active: boolean
session_id: string | null
}
results.value = data.listings ?? []
@ -176,7 +171,6 @@ export const useSearchStore = defineStore('search', () => {
sellers.value = new Map(Object.entries(data.sellers ?? {}))
marketPrice.value = data.market_price ?? null
adapterUsed.value = data.adapter_used ?? null
affiliateActive.value = data.affiliate_active ?? false
saveCache({
query: q,
results: results.value,
@ -185,12 +179,6 @@ export const useSearchStore = defineStore('search', () => {
marketPrice: marketPrice.value,
adapterUsed: adapterUsed.value,
})
// Open SSE stream if any scores are partial and a session_id was provided
const hasPartial = Object.values(data.trust_scores ?? {}).some(ts => ts.score_is_partial)
if (data.session_id && hasPartial) {
_openUpdates(data.session_id, apiBase)
}
} catch (e) {
if (e instanceof DOMException && e.name === 'AbortError') {
// User cancelled — clear loading but don't surface as an error
@ -205,57 +193,6 @@ export const useSearchStore = defineStore('search', () => {
}
}
function closeUpdates() {
if (_sse) {
_sse.close()
_sse = null
}
enriching.value = false
}
function _openUpdates(sessionId: string, apiBase: string) {
closeUpdates() // close any previous stream
enriching.value = true
const es = new EventSource(`${apiBase}/api/updates/${sessionId}`)
_sse = es
es.onmessage = (e) => {
try {
const update = JSON.parse(e.data) as {
platform_listing_id: string
trust_score: TrustScore
seller: Record<string, unknown>
market_price: number | null
}
if (update.platform_listing_id && update.trust_score) {
trustScores.value = new Map(trustScores.value)
trustScores.value.set(update.platform_listing_id, update.trust_score)
}
if (update.seller) {
const s = update.seller as Seller
if (s.platform_seller_id) {
sellers.value = new Map(sellers.value)
sellers.value.set(s.platform_seller_id, s)
}
}
if (update.market_price != null) {
marketPrice.value = update.market_price
}
} catch {
// malformed event — ignore
}
}
es.addEventListener('done', () => {
closeUpdates()
})
es.onerror = () => {
closeUpdates()
}
}
async function enrichSeller(sellerUsername: string, listingId: string): Promise<void> {
const apiBase = (import.meta.env.VITE_API_BASE as string) ?? ''
const params = new URLSearchParams({
@ -288,14 +225,11 @@ export const useSearchStore = defineStore('search', () => {
sellers,
marketPrice,
adapterUsed,
affiliateActive,
loading,
enriching,
error,
search,
cancelSearch,
enrichSeller,
closeUpdates,
clearResults,
}
})

View file

@ -91,17 +91,6 @@
aria-label="Search filters"
>
<!-- Clear all filters only shown when at least one filter is active -->
<button
v-if="activeFilterCount > 0"
type="button"
class="filter-clear-btn"
@click="resetFilters"
aria-label="Clear all filters"
>
Clear filters ({{ activeFilterCount }})
</button>
<!-- eBay Search Parameters -->
<!-- These are sent to eBay. Changes require a new search to take effect. -->
<h2 class="filter-section-heading filter-section-heading--search">
@ -307,16 +296,8 @@
<span v-if="hiddenCount > 0" class="results-hidden">
· {{ hiddenCount }} hidden by filters
</span>
<span v-if="store.affiliateActive" class="affiliate-disclosure">
· Links may include an affiliate code
</span>
</p>
<div class="toolbar-actions">
<!-- Live enrichment indicator visible while SSE stream is open -->
<span v-if="store.enriching" class="enriching-badge" aria-live="polite" title="Scores updating as seller data arrives">
<span class="enriching-dot" aria-hidden="true"></span>
Updating scores
</span>
<label for="sort-select" class="sr-only">Sort by</label>
<select id="sort-select" v-model="sortBy" class="sort-select">
<option v-for="opt in SORT_OPTIONS" :key="opt.value" :value="opt.value">
@ -421,7 +402,7 @@ onMounted(() => {
// Filters
const DEFAULT_FILTERS: SearchFilters = {
const filters = reactive<SearchFilters>({
minTrustScore: 0,
minPrice: undefined,
maxPrice: undefined,
@ -440,13 +421,7 @@ const DEFAULT_FILTERS: SearchFilters = {
mustExclude: '',
categoryId: '',
adapter: 'auto' as 'auto' | 'api' | 'scraper',
}
const filters = reactive<SearchFilters>({ ...DEFAULT_FILTERS })
function resetFilters() {
Object.assign(filters, DEFAULT_FILTERS)
}
})
// Parse comma-separated keyword strings into trimmed, lowercase, non-empty term arrays
const parsedMustInclude = computed(() =>
@ -789,27 +764,6 @@ async function onSearch() {
margin-bottom: var(--space-2);
}
/* Clear all filters button */
.filter-clear-btn {
display: flex;
align-items: center;
gap: var(--space-1);
width: 100%;
padding: var(--space-1) var(--space-2);
margin-bottom: var(--space-2);
background: color-mix(in srgb, var(--color-red, #ef4444) 12%, transparent);
color: var(--color-red, #ef4444);
border: 1px solid color-mix(in srgb, var(--color-red, #ef4444) 30%, transparent);
border-radius: var(--radius-sm);
font-size: 0.75rem;
font-weight: 600;
cursor: pointer;
transition: background 0.15s, color 0.15s;
}
.filter-clear-btn:hover {
background: color-mix(in srgb, var(--color-red, #ef4444) 22%, transparent);
}
/* Section headings that separate eBay Search params from local filters */
.filter-section-heading {
font-size: 0.6875rem;
@ -1075,7 +1029,6 @@ async function onSearch() {
}
.results-hidden { color: var(--color-warning); }
.affiliate-disclosure { color: var(--color-text-muted, #8b949e); font-size: 0.8em; }
.toolbar-actions {
display: flex;
@ -1084,33 +1037,6 @@ async function onSearch() {
flex-wrap: wrap;
}
.enriching-badge {
display: inline-flex;
align-items: center;
gap: var(--space-1);
padding: var(--space-1) var(--space-2);
background: color-mix(in srgb, var(--app-primary) 10%, transparent);
border: 1px solid color-mix(in srgb, var(--app-primary) 30%, transparent);
border-radius: var(--radius-full, 9999px);
color: var(--app-primary);
font-size: 0.75rem;
font-weight: 500;
white-space: nowrap;
}
.enriching-dot {
width: 6px;
height: 6px;
border-radius: 50%;
background: var(--app-primary);
animation: enriching-pulse 1.2s ease-in-out infinite;
}
@keyframes enriching-pulse {
0%, 100% { opacity: 1; transform: scale(1); }
50% { opacity: 0.4; transform: scale(0.7); }
}
.save-btn {
display: flex;
align-items: center;