LLM query builder: chat UI to describe what to find, AI builds the search #29

Closed
opened 2026-04-09 08:49:57 -07:00 by pyr0ball · 0 comments
Owner

Feature

Add a chat/natural-language input panel to the search UI (Premium tier) where the user describes what they want in plain English. An LLM translates the description into a fully-populated Snipe search: query, must_include OR groups, must_exclude, price range, and category.

Motivation

The OR-group filter system is powerful but non-obvious — users have to know eBay search syntax, GPU model families, what to exclude, etc. This surfaces that power without requiring users to learn it.

Example

"I want bare laptop motherboards with a decent GPU for running local LLMs, working condition, under $250"

LLM output:

{
  "query": "laptop motherboard",
  "must_include": "rtx 3060|rtx 3070|rtx 3080|rx 6700m|rx 6800m",
  "must_include_mode": "groups",
  "must_exclude": "broken,cracked,no post,for parts,untested,screen,chassis",
  "max_price": 250,
  "category_id": "177946"
}

Design

  • Collapsible panel above the search bar (not a separate page)
  • User types a description, hits Enter or a "Build" button
  • Shows the generated filters in the existing filter UI before running (user can tweak)
  • Streaming response preferred so it feels fast
  • Local LLM (BYOK) for Free tier; cloud LLM for Premium
  • The snipe-consumer skill (Claude Code) already encodes this domain knowledge and can serve as the prompt foundation

Tier

Premium (local BYOK unlockable at Paid). Core search remains fully manual for Free users.

## Feature Add a chat/natural-language input panel to the search UI (Premium tier) where the user describes what they want in plain English. An LLM translates the description into a fully-populated Snipe search: query, must_include OR groups, must_exclude, price range, and category. ## Motivation The OR-group filter system is powerful but non-obvious — users have to know eBay search syntax, GPU model families, what to exclude, etc. This surfaces that power without requiring users to learn it. ## Example > "I want bare laptop motherboards with a decent GPU for running local LLMs, working condition, under $250" LLM output: ```json { "query": "laptop motherboard", "must_include": "rtx 3060|rtx 3070|rtx 3080|rx 6700m|rx 6800m", "must_include_mode": "groups", "must_exclude": "broken,cracked,no post,for parts,untested,screen,chassis", "max_price": 250, "category_id": "177946" } ``` ## Design - Collapsible panel above the search bar (not a separate page) - User types a description, hits Enter or a "Build" button - Shows the generated filters in the existing filter UI before running (user can tweak) - Streaming response preferred so it feels fast - Local LLM (BYOK) for Free tier; cloud LLM for Premium - The `snipe-consumer` skill (Claude Code) already encodes this domain knowledge and can serve as the prompt foundation ## Tier Premium (local BYOK unlockable at Paid). Core search remains fully manual for Free users.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: Circuit-Forge/snipe#29
No description provided.