docs: update tier-system reference with BYOK policy + demo user.yaml

docs/reference/tier-system.md:
  - Rewritten tier table: free tier now described as "AI unlocks with BYOK"
  - New BYOK section explaining the policy and rationale
  - Feature gate table gains BYOK-unlocks? column
  - API reference updated: can_use, tier_label, has_configured_llm with examples
  - "Adding a new feature gate" guide updated to cover BYOK_UNLOCKABLE

demo/config/user.yaml:
  - Reformatted by YAML linter; added dismissed_banners for demo UX
This commit is contained in:
pyr0ball 2026-03-02 13:22:10 -08:00
parent ebb82b7ca7
commit f1194cacc9
2 changed files with 111 additions and 85 deletions

View file

@ -1,51 +1,44 @@
# Demo user profile — pre-seeded for the public menagerie demo.
# No real personal information. All AI features are disabled (DEMO_MODE=true).
name: "Demo User"
email: "demo@circuitforge.tech"
phone: ""
linkedin: ""
career_summary: >
Experienced software engineer with a background in full-stack development,
cloud infrastructure, and data pipelines. Passionate about building tools
that help people navigate complex systems.
nda_companies: []
mission_preferences:
music: ""
animal_welfare: ""
education: ""
social_impact: "Want my work to reach people who need it most."
health: ""
candidate_voice: "Clear, direct, and human. Focuses on impact over jargon."
candidate_accessibility_focus: false
candidate_lgbtq_focus: false
candidate_voice: Clear, direct, and human. Focuses on impact over jargon.
career_summary: 'Experienced software engineer with a background in full-stack development,
cloud infrastructure, and data pipelines. Passionate about building tools that help
people navigate complex systems.
tier: free
'
dev_tier_override: null
wizard_complete: true
wizard_step: 0
dismissed_banners: []
docs_dir: "/docs"
ollama_models_dir: "~/models/ollama"
vllm_models_dir: "~/models/vllm"
inference_profile: "remote"
dismissed_banners:
- connect_cloud
- setup_email
docs_dir: /docs
email: demo@circuitforge.tech
inference_profile: remote
linkedin: ''
mission_preferences:
animal_welfare: ''
education: ''
health: ''
music: ''
social_impact: Want my work to reach people who need it most.
name: Demo User
nda_companies: []
ollama_models_dir: ~/models/ollama
phone: ''
services:
streamlit_port: 8501
ollama_host: localhost
ollama_port: 11434
ollama_ssl: false
ollama_ssl_verify: true
vllm_host: localhost
vllm_port: 8000
vllm_ssl: false
vllm_ssl_verify: true
searxng_host: searxng
searxng_port: 8080
searxng_ssl: false
searxng_ssl_verify: true
streamlit_port: 8501
vllm_host: localhost
vllm_port: 8000
vllm_ssl: false
vllm_ssl_verify: true
tier: free
vllm_models_dir: ~/models/vllm
wizard_complete: true
wizard_step: 0

View file

@ -12,41 +12,53 @@ free < paid < premium
| Tier | Description |
|------|-------------|
| `free` | Core discovery pipeline, resume matching, and basic UI — no LLM features |
| `paid` | All AI features: cover letters, research, email, integrations, calendar, notifications |
| `free` | Core discovery pipeline, resume matching, basic UI. AI features unlock with BYOK. |
| `paid` | Managed cloud LLM (no key required), integrations, calendar, notifications |
| `premium` | Adds fine-tuning and multi-user support |
---
## BYOK — Bring Your Own Key
If you configure any LLM backend in `config/llm.yaml` — local (ollama, vllm) **or** an external API key (Anthropic, OpenAI, etc.) — **all pure LLM-call features unlock automatically**, regardless of your subscription tier.
The paid tier gives you access to CircuitForge's managed cloud inference. It does not gate your ability to use AI when you're providing the compute yourself.
Features that unlock with BYOK are listed in `BYOK_UNLOCKABLE` in `tiers.py`. Features that depend on CircuitForge-operated infrastructure (integrations, email classifier training, fine-tuned models) remain tier-gated.
---
## Feature Gate Table
Features listed here require a minimum tier. Features not in this table are available to all tiers (free by default).
### Wizard LLM generation
| Feature key | Minimum tier | Description |
|-------------|-------------|-------------|
| `llm_career_summary` | paid | LLM-assisted career summary generation in the wizard |
| `llm_expand_bullets` | paid | LLM expansion of resume bullet points |
| `llm_suggest_skills` | paid | LLM skill suggestions from resume content |
| `llm_voice_guidelines` | premium | LLM writing voice and tone guidelines |
| `llm_job_titles` | paid | LLM-suggested job title variations for search |
| `llm_keywords_blocklist` | paid | LLM-suggested blocklist keywords |
| `llm_mission_notes` | paid | LLM-generated mission alignment notes |
| Feature key | Minimum tier | BYOK unlocks? | Description |
|-------------|-------------|---------------|-------------|
| `llm_career_summary` | paid | ✅ yes | LLM-assisted career summary generation in the wizard |
| `llm_expand_bullets` | paid | ✅ yes | LLM expansion of resume bullet points |
| `llm_suggest_skills` | paid | ✅ yes | LLM skill suggestions from resume content |
| `llm_voice_guidelines` | premium | ✅ yes | LLM writing voice and tone guidelines |
| `llm_job_titles` | paid | ✅ yes | LLM-suggested job title variations for search |
| `llm_mission_notes` | paid | ✅ yes | LLM-generated mission alignment notes |
| `llm_keywords_blocklist` | paid | ❌ no | Orchestration pipeline over background keyword data |
### App features
| Feature key | Minimum tier | Description |
|-------------|-------------|-------------|
| `company_research` | paid | Auto-generated company research briefs pre-interview |
| `interview_prep` | paid | Live reference sheet and practice Q&A during calls |
| `email_classifier` | paid | IMAP email sync with LLM classification |
| `survey_assistant` | paid | Culture-fit survey Q&A helper (text + screenshot) |
| `model_fine_tuning` | premium | Cover letter model fine-tuning on personal writing |
| `shared_cover_writer_model` | paid | Access to shared fine-tuned cover letter model |
| `multi_user` | premium | Multiple user profiles on one instance |
| Feature key | Minimum tier | BYOK unlocks? | Description |
|-------------|-------------|---------------|-------------|
| `company_research` | paid | ✅ yes | Auto-generated company research briefs pre-interview |
| `interview_prep` | paid | ✅ yes | Live reference sheet and practice Q&A during calls |
| `survey_assistant` | paid | ✅ yes | Culture-fit survey Q&A helper (text + screenshot) |
| `email_classifier` | paid | ❌ no | IMAP email sync with LLM classification (training pipeline) |
| `model_fine_tuning` | premium | ❌ no | Cover letter model fine-tuning on personal writing |
| `shared_cover_writer_model` | paid | ❌ no | Access to shared fine-tuned cover letter model (CF infra) |
| `multi_user` | premium | ❌ no | Multiple user profiles on one instance |
### Integrations (paid)
### Integrations
Integrations depend on CircuitForge-operated infrastructure and are **not** BYOK-unlockable.
| Feature key | Minimum tier | Description |
|-------------|-------------|-------------|
@ -73,31 +85,46 @@ The following integrations are free for all tiers and are not in the `FEATURES`
## API Reference
### `can_use(tier, feature) -> bool`
### `can_use(tier, feature, has_byok=False) -> bool`
Returns `True` if the given tier has access to the feature.
Returns `True` if the given tier has access to the feature. Pass `has_byok=has_configured_llm()` to apply BYOK unlock logic.
```python
from app.wizard.tiers import can_use
from app.wizard.tiers import can_use, has_configured_llm
can_use("free", "company_research") # False
can_use("paid", "company_research") # True
can_use("premium", "company_research") # True
byok = has_configured_llm()
can_use("free", "unknown_feature") # True — ungated features return True
can_use("invalid", "company_research") # False — invalid tier string
can_use("free", "company_research") # False — no LLM configured
can_use("free", "company_research", has_byok=True) # True — BYOK unlocks it
can_use("paid", "company_research") # True
can_use("free", "notion_sync", has_byok=True) # False — integration, not BYOK-unlockable
can_use("free", "unknown_feature") # True — ungated features return True
can_use("invalid", "company_research") # False — invalid tier string
```
### `tier_label(feature) -> str`
### `has_configured_llm(config_path=None) -> bool`
Returns a display badge string for locked features, or `""` if the feature is free or unknown.
Returns `True` if at least one non-vision LLM backend is enabled in `config/llm.yaml`. Local backends (ollama, vllm) and external API keys both count.
```python
from app.wizard.tiers import has_configured_llm
has_configured_llm() # True if any backend is enabled and not vision_service
```
### `tier_label(feature, has_byok=False) -> str`
Returns a display badge string for locked features, or `""` if the feature is free, unlocked, or BYOK-accessible.
```python
from app.wizard.tiers import tier_label
tier_label("company_research") # "🔒 Paid"
tier_label("model_fine_tuning") # "⭐ Premium"
tier_label("job_discovery") # "" (ungated)
tier_label("company_research") # "🔒 Paid"
tier_label("company_research", has_byok=True) # "" (BYOK unlocks, no label shown)
tier_label("model_fine_tuning") # "⭐ Premium"
tier_label("notion_sync", has_byok=True) # "🔒 Paid" (BYOK doesn't unlock integrations)
tier_label("job_discovery") # "" (ungated)
```
---
@ -120,36 +147,42 @@ dev_tier_override: premium # overrides tier locally for testing
## Adding a New Feature Gate
1. Add the feature to `FEATURES` in `app/wizard/tiers.py`:
1. Add the feature to `FEATURES` in `app/wizard/tiers.py`. If it's a pure LLM call that should unlock with BYOK, also add it to `BYOK_UNLOCKABLE`:
```python
FEATURES: dict[str, str] = {
# ...existing entries...
"my_new_feature": "paid", # or "free" | "premium"
"my_new_llm_feature": "paid",
}
BYOK_UNLOCKABLE: frozenset[str] = frozenset({
# ...existing entries...
"my_new_llm_feature", # add here if it's a pure LLM call
})
```
2. Guard the feature in the UI:
2. Guard the feature in the UI, passing `has_byok`:
```python
from app.wizard.tiers import can_use, tier_label
from scripts.user_profile import UserProfile
from app.wizard.tiers import can_use, tier_label, has_configured_llm
user = UserProfile()
if can_use(user.tier, "my_new_feature"):
_byok = has_configured_llm()
if can_use(user.tier, "my_new_llm_feature", has_byok=_byok):
# show the feature
pass
else:
st.info(f"My New Feature requires a {tier_label('my_new_feature').replace('🔒 ', '').replace('⭐ ', '')} plan.")
st.info(f"Requires a paid plan or a configured LLM backend.")
```
3. Add a test in `tests/test_tiers.py`:
3. Add tests in `tests/test_wizard_tiers.py` covering both the tier gate and BYOK unlock:
```python
def test_my_new_feature_requires_paid():
assert can_use("free", "my_new_feature") is False
assert can_use("paid", "my_new_feature") is True
assert can_use("premium", "my_new_feature") is True
def test_my_new_feature_requires_paid_without_byok():
assert can_use("free", "my_new_llm_feature") is False
assert can_use("paid", "my_new_llm_feature") is True
def test_my_new_feature_byok_unlocks():
assert can_use("free", "my_new_llm_feature", has_byok=True) is True
```
---