chore: initial commit — kiwi Phase 2 complete
Pantry tracker app with: - FastAPI backend + Vue 3 SPA frontend - SQLite via circuitforge-core (migrations 001-005) - Inventory CRUD, barcode scan, receipt OCR pipeline - Expiry prediction (deterministic + LLM fallback) - CF-core tier system integration - Cloud session support (menagerie)
This commit is contained in:
commit
8cbde774e5
91 changed files with 15250 additions and 0 deletions
37
.env.example
Normal file
37
.env.example
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
# Kiwi — environment variables
|
||||
# Copy to .env and fill in values
|
||||
|
||||
# API
|
||||
API_PREFIX=/api/v1
|
||||
CORS_ORIGINS=http://localhost:5173,http://localhost:8509
|
||||
|
||||
# Storage
|
||||
DATA_DIR=./data
|
||||
|
||||
# Database (defaults to DATA_DIR/kiwi.db)
|
||||
# DB_PATH=./data/kiwi.db
|
||||
|
||||
# Processing
|
||||
USE_GPU=true
|
||||
GPU_MEMORY_LIMIT=6144
|
||||
MAX_CONCURRENT_JOBS=4
|
||||
MIN_QUALITY_SCORE=50.0
|
||||
|
||||
# Feature flags
|
||||
ENABLE_OCR=false
|
||||
|
||||
# Runtime
|
||||
DEBUG=false
|
||||
CLOUD_MODE=false
|
||||
DEMO_MODE=false
|
||||
|
||||
# Cloud mode (set in compose.cloud.yml; also set here for reference)
|
||||
# CLOUD_DATA_ROOT=/devl/kiwi-cloud-data
|
||||
# KIWI_DB=data/kiwi.db # local-mode DB path override
|
||||
|
||||
# Heimdall license server (required for cloud tier resolution)
|
||||
# HEIMDALL_URL=https://license.circuitforge.tech
|
||||
# HEIMDALL_ADMIN_TOKEN=
|
||||
|
||||
# Directus JWT (must match cf-directus SECRET env var)
|
||||
# DIRECTUS_JWT_SECRET=
|
||||
21
.gitignore
vendored
Normal file
21
.gitignore
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
|
||||
# Superpowers brainstorming artifacts
|
||||
.superpowers/
|
||||
|
||||
# Git worktrees
|
||||
.worktrees/
|
||||
|
||||
# Python bytecode
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.pyo
|
||||
|
||||
# Environment files (keep .env.example)
|
||||
.env
|
||||
|
||||
# Node modules
|
||||
node_modules/
|
||||
dist/
|
||||
|
||||
# Data directories
|
||||
data/
|
||||
26
Dockerfile
Normal file
26
Dockerfile
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
FROM continuumio/miniconda3:latest
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies for OpenCV + pyzbar
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
libzbar0 libgl1 libglib2.0-0 \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install circuitforge-core from sibling directory (compose sets context: ..)
|
||||
COPY circuitforge-core/ ./circuitforge-core/
|
||||
RUN conda run -n base pip install --no-cache-dir -e ./circuitforge-core
|
||||
|
||||
# Create kiwi conda env and install app
|
||||
COPY kiwi/environment.yml .
|
||||
RUN conda env create -f environment.yml
|
||||
|
||||
COPY kiwi/ ./kiwi/
|
||||
# Install cf-core into the kiwi env BEFORE installing kiwi (kiwi lists it as a dep)
|
||||
RUN conda run -n kiwi pip install --no-cache-dir -e /app/circuitforge-core
|
||||
WORKDIR /app/kiwi
|
||||
RUN conda run -n kiwi pip install --no-cache-dir -e .
|
||||
|
||||
EXPOSE 8512
|
||||
CMD ["conda", "run", "--no-capture-output", "-n", "kiwi", \
|
||||
"uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8512"]
|
||||
7
PRIVACY.md
Normal file
7
PRIVACY.md
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
# Privacy Policy
|
||||
|
||||
CircuitForge LLC's privacy policy applies to this product and is published at:
|
||||
|
||||
**<https://circuitforge.tech/privacy>**
|
||||
|
||||
Last reviewed: March 2026.
|
||||
66
README.md
Normal file
66
README.md
Normal file
|
|
@ -0,0 +1,66 @@
|
|||
# 🥝 Kiwi
|
||||
|
||||
> *Part of the CircuitForge LLC "AI for the tasks the system made hard on purpose" suite.*
|
||||
|
||||
**Pantry tracking and leftover recipe suggestions.**
|
||||
|
||||
Scan barcodes, photograph receipts, and get recipe ideas based on what you already have — before it expires.
|
||||
|
||||
**Status:** Pre-alpha · CircuitForge LLC
|
||||
|
||||
---
|
||||
|
||||
## What it does
|
||||
|
||||
- **Inventory tracking** — add items by barcode scan, receipt upload, or manually
|
||||
- **Expiry alerts** — know what's about to go bad
|
||||
- **Receipt OCR** — extract line items from receipt photos automatically (Paid tier)
|
||||
- **Recipe suggestions** — LLM-powered ideas based on what's expiring (Paid tier, BYOK-unlockable)
|
||||
- **Leftover mode** — prioritize nearly-expired items in recipe ranking (Premium tier)
|
||||
|
||||
## Stack
|
||||
|
||||
- **Frontend:** Vue 3 SPA (Vite + TypeScript)
|
||||
- **Backend:** FastAPI + SQLite (via `circuitforge-core`)
|
||||
- **Auth:** CF session cookie → Directus JWT (cloud mode)
|
||||
- **Licensing:** Heimdall (free tier auto-provisioned at signup)
|
||||
|
||||
## Running locally
|
||||
|
||||
```bash
|
||||
cp .env.example .env
|
||||
./manage.sh build
|
||||
./manage.sh start
|
||||
# Web: http://localhost:8511
|
||||
# API: http://localhost:8512
|
||||
```
|
||||
|
||||
## Cloud instance
|
||||
|
||||
```bash
|
||||
./manage.sh cloud-build
|
||||
./manage.sh cloud-start
|
||||
# Served at menagerie.circuitforge.tech/kiwi (JWT-gated)
|
||||
```
|
||||
|
||||
## Tiers
|
||||
|
||||
| Feature | Free | Paid | Premium |
|
||||
|---------|------|------|---------|
|
||||
| Inventory CRUD | ✓ | ✓ | ✓ |
|
||||
| Barcode scan | ✓ | ✓ | ✓ |
|
||||
| Receipt upload | ✓ | ✓ | ✓ |
|
||||
| Expiry alerts | ✓ | ✓ | ✓ |
|
||||
| CSV export | ✓ | ✓ | ✓ |
|
||||
| Receipt OCR | BYOK | ✓ | ✓ |
|
||||
| Recipe suggestions | BYOK | ✓ | ✓ |
|
||||
| Meal planning | — | ✓ | ✓ |
|
||||
| Multi-household | — | — | ✓ |
|
||||
| Leftover mode | — | — | ✓ |
|
||||
|
||||
BYOK = bring your own LLM backend (configure `~/.config/circuitforge/llm.yaml`)
|
||||
|
||||
## License
|
||||
|
||||
Discovery/pipeline layer: MIT
|
||||
AI features: BSL 1.1 (free for personal non-commercial self-hosting)
|
||||
7
app/__init__.py
Normal file
7
app/__init__.py
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
# app/__init__.py
|
||||
"""
|
||||
Kiwi: Pantry tracking and leftover recipe suggestions.
|
||||
"""
|
||||
|
||||
__version__ = "0.1.0"
|
||||
__author__ = "Alan 'pyr0ball' Weinstock"
|
||||
5
app/api/__init__.py
Normal file
5
app/api/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
# app/api/__init__.py
|
||||
"""
|
||||
API package for Kiwi.
|
||||
Contains all API routes and endpoint handlers.
|
||||
"""
|
||||
4
app/api/endpoints/__init__.py
Normal file
4
app/api/endpoints/__init__.py
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
# app/api/endpoints/__init__.py
|
||||
"""
|
||||
API endpoint implementations for Kiwi.
|
||||
"""
|
||||
47
app/api/endpoints/export.py
Normal file
47
app/api/endpoints/export.py
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
"""Export endpoints — CSV/Excel of receipt and inventory data."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import csv
|
||||
import io
|
||||
|
||||
from fastapi import APIRouter, Depends
|
||||
from fastapi.responses import StreamingResponse
|
||||
|
||||
from app.db.session import get_store
|
||||
from app.db.store import Store
|
||||
|
||||
router = APIRouter(prefix="/export", tags=["export"])
|
||||
|
||||
|
||||
@router.get("/receipts/csv")
|
||||
async def export_receipts_csv(store: Store = Depends(get_store)):
|
||||
receipts = await asyncio.to_thread(store.list_receipts, 1000, 0)
|
||||
output = io.StringIO()
|
||||
fields = ["id", "filename", "status", "created_at", "updated_at"]
|
||||
writer = csv.DictWriter(output, fieldnames=fields, extrasaction="ignore")
|
||||
writer.writeheader()
|
||||
writer.writerows(receipts)
|
||||
output.seek(0)
|
||||
return StreamingResponse(
|
||||
iter([output.getvalue()]),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=receipts.csv"},
|
||||
)
|
||||
|
||||
|
||||
@router.get("/inventory/csv")
|
||||
async def export_inventory_csv(store: Store = Depends(get_store)):
|
||||
items = await asyncio.to_thread(store.list_inventory)
|
||||
output = io.StringIO()
|
||||
fields = ["id", "product_name", "barcode", "category", "quantity", "unit",
|
||||
"location", "expiration_date", "status", "created_at"]
|
||||
writer = csv.DictWriter(output, fieldnames=fields, extrasaction="ignore")
|
||||
writer.writeheader()
|
||||
writer.writerows(items)
|
||||
output.seek(0)
|
||||
return StreamingResponse(
|
||||
iter([output.getvalue()]),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=inventory.csv"},
|
||||
)
|
||||
14
app/api/endpoints/health.py
Normal file
14
app/api/endpoints/health.py
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
# app/api/endpoints/health.py
|
||||
from fastapi import APIRouter
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get("/")
|
||||
async def health_check():
|
||||
return {"status": "ok", "service": "kiwi-api"}
|
||||
|
||||
|
||||
@router.get("/ping")
|
||||
async def ping():
|
||||
return {"ping": "pong"}
|
||||
394
app/api/endpoints/inventory.py
Normal file
394
app/api/endpoints/inventory.py
Normal file
|
|
@ -0,0 +1,394 @@
|
|||
"""Inventory API endpoints — products, items, barcode scanning, tags, stats."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
import aiofiles
|
||||
from fastapi import APIRouter, Depends, File, Form, HTTPException, UploadFile, status
|
||||
from pydantic import BaseModel
|
||||
|
||||
from app.cloud_session import CloudUser, get_session
|
||||
from app.db.session import get_store
|
||||
from app.db.store import Store
|
||||
from app.models.schemas.inventory import (
|
||||
BarcodeScanResponse,
|
||||
InventoryItemCreate,
|
||||
InventoryItemResponse,
|
||||
InventoryItemUpdate,
|
||||
InventoryStats,
|
||||
ProductCreate,
|
||||
ProductResponse,
|
||||
ProductUpdate,
|
||||
TagCreate,
|
||||
TagResponse,
|
||||
)
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
# ── Products ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/products", response_model=ProductResponse, status_code=status.HTTP_201_CREATED)
|
||||
async def create_product(body: ProductCreate, store: Store = Depends(get_store)):
|
||||
product, _ = await asyncio.to_thread(
|
||||
store.get_or_create_product,
|
||||
body.name,
|
||||
body.barcode,
|
||||
brand=body.brand,
|
||||
category=body.category,
|
||||
description=body.description,
|
||||
image_url=body.image_url,
|
||||
nutrition_data=body.nutrition_data,
|
||||
source=body.source,
|
||||
source_data=body.source_data,
|
||||
)
|
||||
return ProductResponse.model_validate(product)
|
||||
|
||||
|
||||
@router.get("/products", response_model=List[ProductResponse])
|
||||
async def list_products(store: Store = Depends(get_store)):
|
||||
products = await asyncio.to_thread(store.list_products)
|
||||
return [ProductResponse.model_validate(p) for p in products]
|
||||
|
||||
|
||||
@router.get("/products/{product_id}", response_model=ProductResponse)
|
||||
async def get_product(product_id: int, store: Store = Depends(get_store)):
|
||||
product = await asyncio.to_thread(store.get_product, product_id)
|
||||
if not product:
|
||||
raise HTTPException(status_code=404, detail="Product not found")
|
||||
return ProductResponse.model_validate(product)
|
||||
|
||||
|
||||
@router.get("/products/barcode/{barcode}", response_model=ProductResponse)
|
||||
async def get_product_by_barcode(barcode: str, store: Store = Depends(get_store)):
|
||||
from app.db import store as store_module # avoid circular
|
||||
product = await asyncio.to_thread(
|
||||
store._fetch_one, "SELECT * FROM products WHERE barcode = ?", (barcode,)
|
||||
)
|
||||
if not product:
|
||||
raise HTTPException(status_code=404, detail="Product not found")
|
||||
return ProductResponse.model_validate(product)
|
||||
|
||||
|
||||
@router.patch("/products/{product_id}", response_model=ProductResponse)
|
||||
async def update_product(
|
||||
product_id: int, body: ProductUpdate, store: Store = Depends(get_store)
|
||||
):
|
||||
updates = body.model_dump(exclude_none=True)
|
||||
if not updates:
|
||||
product = await asyncio.to_thread(store.get_product, product_id)
|
||||
else:
|
||||
import json
|
||||
sets = ", ".join(f"{k} = ?" for k in updates)
|
||||
values = []
|
||||
for k, v in updates.items():
|
||||
values.append(json.dumps(v) if isinstance(v, dict) else v)
|
||||
values.append(product_id)
|
||||
await asyncio.to_thread(
|
||||
store.conn.execute,
|
||||
f"UPDATE products SET {sets}, updated_at = datetime('now') WHERE id = ?",
|
||||
values,
|
||||
)
|
||||
store.conn.commit()
|
||||
product = await asyncio.to_thread(store.get_product, product_id)
|
||||
if not product:
|
||||
raise HTTPException(status_code=404, detail="Product not found")
|
||||
return ProductResponse.model_validate(product)
|
||||
|
||||
|
||||
@router.delete("/products/{product_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||
async def delete_product(product_id: int, store: Store = Depends(get_store)):
|
||||
existing = await asyncio.to_thread(store.get_product, product_id)
|
||||
if not existing:
|
||||
raise HTTPException(status_code=404, detail="Product not found")
|
||||
await asyncio.to_thread(
|
||||
store.conn.execute, "DELETE FROM products WHERE id = ?", (product_id,)
|
||||
)
|
||||
store.conn.commit()
|
||||
|
||||
|
||||
# ── Inventory items ───────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/items", response_model=InventoryItemResponse, status_code=status.HTTP_201_CREATED)
|
||||
async def create_inventory_item(body: InventoryItemCreate, store: Store = Depends(get_store)):
|
||||
item = await asyncio.to_thread(
|
||||
store.add_inventory_item,
|
||||
body.product_id,
|
||||
body.location,
|
||||
quantity=body.quantity,
|
||||
unit=body.unit,
|
||||
sublocation=body.sublocation,
|
||||
purchase_date=str(body.purchase_date) if body.purchase_date else None,
|
||||
expiration_date=str(body.expiration_date) if body.expiration_date else None,
|
||||
notes=body.notes,
|
||||
source=body.source,
|
||||
)
|
||||
return InventoryItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.get("/items", response_model=List[InventoryItemResponse])
|
||||
async def list_inventory_items(
|
||||
location: Optional[str] = None,
|
||||
item_status: str = "available",
|
||||
store: Store = Depends(get_store),
|
||||
):
|
||||
items = await asyncio.to_thread(store.list_inventory, location, item_status)
|
||||
return [InventoryItemResponse.model_validate(i) for i in items]
|
||||
|
||||
|
||||
@router.get("/items/expiring", response_model=List[InventoryItemResponse])
|
||||
async def get_expiring_items(days: int = 7, store: Store = Depends(get_store)):
|
||||
items = await asyncio.to_thread(store.expiring_soon, days)
|
||||
return [InventoryItemResponse.model_validate(i) for i in items]
|
||||
|
||||
|
||||
@router.get("/items/{item_id}", response_model=InventoryItemResponse)
|
||||
async def get_inventory_item(item_id: int, store: Store = Depends(get_store)):
|
||||
item = await asyncio.to_thread(store.get_inventory_item, item_id)
|
||||
if not item:
|
||||
raise HTTPException(status_code=404, detail="Inventory item not found")
|
||||
return InventoryItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.patch("/items/{item_id}", response_model=InventoryItemResponse)
|
||||
async def update_inventory_item(
|
||||
item_id: int, body: InventoryItemUpdate, store: Store = Depends(get_store)
|
||||
):
|
||||
updates = body.model_dump(exclude_none=True)
|
||||
if "purchase_date" in updates and updates["purchase_date"]:
|
||||
updates["purchase_date"] = str(updates["purchase_date"])
|
||||
if "expiration_date" in updates and updates["expiration_date"]:
|
||||
updates["expiration_date"] = str(updates["expiration_date"])
|
||||
item = await asyncio.to_thread(store.update_inventory_item, item_id, **updates)
|
||||
if not item:
|
||||
raise HTTPException(status_code=404, detail="Inventory item not found")
|
||||
return InventoryItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.post("/items/{item_id}/consume", response_model=InventoryItemResponse)
|
||||
async def consume_item(item_id: int, store: Store = Depends(get_store)):
|
||||
from datetime import datetime, timezone
|
||||
item = await asyncio.to_thread(
|
||||
store.update_inventory_item,
|
||||
item_id,
|
||||
status="consumed",
|
||||
consumed_at=datetime.now(timezone.utc).isoformat(),
|
||||
)
|
||||
if not item:
|
||||
raise HTTPException(status_code=404, detail="Inventory item not found")
|
||||
return InventoryItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.delete("/items/{item_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||
async def delete_inventory_item(item_id: int, store: Store = Depends(get_store)):
|
||||
existing = await asyncio.to_thread(store.get_inventory_item, item_id)
|
||||
if not existing:
|
||||
raise HTTPException(status_code=404, detail="Inventory item not found")
|
||||
await asyncio.to_thread(
|
||||
store.conn.execute, "DELETE FROM inventory_items WHERE id = ?", (item_id,)
|
||||
)
|
||||
store.conn.commit()
|
||||
|
||||
|
||||
# ── Barcode scanning ──────────────────────────────────────────────────────────
|
||||
|
||||
class BarcodeScanTextRequest(BaseModel):
|
||||
barcode: str
|
||||
location: str = "pantry"
|
||||
quantity: float = 1.0
|
||||
auto_add_to_inventory: bool = True
|
||||
|
||||
|
||||
@router.post("/scan/text", response_model=BarcodeScanResponse)
|
||||
async def scan_barcode_text(
|
||||
body: BarcodeScanTextRequest,
|
||||
store: Store = Depends(get_store),
|
||||
session: CloudUser = Depends(get_session),
|
||||
):
|
||||
"""Scan a barcode from a text string (e.g. from a hardware scanner or manual entry)."""
|
||||
from app.services.openfoodfacts import OpenFoodFactsService
|
||||
from app.services.expiration_predictor import ExpirationPredictor
|
||||
|
||||
off = OpenFoodFactsService()
|
||||
predictor = ExpirationPredictor()
|
||||
product_info = await off.lookup_product(body.barcode)
|
||||
inventory_item = None
|
||||
|
||||
if product_info and body.auto_add_to_inventory:
|
||||
product, _ = await asyncio.to_thread(
|
||||
store.get_or_create_product,
|
||||
product_info.get("name", body.barcode),
|
||||
body.barcode,
|
||||
brand=product_info.get("brand"),
|
||||
category=product_info.get("category"),
|
||||
nutrition_data=product_info.get("nutrition_data", {}),
|
||||
source="openfoodfacts",
|
||||
source_data=product_info,
|
||||
)
|
||||
exp = predictor.predict_expiration(
|
||||
product_info.get("category", ""),
|
||||
body.location,
|
||||
product_name=product_info.get("name", body.barcode),
|
||||
tier=session.tier,
|
||||
has_byok=session.has_byok,
|
||||
)
|
||||
inventory_item = await asyncio.to_thread(
|
||||
store.add_inventory_item,
|
||||
product["id"], body.location,
|
||||
quantity=body.quantity,
|
||||
expiration_date=str(exp) if exp else None,
|
||||
source="barcode_scan",
|
||||
)
|
||||
result_product = ProductResponse.model_validate(product)
|
||||
else:
|
||||
result_product = None
|
||||
|
||||
return BarcodeScanResponse(
|
||||
success=True,
|
||||
barcodes_found=1,
|
||||
results=[{
|
||||
"barcode": body.barcode,
|
||||
"barcode_type": "text",
|
||||
"product": result_product,
|
||||
"inventory_item": InventoryItemResponse.model_validate(inventory_item) if inventory_item else None,
|
||||
"added_to_inventory": inventory_item is not None,
|
||||
"message": "Added to inventory" if inventory_item else "Product not found in database",
|
||||
}],
|
||||
message="Barcode processed",
|
||||
)
|
||||
|
||||
|
||||
@router.post("/scan", response_model=BarcodeScanResponse)
|
||||
async def scan_barcode_image(
|
||||
file: UploadFile = File(...),
|
||||
auto_add_to_inventory: bool = Form(True),
|
||||
location: str = Form("pantry"),
|
||||
quantity: float = Form(1.0),
|
||||
store: Store = Depends(get_store),
|
||||
session: CloudUser = Depends(get_session),
|
||||
):
|
||||
"""Scan a barcode from an uploaded image. Requires Phase 2 scanner integration."""
|
||||
temp_dir = Path("/tmp/kiwi_barcode_scans")
|
||||
temp_dir.mkdir(parents=True, exist_ok=True)
|
||||
temp_file = temp_dir / f"{uuid.uuid4()}_{file.filename}"
|
||||
try:
|
||||
async with aiofiles.open(temp_file, "wb") as f:
|
||||
await f.write(await file.read())
|
||||
from app.services.barcode_scanner import BarcodeScanner
|
||||
from app.services.openfoodfacts import OpenFoodFactsService
|
||||
from app.services.expiration_predictor import ExpirationPredictor
|
||||
|
||||
barcodes = await asyncio.to_thread(BarcodeScanner().scan_image, temp_file)
|
||||
if not barcodes:
|
||||
return BarcodeScanResponse(
|
||||
success=False, barcodes_found=0, results=[],
|
||||
message="No barcodes detected in image"
|
||||
)
|
||||
|
||||
off = OpenFoodFactsService()
|
||||
predictor = ExpirationPredictor()
|
||||
results = []
|
||||
for bc in barcodes:
|
||||
code = bc["data"]
|
||||
product_info = await off.lookup_product(code)
|
||||
inventory_item = None
|
||||
if product_info and auto_add_to_inventory:
|
||||
product, _ = await asyncio.to_thread(
|
||||
store.get_or_create_product,
|
||||
product_info.get("name", code),
|
||||
code,
|
||||
brand=product_info.get("brand"),
|
||||
category=product_info.get("category"),
|
||||
nutrition_data=product_info.get("nutrition_data", {}),
|
||||
source="openfoodfacts",
|
||||
source_data=product_info,
|
||||
)
|
||||
exp = predictor.predict_expiration(
|
||||
product_info.get("category", ""),
|
||||
location,
|
||||
product_name=product_info.get("name", code),
|
||||
tier=session.tier,
|
||||
has_byok=session.has_byok,
|
||||
)
|
||||
inventory_item = await asyncio.to_thread(
|
||||
store.add_inventory_item,
|
||||
product["id"], location,
|
||||
quantity=quantity,
|
||||
expiration_date=str(exp) if exp else None,
|
||||
source="barcode_scan",
|
||||
)
|
||||
results.append({
|
||||
"barcode": code,
|
||||
"barcode_type": bc.get("type", "unknown"),
|
||||
"product": ProductResponse.model_validate(product) if product_info else None,
|
||||
"inventory_item": InventoryItemResponse.model_validate(inventory_item) if inventory_item else None,
|
||||
"added_to_inventory": inventory_item is not None,
|
||||
"message": "Added to inventory" if inventory_item else "Barcode scanned",
|
||||
})
|
||||
return BarcodeScanResponse(
|
||||
success=True, barcodes_found=len(barcodes), results=results,
|
||||
message=f"Processed {len(barcodes)} barcode(s)"
|
||||
)
|
||||
finally:
|
||||
if temp_file.exists():
|
||||
temp_file.unlink()
|
||||
|
||||
|
||||
# ── Tags ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/tags", response_model=TagResponse, status_code=status.HTTP_201_CREATED)
|
||||
async def create_tag(body: TagCreate, store: Store = Depends(get_store)):
|
||||
cur = await asyncio.to_thread(
|
||||
store.conn.execute,
|
||||
"INSERT INTO tags (name, slug, description, color, category) VALUES (?,?,?,?,?) RETURNING *",
|
||||
(body.name, body.slug, body.description, body.color, body.category),
|
||||
)
|
||||
store.conn.commit()
|
||||
import sqlite3; store.conn.row_factory = sqlite3.Row
|
||||
return TagResponse.model_validate(store._row_to_dict(cur.fetchone()))
|
||||
|
||||
|
||||
@router.get("/tags", response_model=List[TagResponse])
|
||||
async def list_tags(
|
||||
category: Optional[str] = None, store: Store = Depends(get_store)
|
||||
):
|
||||
if category:
|
||||
tags = await asyncio.to_thread(
|
||||
store._fetch_all, "SELECT * FROM tags WHERE category = ? ORDER BY name", (category,)
|
||||
)
|
||||
else:
|
||||
tags = await asyncio.to_thread(
|
||||
store._fetch_all, "SELECT * FROM tags ORDER BY name"
|
||||
)
|
||||
return [TagResponse.model_validate(t) for t in tags]
|
||||
|
||||
|
||||
# ── Stats ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/stats", response_model=InventoryStats)
|
||||
async def get_inventory_stats(store: Store = Depends(get_store)):
|
||||
def _stats():
|
||||
rows = store._fetch_all(
|
||||
"""SELECT status, location, COUNT(*) as cnt
|
||||
FROM inventory_items GROUP BY status, location"""
|
||||
)
|
||||
total = sum(r["cnt"] for r in rows)
|
||||
available = sum(r["cnt"] for r in rows if r["status"] == "available")
|
||||
expired = sum(r["cnt"] for r in rows if r["status"] == "expired")
|
||||
expiring = len(store.expiring_soon(7))
|
||||
locations = {}
|
||||
for r in rows:
|
||||
if r["status"] == "available":
|
||||
locations[r["location"]] = locations.get(r["location"], 0) + r["cnt"]
|
||||
return {
|
||||
"total_items": total,
|
||||
"available_items": available,
|
||||
"expiring_soon": expiring,
|
||||
"expired_items": expired,
|
||||
"locations": locations,
|
||||
}
|
||||
return InventoryStats.model_validate(await asyncio.to_thread(_stats))
|
||||
233
app/api/endpoints/ocr.py
Normal file
233
app/api/endpoints/ocr.py
Normal file
|
|
@ -0,0 +1,233 @@
|
|||
"""OCR status, trigger, and approval endpoints."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from datetime import date
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException
|
||||
|
||||
from app.cloud_session import CloudUser, get_session
|
||||
from app.core.config import settings
|
||||
from app.db.session import get_store
|
||||
from app.db.store import Store
|
||||
from app.models.schemas.receipt import (
|
||||
ApproveOCRRequest,
|
||||
ApproveOCRResponse,
|
||||
ApprovedInventoryItem,
|
||||
)
|
||||
from app.services.expiration_predictor import ExpirationPredictor
|
||||
from app.tiers import can_use
|
||||
from app.utils.units import normalize_to_metric
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
# ── Status ────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/{receipt_id}/ocr/status")
|
||||
async def get_ocr_status(receipt_id: int, store: Store = Depends(get_store)):
|
||||
receipt = await asyncio.to_thread(store.get_receipt, receipt_id)
|
||||
if not receipt:
|
||||
raise HTTPException(status_code=404, detail="Receipt not found")
|
||||
rd = await asyncio.to_thread(
|
||||
store._fetch_one,
|
||||
"SELECT id, processing_time FROM receipt_data WHERE receipt_id = ?",
|
||||
(receipt_id,),
|
||||
)
|
||||
return {
|
||||
"receipt_id": receipt_id,
|
||||
"status": receipt["status"],
|
||||
"ocr_complete": rd is not None,
|
||||
"ocr_enabled": settings.ENABLE_OCR,
|
||||
}
|
||||
|
||||
|
||||
# ── Trigger ───────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/{receipt_id}/ocr/trigger")
|
||||
async def trigger_ocr(
|
||||
receipt_id: int,
|
||||
background_tasks: BackgroundTasks,
|
||||
store: Store = Depends(get_store),
|
||||
session: CloudUser = Depends(get_session),
|
||||
):
|
||||
"""Manually trigger OCR processing for an already-uploaded receipt."""
|
||||
if not can_use("receipt_ocr", session.tier, session.has_byok):
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail="Receipt OCR requires Paid tier or a configured local LLM backend (BYOK).",
|
||||
)
|
||||
if not settings.ENABLE_OCR:
|
||||
raise HTTPException(status_code=503, detail="OCR not enabled on this server.")
|
||||
|
||||
receipt = await asyncio.to_thread(store.get_receipt, receipt_id)
|
||||
if not receipt:
|
||||
raise HTTPException(status_code=404, detail="Receipt not found")
|
||||
if receipt["status"] == "processing":
|
||||
raise HTTPException(status_code=409, detail="OCR already in progress for this receipt.")
|
||||
|
||||
image_path = Path(receipt["original_path"])
|
||||
if not image_path.exists():
|
||||
raise HTTPException(status_code=404, detail="Image file not found on disk.")
|
||||
|
||||
async def _run() -> None:
|
||||
try:
|
||||
await asyncio.to_thread(store.update_receipt_status, receipt_id, "processing")
|
||||
from app.services.receipt_service import ReceiptService
|
||||
await ReceiptService(store).process(receipt_id, image_path)
|
||||
except Exception as exc:
|
||||
logger.exception("OCR pipeline failed for receipt %s", receipt_id)
|
||||
await asyncio.to_thread(store.update_receipt_status, receipt_id, "error", str(exc))
|
||||
|
||||
background_tasks.add_task(_run)
|
||||
return {"receipt_id": receipt_id, "status": "queued"}
|
||||
|
||||
|
||||
# ── Data ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/{receipt_id}/ocr/data")
|
||||
async def get_ocr_data(receipt_id: int, store: Store = Depends(get_store)):
|
||||
rd = await asyncio.to_thread(
|
||||
store._fetch_one,
|
||||
"SELECT * FROM receipt_data WHERE receipt_id = ?",
|
||||
(receipt_id,),
|
||||
)
|
||||
if not rd:
|
||||
raise HTTPException(status_code=404, detail="No OCR data for this receipt")
|
||||
return rd
|
||||
|
||||
|
||||
# ── Approve ───────────────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/{receipt_id}/ocr/approve", response_model=ApproveOCRResponse)
|
||||
async def approve_ocr_items(
|
||||
receipt_id: int,
|
||||
body: ApproveOCRRequest,
|
||||
store: Store = Depends(get_store),
|
||||
session: CloudUser = Depends(get_session),
|
||||
):
|
||||
"""Commit reviewed OCR line items into inventory.
|
||||
|
||||
Reads items from receipt_data, optionally filtered by item_indices,
|
||||
and creates inventory entries. Receipt status moves to 'processed'.
|
||||
"""
|
||||
receipt = await asyncio.to_thread(store.get_receipt, receipt_id)
|
||||
if not receipt:
|
||||
raise HTTPException(status_code=404, detail="Receipt not found")
|
||||
if receipt["status"] not in ("staged", "processed"):
|
||||
raise HTTPException(
|
||||
status_code=409,
|
||||
detail=f"Receipt is not staged for approval (status={receipt['status']}).",
|
||||
)
|
||||
|
||||
rd = await asyncio.to_thread(
|
||||
store._fetch_one,
|
||||
"SELECT items, transaction_date FROM receipt_data WHERE receipt_id = ?",
|
||||
(receipt_id,),
|
||||
)
|
||||
if not rd:
|
||||
raise HTTPException(status_code=404, detail="No OCR data found for this receipt.")
|
||||
|
||||
raw_items: list[dict[str, Any]] = json.loads(rd["items"] or "[]")
|
||||
if not raw_items:
|
||||
raise HTTPException(status_code=422, detail="No items found in OCR data.")
|
||||
|
||||
# Filter to requested indices, or use all
|
||||
if body.item_indices is not None:
|
||||
invalid = [i for i in body.item_indices if i >= len(raw_items) or i < 0]
|
||||
if invalid:
|
||||
raise HTTPException(
|
||||
status_code=422,
|
||||
detail=f"Item indices out of range: {invalid} (receipt has {len(raw_items)} items)",
|
||||
)
|
||||
selected = [(i, raw_items[i]) for i in body.item_indices]
|
||||
skipped = len(raw_items) - len(selected)
|
||||
else:
|
||||
selected = list(enumerate(raw_items))
|
||||
skipped = 0
|
||||
|
||||
created = await asyncio.to_thread(
|
||||
_commit_items, store, receipt_id, selected, body.location, rd.get("transaction_date")
|
||||
)
|
||||
|
||||
await asyncio.to_thread(store.update_receipt_status, receipt_id, "processed")
|
||||
|
||||
return ApproveOCRResponse(
|
||||
receipt_id=receipt_id,
|
||||
approved=len(created),
|
||||
skipped=skipped,
|
||||
inventory_items=created,
|
||||
)
|
||||
|
||||
|
||||
def _commit_items(
|
||||
store: Store,
|
||||
receipt_id: int,
|
||||
selected: list[tuple[int, dict[str, Any]]],
|
||||
location: str,
|
||||
transaction_date: str | None,
|
||||
) -> list[ApprovedInventoryItem]:
|
||||
"""Create product + inventory entries for approved OCR line items.
|
||||
|
||||
Runs synchronously inside asyncio.to_thread.
|
||||
"""
|
||||
predictor = ExpirationPredictor()
|
||||
|
||||
purchase_date: date | None = None
|
||||
if transaction_date:
|
||||
try:
|
||||
purchase_date = date.fromisoformat(transaction_date)
|
||||
except ValueError:
|
||||
logger.warning("Could not parse transaction_date %r", transaction_date)
|
||||
|
||||
created: list[ApprovedInventoryItem] = []
|
||||
|
||||
for _idx, item in selected:
|
||||
name = (item.get("name") or "").strip()
|
||||
if not name:
|
||||
logger.debug("Skipping nameless item at index %d", _idx)
|
||||
continue
|
||||
|
||||
category = (item.get("category") or "").strip()
|
||||
quantity = float(item.get("quantity") or 1.0)
|
||||
|
||||
raw_unit = (item.get("unit") or "each").strip()
|
||||
metric_qty, base_unit = normalize_to_metric(quantity, raw_unit)
|
||||
|
||||
product, _ = store.get_or_create_product(
|
||||
name,
|
||||
category=category or None,
|
||||
source="receipt_ocr",
|
||||
)
|
||||
|
||||
exp = predictor.predict_expiration(
|
||||
category, location,
|
||||
purchase_date=purchase_date,
|
||||
product_name=name,
|
||||
)
|
||||
|
||||
inv = store.add_inventory_item(
|
||||
product["id"],
|
||||
location,
|
||||
quantity=metric_qty,
|
||||
unit=base_unit,
|
||||
receipt_id=receipt_id,
|
||||
purchase_date=str(purchase_date) if purchase_date else None,
|
||||
expiration_date=str(exp) if exp else None,
|
||||
source="receipt_ocr",
|
||||
)
|
||||
|
||||
created.append(ApprovedInventoryItem(
|
||||
inventory_id=inv["id"],
|
||||
product_name=name,
|
||||
quantity=quantity,
|
||||
location=location,
|
||||
expiration_date=str(exp) if exp else None,
|
||||
))
|
||||
|
||||
return created
|
||||
110
app/api/endpoints/receipts.py
Normal file
110
app/api/endpoints/receipts.py
Normal file
|
|
@ -0,0 +1,110 @@
|
|||
"""Receipt upload, OCR, and quality endpoints."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
import aiofiles
|
||||
from fastapi import APIRouter, BackgroundTasks, Depends, File, HTTPException, UploadFile
|
||||
|
||||
from app.cloud_session import CloudUser, get_session
|
||||
from app.core.config import settings
|
||||
from app.db.session import get_store
|
||||
from app.db.store import Store
|
||||
from app.models.schemas.receipt import ReceiptResponse
|
||||
from app.models.schemas.quality import QualityAssessment
|
||||
from app.tiers import can_use
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
async def _save_upload(file: UploadFile, dest_dir: Path) -> Path:
|
||||
dest = dest_dir / f"{uuid.uuid4()}_{file.filename}"
|
||||
async with aiofiles.open(dest, "wb") as f:
|
||||
await f.write(await file.read())
|
||||
return dest
|
||||
|
||||
|
||||
@router.post("/", response_model=ReceiptResponse, status_code=201)
|
||||
async def upload_receipt(
|
||||
background_tasks: BackgroundTasks,
|
||||
file: UploadFile = File(...),
|
||||
store: Store = Depends(get_store),
|
||||
session: CloudUser = Depends(get_session),
|
||||
):
|
||||
settings.ensure_dirs()
|
||||
saved = await _save_upload(file, settings.UPLOAD_DIR)
|
||||
receipt = await asyncio.to_thread(
|
||||
store.create_receipt, file.filename, str(saved)
|
||||
)
|
||||
# Only queue OCR if the feature is enabled server-side AND the user's tier allows it.
|
||||
# Check tier here, not inside the background task — once dispatched it can't be cancelled.
|
||||
ocr_allowed = settings.ENABLE_OCR and can_use("receipt_ocr", session.tier, session.has_byok)
|
||||
if ocr_allowed:
|
||||
background_tasks.add_task(_process_receipt_ocr, receipt["id"], saved, store)
|
||||
return ReceiptResponse.model_validate(receipt)
|
||||
|
||||
|
||||
@router.post("/batch", response_model=List[ReceiptResponse], status_code=201)
|
||||
async def upload_receipts_batch(
|
||||
background_tasks: BackgroundTasks,
|
||||
files: List[UploadFile] = File(...),
|
||||
store: Store = Depends(get_store),
|
||||
session: CloudUser = Depends(get_session),
|
||||
):
|
||||
settings.ensure_dirs()
|
||||
ocr_allowed = settings.ENABLE_OCR and can_use("receipt_ocr", session.tier, session.has_byok)
|
||||
results = []
|
||||
for file in files:
|
||||
saved = await _save_upload(file, settings.UPLOAD_DIR)
|
||||
receipt = await asyncio.to_thread(
|
||||
store.create_receipt, file.filename, str(saved)
|
||||
)
|
||||
if ocr_allowed:
|
||||
background_tasks.add_task(_process_receipt_ocr, receipt["id"], saved, store)
|
||||
results.append(ReceiptResponse.model_validate(receipt))
|
||||
return results
|
||||
|
||||
|
||||
@router.get("/{receipt_id}", response_model=ReceiptResponse)
|
||||
async def get_receipt(receipt_id: int, store: Store = Depends(get_store)):
|
||||
receipt = await asyncio.to_thread(store.get_receipt, receipt_id)
|
||||
if not receipt:
|
||||
raise HTTPException(status_code=404, detail="Receipt not found")
|
||||
return ReceiptResponse.model_validate(receipt)
|
||||
|
||||
|
||||
@router.get("/", response_model=List[ReceiptResponse])
|
||||
async def list_receipts(
|
||||
limit: int = 50, offset: int = 0, store: Store = Depends(get_store)
|
||||
):
|
||||
receipts = await asyncio.to_thread(store.list_receipts, limit, offset)
|
||||
return [ReceiptResponse.model_validate(r) for r in receipts]
|
||||
|
||||
|
||||
@router.get("/{receipt_id}/quality", response_model=QualityAssessment)
|
||||
async def get_receipt_quality(receipt_id: int, store: Store = Depends(get_store)):
|
||||
qa = await asyncio.to_thread(
|
||||
store._fetch_one,
|
||||
"SELECT * FROM quality_assessments WHERE receipt_id = ?",
|
||||
(receipt_id,),
|
||||
)
|
||||
if not qa:
|
||||
raise HTTPException(status_code=404, detail="Quality assessment not found")
|
||||
return QualityAssessment.model_validate(qa)
|
||||
|
||||
|
||||
async def _process_receipt_ocr(receipt_id: int, image_path: Path, store: Store) -> None:
|
||||
"""Background task: run OCR pipeline on an uploaded receipt."""
|
||||
try:
|
||||
await asyncio.to_thread(store.update_receipt_status, receipt_id, "processing")
|
||||
from app.services.receipt_service import ReceiptService
|
||||
service = ReceiptService(store)
|
||||
await service.process(receipt_id, image_path)
|
||||
except Exception as exc:
|
||||
await asyncio.to_thread(
|
||||
store.update_receipt_status, receipt_id, "error", str(exc)
|
||||
)
|
||||
10
app/api/routes.py
Normal file
10
app/api/routes.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
from fastapi import APIRouter
|
||||
from app.api.endpoints import health, receipts, export, inventory, ocr
|
||||
|
||||
api_router = APIRouter()
|
||||
|
||||
api_router.include_router(health.router, prefix="/health", tags=["health"])
|
||||
api_router.include_router(receipts.router, prefix="/receipts", tags=["receipts"])
|
||||
api_router.include_router(ocr.router, prefix="/receipts", tags=["ocr"]) # OCR endpoints under /receipts
|
||||
api_router.include_router(export.router, tags=["export"]) # No prefix, uses /export in the router
|
||||
api_router.include_router(inventory.router, prefix="/inventory", tags=["inventory"])
|
||||
196
app/cloud_session.py
Normal file
196
app/cloud_session.py
Normal file
|
|
@ -0,0 +1,196 @@
|
|||
"""Cloud session resolution for Kiwi FastAPI.
|
||||
|
||||
Local mode (CLOUD_MODE unset/false): returns a local CloudUser with no auth
|
||||
checks, full tier access, and DB path pointing to settings.DB_PATH.
|
||||
|
||||
Cloud mode (CLOUD_MODE=true): validates the cf_session JWT injected by Caddy
|
||||
as X-CF-Session, resolves user_id, auto-provisions a free Heimdall license on
|
||||
first visit, fetches the tier, and returns a per-user DB path.
|
||||
|
||||
FastAPI usage:
|
||||
@app.get("/api/v1/inventory/items")
|
||||
def list_items(session: CloudUser = Depends(get_session)):
|
||||
store = Store(session.db)
|
||||
...
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import time
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
|
||||
import jwt as pyjwt
|
||||
import requests
|
||||
import yaml
|
||||
from fastapi import Depends, HTTPException, Request
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
# ── Config ────────────────────────────────────────────────────────────────────
|
||||
|
||||
CLOUD_MODE: bool = os.environ.get("CLOUD_MODE", "").lower() in ("1", "true", "yes")
|
||||
CLOUD_DATA_ROOT: Path = Path(os.environ.get("CLOUD_DATA_ROOT", "/devl/kiwi-cloud-data"))
|
||||
DIRECTUS_JWT_SECRET: str = os.environ.get("DIRECTUS_JWT_SECRET", "")
|
||||
HEIMDALL_URL: str = os.environ.get("HEIMDALL_URL", "https://license.circuitforge.tech")
|
||||
HEIMDALL_ADMIN_TOKEN: str = os.environ.get("HEIMDALL_ADMIN_TOKEN", "")
|
||||
|
||||
_LOCAL_KIWI_DB: Path = Path(os.environ.get("KIWI_DB", "data/kiwi.db"))
|
||||
|
||||
_TIER_CACHE: dict[str, tuple[str, float]] = {}
|
||||
_TIER_CACHE_TTL = 300 # 5 minutes
|
||||
|
||||
TIERS = ["free", "paid", "premium", "ultra"]
|
||||
|
||||
|
||||
# ── Domain ────────────────────────────────────────────────────────────────────
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class CloudUser:
|
||||
user_id: str # Directus UUID, or "local"
|
||||
tier: str # free | paid | premium | ultra | local
|
||||
db: Path # per-user SQLite DB path
|
||||
has_byok: bool # True if a configured LLM backend is present in llm.yaml
|
||||
|
||||
|
||||
# ── JWT validation ─────────────────────────────────────────────────────────────
|
||||
|
||||
def _extract_session_token(header_value: str) -> str:
|
||||
m = re.search(r'(?:^|;)\s*cf_session=([^;]+)', header_value)
|
||||
return m.group(1).strip() if m else header_value.strip()
|
||||
|
||||
|
||||
def validate_session_jwt(token: str) -> str:
|
||||
"""Validate cf_session JWT and return the Directus user_id."""
|
||||
try:
|
||||
payload = pyjwt.decode(
|
||||
token,
|
||||
DIRECTUS_JWT_SECRET,
|
||||
algorithms=["HS256"],
|
||||
options={"require": ["id", "exp"]},
|
||||
)
|
||||
return payload["id"]
|
||||
except Exception as exc:
|
||||
log.debug("JWT validation failed: %s", exc)
|
||||
raise HTTPException(status_code=401, detail="Session invalid or expired")
|
||||
|
||||
|
||||
# ── Heimdall integration ──────────────────────────────────────────────────────
|
||||
|
||||
def _ensure_provisioned(user_id: str) -> None:
|
||||
if not HEIMDALL_ADMIN_TOKEN:
|
||||
return
|
||||
try:
|
||||
requests.post(
|
||||
f"{HEIMDALL_URL}/admin/provision",
|
||||
json={"directus_user_id": user_id, "product": "kiwi", "tier": "free"},
|
||||
headers={"Authorization": f"Bearer {HEIMDALL_ADMIN_TOKEN}"},
|
||||
timeout=5,
|
||||
)
|
||||
except Exception as exc:
|
||||
log.warning("Heimdall provision failed for user %s: %s", user_id, exc)
|
||||
|
||||
|
||||
def _fetch_cloud_tier(user_id: str) -> str:
|
||||
now = time.monotonic()
|
||||
cached = _TIER_CACHE.get(user_id)
|
||||
if cached and (now - cached[1]) < _TIER_CACHE_TTL:
|
||||
return cached[0]
|
||||
|
||||
if not HEIMDALL_ADMIN_TOKEN:
|
||||
return "free"
|
||||
try:
|
||||
resp = requests.post(
|
||||
f"{HEIMDALL_URL}/admin/cloud/resolve",
|
||||
json={"directus_user_id": user_id, "product": "kiwi"},
|
||||
headers={"Authorization": f"Bearer {HEIMDALL_ADMIN_TOKEN}"},
|
||||
timeout=5,
|
||||
)
|
||||
tier = resp.json().get("tier", "free") if resp.ok else "free"
|
||||
except Exception as exc:
|
||||
log.warning("Heimdall tier resolve failed for user %s: %s", user_id, exc)
|
||||
tier = "free"
|
||||
|
||||
_TIER_CACHE[user_id] = (tier, now)
|
||||
return tier
|
||||
|
||||
|
||||
def _user_db_path(user_id: str) -> Path:
|
||||
path = CLOUD_DATA_ROOT / user_id / "kiwi.db"
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
return path
|
||||
|
||||
|
||||
# ── BYOK detection ────────────────────────────────────────────────────────────
|
||||
|
||||
_LLM_CONFIG_PATH = Path.home() / ".config" / "circuitforge" / "llm.yaml"
|
||||
|
||||
|
||||
def _detect_byok(config_path: Path = _LLM_CONFIG_PATH) -> bool:
|
||||
"""Return True if at least one enabled non-vision LLM backend is configured.
|
||||
|
||||
Reads the same llm.yaml that LLMRouter uses. Local (Ollama, vLLM) and
|
||||
API-key backends both count — the policy is "user is supplying compute",
|
||||
regardless of where that compute lives.
|
||||
"""
|
||||
try:
|
||||
with open(config_path) as f:
|
||||
cfg = yaml.safe_load(f) or {}
|
||||
return any(
|
||||
b.get("enabled", True) and b.get("type") != "vision_service"
|
||||
for b in cfg.get("backends", {}).values()
|
||||
)
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
# ── FastAPI dependency ────────────────────────────────────────────────────────
|
||||
|
||||
def get_session(request: Request) -> CloudUser:
|
||||
"""FastAPI dependency — resolves the current user from the request.
|
||||
|
||||
Local mode: fully-privileged "local" user pointing at local DB.
|
||||
Cloud mode: validates X-CF-Session JWT, provisions license, resolves tier.
|
||||
"""
|
||||
has_byok = _detect_byok()
|
||||
|
||||
if not CLOUD_MODE:
|
||||
return CloudUser(user_id="local", tier="local", db=_LOCAL_KIWI_DB, has_byok=has_byok)
|
||||
|
||||
raw_header = (
|
||||
request.headers.get("x-cf-session", "")
|
||||
or request.headers.get("cookie", "")
|
||||
)
|
||||
if not raw_header:
|
||||
raise HTTPException(status_code=401, detail="Not authenticated")
|
||||
|
||||
token = _extract_session_token(raw_header)
|
||||
if not token:
|
||||
raise HTTPException(status_code=401, detail="Not authenticated")
|
||||
|
||||
user_id = validate_session_jwt(token)
|
||||
_ensure_provisioned(user_id)
|
||||
tier = _fetch_cloud_tier(user_id)
|
||||
return CloudUser(user_id=user_id, tier=tier, db=_user_db_path(user_id), has_byok=has_byok)
|
||||
|
||||
|
||||
def require_tier(min_tier: str):
|
||||
"""Dependency factory — raises 403 if tier is below min_tier."""
|
||||
min_idx = TIERS.index(min_tier)
|
||||
|
||||
def _check(session: CloudUser = Depends(get_session)) -> CloudUser:
|
||||
if session.tier == "local":
|
||||
return session
|
||||
try:
|
||||
if TIERS.index(session.tier) < min_idx:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail=f"This feature requires {min_tier} tier or above.",
|
||||
)
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=403, detail="Unknown tier.")
|
||||
return session
|
||||
|
||||
return _check
|
||||
5
app/core/__init__.py
Normal file
5
app/core/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
# app/core/__init__.py
|
||||
"""
|
||||
Core components for Kiwi.
|
||||
Contains configuration, dependencies, and other core functionality.
|
||||
"""
|
||||
59
app/core/config.py
Normal file
59
app/core/config.py
Normal file
|
|
@ -0,0 +1,59 @@
|
|||
"""
|
||||
Kiwi application config.
|
||||
Uses circuitforge-core for env loading; no pydantic-settings dependency.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
from circuitforge_core.config.settings import load_env
|
||||
|
||||
# Load .env from the repo root (two levels up from app/core/)
|
||||
_ROOT = Path(__file__).resolve().parents[2]
|
||||
load_env(_ROOT / ".env")
|
||||
|
||||
|
||||
class Settings:
|
||||
# API
|
||||
API_PREFIX: str = os.environ.get("API_PREFIX", "/api/v1")
|
||||
PROJECT_NAME: str = "Kiwi — Pantry Intelligence"
|
||||
|
||||
# CORS
|
||||
CORS_ORIGINS: list[str] = [
|
||||
o.strip()
|
||||
for o in os.environ.get("CORS_ORIGINS", "").split(",")
|
||||
if o.strip()
|
||||
]
|
||||
|
||||
# File storage
|
||||
DATA_DIR: Path = Path(os.environ.get("DATA_DIR", str(_ROOT / "data")))
|
||||
UPLOAD_DIR: Path = DATA_DIR / "uploads"
|
||||
PROCESSING_DIR: Path = DATA_DIR / "processing"
|
||||
ARCHIVE_DIR: Path = DATA_DIR / "archive"
|
||||
|
||||
# Database
|
||||
DB_PATH: Path = Path(os.environ.get("DB_PATH", str(DATA_DIR / "kiwi.db")))
|
||||
|
||||
# Processing
|
||||
MAX_CONCURRENT_JOBS: int = int(os.environ.get("MAX_CONCURRENT_JOBS", "4"))
|
||||
USE_GPU: bool = os.environ.get("USE_GPU", "true").lower() in ("1", "true", "yes")
|
||||
GPU_MEMORY_LIMIT: int = int(os.environ.get("GPU_MEMORY_LIMIT", "6144"))
|
||||
|
||||
# Quality
|
||||
MIN_QUALITY_SCORE: float = float(os.environ.get("MIN_QUALITY_SCORE", "50.0"))
|
||||
|
||||
# Feature flags
|
||||
ENABLE_OCR: bool = os.environ.get("ENABLE_OCR", "false").lower() in ("1", "true", "yes")
|
||||
|
||||
# Runtime
|
||||
DEBUG: bool = os.environ.get("DEBUG", "false").lower() in ("1", "true", "yes")
|
||||
CLOUD_MODE: bool = os.environ.get("CLOUD_MODE", "false").lower() in ("1", "true", "yes")
|
||||
DEMO_MODE: bool = os.environ.get("DEMO_MODE", "false").lower() in ("1", "true", "yes")
|
||||
|
||||
def ensure_dirs(self) -> None:
|
||||
for d in (self.UPLOAD_DIR, self.PROCESSING_DIR, self.ARCHIVE_DIR):
|
||||
d.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
settings = Settings()
|
||||
1
app/db/__init__.py
Normal file
1
app/db/__init__.py
Normal file
|
|
@ -0,0 +1 @@
|
|||
# DB package — use app.db.store.Store for all database access
|
||||
1
app/db/base.py
Normal file
1
app/db/base.py
Normal file
|
|
@ -0,0 +1 @@
|
|||
# Replaced by app.db.store — SQLAlchemy removed in favour of CF-core SQLite stack
|
||||
32
app/db/migrations/001_initial_schema.sql
Normal file
32
app/db/migrations/001_initial_schema.sql
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
-- Migration 001: receipts + quality assessments (ported from Alembic f31d9044277e)
|
||||
|
||||
CREATE TABLE IF NOT EXISTS receipts (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
filename TEXT NOT NULL,
|
||||
original_path TEXT NOT NULL,
|
||||
processed_path TEXT,
|
||||
status TEXT NOT NULL DEFAULT 'uploaded'
|
||||
CHECK (status IN ('uploaded', 'processing', 'processed', 'error')),
|
||||
error TEXT,
|
||||
metadata TEXT NOT NULL DEFAULT '{}',
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_receipts_status ON receipts (status);
|
||||
CREATE INDEX IF NOT EXISTS idx_receipts_created_at ON receipts (created_at DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS quality_assessments (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
receipt_id INTEGER NOT NULL UNIQUE
|
||||
REFERENCES receipts (id) ON DELETE CASCADE,
|
||||
overall_score REAL NOT NULL CHECK (overall_score >= 0 AND overall_score <= 100),
|
||||
is_acceptable INTEGER NOT NULL DEFAULT 0,
|
||||
metrics TEXT NOT NULL DEFAULT '{}',
|
||||
improvement_suggestions TEXT NOT NULL DEFAULT '[]',
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_quality_receipt_id ON quality_assessments (receipt_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_quality_score ON quality_assessments (overall_score);
|
||||
CREATE INDEX IF NOT EXISTS idx_quality_acceptable ON quality_assessments (is_acceptable);
|
||||
53
app/db/migrations/002_inventory_and_products.sql
Normal file
53
app/db/migrations/002_inventory_and_products.sql
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
-- Migration 002: products + inventory items (ported from Alembic 8fc1bf4e7a91)
|
||||
|
||||
CREATE TABLE IF NOT EXISTS products (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
barcode TEXT UNIQUE,
|
||||
name TEXT NOT NULL,
|
||||
brand TEXT,
|
||||
category TEXT,
|
||||
description TEXT,
|
||||
image_url TEXT,
|
||||
nutrition_data TEXT NOT NULL DEFAULT '{}',
|
||||
source TEXT NOT NULL DEFAULT 'manual'
|
||||
CHECK (source IN ('openfoodfacts', 'manual', 'receipt_ocr')),
|
||||
source_data TEXT NOT NULL DEFAULT '{}',
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_products_barcode ON products (barcode);
|
||||
CREATE INDEX IF NOT EXISTS idx_products_name ON products (name);
|
||||
CREATE INDEX IF NOT EXISTS idx_products_category ON products (category);
|
||||
CREATE INDEX IF NOT EXISTS idx_products_source ON products (source);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS inventory_items (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
product_id INTEGER NOT NULL
|
||||
REFERENCES products (id) ON DELETE RESTRICT,
|
||||
receipt_id INTEGER
|
||||
REFERENCES receipts (id) ON DELETE SET NULL,
|
||||
quantity REAL NOT NULL DEFAULT 1 CHECK (quantity > 0),
|
||||
unit TEXT NOT NULL DEFAULT 'count',
|
||||
location TEXT NOT NULL,
|
||||
sublocation TEXT,
|
||||
purchase_date TEXT,
|
||||
expiration_date TEXT,
|
||||
status TEXT NOT NULL DEFAULT 'available'
|
||||
CHECK (status IN ('available', 'consumed', 'expired', 'discarded')),
|
||||
consumed_at TEXT,
|
||||
notes TEXT,
|
||||
source TEXT NOT NULL DEFAULT 'manual'
|
||||
CHECK (source IN ('barcode_scan', 'manual', 'receipt')),
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_product ON inventory_items (product_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_receipt ON inventory_items (receipt_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_status ON inventory_items (status);
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_location ON inventory_items (location);
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_expiration ON inventory_items (expiration_date);
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_created ON inventory_items (created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_inventory_active_loc ON inventory_items (status, location)
|
||||
WHERE status = 'available';
|
||||
38
app/db/migrations/003_receipt_data.sql
Normal file
38
app/db/migrations/003_receipt_data.sql
Normal file
|
|
@ -0,0 +1,38 @@
|
|||
-- Migration 003: OCR receipt data table (ported from Alembic 54cddaf4f4e2)
|
||||
|
||||
CREATE TABLE IF NOT EXISTS receipt_data (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
receipt_id INTEGER NOT NULL UNIQUE
|
||||
REFERENCES receipts (id) ON DELETE CASCADE,
|
||||
merchant_name TEXT,
|
||||
merchant_address TEXT,
|
||||
merchant_phone TEXT,
|
||||
merchant_email TEXT,
|
||||
merchant_website TEXT,
|
||||
merchant_tax_id TEXT,
|
||||
transaction_date TEXT,
|
||||
transaction_time TEXT,
|
||||
receipt_number TEXT,
|
||||
register_number TEXT,
|
||||
cashier_name TEXT,
|
||||
transaction_id TEXT,
|
||||
items TEXT NOT NULL DEFAULT '[]',
|
||||
subtotal REAL,
|
||||
tax REAL,
|
||||
discount REAL,
|
||||
tip REAL,
|
||||
total REAL,
|
||||
payment_method TEXT,
|
||||
amount_paid REAL,
|
||||
change_given REAL,
|
||||
raw_text TEXT,
|
||||
confidence_scores TEXT NOT NULL DEFAULT '{}',
|
||||
warnings TEXT NOT NULL DEFAULT '[]',
|
||||
processing_time REAL,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_receipt_data_receipt_id ON receipt_data (receipt_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_receipt_data_merchant ON receipt_data (merchant_name);
|
||||
CREATE INDEX IF NOT EXISTS idx_receipt_data_date ON receipt_data (transaction_date);
|
||||
23
app/db/migrations/004_tagging_system.sql
Normal file
23
app/db/migrations/004_tagging_system.sql
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
-- Migration 004: tags + product_tags join table (ported from Alembic 14f688cde2ca)
|
||||
|
||||
CREATE TABLE IF NOT EXISTS tags (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
name TEXT NOT NULL UNIQUE,
|
||||
slug TEXT NOT NULL UNIQUE,
|
||||
description TEXT,
|
||||
color TEXT,
|
||||
category TEXT CHECK (category IN ('food_type', 'dietary', 'allergen', 'custom') OR category IS NULL),
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_tags_name ON tags (name);
|
||||
CREATE INDEX IF NOT EXISTS idx_tags_slug ON tags (slug);
|
||||
CREATE INDEX IF NOT EXISTS idx_tags_category ON tags (category);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS product_tags (
|
||||
product_id INTEGER NOT NULL REFERENCES products (id) ON DELETE CASCADE,
|
||||
tag_id INTEGER NOT NULL REFERENCES tags (id) ON DELETE CASCADE,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
PRIMARY KEY (product_id, tag_id)
|
||||
);
|
||||
36
app/db/migrations/005_receipt_staged_status.sql
Normal file
36
app/db/migrations/005_receipt_staged_status.sql
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
-- Migration 005: Add 'staged' and 'low_quality' to receipts status constraint.
|
||||
--
|
||||
-- SQLite does not support ALTER TABLE to modify CHECK constraints.
|
||||
-- Pattern: create new table → copy data → drop old → rename.
|
||||
|
||||
PRAGMA foreign_keys = OFF;
|
||||
|
||||
CREATE TABLE receipts_new (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
filename TEXT NOT NULL,
|
||||
original_path TEXT NOT NULL,
|
||||
status TEXT NOT NULL DEFAULT 'uploaded'
|
||||
CHECK (status IN (
|
||||
'uploaded',
|
||||
'processing',
|
||||
'processed',
|
||||
'staged',
|
||||
'low_quality',
|
||||
'error'
|
||||
)),
|
||||
error TEXT,
|
||||
metadata TEXT NOT NULL DEFAULT '{}',
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
||||
updated_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
INSERT INTO receipts_new SELECT * FROM receipts;
|
||||
|
||||
DROP TABLE receipts;
|
||||
|
||||
ALTER TABLE receipts_new RENAME TO receipts;
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_receipts_status ON receipts (status);
|
||||
CREATE INDEX IF NOT EXISTS idx_receipts_created_at ON receipts (created_at DESC);
|
||||
|
||||
PRAGMA foreign_keys = ON;
|
||||
577
app/db/models.py
Normal file
577
app/db/models.py
Normal file
|
|
@ -0,0 +1,577 @@
|
|||
"""
|
||||
REMOVED — schema is now managed by plain SQL migrations in app/db/migrations/.
|
||||
This file is kept for historical reference only. Nothing imports it.
|
||||
"""
|
||||
# fmt: off # noqa — dead file, not linted
|
||||
|
||||
from sqlalchemy import (
|
||||
Column,
|
||||
String,
|
||||
Text,
|
||||
Boolean,
|
||||
Numeric,
|
||||
DateTime,
|
||||
Date,
|
||||
ForeignKey,
|
||||
CheckConstraint,
|
||||
Index,
|
||||
Table,
|
||||
)
|
||||
from sqlalchemy.dialects.postgresql import UUID, JSONB
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy.sql import func
|
||||
from datetime import datetime
|
||||
import uuid
|
||||
|
||||
from app.db.base import Base
|
||||
|
||||
|
||||
# Association table for many-to-many relationship between products and tags
|
||||
product_tags = Table(
|
||||
"product_tags",
|
||||
Base.metadata,
|
||||
Column(
|
||||
"product_id",
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("products.id", ondelete="CASCADE"),
|
||||
primary_key=True,
|
||||
),
|
||||
Column(
|
||||
"tag_id",
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("tags.id", ondelete="CASCADE"),
|
||||
primary_key=True,
|
||||
),
|
||||
Column(
|
||||
"created_at",
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
server_default=func.now(),
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class Receipt(Base):
|
||||
"""
|
||||
Receipt model - stores receipt metadata and processing status.
|
||||
|
||||
Corresponds to the 'receipts' table in the database schema.
|
||||
"""
|
||||
|
||||
__tablename__ = "receipts"
|
||||
|
||||
# Primary Key
|
||||
id = Column(
|
||||
UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
default=uuid.uuid4,
|
||||
server_default=func.gen_random_uuid(),
|
||||
)
|
||||
|
||||
# File Information
|
||||
filename = Column(String(255), nullable=False)
|
||||
original_path = Column(Text, nullable=False)
|
||||
processed_path = Column(Text, nullable=True)
|
||||
|
||||
# Processing Status
|
||||
status = Column(
|
||||
String(50),
|
||||
nullable=False,
|
||||
default="uploaded",
|
||||
server_default="uploaded",
|
||||
)
|
||||
error = Column(Text, nullable=True)
|
||||
|
||||
# Metadata (JSONB for flexibility)
|
||||
# Using 'receipt_metadata' to avoid conflict with SQLAlchemy's metadata attribute
|
||||
receipt_metadata = Column("metadata", JSONB, nullable=False, default={}, server_default="{}")
|
||||
|
||||
# Timestamps
|
||||
created_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
)
|
||||
updated_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
)
|
||||
|
||||
# Relationships
|
||||
quality_assessment = relationship(
|
||||
"QualityAssessment",
|
||||
back_populates="receipt",
|
||||
uselist=False, # One-to-one relationship
|
||||
cascade="all, delete-orphan",
|
||||
)
|
||||
receipt_data = relationship(
|
||||
"ReceiptData",
|
||||
back_populates="receipt",
|
||||
uselist=False, # One-to-one relationship
|
||||
cascade="all, delete-orphan",
|
||||
)
|
||||
|
||||
# Constraints and Indexes
|
||||
__table_args__ = (
|
||||
CheckConstraint(
|
||||
"status IN ('uploaded', 'processing', 'processed', 'error')",
|
||||
name="receipts_status_check",
|
||||
),
|
||||
# Indexes will be created after table definition
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<Receipt(id={self.id}, filename={self.filename}, status={self.status})>"
|
||||
|
||||
|
||||
# Create indexes for Receipt table
|
||||
Index("idx_receipts_status", Receipt.status)
|
||||
Index("idx_receipts_created_at", Receipt.created_at.desc())
|
||||
Index("idx_receipts_metadata", Receipt.receipt_metadata, postgresql_using="gin")
|
||||
|
||||
|
||||
class QualityAssessment(Base):
|
||||
"""
|
||||
Quality Assessment model - stores quality evaluation results.
|
||||
|
||||
One-to-one relationship with Receipt.
|
||||
Corresponds to the 'quality_assessments' table in the database schema.
|
||||
"""
|
||||
|
||||
__tablename__ = "quality_assessments"
|
||||
|
||||
# Primary Key
|
||||
id = Column(
|
||||
UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
default=uuid.uuid4,
|
||||
server_default=func.gen_random_uuid(),
|
||||
)
|
||||
|
||||
# Foreign Key (1:1 with receipts)
|
||||
receipt_id = Column(
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("receipts.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
unique=True,
|
||||
)
|
||||
|
||||
# Quality Scores
|
||||
overall_score = Column(Numeric(5, 2), nullable=False)
|
||||
is_acceptable = Column(Boolean, nullable=False, default=False, server_default="false")
|
||||
|
||||
# Detailed Metrics (JSONB)
|
||||
metrics = Column(JSONB, nullable=False, default={}, server_default="{}")
|
||||
|
||||
# Improvement Suggestions
|
||||
improvement_suggestions = Column(JSONB, nullable=False, default=[], server_default="[]")
|
||||
|
||||
# Timestamp
|
||||
created_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
)
|
||||
|
||||
# Relationships
|
||||
receipt = relationship("Receipt", back_populates="quality_assessment")
|
||||
|
||||
# Constraints
|
||||
__table_args__ = (
|
||||
CheckConstraint(
|
||||
"overall_score >= 0 AND overall_score <= 100",
|
||||
name="quality_assessments_score_range",
|
||||
),
|
||||
Index("idx_quality_assessments_receipt_id", "receipt_id"),
|
||||
Index("idx_quality_assessments_score", "overall_score"),
|
||||
Index("idx_quality_assessments_acceptable", "is_acceptable"),
|
||||
Index("idx_quality_assessments_metrics", "metrics", postgresql_using="gin"),
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return (
|
||||
f"<QualityAssessment(id={self.id}, receipt_id={self.receipt_id}, "
|
||||
f"score={self.overall_score}, acceptable={self.is_acceptable})>"
|
||||
)
|
||||
|
||||
|
||||
class Product(Base):
|
||||
"""
|
||||
Product model - stores product catalog information.
|
||||
|
||||
Products can come from:
|
||||
- Barcode scans (OpenFoodFacts API)
|
||||
- Manual user entries
|
||||
- Future: OCR extraction from receipts
|
||||
|
||||
One product can have many inventory items.
|
||||
"""
|
||||
|
||||
__tablename__ = "products"
|
||||
|
||||
# Primary Key
|
||||
id = Column(
|
||||
UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
default=uuid.uuid4,
|
||||
server_default=func.gen_random_uuid(),
|
||||
)
|
||||
|
||||
# Identifiers
|
||||
barcode = Column(String(50), unique=True, nullable=True) # UPC/EAN code
|
||||
|
||||
# Product Information
|
||||
name = Column(String(500), nullable=False)
|
||||
brand = Column(String(255), nullable=True)
|
||||
category = Column(String(255), nullable=True)
|
||||
|
||||
# Additional Details
|
||||
description = Column(Text, nullable=True)
|
||||
image_url = Column(Text, nullable=True)
|
||||
|
||||
# Nutritional Data (JSONB for flexibility)
|
||||
nutrition_data = Column(JSONB, nullable=False, default={}, server_default="{}")
|
||||
|
||||
# Source Tracking
|
||||
source = Column(
|
||||
String(50),
|
||||
nullable=False,
|
||||
default="manual",
|
||||
server_default="manual",
|
||||
) # 'openfoodfacts', 'manual', 'receipt_ocr'
|
||||
source_data = Column(JSONB, nullable=False, default={}, server_default="{}")
|
||||
|
||||
# Timestamps
|
||||
created_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
)
|
||||
updated_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
)
|
||||
|
||||
# Relationships
|
||||
inventory_items = relationship(
|
||||
"InventoryItem",
|
||||
back_populates="product",
|
||||
cascade="all, delete-orphan",
|
||||
)
|
||||
tags = relationship(
|
||||
"Tag",
|
||||
secondary=product_tags,
|
||||
back_populates="products",
|
||||
)
|
||||
|
||||
# Constraints
|
||||
__table_args__ = (
|
||||
CheckConstraint(
|
||||
"source IN ('openfoodfacts', 'manual', 'receipt_ocr')",
|
||||
name="products_source_check",
|
||||
),
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<Product(id={self.id}, name={self.name}, barcode={self.barcode})>"
|
||||
|
||||
|
||||
class Tag(Base):
|
||||
"""
|
||||
Tag model - stores tags/labels for organizing products.
|
||||
|
||||
Tags can be used to categorize products by:
|
||||
- Food type (dairy, meat, vegetables, fruit, etc.)
|
||||
- Dietary restrictions (vegan, gluten-free, kosher, halal, etc.)
|
||||
- Allergens (contains nuts, contains dairy, etc.)
|
||||
- Custom user categories
|
||||
|
||||
Many-to-many relationship with products.
|
||||
"""
|
||||
|
||||
__tablename__ = "tags"
|
||||
|
||||
# Primary Key
|
||||
id = Column(
|
||||
UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
default=uuid.uuid4,
|
||||
server_default=func.gen_random_uuid(),
|
||||
)
|
||||
|
||||
# Tag Information
|
||||
name = Column(String(100), nullable=False, unique=True)
|
||||
slug = Column(String(100), nullable=False, unique=True) # URL-safe version
|
||||
description = Column(Text, nullable=True)
|
||||
color = Column(String(7), nullable=True) # Hex color code for UI (#FF5733)
|
||||
|
||||
# Category (optional grouping)
|
||||
category = Column(String(50), nullable=True) # 'food_type', 'dietary', 'allergen', 'custom'
|
||||
|
||||
# Timestamps
|
||||
created_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
)
|
||||
updated_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
)
|
||||
|
||||
# Relationships
|
||||
products = relationship(
|
||||
"Product",
|
||||
secondary=product_tags,
|
||||
back_populates="tags",
|
||||
)
|
||||
|
||||
# Constraints
|
||||
__table_args__ = (
|
||||
CheckConstraint(
|
||||
"category IN ('food_type', 'dietary', 'allergen', 'custom', NULL)",
|
||||
name="tags_category_check",
|
||||
),
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<Tag(id={self.id}, name={self.name}, category={self.category})>"
|
||||
|
||||
|
||||
class InventoryItem(Base):
|
||||
"""
|
||||
Inventory Item model - tracks individual items in user's inventory.
|
||||
|
||||
Links to a Product and adds user-specific information like
|
||||
quantity, location, expiration date, etc.
|
||||
"""
|
||||
|
||||
__tablename__ = "inventory_items"
|
||||
|
||||
# Primary Key
|
||||
id = Column(
|
||||
UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
default=uuid.uuid4,
|
||||
server_default=func.gen_random_uuid(),
|
||||
)
|
||||
|
||||
# Foreign Keys
|
||||
product_id = Column(
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("products.id", ondelete="RESTRICT"),
|
||||
nullable=False,
|
||||
)
|
||||
receipt_id = Column(
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("receipts.id", ondelete="SET NULL"),
|
||||
nullable=True,
|
||||
)
|
||||
|
||||
# Quantity
|
||||
quantity = Column(Numeric(10, 2), nullable=False, default=1)
|
||||
unit = Column(String(50), nullable=False, default="count", server_default="count")
|
||||
|
||||
# Location
|
||||
location = Column(String(100), nullable=False)
|
||||
sublocation = Column(String(255), nullable=True)
|
||||
|
||||
# Dates
|
||||
purchase_date = Column(Date, nullable=True)
|
||||
expiration_date = Column(Date, nullable=True)
|
||||
|
||||
# Status
|
||||
status = Column(
|
||||
String(50),
|
||||
nullable=False,
|
||||
default="available",
|
||||
server_default="available",
|
||||
)
|
||||
consumed_at = Column(DateTime(timezone=True), nullable=True)
|
||||
|
||||
# Notes
|
||||
notes = Column(Text, nullable=True)
|
||||
|
||||
# Source Tracking
|
||||
source = Column(
|
||||
String(50),
|
||||
nullable=False,
|
||||
default="manual",
|
||||
server_default="manual",
|
||||
) # 'barcode_scan', 'manual', 'receipt'
|
||||
|
||||
# Timestamps
|
||||
created_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
)
|
||||
updated_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
)
|
||||
|
||||
# Relationships
|
||||
product = relationship("Product", back_populates="inventory_items")
|
||||
receipt = relationship("Receipt")
|
||||
|
||||
# Constraints
|
||||
__table_args__ = (
|
||||
CheckConstraint(
|
||||
"status IN ('available', 'consumed', 'expired', 'discarded')",
|
||||
name="inventory_items_status_check",
|
||||
),
|
||||
CheckConstraint(
|
||||
"source IN ('barcode_scan', 'manual', 'receipt')",
|
||||
name="inventory_items_source_check",
|
||||
),
|
||||
CheckConstraint(
|
||||
"quantity > 0",
|
||||
name="inventory_items_quantity_positive",
|
||||
),
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return (
|
||||
f"<InventoryItem(id={self.id}, product_id={self.product_id}, "
|
||||
f"quantity={self.quantity}, location={self.location}, status={self.status})>"
|
||||
)
|
||||
|
||||
|
||||
# Create indexes for Product table
|
||||
Index("idx_products_barcode", Product.barcode)
|
||||
Index("idx_products_name", Product.name)
|
||||
Index("idx_products_category", Product.category)
|
||||
Index("idx_products_source", Product.source)
|
||||
Index("idx_products_nutrition_data", Product.nutrition_data, postgresql_using="gin")
|
||||
|
||||
# Create indexes for Tag table
|
||||
Index("idx_tags_name", Tag.name)
|
||||
Index("idx_tags_slug", Tag.slug)
|
||||
Index("idx_tags_category", Tag.category)
|
||||
|
||||
# Create indexes for InventoryItem table
|
||||
Index("idx_inventory_items_product", InventoryItem.product_id)
|
||||
Index("idx_inventory_items_receipt", InventoryItem.receipt_id)
|
||||
Index("idx_inventory_items_status", InventoryItem.status)
|
||||
Index("idx_inventory_items_location", InventoryItem.location)
|
||||
Index("idx_inventory_items_expiration", InventoryItem.expiration_date)
|
||||
Index("idx_inventory_items_created", InventoryItem.created_at.desc())
|
||||
# Composite index for common query: active items by location
|
||||
Index(
|
||||
"idx_inventory_items_active_by_location",
|
||||
InventoryItem.status,
|
||||
InventoryItem.location,
|
||||
postgresql_where=(InventoryItem.status == "available"),
|
||||
)
|
||||
|
||||
|
||||
class ReceiptData(Base):
|
||||
"""
|
||||
Receipt Data model - stores OCR-extracted structured data from receipts.
|
||||
|
||||
One-to-one relationship with Receipt.
|
||||
Stores merchant info, transaction details, line items, and totals.
|
||||
"""
|
||||
|
||||
__tablename__ = "receipt_data"
|
||||
|
||||
# Primary Key
|
||||
id = Column(
|
||||
UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
default=uuid.uuid4,
|
||||
server_default=func.gen_random_uuid(),
|
||||
)
|
||||
|
||||
# Foreign Key (1:1 with receipts)
|
||||
receipt_id = Column(
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("receipts.id", ondelete="CASCADE"),
|
||||
nullable=False,
|
||||
unique=True,
|
||||
)
|
||||
|
||||
# Merchant Information
|
||||
merchant_name = Column(String(500), nullable=True)
|
||||
merchant_address = Column(Text, nullable=True)
|
||||
merchant_phone = Column(String(50), nullable=True)
|
||||
merchant_email = Column(String(255), nullable=True)
|
||||
merchant_website = Column(String(255), nullable=True)
|
||||
merchant_tax_id = Column(String(100), nullable=True)
|
||||
|
||||
# Transaction Information
|
||||
transaction_date = Column(Date, nullable=True)
|
||||
transaction_time = Column(String(20), nullable=True) # Store as string for flexibility
|
||||
receipt_number = Column(String(100), nullable=True)
|
||||
register_number = Column(String(50), nullable=True)
|
||||
cashier_name = Column(String(255), nullable=True)
|
||||
transaction_id = Column(String(100), nullable=True)
|
||||
|
||||
# Line Items (JSONB array)
|
||||
items = Column(JSONB, nullable=False, default=[], server_default="[]")
|
||||
|
||||
# Financial Totals
|
||||
subtotal = Column(Numeric(12, 2), nullable=True)
|
||||
tax = Column(Numeric(12, 2), nullable=True)
|
||||
discount = Column(Numeric(12, 2), nullable=True)
|
||||
tip = Column(Numeric(12, 2), nullable=True)
|
||||
total = Column(Numeric(12, 2), nullable=True)
|
||||
payment_method = Column(String(100), nullable=True)
|
||||
amount_paid = Column(Numeric(12, 2), nullable=True)
|
||||
change_given = Column(Numeric(12, 2), nullable=True)
|
||||
|
||||
# OCR Metadata
|
||||
raw_text = Column(Text, nullable=True) # Full OCR text output
|
||||
confidence_scores = Column(JSONB, nullable=False, default={}, server_default="{}")
|
||||
warnings = Column(JSONB, nullable=False, default=[], server_default="[]")
|
||||
processing_time = Column(Numeric(8, 3), nullable=True) # seconds
|
||||
|
||||
# Timestamps
|
||||
created_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
)
|
||||
updated_at = Column(
|
||||
DateTime(timezone=True),
|
||||
nullable=False,
|
||||
default=datetime.utcnow,
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
)
|
||||
|
||||
# Relationships
|
||||
receipt = relationship("Receipt", back_populates="receipt_data")
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return (
|
||||
f"<ReceiptData(id={self.id}, receipt_id={self.receipt_id}, "
|
||||
f"merchant={self.merchant_name}, total={self.total})>"
|
||||
)
|
||||
|
||||
|
||||
# Create indexes for ReceiptData table
|
||||
Index("idx_receipt_data_receipt_id", ReceiptData.receipt_id)
|
||||
Index("idx_receipt_data_merchant", ReceiptData.merchant_name)
|
||||
Index("idx_receipt_data_date", ReceiptData.transaction_date)
|
||||
Index("idx_receipt_data_items", ReceiptData.items, postgresql_using="gin")
|
||||
Index("idx_receipt_data_confidence", ReceiptData.confidence_scores, postgresql_using="gin")
|
||||
23
app/db/session.py
Normal file
23
app/db/session.py
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
"""
|
||||
FastAPI dependency that provides a Store instance per request.
|
||||
|
||||
Local mode: opens a Store at settings.DB_PATH.
|
||||
Cloud mode: opens a Store at the per-user DB path from the CloudUser session.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Generator
|
||||
|
||||
from fastapi import Depends
|
||||
|
||||
from app.cloud_session import CloudUser, get_session
|
||||
from app.db.store import Store
|
||||
|
||||
|
||||
def get_store(session: CloudUser = Depends(get_session)) -> Generator[Store, None, None]:
|
||||
"""FastAPI dependency — yields a Store for the current user, closes on completion."""
|
||||
store = Store(session.db)
|
||||
try:
|
||||
yield store
|
||||
finally:
|
||||
store.close()
|
||||
262
app/db/store.py
Normal file
262
app/db/store.py
Normal file
|
|
@ -0,0 +1,262 @@
|
|||
"""
|
||||
SQLite data store for Kiwi.
|
||||
Uses circuitforge-core for connection management and migrations.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import sqlite3
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from circuitforge_core.db.base import get_connection
|
||||
from circuitforge_core.db.migrations import run_migrations
|
||||
|
||||
MIGRATIONS_DIR = Path(__file__).parent / "migrations"
|
||||
|
||||
|
||||
class Store:
|
||||
def __init__(self, db_path: Path, key: str = "") -> None:
|
||||
self.conn: sqlite3.Connection = get_connection(db_path, key)
|
||||
self.conn.execute("PRAGMA journal_mode=WAL")
|
||||
self.conn.execute("PRAGMA foreign_keys=ON")
|
||||
run_migrations(self.conn, MIGRATIONS_DIR)
|
||||
|
||||
def close(self) -> None:
|
||||
self.conn.close()
|
||||
|
||||
# ── helpers ───────────────────────────────────────────────────────────
|
||||
|
||||
def _row_to_dict(self, row: sqlite3.Row) -> dict[str, Any]:
|
||||
d = dict(row)
|
||||
# Deserialise any TEXT columns that contain JSON
|
||||
for key in ("metadata", "nutrition_data", "source_data", "items",
|
||||
"metrics", "improvement_suggestions", "confidence_scores",
|
||||
"warnings"):
|
||||
if key in d and isinstance(d[key], str):
|
||||
try:
|
||||
d[key] = json.loads(d[key])
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
pass
|
||||
return d
|
||||
|
||||
def _fetch_one(self, sql: str, params: tuple = ()) -> dict[str, Any] | None:
|
||||
self.conn.row_factory = sqlite3.Row
|
||||
row = self.conn.execute(sql, params).fetchone()
|
||||
return self._row_to_dict(row) if row else None
|
||||
|
||||
def _fetch_all(self, sql: str, params: tuple = ()) -> list[dict[str, Any]]:
|
||||
self.conn.row_factory = sqlite3.Row
|
||||
rows = self.conn.execute(sql, params).fetchall()
|
||||
return [self._row_to_dict(r) for r in rows]
|
||||
|
||||
def _dump(self, value: Any) -> str:
|
||||
"""Serialise a Python object to a JSON string for storage."""
|
||||
return json.dumps(value)
|
||||
|
||||
# ── receipts ──────────────────────────────────────────────────────────
|
||||
|
||||
def _insert_returning(self, sql: str, params: tuple = ()) -> dict[str, Any]:
|
||||
"""Execute an INSERT ... RETURNING * and return the new row as a dict.
|
||||
Fetches the row BEFORE committing — SQLite requires the cursor to be
|
||||
fully consumed before the transaction is committed."""
|
||||
self.conn.row_factory = sqlite3.Row
|
||||
cur = self.conn.execute(sql, params)
|
||||
row = self._row_to_dict(cur.fetchone())
|
||||
self.conn.commit()
|
||||
return row
|
||||
|
||||
def create_receipt(self, filename: str, original_path: str) -> dict[str, Any]:
|
||||
return self._insert_returning(
|
||||
"INSERT INTO receipts (filename, original_path) VALUES (?, ?) RETURNING *",
|
||||
(filename, original_path),
|
||||
)
|
||||
|
||||
def get_receipt(self, receipt_id: int) -> dict[str, Any] | None:
|
||||
return self._fetch_one("SELECT * FROM receipts WHERE id = ?", (receipt_id,))
|
||||
|
||||
def list_receipts(self, limit: int = 50, offset: int = 0) -> list[dict[str, Any]]:
|
||||
return self._fetch_all(
|
||||
"SELECT * FROM receipts ORDER BY created_at DESC LIMIT ? OFFSET ?",
|
||||
(limit, offset),
|
||||
)
|
||||
|
||||
def update_receipt_status(self, receipt_id: int, status: str,
|
||||
error: str | None = None) -> None:
|
||||
self.conn.execute(
|
||||
"UPDATE receipts SET status = ?, error = ?, updated_at = datetime('now') WHERE id = ?",
|
||||
(status, error, receipt_id),
|
||||
)
|
||||
self.conn.commit()
|
||||
|
||||
def update_receipt_metadata(self, receipt_id: int, metadata: dict) -> None:
|
||||
self.conn.execute(
|
||||
"UPDATE receipts SET metadata = ?, updated_at = datetime('now') WHERE id = ?",
|
||||
(self._dump(metadata), receipt_id),
|
||||
)
|
||||
self.conn.commit()
|
||||
|
||||
# ── quality assessments ───────────────────────────────────────────────
|
||||
|
||||
def upsert_quality_assessment(self, receipt_id: int, overall_score: float,
|
||||
is_acceptable: bool, metrics: dict,
|
||||
suggestions: list) -> dict[str, Any]:
|
||||
self.conn.execute(
|
||||
"""INSERT INTO quality_assessments
|
||||
(receipt_id, overall_score, is_acceptable, metrics, improvement_suggestions)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
ON CONFLICT (receipt_id) DO UPDATE SET
|
||||
overall_score = excluded.overall_score,
|
||||
is_acceptable = excluded.is_acceptable,
|
||||
metrics = excluded.metrics,
|
||||
improvement_suggestions = excluded.improvement_suggestions""",
|
||||
(receipt_id, overall_score, int(is_acceptable),
|
||||
self._dump(metrics), self._dump(suggestions)),
|
||||
)
|
||||
self.conn.commit()
|
||||
return self._fetch_one(
|
||||
"SELECT * FROM quality_assessments WHERE receipt_id = ?", (receipt_id,)
|
||||
)
|
||||
|
||||
# ── products ──────────────────────────────────────────────────────────
|
||||
|
||||
def get_or_create_product(self, name: str, barcode: str | None = None,
|
||||
**kwargs) -> tuple[dict[str, Any], bool]:
|
||||
"""Returns (product, created). Looks up by barcode first, then name."""
|
||||
if barcode:
|
||||
existing = self._fetch_one(
|
||||
"SELECT * FROM products WHERE barcode = ?", (barcode,)
|
||||
)
|
||||
if existing:
|
||||
return existing, False
|
||||
|
||||
existing = self._fetch_one("SELECT * FROM products WHERE name = ?", (name,))
|
||||
if existing:
|
||||
return existing, False
|
||||
|
||||
row = self._insert_returning(
|
||||
"""INSERT INTO products (name, barcode, brand, category, description,
|
||||
image_url, nutrition_data, source, source_data)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?) RETURNING *""",
|
||||
(
|
||||
name, barcode,
|
||||
kwargs.get("brand"), kwargs.get("category"),
|
||||
kwargs.get("description"), kwargs.get("image_url"),
|
||||
self._dump(kwargs.get("nutrition_data", {})),
|
||||
kwargs.get("source", "manual"),
|
||||
self._dump(kwargs.get("source_data", {})),
|
||||
),
|
||||
)
|
||||
return row, True
|
||||
|
||||
def get_product(self, product_id: int) -> dict[str, Any] | None:
|
||||
return self._fetch_one("SELECT * FROM products WHERE id = ?", (product_id,))
|
||||
|
||||
def list_products(self) -> list[dict[str, Any]]:
|
||||
return self._fetch_all("SELECT * FROM products ORDER BY name")
|
||||
|
||||
# ── inventory ─────────────────────────────────────────────────────────
|
||||
|
||||
def add_inventory_item(self, product_id: int, location: str,
|
||||
quantity: float = 1.0, unit: str = "count",
|
||||
**kwargs) -> dict[str, Any]:
|
||||
return self._insert_returning(
|
||||
"""INSERT INTO inventory_items
|
||||
(product_id, receipt_id, quantity, unit, location, sublocation,
|
||||
purchase_date, expiration_date, notes, source)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?) RETURNING *""",
|
||||
(
|
||||
product_id, kwargs.get("receipt_id"),
|
||||
quantity, unit, location, kwargs.get("sublocation"),
|
||||
kwargs.get("purchase_date"), kwargs.get("expiration_date"),
|
||||
kwargs.get("notes"), kwargs.get("source", "manual"),
|
||||
),
|
||||
)
|
||||
|
||||
def get_inventory_item(self, item_id: int) -> dict[str, Any] | None:
|
||||
return self._fetch_one(
|
||||
"""SELECT i.*, p.name as product_name, p.barcode, p.category
|
||||
FROM inventory_items i
|
||||
JOIN products p ON p.id = i.product_id
|
||||
WHERE i.id = ?""",
|
||||
(item_id,),
|
||||
)
|
||||
|
||||
def list_inventory(self, location: str | None = None,
|
||||
status: str = "available") -> list[dict[str, Any]]:
|
||||
if location:
|
||||
return self._fetch_all(
|
||||
"""SELECT i.*, p.name as product_name, p.barcode, p.category
|
||||
FROM inventory_items i
|
||||
JOIN products p ON p.id = i.product_id
|
||||
WHERE i.status = ? AND i.location = ?
|
||||
ORDER BY i.expiration_date ASC NULLS LAST""",
|
||||
(status, location),
|
||||
)
|
||||
return self._fetch_all(
|
||||
"""SELECT i.*, p.name as product_name, p.barcode, p.category
|
||||
FROM inventory_items i
|
||||
JOIN products p ON p.id = i.product_id
|
||||
WHERE i.status = ?
|
||||
ORDER BY i.expiration_date ASC NULLS LAST""",
|
||||
(status,),
|
||||
)
|
||||
|
||||
def update_inventory_item(self, item_id: int, **kwargs) -> dict[str, Any] | None:
|
||||
allowed = {"quantity", "unit", "location", "sublocation",
|
||||
"expiration_date", "status", "notes", "consumed_at"}
|
||||
updates = {k: v for k, v in kwargs.items() if k in allowed}
|
||||
if not updates:
|
||||
return self.get_inventory_item(item_id)
|
||||
sets = ", ".join(f"{k} = ?" for k in updates)
|
||||
values = list(updates.values()) + [item_id]
|
||||
self.conn.execute(
|
||||
f"UPDATE inventory_items SET {sets}, updated_at = datetime('now') WHERE id = ?",
|
||||
values,
|
||||
)
|
||||
self.conn.commit()
|
||||
return self.get_inventory_item(item_id)
|
||||
|
||||
def expiring_soon(self, days: int = 7) -> list[dict[str, Any]]:
|
||||
return self._fetch_all(
|
||||
"""SELECT i.*, p.name as product_name, p.category
|
||||
FROM inventory_items i
|
||||
JOIN products p ON p.id = i.product_id
|
||||
WHERE i.status = 'available'
|
||||
AND i.expiration_date IS NOT NULL
|
||||
AND date(i.expiration_date) <= date('now', ? || ' days')
|
||||
ORDER BY i.expiration_date ASC""",
|
||||
(str(days),),
|
||||
)
|
||||
|
||||
# ── receipt_data ──────────────────────────────────────────────────────
|
||||
|
||||
def upsert_receipt_data(self, receipt_id: int, data: dict) -> dict[str, Any]:
|
||||
fields = [
|
||||
"merchant_name", "merchant_address", "merchant_phone", "merchant_email",
|
||||
"merchant_website", "merchant_tax_id", "transaction_date", "transaction_time",
|
||||
"receipt_number", "register_number", "cashier_name", "transaction_id",
|
||||
"items", "subtotal", "tax", "discount", "tip", "total",
|
||||
"payment_method", "amount_paid", "change_given",
|
||||
"raw_text", "confidence_scores", "warnings", "processing_time",
|
||||
]
|
||||
json_fields = {"items", "confidence_scores", "warnings"}
|
||||
cols = ", ".join(fields)
|
||||
placeholders = ", ".join("?" for _ in fields)
|
||||
values = [
|
||||
self._dump(data.get(f)) if f in json_fields and data.get(f) is not None
|
||||
else data.get(f)
|
||||
for f in fields
|
||||
]
|
||||
self.conn.execute(
|
||||
f"""INSERT INTO receipt_data (receipt_id, {cols})
|
||||
VALUES (?, {placeholders})
|
||||
ON CONFLICT (receipt_id) DO UPDATE SET
|
||||
{', '.join(f'{f} = excluded.{f}' for f in fields)},
|
||||
updated_at = datetime('now')""",
|
||||
[receipt_id] + values,
|
||||
)
|
||||
self.conn.commit()
|
||||
return self._fetch_one(
|
||||
"SELECT * FROM receipt_data WHERE receipt_id = ?", (receipt_id,)
|
||||
)
|
||||
44
app/main.py
Normal file
44
app/main.py
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
#!/usr/bin/env python
|
||||
# app/main.py
|
||||
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
|
||||
from app.api.routes import api_router
|
||||
from app.core.config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
logger.info("Starting Kiwi API...")
|
||||
settings.ensure_dirs()
|
||||
yield
|
||||
logger.info("Kiwi API shutting down.")
|
||||
|
||||
|
||||
app = FastAPI(
|
||||
title=settings.PROJECT_NAME,
|
||||
description="Pantry tracking + leftover recipe suggestions",
|
||||
version="0.1.0",
|
||||
lifespan=lifespan,
|
||||
)
|
||||
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=settings.CORS_ORIGINS,
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
app.include_router(api_router, prefix=settings.API_PREFIX)
|
||||
|
||||
|
||||
@app.get("/")
|
||||
async def root():
|
||||
return {"service": "kiwi-api", "docs": "/docs"}
|
||||
5
app/models/__init__.py
Normal file
5
app/models/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
# app/models/__init__.py
|
||||
"""
|
||||
Data models for Kiwi.
|
||||
Contains domain models and Pydantic schemas.
|
||||
"""
|
||||
5
app/models/domain/__init__.py
Normal file
5
app/models/domain/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
# app/models/domain/__init__.py
|
||||
"""
|
||||
Domain models for Kiwi.
|
||||
These represent the core business entities.
|
||||
"""
|
||||
4
app/models/schemas/__init__.py
Normal file
4
app/models/schemas/__init__.py
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
from app.models.schemas.receipt import ReceiptResponse
|
||||
from app.models.schemas.quality import QualityAssessment
|
||||
|
||||
__all__ = ["ReceiptResponse", "QualityAssessment"]
|
||||
143
app/models/schemas/inventory.py
Normal file
143
app/models/schemas/inventory.py
Normal file
|
|
@ -0,0 +1,143 @@
|
|||
"""Pydantic schemas for inventory management (integer IDs, SQLite-compatible)."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import date, datetime
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
# ── Tags ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
class TagCreate(BaseModel):
|
||||
name: str = Field(..., max_length=100)
|
||||
slug: str = Field(..., max_length=100)
|
||||
description: Optional[str] = None
|
||||
color: Optional[str] = Field(None, max_length=7)
|
||||
category: Optional[str] = None
|
||||
|
||||
|
||||
class TagResponse(BaseModel):
|
||||
id: int
|
||||
name: str
|
||||
slug: str
|
||||
description: Optional[str]
|
||||
color: Optional[str]
|
||||
category: Optional[str]
|
||||
created_at: str
|
||||
updated_at: str
|
||||
|
||||
model_config = {"from_attributes": True}
|
||||
|
||||
|
||||
# ── Products ──────────────────────────────────────────────────────────────────
|
||||
|
||||
class ProductCreate(BaseModel):
|
||||
name: str = Field(..., max_length=500)
|
||||
barcode: Optional[str] = Field(None, max_length=50)
|
||||
brand: Optional[str] = None
|
||||
category: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
image_url: Optional[str] = None
|
||||
nutrition_data: Dict[str, Any] = Field(default_factory=dict)
|
||||
source: str = "manual"
|
||||
source_data: Dict[str, Any] = Field(default_factory=dict)
|
||||
|
||||
|
||||
class ProductUpdate(BaseModel):
|
||||
name: Optional[str] = None
|
||||
brand: Optional[str] = None
|
||||
category: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
image_url: Optional[str] = None
|
||||
nutrition_data: Optional[Dict[str, Any]] = None
|
||||
|
||||
|
||||
class ProductResponse(BaseModel):
|
||||
id: int
|
||||
barcode: Optional[str]
|
||||
name: str
|
||||
brand: Optional[str]
|
||||
category: Optional[str]
|
||||
description: Optional[str]
|
||||
image_url: Optional[str]
|
||||
nutrition_data: Dict[str, Any]
|
||||
source: str
|
||||
created_at: str
|
||||
updated_at: str
|
||||
|
||||
model_config = {"from_attributes": True}
|
||||
|
||||
|
||||
# ── Inventory Items ───────────────────────────────────────────────────────────
|
||||
|
||||
class InventoryItemCreate(BaseModel):
|
||||
product_id: int
|
||||
quantity: float = Field(default=1.0, gt=0)
|
||||
unit: str = "count"
|
||||
location: str
|
||||
sublocation: Optional[str] = None
|
||||
purchase_date: Optional[date] = None
|
||||
expiration_date: Optional[date] = None
|
||||
notes: Optional[str] = None
|
||||
source: str = "manual"
|
||||
|
||||
|
||||
class InventoryItemUpdate(BaseModel):
|
||||
quantity: Optional[float] = Field(None, gt=0)
|
||||
unit: Optional[str] = None
|
||||
location: Optional[str] = None
|
||||
sublocation: Optional[str] = None
|
||||
expiration_date: Optional[date] = None
|
||||
status: Optional[str] = None
|
||||
notes: Optional[str] = None
|
||||
|
||||
|
||||
class InventoryItemResponse(BaseModel):
|
||||
id: int
|
||||
product_id: int
|
||||
product_name: Optional[str] = None
|
||||
barcode: Optional[str] = None
|
||||
category: Optional[str] = None
|
||||
quantity: float
|
||||
unit: str
|
||||
location: str
|
||||
sublocation: Optional[str]
|
||||
purchase_date: Optional[str]
|
||||
expiration_date: Optional[str]
|
||||
status: str
|
||||
notes: Optional[str]
|
||||
source: str
|
||||
created_at: str
|
||||
updated_at: str
|
||||
|
||||
model_config = {"from_attributes": True}
|
||||
|
||||
|
||||
# ── Barcode scan ──────────────────────────────────────────────────────────────
|
||||
|
||||
class BarcodeScanResult(BaseModel):
|
||||
barcode: str
|
||||
barcode_type: str
|
||||
product: Optional[ProductResponse]
|
||||
inventory_item: Optional[InventoryItemResponse]
|
||||
added_to_inventory: bool
|
||||
message: str
|
||||
|
||||
|
||||
class BarcodeScanResponse(BaseModel):
|
||||
success: bool
|
||||
barcodes_found: int
|
||||
results: List[BarcodeScanResult]
|
||||
message: str
|
||||
|
||||
|
||||
# ── Stats ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
class InventoryStats(BaseModel):
|
||||
total_items: int
|
||||
available_items: int
|
||||
expiring_soon: int
|
||||
expired_items: int
|
||||
locations: Dict[str, int]
|
||||
138
app/models/schemas/ocr.py
Normal file
138
app/models/schemas/ocr.py
Normal file
|
|
@ -0,0 +1,138 @@
|
|||
#!/usr/bin/env python
|
||||
"""
|
||||
Pydantic schemas for OCR data models.
|
||||
"""
|
||||
|
||||
from datetime import datetime, date, time
|
||||
from typing import Optional, List, Dict, Any
|
||||
from uuid import UUID
|
||||
from pydantic import BaseModel, Field, validator
|
||||
|
||||
|
||||
class MerchantInfo(BaseModel):
|
||||
"""Merchant/store information from receipt."""
|
||||
name: Optional[str] = None
|
||||
address: Optional[str] = None
|
||||
phone: Optional[str] = None
|
||||
email: Optional[str] = None
|
||||
website: Optional[str] = None
|
||||
tax_id: Optional[str] = None
|
||||
|
||||
|
||||
class TransactionInfo(BaseModel):
|
||||
"""Transaction details from receipt."""
|
||||
date: Optional[date] = None
|
||||
time: Optional[time] = None
|
||||
receipt_number: Optional[str] = None
|
||||
register: Optional[str] = None
|
||||
cashier: Optional[str] = None
|
||||
transaction_id: Optional[str] = None
|
||||
|
||||
|
||||
class ReceiptItem(BaseModel):
|
||||
"""Individual line item from receipt."""
|
||||
name: str
|
||||
quantity: float = 1.0
|
||||
unit_price: Optional[float] = None
|
||||
total_price: float
|
||||
category: Optional[str] = None
|
||||
tax_code: Optional[str] = None
|
||||
discount: Optional[float] = 0.0
|
||||
barcode: Optional[str] = None
|
||||
notes: Optional[str] = None
|
||||
|
||||
|
||||
class ReceiptTotals(BaseModel):
|
||||
"""Financial totals from receipt."""
|
||||
subtotal: float
|
||||
tax: Optional[float] = 0.0
|
||||
discount: Optional[float] = 0.0
|
||||
tip: Optional[float] = 0.0
|
||||
total: float
|
||||
payment_method: Optional[str] = None
|
||||
amount_paid: Optional[float] = None
|
||||
change: Optional[float] = 0.0
|
||||
calculated_subtotal: Optional[float] = None # For validation
|
||||
|
||||
|
||||
class ConfidenceScores(BaseModel):
|
||||
"""Confidence scores for extracted data."""
|
||||
overall: float = Field(ge=0.0, le=1.0)
|
||||
merchant: Optional[float] = Field(default=0.5, ge=0.0, le=1.0)
|
||||
items: Optional[float] = Field(default=0.5, ge=0.0, le=1.0)
|
||||
totals: Optional[float] = Field(default=0.5, ge=0.0, le=1.0)
|
||||
transaction: Optional[float] = Field(default=0.5, ge=0.0, le=1.0)
|
||||
|
||||
|
||||
class OCRResult(BaseModel):
|
||||
"""Complete OCR extraction result."""
|
||||
merchant: MerchantInfo
|
||||
transaction: TransactionInfo
|
||||
items: List[ReceiptItem]
|
||||
totals: ReceiptTotals
|
||||
confidence: ConfidenceScores
|
||||
raw_text: Optional[str] = None
|
||||
warnings: List[str] = Field(default_factory=list)
|
||||
processing_time: Optional[float] = None # seconds
|
||||
|
||||
|
||||
class ReceiptDataCreate(BaseModel):
|
||||
"""Schema for creating receipt data."""
|
||||
receipt_id: UUID
|
||||
merchant_name: Optional[str] = None
|
||||
merchant_address: Optional[str] = None
|
||||
merchant_phone: Optional[str] = None
|
||||
transaction_date: Optional[date] = None
|
||||
transaction_time: Optional[time] = None
|
||||
receipt_number: Optional[str] = None
|
||||
items: List[Dict[str, Any]] = Field(default_factory=list)
|
||||
subtotal: Optional[float] = None
|
||||
tax: Optional[float] = None
|
||||
tip: Optional[float] = None
|
||||
total: Optional[float] = None
|
||||
payment_method: Optional[str] = None
|
||||
raw_text: Optional[str] = None
|
||||
confidence_scores: Optional[Dict[str, float]] = None
|
||||
warnings: List[str] = Field(default_factory=list)
|
||||
|
||||
|
||||
class ReceiptDataResponse(BaseModel):
|
||||
"""Schema for receipt data response."""
|
||||
id: UUID
|
||||
receipt_id: UUID
|
||||
merchant_name: Optional[str]
|
||||
merchant_address: Optional[str]
|
||||
merchant_phone: Optional[str]
|
||||
transaction_date: Optional[date]
|
||||
transaction_time: Optional[time]
|
||||
receipt_number: Optional[str]
|
||||
items: List[Dict[str, Any]]
|
||||
subtotal: Optional[float]
|
||||
tax: Optional[float]
|
||||
tip: Optional[float]
|
||||
total: Optional[float]
|
||||
payment_method: Optional[str]
|
||||
raw_text: Optional[str]
|
||||
confidence_scores: Optional[Dict[str, float]]
|
||||
warnings: List[str]
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class OCRStatusResponse(BaseModel):
|
||||
"""OCR processing status response."""
|
||||
receipt_id: UUID
|
||||
ocr_completed: bool
|
||||
has_data: bool
|
||||
confidence: Optional[float] = None
|
||||
item_count: Optional[int] = None
|
||||
warnings: List[str] = Field(default_factory=list)
|
||||
|
||||
|
||||
class OCRTriggerRequest(BaseModel):
|
||||
"""Request to trigger OCR processing."""
|
||||
force_reprocess: bool = False
|
||||
use_quantization: bool = False
|
||||
17
app/models/schemas/quality.py
Normal file
17
app/models/schemas/quality.py
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
"""Quality assessment schemas (integer IDs, SQLite-compatible)."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, List
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
class QualityAssessment(BaseModel):
|
||||
id: int
|
||||
receipt_id: int
|
||||
overall_score: float
|
||||
is_acceptable: bool
|
||||
metrics: Dict[str, Any] = {}
|
||||
improvement_suggestions: List[str] = []
|
||||
created_at: str
|
||||
|
||||
model_config = {"from_attributes": True}
|
||||
46
app/models/schemas/receipt.py
Normal file
46
app/models/schemas/receipt.py
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
"""Receipt schemas (integer IDs, SQLite-compatible)."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, List, Optional
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class ReceiptResponse(BaseModel):
|
||||
id: int
|
||||
filename: str
|
||||
status: str
|
||||
error: Optional[str] = None
|
||||
metadata: Dict[str, Any] = {}
|
||||
created_at: str
|
||||
updated_at: str
|
||||
|
||||
model_config = {"from_attributes": True}
|
||||
|
||||
|
||||
class ApproveOCRRequest(BaseModel):
|
||||
"""Approve staged OCR items for inventory population.
|
||||
|
||||
item_indices: which items (by 0-based index) to approve.
|
||||
Omit or pass null to approve all items.
|
||||
location: pantry location for created inventory items.
|
||||
"""
|
||||
item_indices: Optional[List[int]] = Field(
|
||||
default=None,
|
||||
description="0-based indices of items to approve. Null = approve all.",
|
||||
)
|
||||
location: str = Field(default="pantry")
|
||||
|
||||
|
||||
class ApprovedInventoryItem(BaseModel):
|
||||
inventory_id: int
|
||||
product_name: str
|
||||
quantity: float
|
||||
location: str
|
||||
expiration_date: Optional[str] = None
|
||||
|
||||
|
||||
class ApproveOCRResponse(BaseModel):
|
||||
receipt_id: int
|
||||
approved: int
|
||||
skipped: int
|
||||
inventory_items: List[ApprovedInventoryItem]
|
||||
8
app/services/__init__.py
Normal file
8
app/services/__init__.py
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
# app/services/__init__.py
|
||||
"""
|
||||
Business logic services for Kiwi.
|
||||
"""
|
||||
|
||||
from app.services.receipt_service import ReceiptService
|
||||
|
||||
__all__ = ["ReceiptService"]
|
||||
365
app/services/barcode_scanner.py
Normal file
365
app/services/barcode_scanner.py
Normal file
|
|
@ -0,0 +1,365 @@
|
|||
"""
|
||||
Barcode scanning service using pyzbar.
|
||||
|
||||
This module provides functionality to detect and decode barcodes
|
||||
from images (UPC, EAN, QR codes, etc.).
|
||||
"""
|
||||
|
||||
import cv2
|
||||
import numpy as np
|
||||
from pyzbar import pyzbar
|
||||
from pathlib import Path
|
||||
from typing import List, Dict, Any, Optional
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BarcodeScanner:
|
||||
"""
|
||||
Service for scanning barcodes from images.
|
||||
|
||||
Supports various barcode formats:
|
||||
- UPC-A, UPC-E
|
||||
- EAN-8, EAN-13
|
||||
- Code 39, Code 128
|
||||
- QR codes
|
||||
- And more via pyzbar/libzbar
|
||||
"""
|
||||
|
||||
def scan_image(self, image_path: Path) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Scan an image for barcodes.
|
||||
|
||||
Args:
|
||||
image_path: Path to the image file
|
||||
|
||||
Returns:
|
||||
List of detected barcodes, each as a dictionary with:
|
||||
- data: Barcode data (string)
|
||||
- type: Barcode type (e.g., 'EAN13', 'QRCODE')
|
||||
- quality: Quality score (0-100)
|
||||
- rect: Bounding box (x, y, width, height)
|
||||
"""
|
||||
try:
|
||||
# Read image
|
||||
image = cv2.imread(str(image_path))
|
||||
if image is None:
|
||||
logger.error(f"Failed to load image: {image_path}")
|
||||
return []
|
||||
|
||||
# Convert to grayscale for better detection
|
||||
gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# Try multiple preprocessing techniques and rotations for better detection
|
||||
barcodes = []
|
||||
|
||||
# 1. Try on original grayscale
|
||||
barcodes.extend(self._detect_barcodes(gray, image))
|
||||
|
||||
# 2. Try with adaptive thresholding (helps with poor lighting)
|
||||
if not barcodes:
|
||||
thresh = cv2.adaptiveThreshold(
|
||||
gray, 255, cv2.ADAPTIVE_THRESH_GAUSSIAN_C,
|
||||
cv2.THRESH_BINARY, 11, 2
|
||||
)
|
||||
barcodes.extend(self._detect_barcodes(thresh, image))
|
||||
|
||||
# 3. Try with sharpening (helps with blurry images)
|
||||
if not barcodes:
|
||||
kernel = np.array([[-1, -1, -1],
|
||||
[-1, 9, -1],
|
||||
[-1, -1, -1]])
|
||||
sharpened = cv2.filter2D(gray, -1, kernel)
|
||||
barcodes.extend(self._detect_barcodes(sharpened, image))
|
||||
|
||||
# 4. Try rotations if still no barcodes found (handles tilted/rotated barcodes)
|
||||
if not barcodes:
|
||||
logger.info("No barcodes found in standard orientation, trying rotations...")
|
||||
# Try incremental angles: 30°, 60°, 90° (covers 0-90° range)
|
||||
# 0° already tried, 180° is functionally same as 0°, 90°/270° are same axis
|
||||
for angle in [30, 60, 90]:
|
||||
rotated_gray = self._rotate_image(gray, angle)
|
||||
rotated_color = self._rotate_image(image, angle)
|
||||
detected = self._detect_barcodes(rotated_gray, rotated_color)
|
||||
if detected:
|
||||
logger.info(f"Found barcode(s) at {angle}° rotation")
|
||||
barcodes.extend(detected)
|
||||
break # Stop after first successful rotation
|
||||
|
||||
# Remove duplicates (same data)
|
||||
unique_barcodes = self._deduplicate_barcodes(barcodes)
|
||||
|
||||
logger.info(f"Found {len(unique_barcodes)} barcode(s) in {image_path}")
|
||||
return unique_barcodes
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error scanning image {image_path}: {e}")
|
||||
return []
|
||||
|
||||
def _detect_barcodes(
|
||||
self,
|
||||
image: np.ndarray,
|
||||
original_image: np.ndarray
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Detect barcodes in a preprocessed image.
|
||||
|
||||
Args:
|
||||
image: Preprocessed image (grayscale)
|
||||
original_image: Original color image (for quality assessment)
|
||||
|
||||
Returns:
|
||||
List of detected barcodes
|
||||
"""
|
||||
detected = pyzbar.decode(image)
|
||||
barcodes = []
|
||||
|
||||
for barcode in detected:
|
||||
# Decode barcode data
|
||||
barcode_data = barcode.data.decode("utf-8")
|
||||
barcode_type = barcode.type
|
||||
|
||||
# Get bounding box
|
||||
rect = barcode.rect
|
||||
bbox = {
|
||||
"x": rect.left,
|
||||
"y": rect.top,
|
||||
"width": rect.width,
|
||||
"height": rect.height,
|
||||
}
|
||||
|
||||
# Assess quality of barcode region
|
||||
quality = self._assess_barcode_quality(original_image, bbox)
|
||||
|
||||
barcodes.append({
|
||||
"data": barcode_data,
|
||||
"type": barcode_type,
|
||||
"quality": quality,
|
||||
"rect": bbox,
|
||||
})
|
||||
|
||||
return barcodes
|
||||
|
||||
def _assess_barcode_quality(
|
||||
self,
|
||||
image: np.ndarray,
|
||||
bbox: Dict[str, int]
|
||||
) -> int:
|
||||
"""
|
||||
Assess the quality of a detected barcode.
|
||||
|
||||
Args:
|
||||
image: Original image
|
||||
bbox: Bounding box of barcode
|
||||
|
||||
Returns:
|
||||
Quality score (0-100)
|
||||
"""
|
||||
try:
|
||||
# Extract barcode region
|
||||
x, y, w, h = bbox["x"], bbox["y"], bbox["width"], bbox["height"]
|
||||
|
||||
# Add padding
|
||||
pad = 10
|
||||
y1 = max(0, y - pad)
|
||||
y2 = min(image.shape[0], y + h + pad)
|
||||
x1 = max(0, x - pad)
|
||||
x2 = min(image.shape[1], x + w + pad)
|
||||
|
||||
region = image[y1:y2, x1:x2]
|
||||
|
||||
if region.size == 0:
|
||||
return 50
|
||||
|
||||
# Convert to grayscale if needed
|
||||
if len(region.shape) == 3:
|
||||
region = cv2.cvtColor(region, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# Calculate sharpness (Laplacian variance)
|
||||
laplacian_var = cv2.Laplacian(region, cv2.CV_64F).var()
|
||||
sharpness_score = min(100, laplacian_var / 10) # Normalize
|
||||
|
||||
# Calculate contrast
|
||||
min_val, max_val = region.min(), region.max()
|
||||
contrast = (max_val - min_val) / 255.0 * 100
|
||||
|
||||
# Calculate size score (larger is better, up to a point)
|
||||
area = w * h
|
||||
size_score = min(100, area / 100) # Normalize
|
||||
|
||||
# Weighted average
|
||||
quality = (sharpness_score * 0.4 + contrast * 0.4 + size_score * 0.2)
|
||||
|
||||
return int(quality)
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Error assessing barcode quality: {e}")
|
||||
return 50
|
||||
|
||||
def _rotate_image(self, image: np.ndarray, angle: float) -> np.ndarray:
|
||||
"""
|
||||
Rotate an image by a given angle.
|
||||
|
||||
Args:
|
||||
image: Input image
|
||||
angle: Rotation angle in degrees (any angle, but optimized for 90° increments)
|
||||
|
||||
Returns:
|
||||
Rotated image
|
||||
"""
|
||||
# Use fast optimized rotation for common angles
|
||||
if angle == 90:
|
||||
return cv2.rotate(image, cv2.ROTATE_90_CLOCKWISE)
|
||||
elif angle == 180:
|
||||
return cv2.rotate(image, cv2.ROTATE_180)
|
||||
elif angle == 270:
|
||||
return cv2.rotate(image, cv2.ROTATE_90_COUNTERCLOCKWISE)
|
||||
elif angle == 0:
|
||||
return image
|
||||
else:
|
||||
# For arbitrary angles, use affine transformation
|
||||
(h, w) = image.shape[:2]
|
||||
center = (w // 2, h // 2)
|
||||
|
||||
# Get rotation matrix
|
||||
M = cv2.getRotationMatrix2D(center, angle, 1.0)
|
||||
|
||||
# Calculate new bounding dimensions
|
||||
cos = np.abs(M[0, 0])
|
||||
sin = np.abs(M[0, 1])
|
||||
new_w = int((h * sin) + (w * cos))
|
||||
new_h = int((h * cos) + (w * sin))
|
||||
|
||||
# Adjust rotation matrix for new dimensions
|
||||
M[0, 2] += (new_w / 2) - center[0]
|
||||
M[1, 2] += (new_h / 2) - center[1]
|
||||
|
||||
# Perform rotation
|
||||
return cv2.warpAffine(image, M, (new_w, new_h),
|
||||
flags=cv2.INTER_CUBIC,
|
||||
borderMode=cv2.BORDER_REPLICATE)
|
||||
|
||||
def _deduplicate_barcodes(
|
||||
self,
|
||||
barcodes: List[Dict[str, Any]]
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Remove duplicate barcodes (same data).
|
||||
|
||||
If multiple detections of the same barcode, keep the one
|
||||
with the highest quality score.
|
||||
|
||||
Args:
|
||||
barcodes: List of detected barcodes
|
||||
|
||||
Returns:
|
||||
Deduplicated list
|
||||
"""
|
||||
seen = {}
|
||||
for barcode in barcodes:
|
||||
data = barcode["data"]
|
||||
if data not in seen or barcode["quality"] > seen[data]["quality"]:
|
||||
seen[data] = barcode
|
||||
|
||||
return list(seen.values())
|
||||
|
||||
def scan_from_bytes(self, image_bytes: bytes) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Scan barcodes from image bytes (uploaded file).
|
||||
|
||||
Args:
|
||||
image_bytes: Image data as bytes
|
||||
|
||||
Returns:
|
||||
List of detected barcodes
|
||||
"""
|
||||
try:
|
||||
# Convert bytes to numpy array
|
||||
nparr = np.frombuffer(image_bytes, np.uint8)
|
||||
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
|
||||
|
||||
if image is None:
|
||||
logger.error("Failed to decode image from bytes")
|
||||
return []
|
||||
|
||||
# Convert to grayscale
|
||||
gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# Try multiple approaches for better detection
|
||||
barcodes = []
|
||||
|
||||
# 1. Try original orientation
|
||||
barcodes.extend(self._detect_barcodes(gray, image))
|
||||
|
||||
# 2. Try with adaptive thresholding
|
||||
if not barcodes:
|
||||
thresh = cv2.adaptiveThreshold(
|
||||
gray, 255, cv2.ADAPTIVE_THRESH_GAUSSIAN_C,
|
||||
cv2.THRESH_BINARY, 11, 2
|
||||
)
|
||||
barcodes.extend(self._detect_barcodes(thresh, image))
|
||||
|
||||
# 3. Try rotations if still no barcodes found
|
||||
if not barcodes:
|
||||
logger.info("No barcodes found in uploaded image, trying rotations...")
|
||||
# Try incremental angles: 30°, 60°, 90° (covers 0-90° range)
|
||||
for angle in [30, 60, 90]:
|
||||
rotated_gray = self._rotate_image(gray, angle)
|
||||
rotated_color = self._rotate_image(image, angle)
|
||||
detected = self._detect_barcodes(rotated_gray, rotated_color)
|
||||
if detected:
|
||||
logger.info(f"Found barcode(s) in uploaded image at {angle}° rotation")
|
||||
barcodes.extend(detected)
|
||||
break
|
||||
|
||||
return self._deduplicate_barcodes(barcodes)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error scanning image from bytes: {e}")
|
||||
return []
|
||||
|
||||
def validate_barcode(self, barcode: str, barcode_type: str) -> bool:
|
||||
"""
|
||||
Validate a barcode using check digits (for EAN/UPC).
|
||||
|
||||
Args:
|
||||
barcode: Barcode string
|
||||
barcode_type: Type of barcode (e.g., 'EAN13', 'UPCA')
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
if barcode_type in ["EAN13", "UPCA"]:
|
||||
return self._validate_ean13(barcode)
|
||||
elif barcode_type == "EAN8":
|
||||
return self._validate_ean8(barcode)
|
||||
|
||||
# For other types, assume valid if detected
|
||||
return True
|
||||
|
||||
def _validate_ean13(self, barcode: str) -> bool:
|
||||
"""Validate EAN-13 barcode using check digit."""
|
||||
if len(barcode) != 13 or not barcode.isdigit():
|
||||
return False
|
||||
|
||||
# Calculate check digit
|
||||
odd_sum = sum(int(barcode[i]) for i in range(0, 12, 2))
|
||||
even_sum = sum(int(barcode[i]) for i in range(1, 12, 2))
|
||||
total = odd_sum + (even_sum * 3)
|
||||
check_digit = (10 - (total % 10)) % 10
|
||||
|
||||
return int(barcode[12]) == check_digit
|
||||
|
||||
def _validate_ean8(self, barcode: str) -> bool:
|
||||
"""Validate EAN-8 barcode using check digit."""
|
||||
if len(barcode) != 8 or not barcode.isdigit():
|
||||
return False
|
||||
|
||||
# Calculate check digit
|
||||
odd_sum = sum(int(barcode[i]) for i in range(1, 7, 2))
|
||||
even_sum = sum(int(barcode[i]) for i in range(0, 7, 2))
|
||||
total = (odd_sum * 3) + even_sum
|
||||
check_digit = (10 - (total % 10)) % 10
|
||||
|
||||
return int(barcode[7]) == check_digit
|
||||
306
app/services/expiration_predictor.py
Normal file
306
app/services/expiration_predictor.py
Normal file
|
|
@ -0,0 +1,306 @@
|
|||
"""
|
||||
Expiration Date Prediction Service.
|
||||
|
||||
Predicts expiration dates for food items based on category and storage location.
|
||||
Fast path: deterministic lookup table (USDA FoodKeeper / FDA guidelines).
|
||||
Fallback path: LLMRouter — only fires for unknown products when tier allows it
|
||||
and a LLM backend is configured.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import re
|
||||
from datetime import date, timedelta
|
||||
from typing import Optional, List
|
||||
|
||||
from circuitforge_core.llm.router import LLMRouter
|
||||
from app.tiers import can_use
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ExpirationPredictor:
|
||||
"""Predict expiration dates based on product category and storage location."""
|
||||
|
||||
# Default shelf life in days by category and location
|
||||
# Sources: USDA FoodKeeper app, FDA guidelines
|
||||
SHELF_LIFE = {
|
||||
# Dairy
|
||||
'dairy': {'fridge': 7, 'freezer': 90},
|
||||
'milk': {'fridge': 7, 'freezer': 90},
|
||||
'cheese': {'fridge': 21, 'freezer': 180},
|
||||
'yogurt': {'fridge': 14, 'freezer': 60},
|
||||
'butter': {'fridge': 30, 'freezer': 365},
|
||||
'cream': {'fridge': 5, 'freezer': 60},
|
||||
# Meat & Poultry
|
||||
'meat': {'fridge': 3, 'freezer': 180},
|
||||
'beef': {'fridge': 3, 'freezer': 270},
|
||||
'pork': {'fridge': 3, 'freezer': 180},
|
||||
'lamb': {'fridge': 3, 'freezer': 270},
|
||||
'poultry': {'fridge': 2, 'freezer': 270},
|
||||
'chicken': {'fridge': 2, 'freezer': 270},
|
||||
'turkey': {'fridge': 2, 'freezer': 270},
|
||||
'ground_meat': {'fridge': 2, 'freezer': 120},
|
||||
# Seafood
|
||||
'fish': {'fridge': 2, 'freezer': 180},
|
||||
'seafood': {'fridge': 2, 'freezer': 180},
|
||||
'shrimp': {'fridge': 2, 'freezer': 180},
|
||||
'salmon': {'fridge': 2, 'freezer': 180},
|
||||
# Eggs
|
||||
'eggs': {'fridge': 35, 'freezer': None},
|
||||
# Produce
|
||||
'vegetables': {'fridge': 7, 'pantry': 5, 'freezer': 270},
|
||||
'fruits': {'fridge': 7, 'pantry': 5, 'freezer': 365},
|
||||
'leafy_greens': {'fridge': 5, 'freezer': 270},
|
||||
'berries': {'fridge': 5, 'freezer': 270},
|
||||
'apples': {'fridge': 30, 'pantry': 14},
|
||||
'bananas': {'pantry': 5, 'fridge': 7},
|
||||
'citrus': {'fridge': 21, 'pantry': 7},
|
||||
# Bread & Bakery
|
||||
'bread': {'pantry': 5, 'freezer': 90},
|
||||
'bakery': {'pantry': 3, 'fridge': 7, 'freezer': 90},
|
||||
# Frozen
|
||||
'frozen_foods': {'freezer': 180},
|
||||
'frozen_vegetables': {'freezer': 270},
|
||||
'frozen_fruit': {'freezer': 365},
|
||||
'ice_cream': {'freezer': 60},
|
||||
# Pantry Staples
|
||||
'canned_goods': {'pantry': 730, 'cabinet': 730},
|
||||
'dry_goods': {'pantry': 365, 'cabinet': 365},
|
||||
'pasta': {'pantry': 730, 'cabinet': 730},
|
||||
'rice': {'pantry': 730, 'cabinet': 730},
|
||||
'flour': {'pantry': 180, 'cabinet': 180},
|
||||
'sugar': {'pantry': 730, 'cabinet': 730},
|
||||
'cereal': {'pantry': 180, 'cabinet': 180},
|
||||
'chips': {'pantry': 90, 'cabinet': 90},
|
||||
'cookies': {'pantry': 90, 'cabinet': 90},
|
||||
# Condiments
|
||||
'condiments': {'fridge': 90, 'pantry': 180},
|
||||
'ketchup': {'fridge': 180, 'pantry': 365},
|
||||
'mustard': {'fridge': 365, 'pantry': 365},
|
||||
'mayo': {'fridge': 60, 'pantry': 180},
|
||||
'salad_dressing': {'fridge': 90, 'pantry': 180},
|
||||
'soy_sauce': {'fridge': 730, 'pantry': 730},
|
||||
# Beverages
|
||||
'beverages': {'fridge': 14, 'pantry': 180},
|
||||
'juice': {'fridge': 7, 'freezer': 90},
|
||||
'soda': {'fridge': 270, 'pantry': 270},
|
||||
'water': {'fridge': 365, 'pantry': 365},
|
||||
# Other
|
||||
'deli_meat': {'fridge': 5, 'freezer': 60},
|
||||
'leftovers': {'fridge': 4, 'freezer': 90},
|
||||
'prepared_foods': {'fridge': 4, 'freezer': 90},
|
||||
}
|
||||
|
||||
CATEGORY_KEYWORDS = {
|
||||
'milk': ['milk', 'whole milk', '2% milk', 'skim milk', 'almond milk', 'oat milk', 'soy milk'],
|
||||
'cheese': ['cheese', 'cheddar', 'mozzarella', 'swiss', 'parmesan', 'feta', 'gouda'],
|
||||
'yogurt': ['yogurt', 'greek yogurt', 'yoghurt'],
|
||||
'butter': ['butter', 'margarine'],
|
||||
'cream': ['cream', 'heavy cream', 'whipping cream', 'sour cream'],
|
||||
'eggs': ['eggs', 'egg'],
|
||||
'beef': ['beef', 'steak', 'roast', 'brisket', 'ribeye', 'sirloin'],
|
||||
'pork': ['pork', 'bacon', 'ham', 'sausage', 'pork chop'],
|
||||
'chicken': ['chicken', 'chicken breast', 'chicken thigh', 'chicken wings'],
|
||||
'turkey': ['turkey', 'turkey breast', 'ground turkey'],
|
||||
'ground_meat': ['ground beef', 'ground pork', 'ground chicken', 'hamburger'],
|
||||
'fish': ['fish', 'cod', 'tilapia', 'halibut'],
|
||||
'salmon': ['salmon'],
|
||||
'shrimp': ['shrimp', 'prawns'],
|
||||
'leafy_greens': ['lettuce', 'spinach', 'kale', 'arugula', 'mixed greens', 'salad'],
|
||||
'berries': ['strawberries', 'blueberries', 'raspberries', 'blackberries'],
|
||||
'apples': ['apple', 'apples'],
|
||||
'bananas': ['banana', 'bananas'],
|
||||
'citrus': ['orange', 'lemon', 'lime', 'grapefruit', 'tangerine'],
|
||||
'bread': ['bread', 'loaf', 'baguette', 'roll', 'bagel', 'bun'],
|
||||
'bakery': ['muffin', 'croissant', 'donut', 'danish', 'pastry'],
|
||||
'deli_meat': ['deli', 'sliced turkey', 'sliced ham', 'lunch meat', 'cold cuts'],
|
||||
'frozen_vegetables': ['frozen veg', 'frozen corn', 'frozen peas', 'frozen broccoli'],
|
||||
'frozen_fruit': ['frozen berries', 'frozen mango', 'frozen strawberries'],
|
||||
'ice_cream': ['ice cream', 'gelato', 'frozen yogurt'],
|
||||
'pasta': ['pasta', 'spaghetti', 'penne', 'macaroni', 'noodles'],
|
||||
'rice': ['rice', 'brown rice', 'white rice', 'jasmine'],
|
||||
'cereal': ['cereal', 'granola', 'oatmeal'],
|
||||
'chips': ['chips', 'crisps', 'tortilla chips'],
|
||||
'cookies': ['cookies', 'biscuits', 'crackers'],
|
||||
'ketchup': ['ketchup', 'catsup'],
|
||||
'mustard': ['mustard'],
|
||||
'mayo': ['mayo', 'mayonnaise', 'miracle whip'],
|
||||
'salad_dressing': ['salad dressing', 'ranch', 'italian dressing', 'vinaigrette'],
|
||||
'soy_sauce': ['soy sauce', 'tamari'],
|
||||
'juice': ['juice', 'orange juice', 'apple juice'],
|
||||
'soda': ['soda', 'pop', 'cola', 'sprite', 'pepsi', 'coke'],
|
||||
}
|
||||
|
||||
def __init__(self) -> None:
|
||||
self._router: Optional[LLMRouter] = None
|
||||
try:
|
||||
self._router = LLMRouter()
|
||||
except FileNotFoundError:
|
||||
logger.debug("LLM config not found — expiry LLM fallback disabled")
|
||||
except Exception as e:
|
||||
logger.warning("LLMRouter init failed (%s) — expiry LLM fallback disabled", e)
|
||||
|
||||
# ── Public API ────────────────────────────────────────────────────────────
|
||||
|
||||
def predict_expiration(
|
||||
self,
|
||||
category: Optional[str],
|
||||
location: str,
|
||||
purchase_date: Optional[date] = None,
|
||||
product_name: Optional[str] = None,
|
||||
tier: str = "free",
|
||||
has_byok: bool = False,
|
||||
) -> Optional[date]:
|
||||
"""
|
||||
Predict expiration date.
|
||||
|
||||
Fast path: deterministic lookup table.
|
||||
Fallback: LLM query when table has no match, tier allows it, and a
|
||||
backend is configured. Returns None rather than crashing if
|
||||
inference fails.
|
||||
"""
|
||||
if not purchase_date:
|
||||
purchase_date = date.today()
|
||||
|
||||
days = self._lookup_days(category, location)
|
||||
|
||||
if days is None and product_name and self._router and can_use("expiry_llm_matching", tier, has_byok):
|
||||
days = self._llm_predict_days(product_name, category, location)
|
||||
|
||||
if days is None:
|
||||
return None
|
||||
return purchase_date + timedelta(days=days)
|
||||
|
||||
def get_category_from_product(
|
||||
self,
|
||||
product_name: str,
|
||||
product_category: Optional[str] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
) -> Optional[str]:
|
||||
"""Determine category from product name, existing category, and tags."""
|
||||
if product_category:
|
||||
cat = product_category.lower().strip()
|
||||
if cat in self.SHELF_LIFE:
|
||||
return cat
|
||||
for key in self.SHELF_LIFE:
|
||||
if key in cat or cat in key:
|
||||
return key
|
||||
|
||||
if tags:
|
||||
for tag in tags:
|
||||
t = tag.lower().strip()
|
||||
if t in self.SHELF_LIFE:
|
||||
return t
|
||||
|
||||
name = product_name.lower().strip()
|
||||
for category, keywords in self.CATEGORY_KEYWORDS.items():
|
||||
if any(kw in name for kw in keywords):
|
||||
return category
|
||||
|
||||
for words, fallback in [
|
||||
(['meat', 'beef', 'pork', 'chicken'], 'meat'),
|
||||
(['vegetable', 'veggie', 'produce'], 'vegetables'),
|
||||
(['fruit'], 'fruits'),
|
||||
(['dairy'], 'dairy'),
|
||||
(['frozen'], 'frozen_foods'),
|
||||
]:
|
||||
if any(w in name for w in words):
|
||||
return fallback
|
||||
|
||||
return 'dry_goods'
|
||||
|
||||
def get_shelf_life_info(self, category: str, location: str) -> Optional[int]:
|
||||
"""Shelf life in days for a given category + location, or None."""
|
||||
return self.SHELF_LIFE.get(category.lower().strip(), {}).get(location)
|
||||
|
||||
def list_categories(self) -> List[str]:
|
||||
return list(self.SHELF_LIFE.keys())
|
||||
|
||||
def list_locations(self) -> List[str]:
|
||||
locations: set[str] = set()
|
||||
for shelf_life in self.SHELF_LIFE.values():
|
||||
locations.update(shelf_life.keys())
|
||||
return sorted(locations)
|
||||
|
||||
# ── Private helpers ───────────────────────────────────────────────────────
|
||||
|
||||
def _lookup_days(self, category: Optional[str], location: str) -> Optional[int]:
|
||||
"""Pure deterministic lookup — no I/O."""
|
||||
if not category:
|
||||
return None
|
||||
cat = category.lower().strip()
|
||||
if cat not in self.SHELF_LIFE:
|
||||
for key in self.SHELF_LIFE:
|
||||
if key in cat or cat in key:
|
||||
cat = key
|
||||
break
|
||||
else:
|
||||
return None
|
||||
|
||||
days = self.SHELF_LIFE[cat].get(location)
|
||||
if days is None:
|
||||
for loc in ('fridge', 'pantry', 'freezer', 'cabinet'):
|
||||
days = self.SHELF_LIFE[cat].get(loc)
|
||||
if days is not None:
|
||||
break
|
||||
return days
|
||||
|
||||
def _llm_predict_days(
|
||||
self,
|
||||
product_name: str,
|
||||
category: Optional[str],
|
||||
location: str,
|
||||
) -> Optional[int]:
|
||||
"""
|
||||
Ask the LLM how many days this product keeps in the given location.
|
||||
|
||||
TODO: Fill in the prompts below. Good prompts should:
|
||||
- Give enough context for the LLM to reason about food safety
|
||||
- Specify output format clearly (just an integer — nothing else)
|
||||
- Err conservative (shorter shelf life) when uncertain
|
||||
- Stay concise — this fires on every unknown barcode scan
|
||||
|
||||
Parameters available:
|
||||
product_name — e.g. "Trader Joe's Organic Tempeh"
|
||||
category — best-guess from get_category_from_product(), may be None
|
||||
location — "fridge" | "freezer" | "pantry" | "cabinet"
|
||||
"""
|
||||
assert self._router is not None
|
||||
|
||||
system = (
|
||||
"You are a food safety expert. Given a food product name, an optional "
|
||||
"category hint, and a storage location, respond with ONLY a single "
|
||||
"integer: the number of days the product typically remains safe to eat "
|
||||
"from purchase when stored as specified. No explanation, no units, no "
|
||||
"punctuation — just the integer. When uncertain, give the conservative "
|
||||
"(shorter) estimate."
|
||||
)
|
||||
|
||||
parts = [f"Product: {product_name}"]
|
||||
if category:
|
||||
parts.append(f"Category: {category}")
|
||||
parts.append(f"Storage location: {location}")
|
||||
parts.append("Days until expiry from purchase:")
|
||||
prompt = "\n".join(parts)
|
||||
|
||||
try:
|
||||
raw = self._router.complete(prompt, system=system, max_tokens=16)
|
||||
match = re.search(r'\b(\d+)\b', raw)
|
||||
if match:
|
||||
days = int(match.group(1))
|
||||
# Sanity cap: >5 years is implausible for a perishable unknown to
|
||||
# the deterministic table. If the LLM returns something absurd,
|
||||
# fall back to None rather than storing a misleading date.
|
||||
if days > 1825:
|
||||
logger.warning(
|
||||
"LLM returned implausible shelf life (%d days) for %r — discarding",
|
||||
days, product_name,
|
||||
)
|
||||
return None
|
||||
logger.debug(
|
||||
"LLM shelf life for %r in %s: %d days", product_name, location, days
|
||||
)
|
||||
return days
|
||||
except Exception as e:
|
||||
logger.warning("LLM expiry prediction failed for %r: %s", product_name, e)
|
||||
return None
|
||||
1
app/services/export/__init__.py
Normal file
1
app/services/export/__init__.py
Normal file
|
|
@ -0,0 +1 @@
|
|||
# app/services/export/__init__.py
|
||||
325
app/services/export/spreadsheet_export.py
Normal file
325
app/services/export/spreadsheet_export.py
Normal file
|
|
@ -0,0 +1,325 @@
|
|||
# app/services/export/spreadsheet_export.py
|
||||
"""
|
||||
Service for exporting receipt data to CSV and Excel formats.
|
||||
|
||||
This module provides functionality to convert receipt and quality assessment
|
||||
data into spreadsheet formats for easy viewing and analysis.
|
||||
"""
|
||||
|
||||
import pandas as pd
|
||||
from datetime import datetime
|
||||
from typing import List, Dict, Optional
|
||||
from pathlib import Path
|
||||
|
||||
from app.models.schemas.receipt import ReceiptResponse
|
||||
from app.models.schemas.quality import QualityAssessment
|
||||
|
||||
|
||||
class SpreadsheetExporter:
|
||||
"""
|
||||
Service for exporting receipt data to CSV/Excel formats.
|
||||
|
||||
Provides methods to convert receipt and quality assessment data into
|
||||
spreadsheet formats that can be opened in Excel, Google Sheets, or
|
||||
LibreOffice Calc.
|
||||
"""
|
||||
|
||||
def export_to_csv(
|
||||
self,
|
||||
receipts: List[ReceiptResponse],
|
||||
quality_data: Dict[str, QualityAssessment],
|
||||
ocr_data: Optional[Dict[str, Dict]] = None
|
||||
) -> str:
|
||||
"""
|
||||
Export receipts to CSV format.
|
||||
|
||||
Args:
|
||||
receipts: List of receipt responses
|
||||
quality_data: Dict mapping receipt_id to quality assessment
|
||||
ocr_data: Optional dict mapping receipt_id to OCR extracted data
|
||||
|
||||
Returns:
|
||||
CSV string ready for download
|
||||
"""
|
||||
df = self._receipts_to_dataframe(receipts, quality_data, ocr_data)
|
||||
return df.to_csv(index=False)
|
||||
|
||||
def export_to_excel(
|
||||
self,
|
||||
receipts: List[ReceiptResponse],
|
||||
quality_data: Dict[str, QualityAssessment],
|
||||
output_path: str,
|
||||
ocr_data: Optional[Dict[str, Dict]] = None
|
||||
) -> None:
|
||||
"""
|
||||
Export receipts to Excel format with multiple sheets.
|
||||
|
||||
Creates an Excel file with sheets:
|
||||
- Receipts: Main receipt data with OCR results
|
||||
- Line Items: Detailed items from all receipts (if OCR available)
|
||||
- Quality Details: Detailed quality metrics
|
||||
- Summary: Aggregated statistics
|
||||
|
||||
Args:
|
||||
receipts: List of receipt responses
|
||||
quality_data: Dict mapping receipt_id to quality assessment
|
||||
output_path: Path to save Excel file
|
||||
ocr_data: Optional dict mapping receipt_id to OCR extracted data
|
||||
"""
|
||||
with pd.ExcelWriter(output_path, engine='openpyxl') as writer:
|
||||
# Sheet 1: Receipts with OCR data
|
||||
receipts_df = self._receipts_to_dataframe(receipts, quality_data, ocr_data)
|
||||
receipts_df.to_excel(writer, sheet_name='Receipts', index=False)
|
||||
|
||||
# Sheet 2: Line Items (if OCR data available)
|
||||
if ocr_data:
|
||||
items_df = self._items_to_dataframe(receipts, ocr_data)
|
||||
if not items_df.empty:
|
||||
items_df.to_excel(writer, sheet_name='Line Items', index=False)
|
||||
|
||||
# Sheet 3: Quality Details
|
||||
if quality_data:
|
||||
quality_df = self._quality_to_dataframe(quality_data)
|
||||
quality_df.to_excel(writer, sheet_name='Quality Details', index=False)
|
||||
|
||||
# Sheet 4: Summary
|
||||
summary_df = self._create_summary(receipts, quality_data, ocr_data)
|
||||
summary_df.to_excel(writer, sheet_name='Summary', index=False)
|
||||
|
||||
def _receipts_to_dataframe(
|
||||
self,
|
||||
receipts: List[ReceiptResponse],
|
||||
quality_data: Dict[str, QualityAssessment],
|
||||
ocr_data: Optional[Dict[str, Dict]] = None
|
||||
) -> pd.DataFrame:
|
||||
"""
|
||||
Convert receipts to pandas DataFrame.
|
||||
|
||||
Args:
|
||||
receipts: List of receipt responses
|
||||
quality_data: Dict mapping receipt_id to quality assessment
|
||||
ocr_data: Optional dict mapping receipt_id to OCR extracted data
|
||||
|
||||
Returns:
|
||||
DataFrame with receipt data
|
||||
"""
|
||||
data = []
|
||||
for receipt in receipts:
|
||||
quality = quality_data.get(receipt.id)
|
||||
ocr = ocr_data.get(receipt.id) if ocr_data else None
|
||||
|
||||
# Base columns
|
||||
row = {
|
||||
'ID': receipt.id,
|
||||
'Filename': receipt.filename,
|
||||
'Status': receipt.status,
|
||||
'Quality Score': quality.overall_score if quality else None,
|
||||
}
|
||||
|
||||
# Add OCR data if available
|
||||
if ocr:
|
||||
merchant = ocr.get('merchant', {})
|
||||
transaction = ocr.get('transaction', {})
|
||||
totals = ocr.get('totals', {})
|
||||
items = ocr.get('items', [])
|
||||
|
||||
row.update({
|
||||
'Merchant': merchant.get('name', ''),
|
||||
'Store Address': merchant.get('address', ''),
|
||||
'Store Phone': merchant.get('phone', ''),
|
||||
'Date': transaction.get('date', ''),
|
||||
'Time': transaction.get('time', ''),
|
||||
'Receipt Number': transaction.get('receipt_number', ''),
|
||||
'Item Count': len(items),
|
||||
'Subtotal': totals.get('subtotal', ''),
|
||||
'Tax': totals.get('tax', ''),
|
||||
'Total': totals.get('total', ''),
|
||||
'Payment Method': totals.get('payment_method', ''),
|
||||
'OCR Confidence': ocr.get('confidence', {}).get('overall', ''),
|
||||
})
|
||||
|
||||
# Add items as text
|
||||
items_text = '; '.join([
|
||||
f"{item.get('name', 'Unknown')} (${item.get('total_price', 0):.2f})"
|
||||
for item in items[:10] # Limit to first 10 items for CSV
|
||||
])
|
||||
if len(items) > 10:
|
||||
items_text += f'; ... and {len(items) - 10} more items'
|
||||
row['Items'] = items_text
|
||||
else:
|
||||
# No OCR data - show image metadata instead
|
||||
row.update({
|
||||
'Merchant': 'N/A - No OCR',
|
||||
'Date': '',
|
||||
'Total': '',
|
||||
'Item Count': 0,
|
||||
'Width': receipt.metadata.get('width'),
|
||||
'Height': receipt.metadata.get('height'),
|
||||
'File Size (KB)': round(receipt.metadata.get('file_size_bytes', 0) / 1024, 2),
|
||||
})
|
||||
|
||||
data.append(row)
|
||||
|
||||
return pd.DataFrame(data)
|
||||
|
||||
def _items_to_dataframe(
|
||||
self,
|
||||
receipts: List[ReceiptResponse],
|
||||
ocr_data: Dict[str, Dict]
|
||||
) -> pd.DataFrame:
|
||||
"""
|
||||
Convert line items from all receipts to DataFrame.
|
||||
|
||||
Args:
|
||||
receipts: List of receipt responses
|
||||
ocr_data: Dict mapping receipt_id to OCR extracted data
|
||||
|
||||
Returns:
|
||||
DataFrame with all line items from all receipts
|
||||
"""
|
||||
data = []
|
||||
for receipt in receipts:
|
||||
ocr = ocr_data.get(receipt.id)
|
||||
if not ocr:
|
||||
continue
|
||||
|
||||
merchant = ocr.get('merchant', {}).get('name', 'Unknown')
|
||||
date = ocr.get('transaction', {}).get('date', '')
|
||||
items = ocr.get('items', [])
|
||||
|
||||
for item in items:
|
||||
data.append({
|
||||
'Receipt ID': receipt.id,
|
||||
'Receipt File': receipt.filename,
|
||||
'Merchant': merchant,
|
||||
'Date': date,
|
||||
'Item Name': item.get('name', 'Unknown'),
|
||||
'Quantity': item.get('quantity', 1),
|
||||
'Unit Price': item.get('unit_price', ''),
|
||||
'Total Price': item.get('total_price', 0),
|
||||
'Category': item.get('category', ''),
|
||||
'Tax Code': item.get('tax_code', ''),
|
||||
'Discount': item.get('discount', 0),
|
||||
})
|
||||
|
||||
return pd.DataFrame(data)
|
||||
|
||||
def _quality_to_dataframe(
|
||||
self,
|
||||
quality_data: Dict[str, QualityAssessment]
|
||||
) -> pd.DataFrame:
|
||||
"""
|
||||
Convert quality assessments to DataFrame.
|
||||
|
||||
Args:
|
||||
quality_data: Dict mapping receipt_id to quality assessment
|
||||
|
||||
Returns:
|
||||
DataFrame with quality metrics
|
||||
"""
|
||||
data = []
|
||||
for receipt_id, quality in quality_data.items():
|
||||
metrics = quality.metrics
|
||||
row = {
|
||||
'Receipt ID': receipt_id,
|
||||
'Overall Score': round(quality.overall_score, 2),
|
||||
'Acceptable': quality.is_acceptable,
|
||||
'Blur Score': round(metrics.get('blur_score', 0), 2),
|
||||
'Lighting Score': round(metrics.get('lighting_score', 0), 2),
|
||||
'Contrast Score': round(metrics.get('contrast_score', 0), 2),
|
||||
'Size Score': round(metrics.get('size_score', 0), 2),
|
||||
'Fold Detected': metrics.get('fold_detected', False),
|
||||
'Fold Severity': round(metrics.get('fold_severity', 0), 2),
|
||||
'Suggestions': '; '.join(quality.suggestions) if quality.suggestions else 'None',
|
||||
}
|
||||
data.append(row)
|
||||
|
||||
return pd.DataFrame(data)
|
||||
|
||||
def _create_summary(
|
||||
self,
|
||||
receipts: List[ReceiptResponse],
|
||||
quality_data: Dict[str, QualityAssessment],
|
||||
ocr_data: Optional[Dict[str, Dict]] = None
|
||||
) -> pd.DataFrame:
|
||||
"""
|
||||
Create summary statistics DataFrame.
|
||||
|
||||
Args:
|
||||
receipts: List of receipt responses
|
||||
quality_data: Dict mapping receipt_id to quality assessment
|
||||
ocr_data: Optional dict mapping receipt_id to OCR extracted data
|
||||
|
||||
Returns:
|
||||
DataFrame with summary statistics
|
||||
"""
|
||||
quality_scores = [q.overall_score for q in quality_data.values() if q]
|
||||
|
||||
# Count statuses
|
||||
status_counts = {}
|
||||
for receipt in receipts:
|
||||
status_counts[receipt.status] = status_counts.get(receipt.status, 0) + 1
|
||||
|
||||
metrics = [
|
||||
'Total Receipts',
|
||||
'Processed',
|
||||
'Processing',
|
||||
'Uploaded',
|
||||
'Failed',
|
||||
'Average Quality Score',
|
||||
'Best Quality Score',
|
||||
'Worst Quality Score',
|
||||
'Acceptable Quality Count',
|
||||
'Unacceptable Quality Count',
|
||||
]
|
||||
|
||||
values = [
|
||||
len(receipts),
|
||||
status_counts.get('processed', 0),
|
||||
status_counts.get('processing', 0),
|
||||
status_counts.get('uploaded', 0),
|
||||
status_counts.get('error', 0),
|
||||
f"{sum(quality_scores) / len(quality_scores):.2f}" if quality_scores else 'N/A',
|
||||
f"{max(quality_scores):.2f}" if quality_scores else 'N/A',
|
||||
f"{min(quality_scores):.2f}" if quality_scores else 'N/A',
|
||||
len([q for q in quality_data.values() if q and q.is_acceptable]),
|
||||
len([q for q in quality_data.values() if q and not q.is_acceptable]),
|
||||
]
|
||||
|
||||
# Add OCR statistics if available
|
||||
if ocr_data:
|
||||
receipts_with_ocr = len([r for r in receipts if r.id in ocr_data])
|
||||
total_items = sum(len(ocr.get('items', [])) for ocr in ocr_data.values())
|
||||
total_spent = sum(
|
||||
ocr.get('totals', {}).get('total', 0) or 0
|
||||
for ocr in ocr_data.values()
|
||||
)
|
||||
avg_confidence = sum(
|
||||
ocr.get('confidence', {}).get('overall', 0) or 0
|
||||
for ocr in ocr_data.values()
|
||||
) / len(ocr_data) if ocr_data else 0
|
||||
|
||||
metrics.extend([
|
||||
'', # Blank row
|
||||
'OCR Statistics',
|
||||
'Receipts with OCR Data',
|
||||
'Total Line Items Extracted',
|
||||
'Total Amount Spent',
|
||||
'Average OCR Confidence',
|
||||
])
|
||||
|
||||
values.extend([
|
||||
'',
|
||||
'',
|
||||
receipts_with_ocr,
|
||||
total_items,
|
||||
f"${total_spent:.2f}" if total_spent > 0 else 'N/A',
|
||||
f"{avg_confidence:.2%}" if avg_confidence > 0 else 'N/A',
|
||||
])
|
||||
|
||||
summary = {
|
||||
'Metric': metrics,
|
||||
'Value': values
|
||||
}
|
||||
|
||||
return pd.DataFrame(summary)
|
||||
10
app/services/image_preprocessing/__init__.py
Normal file
10
app/services/image_preprocessing/__init__.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
# app/services/image_preprocessing/__init__.py
|
||||
"""
|
||||
Image preprocessing services for Kiwi.
|
||||
Contains functions for image enhancement, format conversion, and perspective correction.
|
||||
"""
|
||||
|
||||
from app.services.image_preprocessing.format_conversion import convert_to_standard_format, extract_metadata
|
||||
from app.services.image_preprocessing.enhancement import enhance_image, correct_perspective
|
||||
|
||||
__all__ = ["convert_to_standard_format", "extract_metadata", "enhance_image", "correct_perspective"]
|
||||
172
app/services/image_preprocessing/enhancement.py
Normal file
172
app/services/image_preprocessing/enhancement.py
Normal file
|
|
@ -0,0 +1,172 @@
|
|||
#!/usr/bin/env python
|
||||
# app/services/image_preprocessing/
|
||||
import cv2
|
||||
import numpy as np
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Tuple, Optional
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def enhance_image(
|
||||
image_path: Path,
|
||||
output_path: Optional[Path] = None,
|
||||
adaptive_threshold: bool = True,
|
||||
denoise: bool = True,
|
||||
) -> Tuple[bool, str, Optional[Path]]:
|
||||
"""
|
||||
Enhance receipt image for better OCR.
|
||||
|
||||
Args:
|
||||
image_path: Path to input image
|
||||
output_path: Optional path to save enhanced image
|
||||
adaptive_threshold: Whether to apply adaptive thresholding
|
||||
denoise: Whether to apply denoising
|
||||
|
||||
Returns:
|
||||
Tuple containing (success, message, output_path)
|
||||
"""
|
||||
try:
|
||||
# Check if CUDA is available
|
||||
use_cuda = cv2.cuda.getCudaEnabledDeviceCount() > 0
|
||||
|
||||
# Set output path if not provided
|
||||
if output_path is None:
|
||||
output_path = image_path.with_stem(f"{image_path.stem}_enhanced")
|
||||
|
||||
# Read image
|
||||
img = cv2.imread(str(image_path))
|
||||
if img is None:
|
||||
return False, f"Failed to read image: {image_path}", None
|
||||
|
||||
# Convert to grayscale
|
||||
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# Apply denoising if requested
|
||||
if denoise:
|
||||
if use_cuda:
|
||||
# GPU accelerated denoising
|
||||
gpu_img = cv2.cuda_GpuMat()
|
||||
gpu_img.upload(gray)
|
||||
gpu_result = cv2.cuda.createNonLocalMeans().apply(gpu_img)
|
||||
denoised = gpu_result.download()
|
||||
else:
|
||||
# CPU denoising
|
||||
denoised = cv2.fastNlMeansDenoising(gray, None, 10, 7, 21)
|
||||
else:
|
||||
denoised = gray
|
||||
|
||||
# Apply adaptive thresholding if requested
|
||||
if adaptive_threshold:
|
||||
# Adaptive thresholding works well for receipts with varying backgrounds
|
||||
binary = cv2.adaptiveThreshold(
|
||||
denoised,
|
||||
255,
|
||||
cv2.ADAPTIVE_THRESH_GAUSSIAN_C,
|
||||
cv2.THRESH_BINARY,
|
||||
11,
|
||||
2
|
||||
)
|
||||
processed = binary
|
||||
else:
|
||||
processed = denoised
|
||||
|
||||
# Write enhanced image
|
||||
success = cv2.imwrite(str(output_path), processed)
|
||||
if not success:
|
||||
return False, f"Failed to write enhanced image to {output_path}", None
|
||||
|
||||
return True, "Image enhanced successfully", output_path
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(f"Error enhancing image: {e}")
|
||||
return False, f"Error enhancing image: {str(e)}", None
|
||||
|
||||
def correct_perspective(
|
||||
image_path: Path,
|
||||
output_path: Optional[Path] = None,
|
||||
) -> Tuple[bool, str, Optional[Path]]:
|
||||
"""
|
||||
Correct perspective distortion in receipt image.
|
||||
|
||||
Args:
|
||||
image_path: Path to input image
|
||||
output_path: Optional path to save corrected image
|
||||
|
||||
Returns:
|
||||
Tuple containing (success, message, output_path)
|
||||
"""
|
||||
try:
|
||||
# Set output path if not provided
|
||||
if output_path is None:
|
||||
output_path = image_path.with_stem(f"{image_path.stem}_perspective")
|
||||
|
||||
# Read image
|
||||
img = cv2.imread(str(image_path))
|
||||
if img is None:
|
||||
return False, f"Failed to read image: {image_path}", None
|
||||
|
||||
# Convert to grayscale
|
||||
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# Apply Gaussian blur to reduce noise
|
||||
blur = cv2.GaussianBlur(gray, (5, 5), 0)
|
||||
|
||||
# Apply edge detection
|
||||
edges = cv2.Canny(blur, 50, 150, apertureSize=3)
|
||||
|
||||
# Find contours
|
||||
contours, _ = cv2.findContours(edges, cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)
|
||||
|
||||
# Find the largest contour by area which is likely the receipt
|
||||
if not contours:
|
||||
return False, "No contours found in image", None
|
||||
|
||||
largest_contour = max(contours, key=cv2.contourArea)
|
||||
|
||||
# Approximate the contour to get the corners
|
||||
epsilon = 0.02 * cv2.arcLength(largest_contour, True)
|
||||
approx = cv2.approxPolyDP(largest_contour, epsilon, True)
|
||||
|
||||
# If we have a quadrilateral, we can apply perspective transform
|
||||
if len(approx) == 4:
|
||||
# Sort the points for the perspective transform
|
||||
# This is a simplified implementation
|
||||
src_pts = approx.reshape(4, 2).astype(np.float32)
|
||||
|
||||
# Get width and height for the destination image
|
||||
width = int(max(
|
||||
np.linalg.norm(src_pts[0] - src_pts[1]),
|
||||
np.linalg.norm(src_pts[2] - src_pts[3])
|
||||
))
|
||||
height = int(max(
|
||||
np.linalg.norm(src_pts[0] - src_pts[3]),
|
||||
np.linalg.norm(src_pts[1] - src_pts[2])
|
||||
))
|
||||
|
||||
# Define destination points
|
||||
dst_pts = np.array([
|
||||
[0, 0],
|
||||
[width - 1, 0],
|
||||
[width - 1, height - 1],
|
||||
[0, height - 1]
|
||||
], dtype=np.float32)
|
||||
|
||||
# Get perspective transform matrix
|
||||
M = cv2.getPerspectiveTransform(src_pts, dst_pts)
|
||||
|
||||
# Apply perspective transform
|
||||
warped = cv2.warpPerspective(img, M, (width, height))
|
||||
|
||||
# Write corrected image
|
||||
success = cv2.imwrite(str(output_path), warped)
|
||||
if not success:
|
||||
return False, f"Failed to write perspective-corrected image to {output_path}", None
|
||||
|
||||
return True, "Perspective corrected successfully", output_path
|
||||
else:
|
||||
return False, "Receipt corners not clearly detected", None
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(f"Error correcting perspective: {e}")
|
||||
return False, f"Error correcting perspective: {str(e)}", None
|
||||
89
app/services/image_preprocessing/format_conversion.py
Normal file
89
app/services/image_preprocessing/format_conversion.py
Normal file
|
|
@ -0,0 +1,89 @@
|
|||
#!/usr/bin/env python
|
||||
# app/services/image_preprocessing/format_conversion.py
|
||||
import cv2
|
||||
import numpy as np
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Tuple, Optional
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def convert_to_standard_format(
|
||||
image_path: Path,
|
||||
output_path: Optional[Path] = None,
|
||||
target_format: str = "png"
|
||||
) -> Tuple[bool, str, Optional[Path]]:
|
||||
"""
|
||||
Convert image to standard internal format.
|
||||
|
||||
Args:
|
||||
image_path: Path to input image
|
||||
output_path: Optional path to save converted image
|
||||
target_format: Target format (png, jpg)
|
||||
|
||||
Returns:
|
||||
Tuple containing (success, message, output_path)
|
||||
"""
|
||||
try:
|
||||
# Check if CUDA is available and set up GPU processing
|
||||
if cv2.cuda.getCudaEnabledDeviceCount() > 0:
|
||||
logger.info("CUDA available, using GPU acceleration")
|
||||
use_cuda = True
|
||||
else:
|
||||
logger.info("CUDA not available, using CPU processing")
|
||||
use_cuda = False
|
||||
|
||||
# Read image
|
||||
img = cv2.imread(str(image_path))
|
||||
if img is None:
|
||||
return False, f"Failed to read image: {image_path}", None
|
||||
|
||||
# If PDF, extract first page (simplified for Phase 1)
|
||||
if image_path.suffix.lower() == '.pdf':
|
||||
# This is a placeholder for PDF handling
|
||||
# In a real implementation, you'd use a PDF processing library
|
||||
return False, "PDF processing not implemented in Phase 1", None
|
||||
|
||||
# Set output path if not provided
|
||||
if output_path is None:
|
||||
output_path = image_path.with_suffix(f".{target_format}")
|
||||
|
||||
# Write converted image
|
||||
success = cv2.imwrite(str(output_path), img)
|
||||
if not success:
|
||||
return False, f"Failed to write converted image to {output_path}", None
|
||||
|
||||
return True, "Image converted successfully", output_path
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(f"Error converting image: {e}")
|
||||
return False, f"Error converting image: {str(e)}", None
|
||||
|
||||
def extract_metadata(image_path: Path) -> dict:
|
||||
"""
|
||||
Extract metadata from image file.
|
||||
|
||||
Args:
|
||||
image_path: Path to input image
|
||||
|
||||
Returns:
|
||||
Dictionary containing metadata
|
||||
"""
|
||||
metadata = {
|
||||
"filename": image_path.name,
|
||||
"original_format": image_path.suffix.lstrip(".").lower(),
|
||||
"file_size_bytes": image_path.stat().st_size,
|
||||
}
|
||||
|
||||
try:
|
||||
img = cv2.imread(str(image_path))
|
||||
if img is not None:
|
||||
metadata.update({
|
||||
"width": img.shape[1],
|
||||
"height": img.shape[0],
|
||||
"channels": img.shape[2] if len(img.shape) > 2 else 1,
|
||||
})
|
||||
except Exception as e:
|
||||
logger.exception(f"Error extracting image metadata: {e}")
|
||||
|
||||
return metadata
|
||||
539
app/services/inventory_service.py
Normal file
539
app/services/inventory_service.py
Normal file
|
|
@ -0,0 +1,539 @@
|
|||
"""
|
||||
Inventory management service.
|
||||
|
||||
This service orchestrates:
|
||||
- Barcode scanning
|
||||
- Product lookups (OpenFoodFacts)
|
||||
- Inventory CRUD operations
|
||||
- Tag management
|
||||
"""
|
||||
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, func, and_, or_
|
||||
from sqlalchemy.orm import selectinload
|
||||
from typing import List, Optional, Dict, Any
|
||||
from datetime import date, datetime, timedelta
|
||||
from pathlib import Path
|
||||
from uuid import UUID
|
||||
import uuid
|
||||
import logging
|
||||
|
||||
from app.db.models import Product, InventoryItem, Tag, product_tags
|
||||
from app.models.schemas.inventory import (
|
||||
ProductCreate,
|
||||
ProductUpdate,
|
||||
ProductResponse,
|
||||
InventoryItemCreate,
|
||||
InventoryItemUpdate,
|
||||
InventoryItemResponse,
|
||||
TagCreate,
|
||||
TagResponse,
|
||||
InventoryStats,
|
||||
)
|
||||
from app.services.barcode_scanner import BarcodeScanner
|
||||
from app.services.openfoodfacts import OpenFoodFactsService
|
||||
from app.services.expiration_predictor import ExpirationPredictor
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class InventoryService:
|
||||
"""Service for managing inventory and products."""
|
||||
|
||||
def __init__(self):
|
||||
self.barcode_scanner = BarcodeScanner()
|
||||
self.openfoodfacts = OpenFoodFactsService()
|
||||
self.expiration_predictor = ExpirationPredictor()
|
||||
|
||||
# ========== Barcode Scanning ==========
|
||||
|
||||
async def scan_barcode_image(
|
||||
self,
|
||||
image_path: Path,
|
||||
db: AsyncSession,
|
||||
auto_add: bool = True,
|
||||
location: str = "pantry",
|
||||
quantity: float = 1.0,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Scan an image for barcodes and optionally add to inventory.
|
||||
|
||||
Args:
|
||||
image_path: Path to image file
|
||||
db: Database session
|
||||
auto_add: Whether to auto-add to inventory
|
||||
location: Default storage location
|
||||
quantity: Default quantity
|
||||
|
||||
Returns:
|
||||
Dictionary with scan results
|
||||
"""
|
||||
# Scan for barcodes
|
||||
barcodes = self.barcode_scanner.scan_image(image_path)
|
||||
|
||||
if not barcodes:
|
||||
return {
|
||||
"success": False,
|
||||
"barcodes_found": 0,
|
||||
"results": [],
|
||||
"message": "No barcodes detected in image",
|
||||
}
|
||||
|
||||
results = []
|
||||
for barcode_data in barcodes:
|
||||
result = await self._process_barcode(
|
||||
barcode_data, db, auto_add, location, quantity
|
||||
)
|
||||
results.append(result)
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"barcodes_found": len(barcodes),
|
||||
"results": results,
|
||||
"message": f"Found {len(barcodes)} barcode(s)",
|
||||
}
|
||||
|
||||
async def _process_barcode(
|
||||
self,
|
||||
barcode_data: Dict[str, Any],
|
||||
db: AsyncSession,
|
||||
auto_add: bool,
|
||||
location: str,
|
||||
quantity: float,
|
||||
) -> Dict[str, Any]:
|
||||
"""Process a single barcode detection."""
|
||||
barcode = barcode_data["data"]
|
||||
barcode_type = barcode_data["type"]
|
||||
|
||||
# Check if product already exists
|
||||
product = await self.get_product_by_barcode(db, barcode)
|
||||
|
||||
# If not found, lookup in OpenFoodFacts
|
||||
if not product:
|
||||
off_data = await self.openfoodfacts.lookup_product(barcode)
|
||||
|
||||
if off_data:
|
||||
# Create product from OpenFoodFacts data
|
||||
product_create = ProductCreate(
|
||||
barcode=barcode,
|
||||
name=off_data["name"],
|
||||
brand=off_data.get("brand"),
|
||||
category=off_data.get("category"),
|
||||
description=off_data.get("description"),
|
||||
image_url=off_data.get("image_url"),
|
||||
nutrition_data=off_data.get("nutrition_data", {}),
|
||||
source="openfoodfacts",
|
||||
source_data=off_data.get("raw_data", {}),
|
||||
)
|
||||
product = await self.create_product(db, product_create)
|
||||
source = "openfoodfacts"
|
||||
else:
|
||||
# Product not found in OpenFoodFacts
|
||||
# Create a placeholder product
|
||||
product_create = ProductCreate(
|
||||
barcode=barcode,
|
||||
name=f"Unknown Product ({barcode})",
|
||||
source="manual",
|
||||
)
|
||||
product = await self.create_product(db, product_create)
|
||||
source = "manual"
|
||||
else:
|
||||
source = product.source
|
||||
|
||||
# Auto-add to inventory if requested
|
||||
inventory_item = None
|
||||
predicted_expiration = None
|
||||
if auto_add:
|
||||
# Predict expiration date based on product category and location
|
||||
category = self.expiration_predictor.get_category_from_product(
|
||||
product.name,
|
||||
product.category,
|
||||
[tag.name for tag in product.tags] if product.tags else None
|
||||
)
|
||||
if category:
|
||||
predicted_expiration = self.expiration_predictor.predict_expiration(
|
||||
category,
|
||||
location,
|
||||
date.today()
|
||||
)
|
||||
|
||||
item_create = InventoryItemCreate(
|
||||
product_id=product.id,
|
||||
quantity=quantity,
|
||||
location=location,
|
||||
purchase_date=date.today(),
|
||||
expiration_date=predicted_expiration,
|
||||
source="barcode_scan",
|
||||
)
|
||||
inventory_item = await self.create_inventory_item(db, item_create)
|
||||
|
||||
return {
|
||||
"barcode": barcode,
|
||||
"barcode_type": barcode_type,
|
||||
"quality": barcode_data["quality"],
|
||||
"product": ProductResponse.from_orm(product),
|
||||
"inventory_item": (
|
||||
InventoryItemResponse.from_orm(inventory_item) if inventory_item else None
|
||||
),
|
||||
"source": source,
|
||||
"predicted_expiration": predicted_expiration.isoformat() if predicted_expiration else None,
|
||||
"predicted_category": category if auto_add else None,
|
||||
}
|
||||
|
||||
# ========== Product Management ==========
|
||||
|
||||
async def create_product(
|
||||
self,
|
||||
db: AsyncSession,
|
||||
product: ProductCreate,
|
||||
) -> Product:
|
||||
"""Create a new product."""
|
||||
# Create product
|
||||
db_product = Product(
|
||||
id=uuid.uuid4(),
|
||||
barcode=product.barcode,
|
||||
name=product.name,
|
||||
brand=product.brand,
|
||||
category=product.category,
|
||||
description=product.description,
|
||||
image_url=product.image_url,
|
||||
nutrition_data=product.nutrition_data,
|
||||
source=product.source,
|
||||
source_data=product.source_data,
|
||||
)
|
||||
|
||||
db.add(db_product)
|
||||
await db.flush()
|
||||
|
||||
# Add tags if specified
|
||||
if product.tag_ids:
|
||||
for tag_id in product.tag_ids:
|
||||
tag = await db.get(Tag, tag_id)
|
||||
if tag:
|
||||
db_product.tags.append(tag)
|
||||
|
||||
await db.commit()
|
||||
await db.refresh(db_product, ["tags"])
|
||||
|
||||
return db_product
|
||||
|
||||
async def get_product(self, db: AsyncSession, product_id: UUID) -> Optional[Product]:
|
||||
"""Get a product by ID."""
|
||||
result = await db.execute(
|
||||
select(Product).where(Product.id == product_id).options(selectinload(Product.tags))
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_product_by_barcode(
|
||||
self, db: AsyncSession, barcode: str
|
||||
) -> Optional[Product]:
|
||||
"""Get a product by barcode."""
|
||||
result = await db.execute(
|
||||
select(Product).where(Product.barcode == barcode).options(selectinload(Product.tags))
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def list_products(
|
||||
self,
|
||||
db: AsyncSession,
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
category: Optional[str] = None,
|
||||
) -> List[Product]:
|
||||
"""List products with optional filtering."""
|
||||
query = select(Product).options(selectinload(Product.tags))
|
||||
|
||||
if category:
|
||||
query = query.where(Product.category == category)
|
||||
|
||||
query = query.offset(skip).limit(limit).order_by(Product.name)
|
||||
|
||||
result = await db.execute(query)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def update_product(
|
||||
self,
|
||||
db: AsyncSession,
|
||||
product_id: UUID,
|
||||
product_update: ProductUpdate,
|
||||
) -> Optional[Product]:
|
||||
"""Update a product."""
|
||||
product = await self.get_product(db, product_id)
|
||||
if not product:
|
||||
return None
|
||||
|
||||
# Update fields
|
||||
for field, value in product_update.dict(exclude_unset=True).items():
|
||||
if field == "tag_ids":
|
||||
# Update tags
|
||||
product.tags = []
|
||||
for tag_id in value:
|
||||
tag = await db.get(Tag, tag_id)
|
||||
if tag:
|
||||
product.tags.append(tag)
|
||||
else:
|
||||
setattr(product, field, value)
|
||||
|
||||
product.updated_at = datetime.utcnow()
|
||||
await db.commit()
|
||||
await db.refresh(product, ["tags"])
|
||||
|
||||
return product
|
||||
|
||||
async def delete_product(self, db: AsyncSession, product_id: UUID) -> bool:
|
||||
"""Delete a product (will fail if inventory items exist)."""
|
||||
product = await self.get_product(db, product_id)
|
||||
if not product:
|
||||
return False
|
||||
|
||||
await db.delete(product)
|
||||
await db.commit()
|
||||
return True
|
||||
|
||||
# ========== Inventory Item Management ==========
|
||||
|
||||
async def create_inventory_item(
|
||||
self,
|
||||
db: AsyncSession,
|
||||
item: InventoryItemCreate,
|
||||
) -> InventoryItem:
|
||||
"""Create a new inventory item."""
|
||||
db_item = InventoryItem(
|
||||
id=uuid.uuid4(),
|
||||
product_id=item.product_id,
|
||||
quantity=item.quantity,
|
||||
unit=item.unit,
|
||||
location=item.location,
|
||||
sublocation=item.sublocation,
|
||||
purchase_date=item.purchase_date,
|
||||
expiration_date=item.expiration_date,
|
||||
notes=item.notes,
|
||||
source=item.source,
|
||||
status="available",
|
||||
)
|
||||
|
||||
db.add(db_item)
|
||||
await db.commit()
|
||||
await db.refresh(db_item, ["product"])
|
||||
|
||||
return db_item
|
||||
|
||||
async def get_inventory_item(
|
||||
self, db: AsyncSession, item_id: UUID
|
||||
) -> Optional[InventoryItem]:
|
||||
"""Get an inventory item by ID."""
|
||||
result = await db.execute(
|
||||
select(InventoryItem)
|
||||
.where(InventoryItem.id == item_id)
|
||||
.options(selectinload(InventoryItem.product).selectinload(Product.tags))
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def list_inventory_items(
|
||||
self,
|
||||
db: AsyncSession,
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
location: Optional[str] = None,
|
||||
status: str = "available",
|
||||
) -> List[InventoryItem]:
|
||||
"""List inventory items with filtering."""
|
||||
query = select(InventoryItem).options(
|
||||
selectinload(InventoryItem.product).selectinload(Product.tags)
|
||||
)
|
||||
|
||||
query = query.where(InventoryItem.status == status)
|
||||
|
||||
if location:
|
||||
query = query.where(InventoryItem.location == location)
|
||||
|
||||
query = (
|
||||
query.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(InventoryItem.expiration_date.asc().nullsfirst())
|
||||
)
|
||||
|
||||
result = await db.execute(query)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def update_inventory_item(
|
||||
self,
|
||||
db: AsyncSession,
|
||||
item_id: UUID,
|
||||
item_update: InventoryItemUpdate,
|
||||
) -> Optional[InventoryItem]:
|
||||
"""Update an inventory item."""
|
||||
item = await self.get_inventory_item(db, item_id)
|
||||
if not item:
|
||||
return None
|
||||
|
||||
for field, value in item_update.dict(exclude_unset=True).items():
|
||||
setattr(item, field, value)
|
||||
|
||||
item.updated_at = datetime.utcnow()
|
||||
|
||||
if item_update.status == "consumed" and not item.consumed_at:
|
||||
item.consumed_at = datetime.utcnow()
|
||||
|
||||
await db.commit()
|
||||
await db.refresh(item, ["product"])
|
||||
|
||||
return item
|
||||
|
||||
async def delete_inventory_item(self, db: AsyncSession, item_id: UUID) -> bool:
|
||||
"""Delete an inventory item."""
|
||||
item = await self.get_inventory_item(db, item_id)
|
||||
if not item:
|
||||
return False
|
||||
|
||||
await db.delete(item)
|
||||
await db.commit()
|
||||
return True
|
||||
|
||||
async def mark_as_consumed(
|
||||
self, db: AsyncSession, item_id: UUID
|
||||
) -> Optional[InventoryItem]:
|
||||
"""Mark an inventory item as consumed."""
|
||||
return await self.update_inventory_item(
|
||||
db, item_id, InventoryItemUpdate(status="consumed")
|
||||
)
|
||||
|
||||
# ========== Tag Management ==========
|
||||
|
||||
async def create_tag(self, db: AsyncSession, tag: TagCreate) -> Tag:
|
||||
"""Create a new tag."""
|
||||
db_tag = Tag(
|
||||
id=uuid.uuid4(),
|
||||
name=tag.name,
|
||||
slug=tag.slug,
|
||||
description=tag.description,
|
||||
color=tag.color,
|
||||
category=tag.category,
|
||||
)
|
||||
|
||||
db.add(db_tag)
|
||||
await db.commit()
|
||||
await db.refresh(db_tag)
|
||||
|
||||
return db_tag
|
||||
|
||||
async def get_tag(self, db: AsyncSession, tag_id: UUID) -> Optional[Tag]:
|
||||
"""Get a tag by ID."""
|
||||
return await db.get(Tag, tag_id)
|
||||
|
||||
async def list_tags(
|
||||
self, db: AsyncSession, category: Optional[str] = None
|
||||
) -> List[Tag]:
|
||||
"""List all tags, optionally filtered by category."""
|
||||
query = select(Tag).order_by(Tag.name)
|
||||
|
||||
if category:
|
||||
query = query.where(Tag.category == category)
|
||||
|
||||
result = await db.execute(query)
|
||||
return list(result.scalars().all())
|
||||
|
||||
# ========== Statistics and Analytics ==========
|
||||
|
||||
async def get_inventory_stats(self, db: AsyncSession) -> InventoryStats:
|
||||
"""Get inventory statistics."""
|
||||
# Total items (available only)
|
||||
total_result = await db.execute(
|
||||
select(func.count(InventoryItem.id)).where(InventoryItem.status == "available")
|
||||
)
|
||||
total_items = total_result.scalar() or 0
|
||||
|
||||
# Total unique products
|
||||
products_result = await db.execute(
|
||||
select(func.count(func.distinct(InventoryItem.product_id))).where(
|
||||
InventoryItem.status == "available"
|
||||
)
|
||||
)
|
||||
total_products = products_result.scalar() or 0
|
||||
|
||||
# Items by location
|
||||
location_result = await db.execute(
|
||||
select(
|
||||
InventoryItem.location,
|
||||
func.count(InventoryItem.id).label("count"),
|
||||
)
|
||||
.where(InventoryItem.status == "available")
|
||||
.group_by(InventoryItem.location)
|
||||
)
|
||||
items_by_location = {row[0]: row[1] for row in location_result}
|
||||
|
||||
# Items by status
|
||||
status_result = await db.execute(
|
||||
select(InventoryItem.status, func.count(InventoryItem.id).label("count")).group_by(
|
||||
InventoryItem.status
|
||||
)
|
||||
)
|
||||
items_by_status = {row[0]: row[1] for row in status_result}
|
||||
|
||||
# Expiring soon (next 7 days)
|
||||
today = date.today()
|
||||
week_from_now = today + timedelta(days=7)
|
||||
|
||||
expiring_result = await db.execute(
|
||||
select(func.count(InventoryItem.id)).where(
|
||||
and_(
|
||||
InventoryItem.status == "available",
|
||||
InventoryItem.expiration_date.isnot(None),
|
||||
InventoryItem.expiration_date <= week_from_now,
|
||||
InventoryItem.expiration_date >= today,
|
||||
)
|
||||
)
|
||||
)
|
||||
expiring_soon = expiring_result.scalar() or 0
|
||||
|
||||
# Expired
|
||||
expired_result = await db.execute(
|
||||
select(func.count(InventoryItem.id)).where(
|
||||
and_(
|
||||
InventoryItem.status == "available",
|
||||
InventoryItem.expiration_date.isnot(None),
|
||||
InventoryItem.expiration_date < today,
|
||||
)
|
||||
)
|
||||
)
|
||||
expired = expired_result.scalar() or 0
|
||||
|
||||
return InventoryStats(
|
||||
total_items=total_items,
|
||||
total_products=total_products,
|
||||
items_by_location=items_by_location,
|
||||
items_by_status=items_by_status,
|
||||
expiring_soon=expiring_soon,
|
||||
expired=expired,
|
||||
)
|
||||
|
||||
async def get_expiring_items(
|
||||
self, db: AsyncSession, days: int = 7
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Get items expiring within N days."""
|
||||
today = date.today()
|
||||
cutoff_date = today + timedelta(days=days)
|
||||
|
||||
result = await db.execute(
|
||||
select(InventoryItem)
|
||||
.where(
|
||||
and_(
|
||||
InventoryItem.status == "available",
|
||||
InventoryItem.expiration_date.isnot(None),
|
||||
InventoryItem.expiration_date <= cutoff_date,
|
||||
InventoryItem.expiration_date >= today,
|
||||
)
|
||||
)
|
||||
.options(selectinload(InventoryItem.product).selectinload(Product.tags))
|
||||
.order_by(InventoryItem.expiration_date.asc())
|
||||
)
|
||||
|
||||
items = result.scalars().all()
|
||||
|
||||
return [
|
||||
{
|
||||
"inventory_item": item,
|
||||
"days_until_expiry": (item.expiration_date - today).days,
|
||||
}
|
||||
for item in items
|
||||
]
|
||||
5
app/services/ocr/__init__.py
Normal file
5
app/services/ocr/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
"""OCR services for receipt text extraction."""
|
||||
|
||||
from .vl_model import VisionLanguageOCR
|
||||
|
||||
__all__ = ["VisionLanguageOCR"]
|
||||
371
app/services/ocr/vl_model.py
Normal file
371
app/services/ocr/vl_model.py
Normal file
|
|
@ -0,0 +1,371 @@
|
|||
#!/usr/bin/env python
|
||||
"""
|
||||
Vision-Language Model service for receipt OCR and structured data extraction.
|
||||
|
||||
Uses Qwen3-VL-2B-Instruct for intelligent receipt processing that combines
|
||||
OCR with understanding of receipt structure to extract structured JSON data.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import re
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional, List
|
||||
from datetime import datetime
|
||||
|
||||
from PIL import Image
|
||||
import torch
|
||||
from transformers import (
|
||||
Qwen2VLForConditionalGeneration,
|
||||
AutoProcessor,
|
||||
BitsAndBytesConfig
|
||||
)
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class VisionLanguageOCR:
|
||||
"""Vision-Language Model for receipt OCR and structured extraction."""
|
||||
|
||||
def __init__(self, use_quantization: bool = False):
|
||||
"""
|
||||
Initialize the VLM OCR service.
|
||||
|
||||
Args:
|
||||
use_quantization: Use 8-bit quantization to reduce memory usage
|
||||
"""
|
||||
self.model = None
|
||||
self.processor = None
|
||||
self.device = "cuda" if torch.cuda.is_available() and settings.USE_GPU else "cpu"
|
||||
self.use_quantization = use_quantization
|
||||
self.model_name = "Qwen/Qwen2-VL-2B-Instruct"
|
||||
|
||||
logger.info(f"Initializing VisionLanguageOCR with device: {self.device}")
|
||||
|
||||
# Lazy loading - model will be loaded on first use
|
||||
self._model_loaded = False
|
||||
|
||||
def _load_model(self):
|
||||
"""Load the VLM model (lazy loading)."""
|
||||
if self._model_loaded:
|
||||
return
|
||||
|
||||
logger.info(f"Loading VLM model: {self.model_name}")
|
||||
|
||||
try:
|
||||
if self.use_quantization and self.device == "cuda":
|
||||
# Use 8-bit quantization for lower memory usage
|
||||
quantization_config = BitsAndBytesConfig(
|
||||
load_in_8bit=True,
|
||||
llm_int8_threshold=6.0
|
||||
)
|
||||
|
||||
self.model = Qwen2VLForConditionalGeneration.from_pretrained(
|
||||
self.model_name,
|
||||
quantization_config=quantization_config,
|
||||
device_map="auto",
|
||||
low_cpu_mem_usage=True
|
||||
)
|
||||
logger.info("Model loaded with 8-bit quantization")
|
||||
else:
|
||||
# Standard FP16 loading
|
||||
self.model = Qwen2VLForConditionalGeneration.from_pretrained(
|
||||
self.model_name,
|
||||
torch_dtype=torch.float16 if self.device == "cuda" else torch.float32,
|
||||
device_map="auto" if self.device == "cuda" else None,
|
||||
low_cpu_mem_usage=True
|
||||
)
|
||||
|
||||
if self.device == "cpu":
|
||||
self.model = self.model.to("cpu")
|
||||
|
||||
logger.info(f"Model loaded in {'FP16' if self.device == 'cuda' else 'FP32'} mode")
|
||||
|
||||
self.processor = AutoProcessor.from_pretrained(self.model_name)
|
||||
self.model.eval() # Set to evaluation mode
|
||||
|
||||
self._model_loaded = True
|
||||
logger.info("VLM model loaded successfully")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to load VLM model: {e}")
|
||||
raise RuntimeError(f"Could not load VLM model: {e}")
|
||||
|
||||
def extract_receipt_data(self, image_path: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract structured data from receipt image.
|
||||
|
||||
Args:
|
||||
image_path: Path to the receipt image
|
||||
|
||||
Returns:
|
||||
Dictionary containing extracted receipt data with structure:
|
||||
{
|
||||
"merchant": {...},
|
||||
"transaction": {...},
|
||||
"items": [...],
|
||||
"totals": {...},
|
||||
"confidence": {...},
|
||||
"raw_text": "...",
|
||||
"warnings": [...]
|
||||
}
|
||||
"""
|
||||
self._load_model()
|
||||
|
||||
try:
|
||||
# Load image
|
||||
image = Image.open(image_path)
|
||||
|
||||
# Convert to RGB if needed
|
||||
if image.mode != 'RGB':
|
||||
image = image.convert('RGB')
|
||||
|
||||
# Build extraction prompt
|
||||
prompt = self._build_extraction_prompt()
|
||||
|
||||
# Process image and text
|
||||
logger.info(f"Processing receipt image: {image_path}")
|
||||
inputs = self.processor(
|
||||
images=image,
|
||||
text=prompt,
|
||||
return_tensors="pt"
|
||||
)
|
||||
|
||||
# Move to device
|
||||
if self.device == "cuda":
|
||||
inputs = {k: v.to("cuda", torch.float16) if isinstance(v, torch.Tensor) else v
|
||||
for k, v in inputs.items()}
|
||||
|
||||
# Generate
|
||||
with torch.no_grad():
|
||||
output_ids = self.model.generate(
|
||||
**inputs,
|
||||
max_new_tokens=2048,
|
||||
do_sample=False, # Deterministic for consistency
|
||||
temperature=0.0,
|
||||
pad_token_id=self.processor.tokenizer.pad_token_id,
|
||||
eos_token_id=self.processor.tokenizer.eos_token_id,
|
||||
)
|
||||
|
||||
# Decode output
|
||||
output_text = self.processor.decode(
|
||||
output_ids[0],
|
||||
skip_special_tokens=True
|
||||
)
|
||||
|
||||
# Remove the prompt from output
|
||||
output_text = output_text.replace(prompt, "").strip()
|
||||
|
||||
logger.info(f"VLM output length: {len(output_text)} characters")
|
||||
|
||||
# Parse JSON from output
|
||||
result = self._parse_json_from_text(output_text)
|
||||
|
||||
# Add raw text for reference
|
||||
result["raw_text"] = output_text
|
||||
|
||||
# Validate and enhance result
|
||||
result = self._validate_result(result)
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error extracting receipt data: {e}", exc_info=True)
|
||||
return {
|
||||
"error": str(e),
|
||||
"merchant": {},
|
||||
"transaction": {},
|
||||
"items": [],
|
||||
"totals": {},
|
||||
"confidence": {"overall": 0.0},
|
||||
"warnings": [f"Extraction failed: {str(e)}"]
|
||||
}
|
||||
|
||||
def _build_extraction_prompt(self) -> str:
|
||||
"""Build the prompt for receipt data extraction."""
|
||||
return """You are a receipt OCR specialist. Extract all information from this receipt image and return it in the exact JSON format specified below.
|
||||
|
||||
Return a JSON object with this exact structure:
|
||||
{
|
||||
"merchant": {
|
||||
"name": "Store Name",
|
||||
"address": "123 Main St, City, State ZIP",
|
||||
"phone": "555-1234"
|
||||
},
|
||||
"transaction": {
|
||||
"date": "2025-10-30",
|
||||
"time": "14:30:00",
|
||||
"receipt_number": "12345",
|
||||
"register": "01",
|
||||
"cashier": "Jane"
|
||||
},
|
||||
"items": [
|
||||
{
|
||||
"name": "Product name",
|
||||
"quantity": 2,
|
||||
"unit_price": 10.99,
|
||||
"total_price": 21.98,
|
||||
"category": "grocery",
|
||||
"tax_code": "F",
|
||||
"discount": 0.00
|
||||
}
|
||||
],
|
||||
"totals": {
|
||||
"subtotal": 21.98,
|
||||
"tax": 1.98,
|
||||
"discount": 0.00,
|
||||
"total": 23.96,
|
||||
"payment_method": "Credit Card",
|
||||
"amount_paid": 23.96,
|
||||
"change": 0.00
|
||||
},
|
||||
"confidence": {
|
||||
"overall": 0.95,
|
||||
"merchant": 0.98,
|
||||
"items": 0.92,
|
||||
"totals": 0.97
|
||||
}
|
||||
}
|
||||
|
||||
Important instructions:
|
||||
1. Extract ALL items from the receipt, no matter how many there are
|
||||
2. Use null for fields you cannot find
|
||||
3. For dates, use YYYY-MM-DD format
|
||||
4. For times, use HH:MM:SS format
|
||||
5. For prices, use numeric values (not strings)
|
||||
6. Estimate confidence scores (0.0-1.0) based on image quality and text clarity
|
||||
7. Return ONLY the JSON object, no other text or explanation"""
|
||||
|
||||
def _parse_json_from_text(self, text: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract and parse JSON from model output text.
|
||||
|
||||
Args:
|
||||
text: Raw text output from the model
|
||||
|
||||
Returns:
|
||||
Parsed JSON dictionary
|
||||
"""
|
||||
# Try to find JSON object in text
|
||||
# Look for content between first { and last }
|
||||
json_match = re.search(r'\{.*\}', text, re.DOTALL)
|
||||
|
||||
if json_match:
|
||||
json_str = json_match.group(0)
|
||||
try:
|
||||
return json.loads(json_str)
|
||||
except json.JSONDecodeError as e:
|
||||
logger.warning(f"Failed to parse JSON: {e}")
|
||||
# Try to fix common issues
|
||||
json_str = self._fix_json(json_str)
|
||||
try:
|
||||
return json.loads(json_str)
|
||||
except json.JSONDecodeError:
|
||||
logger.error("Could not parse JSON even after fixes")
|
||||
|
||||
# Return empty structure if parsing fails
|
||||
logger.warning("No valid JSON found in output, returning empty structure")
|
||||
return {
|
||||
"merchant": {},
|
||||
"transaction": {},
|
||||
"items": [],
|
||||
"totals": {},
|
||||
"confidence": {"overall": 0.1}
|
||||
}
|
||||
|
||||
def _fix_json(self, json_str: str) -> str:
|
||||
"""Attempt to fix common JSON formatting issues."""
|
||||
# Remove trailing commas
|
||||
json_str = re.sub(r',\s*}', '}', json_str)
|
||||
json_str = re.sub(r',\s*]', ']', json_str)
|
||||
|
||||
# Fix single quotes to double quotes
|
||||
json_str = json_str.replace("'", '"')
|
||||
|
||||
return json_str
|
||||
|
||||
def _validate_result(self, result: Dict[str, Any]) -> Dict[str, Any]:
|
||||
"""
|
||||
Validate and enhance extracted data.
|
||||
|
||||
Args:
|
||||
result: Extracted receipt data
|
||||
|
||||
Returns:
|
||||
Validated and enhanced result with warnings
|
||||
"""
|
||||
warnings = []
|
||||
|
||||
# Ensure required fields exist
|
||||
required_fields = ["merchant", "transaction", "items", "totals", "confidence"]
|
||||
for field in required_fields:
|
||||
if field not in result:
|
||||
result[field] = {} if field != "items" else []
|
||||
warnings.append(f"Missing required field: {field}")
|
||||
|
||||
# Validate items
|
||||
if not result.get("items"):
|
||||
warnings.append("No items found on receipt")
|
||||
else:
|
||||
# Validate item structure
|
||||
for i, item in enumerate(result["items"]):
|
||||
if "total_price" not in item and "unit_price" in item and "quantity" in item:
|
||||
item["total_price"] = item["unit_price"] * item["quantity"]
|
||||
|
||||
# Validate totals
|
||||
if result.get("items") and result.get("totals"):
|
||||
calculated_subtotal = sum(
|
||||
item.get("total_price", 0)
|
||||
for item in result["items"]
|
||||
)
|
||||
reported_subtotal = result["totals"].get("subtotal", 0)
|
||||
|
||||
# Allow small variance (rounding errors)
|
||||
if abs(calculated_subtotal - reported_subtotal) > 0.10:
|
||||
warnings.append(
|
||||
f"Total mismatch: calculated ${calculated_subtotal:.2f}, "
|
||||
f"reported ${reported_subtotal:.2f}"
|
||||
)
|
||||
result["totals"]["calculated_subtotal"] = calculated_subtotal
|
||||
|
||||
# Validate date format
|
||||
if result.get("transaction", {}).get("date"):
|
||||
try:
|
||||
datetime.strptime(result["transaction"]["date"], "%Y-%m-%d")
|
||||
except ValueError:
|
||||
warnings.append(f"Invalid date format: {result['transaction']['date']}")
|
||||
|
||||
# Add warnings to result
|
||||
if warnings:
|
||||
result["warnings"] = warnings
|
||||
|
||||
# Ensure confidence exists
|
||||
if "confidence" not in result or not result["confidence"]:
|
||||
result["confidence"] = {
|
||||
"overall": 0.5,
|
||||
"merchant": 0.5,
|
||||
"items": 0.5,
|
||||
"totals": 0.5
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
def get_model_info(self) -> Dict[str, Any]:
|
||||
"""Get information about the loaded model."""
|
||||
return {
|
||||
"model_name": self.model_name,
|
||||
"device": self.device,
|
||||
"quantization": self.use_quantization,
|
||||
"loaded": self._model_loaded,
|
||||
"gpu_available": torch.cuda.is_available(),
|
||||
"gpu_memory_allocated": torch.cuda.memory_allocated() if torch.cuda.is_available() else 0,
|
||||
"gpu_memory_reserved": torch.cuda.memory_reserved() if torch.cuda.is_available() else 0
|
||||
}
|
||||
|
||||
def clear_cache(self):
|
||||
"""Clear GPU memory cache."""
|
||||
if torch.cuda.is_available():
|
||||
torch.cuda.empty_cache()
|
||||
logger.info("GPU cache cleared")
|
||||
234
app/services/openfoodfacts.py
Normal file
234
app/services/openfoodfacts.py
Normal file
|
|
@ -0,0 +1,234 @@
|
|||
"""
|
||||
OpenFoodFacts API integration service.
|
||||
|
||||
This module provides functionality to look up product information
|
||||
from the OpenFoodFacts database using barcodes (UPC/EAN).
|
||||
"""
|
||||
|
||||
import httpx
|
||||
from typing import Optional, Dict, Any
|
||||
from app.core.config import settings
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class OpenFoodFactsService:
|
||||
"""
|
||||
Service for interacting with the OpenFoodFacts API.
|
||||
|
||||
OpenFoodFacts is a free, open database of food products with
|
||||
ingredients, allergens, and nutrition facts.
|
||||
"""
|
||||
|
||||
BASE_URL = "https://world.openfoodfacts.org/api/v2"
|
||||
USER_AGENT = "Kiwi/0.1.0 (https://circuitforge.tech)"
|
||||
|
||||
async def lookup_product(self, barcode: str) -> Optional[Dict[str, Any]]:
|
||||
"""
|
||||
Look up a product by barcode in the OpenFoodFacts database.
|
||||
|
||||
Args:
|
||||
barcode: UPC/EAN barcode (8-13 digits)
|
||||
|
||||
Returns:
|
||||
Dictionary with product information, or None if not found
|
||||
|
||||
Example response:
|
||||
{
|
||||
"name": "Organic Milk",
|
||||
"brand": "Horizon",
|
||||
"categories": ["Dairy", "Milk"],
|
||||
"image_url": "https://...",
|
||||
"nutrition_data": {...},
|
||||
"raw_data": {...} # Full API response
|
||||
}
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
url = f"{self.BASE_URL}/product/{barcode}.json"
|
||||
|
||||
response = await client.get(
|
||||
url,
|
||||
headers={"User-Agent": self.USER_AGENT},
|
||||
timeout=10.0,
|
||||
)
|
||||
|
||||
if response.status_code == 404:
|
||||
logger.info(f"Product not found in OpenFoodFacts: {barcode}")
|
||||
return None
|
||||
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
if data.get("status") != 1:
|
||||
logger.info(f"Product not found in OpenFoodFacts: {barcode}")
|
||||
return None
|
||||
|
||||
return self._parse_product_data(data, barcode)
|
||||
|
||||
except httpx.HTTPError as e:
|
||||
logger.error(f"HTTP error looking up barcode {barcode}: {e}")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error looking up barcode {barcode}: {e}")
|
||||
return None
|
||||
|
||||
def _parse_product_data(self, data: Dict[str, Any], barcode: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Parse OpenFoodFacts API response into our product format.
|
||||
|
||||
Args:
|
||||
data: Raw API response
|
||||
barcode: Original barcode
|
||||
|
||||
Returns:
|
||||
Parsed product dictionary
|
||||
"""
|
||||
product = data.get("product", {})
|
||||
|
||||
# Extract basic info
|
||||
name = (
|
||||
product.get("product_name")
|
||||
or product.get("product_name_en")
|
||||
or f"Unknown Product ({barcode})"
|
||||
)
|
||||
|
||||
brand = product.get("brands", "").split(",")[0].strip() if product.get("brands") else None
|
||||
|
||||
# Categories (comma-separated string to list)
|
||||
categories_str = product.get("categories", "")
|
||||
categories = [c.strip() for c in categories_str.split(",") if c.strip()]
|
||||
category = categories[0] if categories else None
|
||||
|
||||
# Description
|
||||
description = product.get("generic_name") or product.get("generic_name_en")
|
||||
|
||||
# Image
|
||||
image_url = product.get("image_url") or product.get("image_front_url")
|
||||
|
||||
# Nutrition data
|
||||
nutrition_data = self._extract_nutrition_data(product)
|
||||
|
||||
# Allergens and dietary info
|
||||
allergens = product.get("allergens_tags", [])
|
||||
labels = product.get("labels_tags", [])
|
||||
|
||||
return {
|
||||
"name": name,
|
||||
"brand": brand,
|
||||
"category": category,
|
||||
"categories": categories,
|
||||
"description": description,
|
||||
"image_url": image_url,
|
||||
"nutrition_data": nutrition_data,
|
||||
"allergens": allergens,
|
||||
"labels": labels,
|
||||
"raw_data": product, # Store full response for debugging
|
||||
}
|
||||
|
||||
def _extract_nutrition_data(self, product: Dict[str, Any]) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract nutrition facts from product data.
|
||||
|
||||
Args:
|
||||
product: Product data from OpenFoodFacts
|
||||
|
||||
Returns:
|
||||
Dictionary of nutrition facts
|
||||
"""
|
||||
nutriments = product.get("nutriments", {})
|
||||
|
||||
# Extract common nutrients (per 100g)
|
||||
nutrition = {}
|
||||
|
||||
# Energy
|
||||
if "energy-kcal_100g" in nutriments:
|
||||
nutrition["calories"] = nutriments["energy-kcal_100g"]
|
||||
elif "energy_100g" in nutriments:
|
||||
# Convert kJ to kcal (1 kcal = 4.184 kJ)
|
||||
nutrition["calories"] = round(nutriments["energy_100g"] / 4.184, 1)
|
||||
|
||||
# Macronutrients
|
||||
if "fat_100g" in nutriments:
|
||||
nutrition["fat_g"] = nutriments["fat_100g"]
|
||||
if "saturated-fat_100g" in nutriments:
|
||||
nutrition["saturated_fat_g"] = nutriments["saturated-fat_100g"]
|
||||
if "carbohydrates_100g" in nutriments:
|
||||
nutrition["carbohydrates_g"] = nutriments["carbohydrates_100g"]
|
||||
if "sugars_100g" in nutriments:
|
||||
nutrition["sugars_g"] = nutriments["sugars_100g"]
|
||||
if "fiber_100g" in nutriments:
|
||||
nutrition["fiber_g"] = nutriments["fiber_100g"]
|
||||
if "proteins_100g" in nutriments:
|
||||
nutrition["protein_g"] = nutriments["proteins_100g"]
|
||||
|
||||
# Minerals
|
||||
if "salt_100g" in nutriments:
|
||||
nutrition["salt_g"] = nutriments["salt_100g"]
|
||||
elif "sodium_100g" in nutriments:
|
||||
# Convert sodium to salt (1g sodium = 2.5g salt)
|
||||
nutrition["salt_g"] = round(nutriments["sodium_100g"] * 2.5, 2)
|
||||
|
||||
# Serving size
|
||||
if "serving_size" in product:
|
||||
nutrition["serving_size"] = product["serving_size"]
|
||||
|
||||
return nutrition
|
||||
|
||||
async def search_products(
|
||||
self,
|
||||
query: str,
|
||||
page: int = 1,
|
||||
page_size: int = 20
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Search for products by name in OpenFoodFacts.
|
||||
|
||||
Args:
|
||||
query: Search query
|
||||
page: Page number (1-indexed)
|
||||
page_size: Number of results per page
|
||||
|
||||
Returns:
|
||||
Dictionary with search results and metadata
|
||||
"""
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
url = f"{self.BASE_URL}/search"
|
||||
|
||||
response = await client.get(
|
||||
url,
|
||||
params={
|
||||
"search_terms": query,
|
||||
"page": page,
|
||||
"page_size": page_size,
|
||||
"json": 1,
|
||||
},
|
||||
headers={"User-Agent": self.USER_AGENT},
|
||||
timeout=10.0,
|
||||
)
|
||||
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
products = [
|
||||
self._parse_product_data({"product": p}, p.get("code", ""))
|
||||
for p in data.get("products", [])
|
||||
]
|
||||
|
||||
return {
|
||||
"products": products,
|
||||
"count": data.get("count", 0),
|
||||
"page": data.get("page", page),
|
||||
"page_size": data.get("page_size", page_size),
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error searching OpenFoodFacts: {e}")
|
||||
return {
|
||||
"products": [],
|
||||
"count": 0,
|
||||
"page": page,
|
||||
"page_size": page_size,
|
||||
}
|
||||
9
app/services/quality/__init__.py
Normal file
9
app/services/quality/__init__.py
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
# app/services/quality/__init__.py
|
||||
"""
|
||||
Quality assessment services for Kiwi.
|
||||
Contains functionality for evaluating receipt image quality.
|
||||
"""
|
||||
|
||||
from app.services.quality.assessment import QualityAssessor
|
||||
|
||||
__all__ = ["QualityAssessor"]
|
||||
332
app/services/quality/assessment.py
Normal file
332
app/services/quality/assessment.py
Normal file
|
|
@ -0,0 +1,332 @@
|
|||
#!/usr/bin/env python
|
||||
# app/services/quality/assessment.py
|
||||
import cv2
|
||||
import numpy as np
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional, Tuple
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class QualityAssessor:
|
||||
"""
|
||||
Assesses the quality of receipt images for processing suitability.
|
||||
"""
|
||||
|
||||
def __init__(self, min_quality_score: float = 50.0):
|
||||
"""
|
||||
Initialize the quality assessor.
|
||||
|
||||
Args:
|
||||
min_quality_score: Minimum acceptable quality score (0-100)
|
||||
"""
|
||||
self.min_quality_score = min_quality_score
|
||||
|
||||
def assess_image(self, image_path: Path) -> Dict[str, Any]:
|
||||
"""
|
||||
Assess the quality of an image.
|
||||
|
||||
Args:
|
||||
image_path: Path to the image
|
||||
|
||||
Returns:
|
||||
Dictionary containing quality metrics
|
||||
"""
|
||||
try:
|
||||
# Read image
|
||||
img = cv2.imread(str(image_path))
|
||||
if img is None:
|
||||
return {
|
||||
"success": False,
|
||||
"error": f"Failed to read image: {image_path}",
|
||||
"overall_score": 0.0,
|
||||
}
|
||||
|
||||
# Convert to grayscale for some metrics
|
||||
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
|
||||
|
||||
# Calculate various quality metrics
|
||||
blur_score = self._calculate_blur_score(gray)
|
||||
lighting_score = self._calculate_lighting_score(gray)
|
||||
contrast_score = self._calculate_contrast_score(gray)
|
||||
size_score = self._calculate_size_score(img.shape)
|
||||
|
||||
# Check for potential fold lines
|
||||
fold_detected, fold_severity = self._detect_folds(gray)
|
||||
|
||||
# Calculate overall quality score
|
||||
overall_score = self._calculate_overall_score({
|
||||
"blur": blur_score,
|
||||
"lighting": lighting_score,
|
||||
"contrast": contrast_score,
|
||||
"size": size_score,
|
||||
"fold": 100.0 if not fold_detected else (100.0 - fold_severity * 20.0)
|
||||
})
|
||||
|
||||
# Create assessment result
|
||||
result = {
|
||||
"success": True,
|
||||
"metrics": {
|
||||
"blur_score": blur_score,
|
||||
"lighting_score": lighting_score,
|
||||
"contrast_score": contrast_score,
|
||||
"size_score": size_score,
|
||||
"fold_detected": fold_detected,
|
||||
"fold_severity": fold_severity if fold_detected else 0.0,
|
||||
},
|
||||
"overall_score": overall_score,
|
||||
"is_acceptable": overall_score >= self.min_quality_score,
|
||||
"improvement_suggestions": self._generate_suggestions({
|
||||
"blur": blur_score,
|
||||
"lighting": lighting_score,
|
||||
"contrast": contrast_score,
|
||||
"size": size_score,
|
||||
"fold": fold_detected,
|
||||
"fold_severity": fold_severity if fold_detected else 0.0,
|
||||
}),
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(f"Error assessing image quality: {e}")
|
||||
return {
|
||||
"success": False,
|
||||
"error": f"Error assessing image quality: {str(e)}",
|
||||
"overall_score": 0.0,
|
||||
}
|
||||
|
||||
def _calculate_blur_score(self, gray_img: np.ndarray) -> float:
|
||||
"""
|
||||
Calculate blur score using Laplacian variance.
|
||||
Higher variance = less blurry (higher score)
|
||||
|
||||
Args:
|
||||
gray_img: Grayscale image
|
||||
|
||||
Returns:
|
||||
Blur score (0-100)
|
||||
"""
|
||||
# Use Laplacian for edge detection
|
||||
laplacian = cv2.Laplacian(gray_img, cv2.CV_64F)
|
||||
|
||||
# Calculate variance of Laplacian
|
||||
variance = laplacian.var()
|
||||
|
||||
# Map variance to a 0-100 score
|
||||
# These thresholds might need adjustment based on your specific requirements
|
||||
if variance < 10:
|
||||
return 0.0 # Very blurry
|
||||
elif variance < 100:
|
||||
return (variance - 10) / 90 * 50 # Map 10-100 to 0-50
|
||||
elif variance < 1000:
|
||||
return 50 + (variance - 100) / 900 * 50 # Map 100-1000 to 50-100
|
||||
else:
|
||||
return 100.0 # Very sharp
|
||||
|
||||
def _calculate_lighting_score(self, gray_img: np.ndarray) -> float:
|
||||
"""
|
||||
Calculate lighting score based on average brightness and std dev.
|
||||
|
||||
Args:
|
||||
gray_img: Grayscale image
|
||||
|
||||
Returns:
|
||||
Lighting score (0-100)
|
||||
"""
|
||||
# Calculate mean brightness
|
||||
mean = gray_img.mean()
|
||||
|
||||
# Calculate standard deviation of brightness
|
||||
std = gray_img.std()
|
||||
|
||||
# Ideal mean would be around 127 (middle of 0-255)
|
||||
# Penalize if too dark or too bright
|
||||
mean_score = 100 - abs(mean - 127) / 127 * 100
|
||||
|
||||
# Higher std dev generally means better contrast
|
||||
# But we'll cap at 60 for reasonable balance
|
||||
std_score = min(std / 60 * 100, 100)
|
||||
|
||||
# Combine scores (weighted)
|
||||
return 0.6 * mean_score + 0.4 * std_score
|
||||
|
||||
def _calculate_contrast_score(self, gray_img: np.ndarray) -> float:
|
||||
"""
|
||||
Calculate contrast score.
|
||||
|
||||
Args:
|
||||
gray_img: Grayscale image
|
||||
|
||||
Returns:
|
||||
Contrast score (0-100)
|
||||
"""
|
||||
# Calculate histogram
|
||||
hist = cv2.calcHist([gray_img], [0], None, [256], [0, 256])
|
||||
|
||||
# Calculate percentage of pixels in each brightness range
|
||||
total_pixels = gray_img.shape[0] * gray_img.shape[1]
|
||||
dark_pixels = np.sum(hist[:50]) / total_pixels
|
||||
mid_pixels = np.sum(hist[50:200]) / total_pixels
|
||||
bright_pixels = np.sum(hist[200:]) / total_pixels
|
||||
|
||||
# Ideal: good distribution across ranges with emphasis on mid-range
|
||||
# This is a simplified model - real receipts may need different distributions
|
||||
score = (
|
||||
(0.2 * min(dark_pixels * 500, 100)) + # Want some dark pixels (text)
|
||||
(0.6 * min(mid_pixels * 200, 100)) + # Want many mid pixels
|
||||
(0.2 * min(bright_pixels * 500, 100)) # Want some bright pixels (background)
|
||||
)
|
||||
|
||||
return score
|
||||
|
||||
def _calculate_size_score(self, shape: Tuple[int, int, int]) -> float:
|
||||
"""
|
||||
Calculate score based on image dimensions.
|
||||
|
||||
Args:
|
||||
shape: Image shape (height, width, channels)
|
||||
|
||||
Returns:
|
||||
Size score (0-100)
|
||||
"""
|
||||
height, width = shape[0], shape[1]
|
||||
|
||||
# Minimum recommended dimensions for good OCR
|
||||
min_height, min_width = 800, 600
|
||||
|
||||
# Calculate size score
|
||||
if height < min_height or width < min_width:
|
||||
# Penalize if below minimum dimensions
|
||||
return min(height / min_height, width / min_width) * 100
|
||||
else:
|
||||
# Full score if dimensions are adequate
|
||||
return 100.0
|
||||
|
||||
def _detect_folds(self, gray_img: np.ndarray) -> Tuple[bool, float]:
|
||||
"""
|
||||
Detect potential fold lines in the image.
|
||||
|
||||
Args:
|
||||
gray_img: Grayscale image
|
||||
|
||||
Returns:
|
||||
Tuple of (fold_detected, fold_severity)
|
||||
fold_severity is a value between 0 and 5, with 5 being the most severe
|
||||
"""
|
||||
# Apply edge detection
|
||||
edges = cv2.Canny(gray_img, 50, 150, apertureSize=3)
|
||||
|
||||
# Apply Hough Line Transform to detect straight lines
|
||||
lines = cv2.HoughLinesP(
|
||||
edges,
|
||||
rho=1,
|
||||
theta=np.pi/180,
|
||||
threshold=100,
|
||||
minLineLength=gray_img.shape[1] // 3, # Look for lines at least 1/3 of image width
|
||||
maxLineGap=10
|
||||
)
|
||||
|
||||
if lines is None:
|
||||
return False, 0.0
|
||||
|
||||
# Check for horizontal or vertical lines that could be folds
|
||||
potential_folds = []
|
||||
height, width = gray_img.shape
|
||||
|
||||
for line in lines:
|
||||
x1, y1, x2, y2 = line[0]
|
||||
length = np.sqrt((x2 - x1)**2 + (y2 - y1)**2)
|
||||
angle = np.abs(np.arctan2(y2 - y1, x2 - x1) * 180 / np.pi)
|
||||
|
||||
# Check if horizontal (0±10°) or vertical (90±10°)
|
||||
is_horizontal = angle < 10 or angle > 170
|
||||
is_vertical = abs(angle - 90) < 10
|
||||
|
||||
# Check if length is significant
|
||||
is_significant = (is_horizontal and length > width * 0.5) or \
|
||||
(is_vertical and length > height * 0.5)
|
||||
|
||||
if (is_horizontal or is_vertical) and is_significant:
|
||||
# Calculate intensity variance along the line
|
||||
# This helps determine if it's a fold (sharp brightness change)
|
||||
# Simplified implementation for Phase 1
|
||||
potential_folds.append({
|
||||
"length": length,
|
||||
"is_horizontal": is_horizontal,
|
||||
})
|
||||
|
||||
# Determine if folds are detected and their severity
|
||||
if not potential_folds:
|
||||
return False, 0.0
|
||||
|
||||
# Severity based on number and length of potential folds
|
||||
# This is a simplified metric for Phase 1
|
||||
total_len = sum(fold["length"] for fold in potential_folds)
|
||||
if is_horizontal:
|
||||
severity = min(5.0, total_len / width * 2.5)
|
||||
else:
|
||||
severity = min(5.0, total_len / height * 2.5)
|
||||
|
||||
return True, severity
|
||||
|
||||
def _calculate_overall_score(self, scores: Dict[str, float]) -> float:
|
||||
"""
|
||||
Calculate overall quality score from individual metrics.
|
||||
|
||||
Args:
|
||||
scores: Dictionary of individual quality scores
|
||||
|
||||
Returns:
|
||||
Overall quality score (0-100)
|
||||
"""
|
||||
# Weights for different factors
|
||||
weights = {
|
||||
"blur": 0.30,
|
||||
"lighting": 0.25,
|
||||
"contrast": 0.25,
|
||||
"size": 0.10,
|
||||
"fold": 0.10,
|
||||
}
|
||||
|
||||
# Calculate weighted average
|
||||
overall = sum(weights[key] * scores[key] for key in weights)
|
||||
|
||||
return overall
|
||||
|
||||
def _generate_suggestions(self, metrics: Dict[str, Any]) -> list:
|
||||
"""
|
||||
Generate improvement suggestions based on metrics.
|
||||
|
||||
Args:
|
||||
metrics: Dictionary of quality metrics
|
||||
|
||||
Returns:
|
||||
List of improvement suggestions
|
||||
"""
|
||||
suggestions = []
|
||||
|
||||
# Blur suggestions
|
||||
if metrics["blur"] < 60:
|
||||
suggestions.append("Hold the camera steady and ensure the receipt is in focus.")
|
||||
|
||||
# Lighting suggestions
|
||||
if metrics["lighting"] < 60:
|
||||
suggestions.append("Improve lighting conditions and avoid shadows on the receipt.")
|
||||
|
||||
# Contrast suggestions
|
||||
if metrics["contrast"] < 60:
|
||||
suggestions.append("Ensure good contrast between text and background.")
|
||||
|
||||
# Size suggestions
|
||||
if metrics["size"] < 60:
|
||||
suggestions.append("Move the camera closer to the receipt for better resolution.")
|
||||
|
||||
# Fold suggestions
|
||||
if metrics["fold"]:
|
||||
if metrics["fold_severity"] > 3.0:
|
||||
suggestions.append("The receipt has severe folds. Try to flatten it before capturing.")
|
||||
else:
|
||||
suggestions.append("Flatten the receipt to remove fold lines for better processing.")
|
||||
|
||||
return suggestions
|
||||
126
app/services/receipt_service.py
Normal file
126
app/services/receipt_service.py
Normal file
|
|
@ -0,0 +1,126 @@
|
|||
"""
|
||||
Receipt processing service — orchestrates the OCR pipeline.
|
||||
|
||||
Pipeline stages:
|
||||
1. Preprocess — enhance image, convert to PNG
|
||||
2. Quality — score image; abort to 'low_quality' if below threshold
|
||||
3. OCR — VisionLanguageOCR extracts structured data
|
||||
4. Persist — flatten result into receipt_data table
|
||||
5. Stage — set status to 'staged'; items await human approval
|
||||
|
||||
Items are NOT added to inventory automatically. Use the
|
||||
POST /receipts/{id}/ocr/approve endpoint to commit approved items.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from app.db.store import Store
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _flatten_ocr_result(result: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Map nested VisionLanguageOCR output to the flat receipt_data schema."""
|
||||
merchant = result.get("merchant") or {}
|
||||
transaction = result.get("transaction") or {}
|
||||
totals = result.get("totals") or {}
|
||||
return {
|
||||
"merchant_name": merchant.get("name"),
|
||||
"merchant_address": merchant.get("address"),
|
||||
"merchant_phone": merchant.get("phone"),
|
||||
"transaction_date": transaction.get("date"),
|
||||
"transaction_time": transaction.get("time"),
|
||||
"receipt_number": transaction.get("receipt_number"),
|
||||
"register_number": transaction.get("register"),
|
||||
"cashier_name": transaction.get("cashier"),
|
||||
"items": result.get("items") or [],
|
||||
"subtotal": totals.get("subtotal"),
|
||||
"tax": totals.get("tax"),
|
||||
"discount": totals.get("discount"),
|
||||
"total": totals.get("total"),
|
||||
"payment_method": totals.get("payment_method"),
|
||||
"amount_paid": totals.get("amount_paid"),
|
||||
"change_given": totals.get("change"),
|
||||
"raw_text": result.get("raw_text"),
|
||||
"confidence_scores": result.get("confidence") or {},
|
||||
"warnings": result.get("warnings") or [],
|
||||
}
|
||||
|
||||
|
||||
class ReceiptService:
|
||||
def __init__(self, store: Store) -> None:
|
||||
self.store = store
|
||||
|
||||
async def process(self, receipt_id: int, image_path: Path) -> None:
|
||||
"""Run the full OCR pipeline for a receipt image.
|
||||
|
||||
Stages run synchronously inside asyncio.to_thread so SQLite and the
|
||||
VLM (which uses torch) both stay off the async event loop.
|
||||
"""
|
||||
import asyncio
|
||||
await asyncio.to_thread(self._run_pipeline, receipt_id, image_path)
|
||||
|
||||
def _run_pipeline(self, receipt_id: int, image_path: Path) -> None:
|
||||
from app.core.config import settings
|
||||
from app.services.image_preprocessing.enhancement import ImageEnhancer
|
||||
from app.services.image_preprocessing.format_conversion import FormatConverter
|
||||
from app.services.quality.assessment import QualityAssessor
|
||||
|
||||
# ── Stage 1: Preprocess ───────────────────────────────────────────────
|
||||
enhancer = ImageEnhancer()
|
||||
converter = FormatConverter()
|
||||
enhanced = enhancer.enhance(image_path)
|
||||
processed_path = converter.to_png(enhanced)
|
||||
|
||||
# ── Stage 2: Quality assessment ───────────────────────────────────────
|
||||
assessor = QualityAssessor()
|
||||
assessment = assessor.assess(processed_path)
|
||||
self.store.upsert_quality_assessment(
|
||||
receipt_id,
|
||||
overall_score=assessment["overall_score"],
|
||||
is_acceptable=assessment["is_acceptable"],
|
||||
metrics=assessment.get("metrics", {}),
|
||||
suggestions=assessment.get("suggestions", []),
|
||||
)
|
||||
|
||||
if not assessment["is_acceptable"]:
|
||||
self.store.update_receipt_status(receipt_id, "low_quality")
|
||||
logger.warning(
|
||||
"Receipt %s: quality too low for OCR (score=%.1f) — %s",
|
||||
receipt_id, assessment["overall_score"],
|
||||
"; ".join(assessment.get("suggestions", [])),
|
||||
)
|
||||
return
|
||||
|
||||
if not settings.ENABLE_OCR:
|
||||
self.store.update_receipt_status(receipt_id, "processed")
|
||||
logger.info("Receipt %s: quality OK but ENABLE_OCR=false — skipping OCR", receipt_id)
|
||||
return
|
||||
|
||||
# ── Stage 3: OCR extraction ───────────────────────────────────────────
|
||||
from app.services.ocr.vl_model import VisionLanguageOCR
|
||||
ocr = VisionLanguageOCR()
|
||||
result = ocr.extract_receipt_data(str(processed_path))
|
||||
|
||||
if result.get("error"):
|
||||
self.store.update_receipt_status(receipt_id, "error", result["error"])
|
||||
logger.error("Receipt %s: OCR failed — %s", receipt_id, result["error"])
|
||||
return
|
||||
|
||||
# ── Stage 4: Persist extracted data ───────────────────────────────────
|
||||
flat = _flatten_ocr_result(result)
|
||||
self.store.upsert_receipt_data(receipt_id, flat)
|
||||
|
||||
item_count = len(flat.get("items") or [])
|
||||
|
||||
# ── Stage 5: Stage for human approval ────────────────────────────────
|
||||
self.store.update_receipt_status(receipt_id, "staged")
|
||||
logger.info(
|
||||
"Receipt %s: OCR complete — %d item(s) staged for review "
|
||||
"(confidence=%.2f)",
|
||||
receipt_id, item_count,
|
||||
(result.get("confidence") or {}).get("overall", 0.0),
|
||||
)
|
||||
295
app/services/receipt_service_inmemory_backup.py
Normal file
295
app/services/receipt_service_inmemory_backup.py
Normal file
|
|
@ -0,0 +1,295 @@
|
|||
#!/usr/bin/env python
|
||||
# app/services/receipt_service.py
|
||||
import os
|
||||
import uuid
|
||||
import shutil
|
||||
import aiofiles
|
||||
from pathlib import Path
|
||||
from typing import Optional, List, Dict, Any
|
||||
from fastapi import UploadFile, BackgroundTasks, HTTPException
|
||||
import asyncio
|
||||
import logging
|
||||
import sys
|
||||
from app.utils.progress import ProgressIndicator
|
||||
|
||||
from app.services.image_preprocessing.format_conversion import convert_to_standard_format, extract_metadata
|
||||
from app.services.image_preprocessing.enhancement import enhance_image, correct_perspective
|
||||
from app.services.quality.assessment import QualityAssessor
|
||||
from app.models.schemas.receipt import ReceiptCreate, ReceiptResponse
|
||||
from app.models.schemas.quality import QualityAssessment
|
||||
from app.core.config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class ReceiptService:
|
||||
"""
|
||||
Service for handling receipt processing.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""
|
||||
Initialize the receipt service.
|
||||
"""
|
||||
self.quality_assessor = QualityAssessor()
|
||||
self.upload_dir = Path(settings.UPLOAD_DIR)
|
||||
self.processing_dir = Path(settings.PROCESSING_DIR)
|
||||
|
||||
# Create directories if they don't exist
|
||||
self.upload_dir.mkdir(parents=True, exist_ok=True)
|
||||
self.processing_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# In-memory storage for Phase 1 (would be replaced by DB in production)
|
||||
self.receipts = {}
|
||||
self.quality_assessments = {}
|
||||
|
||||
async def process_receipt(
|
||||
self,
|
||||
file: UploadFile,
|
||||
background_tasks: BackgroundTasks
|
||||
) -> ReceiptResponse:
|
||||
"""
|
||||
Process a single receipt file.
|
||||
|
||||
Args:
|
||||
file: Uploaded receipt file
|
||||
background_tasks: FastAPI background tasks
|
||||
|
||||
Returns:
|
||||
ReceiptResponse object
|
||||
"""
|
||||
# Generate unique ID for receipt
|
||||
receipt_id = str(uuid.uuid4())
|
||||
|
||||
# Save uploaded file
|
||||
upload_path = self.upload_dir / f"{receipt_id}_{file.filename}"
|
||||
await self._save_upload_file(file, upload_path)
|
||||
|
||||
# Create receipt entry
|
||||
receipt = {
|
||||
"id": receipt_id,
|
||||
"filename": file.filename,
|
||||
"status": "uploaded",
|
||||
"original_path": str(upload_path),
|
||||
"processed_path": None,
|
||||
"metadata": {},
|
||||
}
|
||||
|
||||
self.receipts[receipt_id] = receipt
|
||||
|
||||
# Add background task for processing
|
||||
background_tasks.add_task(
|
||||
self._process_receipt_background,
|
||||
receipt_id,
|
||||
upload_path
|
||||
)
|
||||
|
||||
return ReceiptResponse(
|
||||
id=receipt_id,
|
||||
filename=file.filename,
|
||||
status="processing",
|
||||
metadata={},
|
||||
quality_score=None,
|
||||
)
|
||||
|
||||
async def get_receipt(self, receipt_id: str) -> Optional[ReceiptResponse]:
|
||||
"""
|
||||
Get receipt by ID.
|
||||
|
||||
Args:
|
||||
receipt_id: Receipt ID
|
||||
|
||||
Returns:
|
||||
ReceiptResponse object or None if not found
|
||||
"""
|
||||
receipt = self.receipts.get(receipt_id)
|
||||
if not receipt:
|
||||
return None
|
||||
|
||||
quality = self.quality_assessments.get(receipt_id)
|
||||
quality_score = quality.get("overall_score") if quality else None
|
||||
|
||||
return ReceiptResponse(
|
||||
id=receipt["id"],
|
||||
filename=receipt["filename"],
|
||||
status=receipt["status"],
|
||||
metadata=receipt["metadata"],
|
||||
quality_score=quality_score,
|
||||
)
|
||||
|
||||
async def get_receipt_quality(self, receipt_id: str) -> Optional[QualityAssessment]:
|
||||
"""
|
||||
Get quality assessment for a receipt.
|
||||
|
||||
Args:
|
||||
receipt_id: Receipt ID
|
||||
|
||||
Returns:
|
||||
QualityAssessment object or None if not found
|
||||
"""
|
||||
quality = self.quality_assessments.get(receipt_id)
|
||||
if not quality:
|
||||
return None
|
||||
|
||||
return QualityAssessment(
|
||||
receipt_id=receipt_id,
|
||||
overall_score=quality["overall_score"],
|
||||
is_acceptable=quality["is_acceptable"],
|
||||
metrics=quality["metrics"],
|
||||
suggestions=quality["improvement_suggestions"],
|
||||
)
|
||||
|
||||
def list_receipts(self) -> List[ReceiptResponse]:
|
||||
"""
|
||||
List all receipts.
|
||||
|
||||
Returns:
|
||||
List of ReceiptResponse objects
|
||||
"""
|
||||
result = []
|
||||
for receipt_id, receipt in self.receipts.items():
|
||||
quality = self.quality_assessments.get(receipt_id)
|
||||
quality_score = quality.get("overall_score") if quality else None
|
||||
|
||||
result.append(ReceiptResponse(
|
||||
id=receipt["id"],
|
||||
filename=receipt["filename"],
|
||||
status=receipt["status"],
|
||||
metadata=receipt["metadata"],
|
||||
quality_score=quality_score,
|
||||
))
|
||||
|
||||
return result
|
||||
|
||||
def get_quality_assessments(self) -> Dict[str, QualityAssessment]:
|
||||
"""
|
||||
Get all quality assessments.
|
||||
|
||||
Returns:
|
||||
Dict mapping receipt_id to QualityAssessment object
|
||||
"""
|
||||
result = {}
|
||||
for receipt_id, quality in self.quality_assessments.items():
|
||||
result[receipt_id] = QualityAssessment(
|
||||
receipt_id=receipt_id,
|
||||
overall_score=quality["overall_score"],
|
||||
is_acceptable=quality["is_acceptable"],
|
||||
metrics=quality["metrics"],
|
||||
suggestions=quality["improvement_suggestions"],
|
||||
)
|
||||
return result
|
||||
|
||||
async def _save_upload_file(self, file: UploadFile, destination: Path) -> None:
|
||||
"""
|
||||
Save an uploaded file to disk.
|
||||
|
||||
Args:
|
||||
file: Uploaded file
|
||||
destination: Destination path
|
||||
"""
|
||||
try:
|
||||
async with aiofiles.open(destination, 'wb') as out_file:
|
||||
# Read in chunks to handle large files
|
||||
content = await file.read(1024 * 1024) # 1MB chunks
|
||||
while content:
|
||||
await out_file.write(content)
|
||||
content = await file.read(1024 * 1024)
|
||||
|
||||
except Exception as e:
|
||||
logger.exception(f"Error saving upload file: {e}")
|
||||
raise HTTPException(status_code=500, detail=f"Error saving upload file: {str(e)}")
|
||||
|
||||
async def _process_receipt_background(self, receipt_id: str, upload_path: Path) -> None:
|
||||
"""
|
||||
Background task for processing a receipt with progress indicators.
|
||||
|
||||
Args:
|
||||
receipt_id: Receipt ID
|
||||
upload_path: Path to uploaded file
|
||||
"""
|
||||
try:
|
||||
# Print a message to indicate start of processing
|
||||
print(f"\nProcessing receipt {receipt_id}...")
|
||||
|
||||
# Update status
|
||||
self.receipts[receipt_id]["status"] = "processing"
|
||||
|
||||
# Create processing directory for this receipt
|
||||
receipt_dir = self.processing_dir / receipt_id
|
||||
receipt_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Step 1: Convert to standard format
|
||||
print(" Step 1/4: Converting to standard format...")
|
||||
converted_path = receipt_dir / f"{receipt_id}_converted.png"
|
||||
success, message, actual_converted_path = convert_to_standard_format(
|
||||
upload_path,
|
||||
converted_path
|
||||
)
|
||||
|
||||
if not success:
|
||||
print(f" ✗ Format conversion failed: {message}")
|
||||
self.receipts[receipt_id]["status"] = "error"
|
||||
self.receipts[receipt_id]["error"] = message
|
||||
return
|
||||
print(" ✓ Format conversion complete")
|
||||
|
||||
# Step 2: Correct perspective
|
||||
print(" Step 2/4: Correcting perspective...")
|
||||
perspective_path = receipt_dir / f"{receipt_id}_perspective.png"
|
||||
success, message, actual_perspective_path = correct_perspective(
|
||||
actual_converted_path,
|
||||
perspective_path
|
||||
)
|
||||
|
||||
# Use corrected image if available, otherwise use converted image
|
||||
current_path = actual_perspective_path if success else actual_converted_path
|
||||
if success:
|
||||
print(" ✓ Perspective correction complete")
|
||||
else:
|
||||
print(f" ⚠ Perspective correction skipped: {message}")
|
||||
|
||||
# Step 3: Enhance image
|
||||
print(" Step 3/4: Enhancing image...")
|
||||
enhanced_path = receipt_dir / f"{receipt_id}_enhanced.png"
|
||||
success, message, actual_enhanced_path = enhance_image(
|
||||
current_path,
|
||||
enhanced_path
|
||||
)
|
||||
|
||||
if not success:
|
||||
print(f" ⚠ Enhancement warning: {message}")
|
||||
# Continue with current path
|
||||
else:
|
||||
current_path = actual_enhanced_path
|
||||
print(" ✓ Image enhancement complete")
|
||||
|
||||
# Step 4: Assess quality
|
||||
print(" Step 4/4: Assessing quality...")
|
||||
quality_assessment = self.quality_assessor.assess_image(current_path)
|
||||
self.quality_assessments[receipt_id] = quality_assessment
|
||||
print(f" ✓ Quality assessment complete: score {quality_assessment['overall_score']:.1f}/100")
|
||||
|
||||
# Step 5: Extract metadata
|
||||
print(" Extracting metadata...")
|
||||
metadata = extract_metadata(upload_path)
|
||||
if current_path != upload_path:
|
||||
processed_metadata = extract_metadata(current_path)
|
||||
metadata["processed"] = {
|
||||
"width": processed_metadata.get("width"),
|
||||
"height": processed_metadata.get("height"),
|
||||
"format": processed_metadata.get("original_format"),
|
||||
}
|
||||
print(" ✓ Metadata extraction complete")
|
||||
|
||||
# Update receipt entry
|
||||
self.receipts[receipt_id].update({
|
||||
"status": "processed",
|
||||
"processed_path": str(current_path),
|
||||
"metadata": metadata,
|
||||
})
|
||||
|
||||
print(f"✓ Receipt {receipt_id} processed successfully!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Error processing receipt {receipt_id}: {e}")
|
||||
self.receipts[receipt_id]["status"] = "error"
|
||||
self.receipts[receipt_id]["error"] = str(e)
|
||||
926
app/static/index.html
Normal file
926
app/static/index.html
Normal file
|
|
@ -0,0 +1,926 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Project Thoth - Inventory & Receipt Manager</title>
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif;
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
min-height: 100vh;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 1200px;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
.header {
|
||||
text-align: center;
|
||||
color: white;
|
||||
margin-bottom: 40px;
|
||||
}
|
||||
|
||||
.header h1 {
|
||||
font-size: 2.5em;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.header p {
|
||||
font-size: 1.2em;
|
||||
opacity: 0.9;
|
||||
}
|
||||
|
||||
/* Tabs */
|
||||
.tabs {
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.tab {
|
||||
background: rgba(255, 255, 255, 0.2);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 15px 30px;
|
||||
font-size: 16px;
|
||||
border-radius: 8px;
|
||||
cursor: pointer;
|
||||
transition: all 0.3s;
|
||||
}
|
||||
|
||||
.tab:hover {
|
||||
background: rgba(255, 255, 255, 0.3);
|
||||
}
|
||||
|
||||
.tab.active {
|
||||
background: white;
|
||||
color: #667eea;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.tab-content {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.tab-content.active {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.card {
|
||||
background: white;
|
||||
border-radius: 12px;
|
||||
padding: 30px;
|
||||
box-shadow: 0 10px 40px rgba(0,0,0,0.2);
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-area {
|
||||
border: 3px dashed #667eea;
|
||||
border-radius: 8px;
|
||||
padding: 40px;
|
||||
text-align: center;
|
||||
cursor: pointer;
|
||||
transition: all 0.3s;
|
||||
background: #f7f9fc;
|
||||
}
|
||||
|
||||
.upload-area:hover {
|
||||
border-color: #764ba2;
|
||||
background: #eef2f7;
|
||||
}
|
||||
|
||||
.upload-icon {
|
||||
font-size: 48px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-text {
|
||||
font-size: 18px;
|
||||
color: #333;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.upload-hint {
|
||||
font-size: 14px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
input[type="file"] {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.button {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 12px 30px;
|
||||
font-size: 16px;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
transition: transform 0.2s;
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.button:hover {
|
||||
transform: translateY(-2px);
|
||||
}
|
||||
|
||||
.button:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
transform: none;
|
||||
}
|
||||
|
||||
.button-secondary {
|
||||
background: #6c757d;
|
||||
}
|
||||
|
||||
.button-small {
|
||||
padding: 8px 16px;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.loading {
|
||||
text-align: center;
|
||||
padding: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.spinner {
|
||||
border: 4px solid #f3f3f3;
|
||||
border-top: 4px solid #667eea;
|
||||
border-radius: 50%;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
animation: spin 1s linear infinite;
|
||||
margin: 0 auto 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
.results {
|
||||
margin-top: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.result-item {
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.result-success {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
border: 1px solid #c3e6cb;
|
||||
}
|
||||
|
||||
.result-error {
|
||||
background: #f8d7da;
|
||||
color: #721c24;
|
||||
border: 1px solid #f5c6cb;
|
||||
}
|
||||
|
||||
.result-info {
|
||||
background: #d1ecf1;
|
||||
color: #0c5460;
|
||||
border: 1px solid #bee5eb;
|
||||
}
|
||||
|
||||
/* Stats */
|
||||
.stats-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
|
||||
gap: 15px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.stat-card {
|
||||
background: #f7f9fc;
|
||||
padding: 20px;
|
||||
border-radius: 8px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.stat-value {
|
||||
font-size: 32px;
|
||||
font-weight: bold;
|
||||
color: #667eea;
|
||||
margin-bottom: 5px;
|
||||
}
|
||||
|
||||
.stat-label {
|
||||
font-size: 14px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
/* Inventory List */
|
||||
.inventory-list {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.inventory-item {
|
||||
background: #f7f9fc;
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-bottom: 10px;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.item-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.item-name {
|
||||
font-weight: 600;
|
||||
font-size: 16px;
|
||||
margin-bottom: 5px;
|
||||
}
|
||||
|
||||
.item-details {
|
||||
font-size: 14px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
.item-tags {
|
||||
display: flex;
|
||||
gap: 5px;
|
||||
margin-top: 5px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.tag {
|
||||
background: #667eea;
|
||||
color: white;
|
||||
padding: 2px 8px;
|
||||
border-radius: 12px;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.expiry-warning {
|
||||
color: #ff6b6b;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.expiry-soon {
|
||||
color: #ffa500;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
/* Form */
|
||||
.form-group {
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
.form-group label {
|
||||
display: block;
|
||||
margin-bottom: 5px;
|
||||
font-weight: 600;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.form-group input,
|
||||
.form-group select,
|
||||
.form-group textarea {
|
||||
width: 100%;
|
||||
padding: 10px;
|
||||
border: 1px solid #ddd;
|
||||
border-radius: 6px;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.form-row {
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 1fr;
|
||||
gap: 15px;
|
||||
}
|
||||
|
||||
.hidden {
|
||||
display: none !important;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="header">
|
||||
<h1>📦 Project Thoth</h1>
|
||||
<p>Smart Inventory & Receipt Management</p>
|
||||
</div>
|
||||
|
||||
<!-- Tabs -->
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('inventory')">🏪 Inventory</button>
|
||||
<button class="tab" onclick="switchTab('receipts')">🧾 Receipts</button>
|
||||
</div>
|
||||
|
||||
<!-- Inventory Tab -->
|
||||
<div id="inventoryTab" class="tab-content active">
|
||||
<!-- Stats -->
|
||||
<div class="card">
|
||||
<h2>📊 Inventory Overview</h2>
|
||||
<div class="stats-grid" id="inventoryStats">
|
||||
<div class="stat-card">
|
||||
<div class="stat-value" id="totalItems">0</div>
|
||||
<div class="stat-label">Total Items</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value" id="totalProducts">0</div>
|
||||
<div class="stat-label">Unique Products</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value expiry-soon" id="expiringSoon">0</div>
|
||||
<div class="stat-label">Expiring Soon</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value expiry-warning" id="expired">0</div>
|
||||
<div class="stat-label">Expired</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Barcode Scanner Gun -->
|
||||
<div class="card">
|
||||
<h2>🔫 Scanner Gun</h2>
|
||||
<p style="color: #666; margin-bottom: 15px;">Use your barcode scanner gun below. Scan will auto-submit when Enter is pressed.</p>
|
||||
|
||||
<div class="form-group">
|
||||
<label for="scannerGunInput">Scan barcode here:</label>
|
||||
<input
|
||||
type="text"
|
||||
id="scannerGunInput"
|
||||
placeholder="Focus here and scan with barcode gun..."
|
||||
style="font-size: 18px; font-family: monospace; background: #f0f8ff;"
|
||||
autocomplete="off"
|
||||
>
|
||||
</div>
|
||||
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="scannerLocation">Location</label>
|
||||
<select id="scannerLocation">
|
||||
<option value="fridge">Fridge</option>
|
||||
<option value="freezer">Freezer</option>
|
||||
<option value="pantry" selected>Pantry</option>
|
||||
<option value="cabinet">Cabinet</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="scannerQuantity">Quantity</label>
|
||||
<input type="number" id="scannerQuantity" value="1" min="0.1" step="0.1">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="loading" id="scannerLoading">
|
||||
<div class="spinner"></div>
|
||||
<p>Processing barcode...</p>
|
||||
</div>
|
||||
|
||||
<div class="results" id="scannerResults"></div>
|
||||
</div>
|
||||
|
||||
<!-- Barcode Scan (Camera/Image) -->
|
||||
<div class="card">
|
||||
<h2>📷 Scan Barcode (Camera/Image)</h2>
|
||||
<div class="upload-area" id="barcodeUploadArea">
|
||||
<div class="upload-icon">📸</div>
|
||||
<div class="upload-text">Click to scan barcode or drag and drop</div>
|
||||
<div class="upload-hint">Take a photo of a product barcode (UPC/EAN)</div>
|
||||
</div>
|
||||
<input type="file" id="barcodeInput" accept="image/*" capture="environment">
|
||||
|
||||
<div class="form-row" style="margin-top: 20px;">
|
||||
<div class="form-group">
|
||||
<label for="barcodeLocation">Location</label>
|
||||
<select id="barcodeLocation">
|
||||
<option value="fridge">Fridge</option>
|
||||
<option value="freezer">Freezer</option>
|
||||
<option value="pantry" selected>Pantry</option>
|
||||
<option value="cabinet">Cabinet</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="barcodeQuantity">Quantity</label>
|
||||
<input type="number" id="barcodeQuantity" value="1" min="0.1" step="0.1">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="loading" id="barcodeLoading">
|
||||
<div class="spinner"></div>
|
||||
<p>Scanning barcode...</p>
|
||||
</div>
|
||||
|
||||
<div class="results" id="barcodeResults"></div>
|
||||
</div>
|
||||
|
||||
<!-- Manual Add -->
|
||||
<div class="card">
|
||||
<h2>➕ Add Item Manually</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="itemName">Product Name*</label>
|
||||
<input type="text" id="itemName" placeholder="e.g., Organic Milk" required>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="itemBrand">Brand</label>
|
||||
<input type="text" id="itemBrand" placeholder="e.g., Horizon">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="itemQuantity">Quantity*</label>
|
||||
<input type="number" id="itemQuantity" value="1" min="0.1" step="0.1" required>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="itemUnit">Unit</label>
|
||||
<select id="itemUnit">
|
||||
<option value="count">Count</option>
|
||||
<option value="kg">Kilograms</option>
|
||||
<option value="lbs">Pounds</option>
|
||||
<option value="oz">Ounces</option>
|
||||
<option value="liters">Liters</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="itemLocation">Location*</label>
|
||||
<select id="itemLocation" required>
|
||||
<option value="fridge">Fridge</option>
|
||||
<option value="freezer">Freezer</option>
|
||||
<option value="pantry" selected>Pantry</option>
|
||||
<option value="cabinet">Cabinet</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="itemExpiration">Expiration Date</label>
|
||||
<input type="date" id="itemExpiration">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button class="button" onclick="addManualItem()">Add to Inventory</button>
|
||||
</div>
|
||||
|
||||
<!-- Inventory List -->
|
||||
<div class="card">
|
||||
<h2>📋 Current Inventory</h2>
|
||||
<div class="inventory-list" id="inventoryList">
|
||||
<p style="text-align: center; color: #666;">No items yet. Scan a barcode or add manually!</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Export -->
|
||||
<div class="card">
|
||||
<h2>📥 Export</h2>
|
||||
<button class="button" onclick="exportInventoryCSV()">📊 Download CSV</button>
|
||||
<button class="button" onclick="exportInventoryExcel()">📈 Download Excel</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Receipts Tab -->
|
||||
<div id="receiptsTab" class="tab-content">
|
||||
<div class="card">
|
||||
<h2>📸 Upload Receipt</h2>
|
||||
<div class="upload-area" id="receiptUploadArea">
|
||||
<div class="upload-icon">🧾</div>
|
||||
<div class="upload-text">Click to upload or drag and drop</div>
|
||||
<div class="upload-hint">Supports JPG, PNG (max 10MB)</div>
|
||||
</div>
|
||||
<input type="file" id="receiptInput" accept="image/*">
|
||||
|
||||
<div class="loading" id="receiptLoading">
|
||||
<div class="spinner"></div>
|
||||
<p>Processing receipt...</p>
|
||||
</div>
|
||||
|
||||
<div class="results" id="receiptResults"></div>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<h2>📋 Recent Receipts</h2>
|
||||
<div id="receiptStats">
|
||||
<p style="text-align: center; color: #666;">No receipts yet. Upload one above!</p>
|
||||
</div>
|
||||
|
||||
<div style="margin-top: 20px;">
|
||||
<button class="button" onclick="exportReceiptCSV()">📊 Download CSV</button>
|
||||
<button class="button" onclick="exportReceiptExcel()">📈 Download Excel</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
const API_BASE = '/api/v1';
|
||||
let currentInventory = [];
|
||||
|
||||
// Tab switching
|
||||
function switchTab(tab) {
|
||||
document.querySelectorAll('.tab').forEach(t => t.classList.remove('active'));
|
||||
document.querySelectorAll('.tab-content').forEach(c => c.classList.remove('active'));
|
||||
|
||||
if (tab === 'inventory') {
|
||||
document.querySelector('.tab:nth-child(1)').classList.add('active');
|
||||
document.getElementById('inventoryTab').classList.add('active');
|
||||
loadInventoryData();
|
||||
|
||||
// Auto-focus scanner gun input for quick scanning
|
||||
setTimeout(() => {
|
||||
document.getElementById('scannerGunInput').focus();
|
||||
}, 100);
|
||||
} else {
|
||||
document.querySelector('.tab:nth-child(2)').classList.add('active');
|
||||
document.getElementById('receiptsTab').classList.add('active');
|
||||
loadReceiptData();
|
||||
}
|
||||
}
|
||||
|
||||
// Scanner gun (text input)
|
||||
const scannerGunInput = document.getElementById('scannerGunInput');
|
||||
|
||||
// Auto-focus scanner gun input when inventory tab is active
|
||||
scannerGunInput.addEventListener('keypress', async (e) => {
|
||||
if (e.key === 'Enter') {
|
||||
e.preventDefault();
|
||||
const barcode = scannerGunInput.value.trim();
|
||||
|
||||
if (!barcode) return;
|
||||
|
||||
await handleScannerGunInput(barcode);
|
||||
scannerGunInput.value = ''; // Clear for next scan
|
||||
scannerGunInput.focus(); // Re-focus for next scan
|
||||
}
|
||||
});
|
||||
|
||||
async function handleScannerGunInput(barcode) {
|
||||
const location = document.getElementById('scannerLocation').value;
|
||||
const quantity = parseFloat(document.getElementById('scannerQuantity').value);
|
||||
|
||||
showLoading('scanner', true);
|
||||
showResults('scanner', false);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/inventory/scan/text`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
barcode,
|
||||
location,
|
||||
quantity,
|
||||
auto_add_to_inventory: true
|
||||
})
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success && data.barcodes_found > 0) {
|
||||
const result = data.results[0];
|
||||
showResult('scanner', 'success',
|
||||
`✓ Added: ${result.product.name}${result.product.brand ? ' (' + result.product.brand + ')' : ''} to ${location}`
|
||||
);
|
||||
loadInventoryData();
|
||||
|
||||
// Beep sound (optional - browser may block)
|
||||
try {
|
||||
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
|
||||
const oscillator = audioContext.createOscillator();
|
||||
const gainNode = audioContext.createGain();
|
||||
oscillator.connect(gainNode);
|
||||
gainNode.connect(audioContext.destination);
|
||||
oscillator.frequency.value = 800;
|
||||
oscillator.type = 'sine';
|
||||
gainNode.gain.setValueAtTime(0.3, audioContext.currentTime);
|
||||
oscillator.start(audioContext.currentTime);
|
||||
oscillator.stop(audioContext.currentTime + 0.1);
|
||||
} catch (e) {
|
||||
// Audio failed, ignore
|
||||
}
|
||||
} else {
|
||||
showResult('scanner', 'error', data.message || 'Barcode not found');
|
||||
}
|
||||
} catch (error) {
|
||||
showResult('scanner', 'error', `Error: ${error.message}`);
|
||||
} finally {
|
||||
showLoading('scanner', false);
|
||||
}
|
||||
}
|
||||
|
||||
// Barcode scanning (image)
|
||||
const barcodeUploadArea = document.getElementById('barcodeUploadArea');
|
||||
const barcodeInput = document.getElementById('barcodeInput');
|
||||
|
||||
barcodeUploadArea.addEventListener('click', () => barcodeInput.click());
|
||||
barcodeInput.addEventListener('change', handleBarcodeScan);
|
||||
|
||||
async function handleBarcodeScan(e) {
|
||||
const file = e.target.files[0];
|
||||
if (!file) return;
|
||||
|
||||
const location = document.getElementById('barcodeLocation').value;
|
||||
const quantity = parseFloat(document.getElementById('barcodeQuantity').value);
|
||||
|
||||
showLoading('barcode', true);
|
||||
showResults('barcode', false);
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
formData.append('location', location);
|
||||
formData.append('quantity', quantity);
|
||||
formData.append('auto_add_to_inventory', 'true');
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/inventory/scan`, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.success && data.barcodes_found > 0) {
|
||||
const result = data.results[0];
|
||||
showResult('barcode', 'success',
|
||||
`✓ Found: ${result.product.name}${result.product.brand ? ' (' + result.product.brand + ')' : ''}`
|
||||
);
|
||||
loadInventoryData();
|
||||
} else {
|
||||
showResult('barcode', 'error', 'No barcode found in image');
|
||||
}
|
||||
} catch (error) {
|
||||
showResult('barcode', 'error', `Error: ${error.message}`);
|
||||
} finally {
|
||||
showLoading('barcode', false);
|
||||
barcodeInput.value = '';
|
||||
}
|
||||
}
|
||||
|
||||
// Manual add
|
||||
async function addManualItem() {
|
||||
const name = document.getElementById('itemName').value;
|
||||
const brand = document.getElementById('itemBrand').value;
|
||||
const quantity = parseFloat(document.getElementById('itemQuantity').value);
|
||||
const unit = document.getElementById('itemUnit').value;
|
||||
const location = document.getElementById('itemLocation').value;
|
||||
const expiration = document.getElementById('itemExpiration').value;
|
||||
|
||||
if (!name || !quantity || !location) {
|
||||
alert('Please fill in required fields');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
// First, create product
|
||||
const productResp = await fetch(`${API_BASE}/inventory/products`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
name,
|
||||
brand: brand || null,
|
||||
source: 'manual'
|
||||
})
|
||||
});
|
||||
|
||||
const product = await productResp.json();
|
||||
|
||||
// Then, add to inventory
|
||||
const itemResp = await fetch(`${API_BASE}/inventory/items`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
product_id: product.id,
|
||||
quantity,
|
||||
unit,
|
||||
location,
|
||||
expiration_date: expiration || null,
|
||||
source: 'manual'
|
||||
})
|
||||
});
|
||||
|
||||
if (itemResp.ok) {
|
||||
alert('✓ Item added to inventory!');
|
||||
// Clear form
|
||||
document.getElementById('itemName').value = '';
|
||||
document.getElementById('itemBrand').value = '';
|
||||
document.getElementById('itemQuantity').value = '1';
|
||||
document.getElementById('itemExpiration').value = '';
|
||||
loadInventoryData();
|
||||
} else {
|
||||
alert('Failed to add item');
|
||||
}
|
||||
} catch (error) {
|
||||
alert(`Error: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Load inventory data
|
||||
async function loadInventoryData() {
|
||||
try {
|
||||
// Load stats
|
||||
const statsResp = await fetch(`${API_BASE}/inventory/stats`);
|
||||
const stats = await statsResp.json();
|
||||
|
||||
document.getElementById('totalItems').textContent = stats.total_items;
|
||||
document.getElementById('totalProducts').textContent = stats.total_products;
|
||||
document.getElementById('expiringSoon').textContent = stats.expiring_soon;
|
||||
document.getElementById('expired').textContent = stats.expired;
|
||||
|
||||
// Load inventory items
|
||||
const itemsResp = await fetch(`${API_BASE}/inventory/items?limit=100`);
|
||||
const items = await itemsResp.json();
|
||||
currentInventory = items;
|
||||
|
||||
displayInventory(items);
|
||||
} catch (error) {
|
||||
console.error('Failed to load inventory:', error);
|
||||
}
|
||||
}
|
||||
|
||||
function displayInventory(items) {
|
||||
const list = document.getElementById('inventoryList');
|
||||
|
||||
if (items.length === 0) {
|
||||
list.innerHTML = '<p style="text-align: center; color: #666;">No items yet. Scan a barcode or add manually!</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
list.innerHTML = items.map(item => {
|
||||
const product = item.product;
|
||||
let expiryInfo = '';
|
||||
|
||||
if (item.expiration_date) {
|
||||
const expiry = new Date(item.expiration_date);
|
||||
const today = new Date();
|
||||
const daysUntil = Math.ceil((expiry - today) / (1000 * 60 * 60 * 24));
|
||||
|
||||
if (daysUntil < 0) {
|
||||
expiryInfo = `<span class="expiry-warning">Expired ${Math.abs(daysUntil)} days ago</span>`;
|
||||
} else if (daysUntil <= 7) {
|
||||
expiryInfo = `<span class="expiry-soon">Expires in ${daysUntil} days</span>`;
|
||||
} else {
|
||||
expiryInfo = `Expires ${expiry.toLocaleDateString()}`;
|
||||
}
|
||||
}
|
||||
|
||||
const tags = product.tags ? product.tags.map(tag =>
|
||||
`<span class="tag" style="background: ${tag.color || '#667eea'}">${tag.name}</span>`
|
||||
).join('') : '';
|
||||
|
||||
return `
|
||||
<div class="inventory-item">
|
||||
<div class="item-info">
|
||||
<div class="item-name">${product.name}${product.brand ? ` - ${product.brand}` : ''}</div>
|
||||
<div class="item-details">
|
||||
${item.quantity} ${item.unit} • ${item.location}${expiryInfo ? ' • ' + expiryInfo : ''}
|
||||
</div>
|
||||
${tags ? `<div class="item-tags">${tags}</div>` : ''}
|
||||
</div>
|
||||
<button class="button button-small" onclick="markAsConsumed('${item.id}')">✓ Consumed</button>
|
||||
</div>
|
||||
`;
|
||||
}).join('');
|
||||
}
|
||||
|
||||
async function markAsConsumed(itemId) {
|
||||
if (!confirm('Mark this item as consumed?')) return;
|
||||
|
||||
try {
|
||||
await fetch(`${API_BASE}/inventory/items/${itemId}/consume`, { method: 'POST' });
|
||||
loadInventoryData();
|
||||
} catch (error) {
|
||||
alert(`Error: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Receipt handling
|
||||
const receiptUploadArea = document.getElementById('receiptUploadArea');
|
||||
const receiptInput = document.getElementById('receiptInput');
|
||||
|
||||
receiptUploadArea.addEventListener('click', () => receiptInput.click());
|
||||
receiptInput.addEventListener('change', handleReceiptUpload);
|
||||
|
||||
async function handleReceiptUpload(e) {
|
||||
const file = e.target.files[0];
|
||||
if (!file) return;
|
||||
|
||||
showLoading('receipt', true);
|
||||
showResults('receipt', false);
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/receipts/`, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showResult('receipt', 'success', `Receipt uploaded! ID: ${data.id}`);
|
||||
showResult('receipt', 'info', 'Processing in background...');
|
||||
setTimeout(loadReceiptData, 3000);
|
||||
} else {
|
||||
showResult('receipt', 'error', `Upload failed: ${data.detail}`);
|
||||
}
|
||||
} catch (error) {
|
||||
showResult('receipt', 'error', `Error: ${error.message}`);
|
||||
} finally {
|
||||
showLoading('receipt', false);
|
||||
receiptInput.value = '';
|
||||
}
|
||||
}
|
||||
|
||||
async function loadReceiptData() {
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/export/stats`);
|
||||
const stats = await response.json();
|
||||
|
||||
const statsDiv = document.getElementById('receiptStats');
|
||||
if (stats.total_receipts === 0) {
|
||||
statsDiv.innerHTML = '<p style="text-align: center; color: #666;">No receipts yet. Upload one above!</p>';
|
||||
} else {
|
||||
statsDiv.innerHTML = `
|
||||
<div class="stats-grid">
|
||||
<div class="stat-card">
|
||||
<div class="stat-value">${stats.total_receipts}</div>
|
||||
<div class="stat-label">Total Receipts</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value">${stats.average_quality_score.toFixed(1)}</div>
|
||||
<div class="stat-label">Avg Quality Score</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value">${stats.acceptable_quality_count}</div>
|
||||
<div class="stat-label">Good Quality</div>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load receipt data:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Export functions
|
||||
function exportInventoryCSV() {
|
||||
window.open(`${API_BASE}/export/inventory/csv`, '_blank');
|
||||
}
|
||||
|
||||
function exportInventoryExcel() {
|
||||
window.open(`${API_BASE}/export/inventory/excel`, '_blank');
|
||||
}
|
||||
|
||||
function exportReceiptCSV() {
|
||||
window.open(`${API_BASE}/export/csv`, '_blank');
|
||||
}
|
||||
|
||||
function exportReceiptExcel() {
|
||||
window.open(`${API_BASE}/export/excel`, '_blank');
|
||||
}
|
||||
|
||||
// Utility functions
|
||||
function showLoading(type, show) {
|
||||
document.getElementById(`${type}Loading`).style.display = show ? 'block' : 'none';
|
||||
}
|
||||
|
||||
function showResults(type, show) {
|
||||
const results = document.getElementById(`${type}Results`);
|
||||
if (!show) {
|
||||
results.innerHTML = '';
|
||||
}
|
||||
results.style.display = show ? 'block' : 'none';
|
||||
}
|
||||
|
||||
function showResult(type, resultType, message) {
|
||||
const results = document.getElementById(`${type}Results`);
|
||||
results.style.display = 'block';
|
||||
|
||||
const div = document.createElement('div');
|
||||
div.className = `result-item result-${resultType}`;
|
||||
div.textContent = message;
|
||||
results.appendChild(div);
|
||||
|
||||
setTimeout(() => div.remove(), 5000);
|
||||
}
|
||||
|
||||
// Load initial data
|
||||
loadInventoryData();
|
||||
setInterval(loadInventoryData, 30000); // Refresh every 30 seconds
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
459
app/static/upload.html
Normal file
459
app/static/upload.html
Normal file
|
|
@ -0,0 +1,459 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Project Thoth - Receipt Upload</title>
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif;
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
min-height: 100vh;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 800px;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
.header {
|
||||
text-align: center;
|
||||
color: white;
|
||||
margin-bottom: 40px;
|
||||
}
|
||||
|
||||
.header h1 {
|
||||
font-size: 2.5em;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.header p {
|
||||
font-size: 1.2em;
|
||||
opacity: 0.9;
|
||||
}
|
||||
|
||||
.card {
|
||||
background: white;
|
||||
border-radius: 12px;
|
||||
padding: 30px;
|
||||
box-shadow: 0 10px 40px rgba(0,0,0,0.2);
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-area {
|
||||
border: 3px dashed #667eea;
|
||||
border-radius: 8px;
|
||||
padding: 40px;
|
||||
text-align: center;
|
||||
cursor: pointer;
|
||||
transition: all 0.3s;
|
||||
background: #f7f9fc;
|
||||
}
|
||||
|
||||
.upload-area:hover {
|
||||
border-color: #764ba2;
|
||||
background: #eef2f7;
|
||||
}
|
||||
|
||||
.upload-area.dragover {
|
||||
border-color: #764ba2;
|
||||
background: #e0e7ff;
|
||||
}
|
||||
|
||||
.upload-icon {
|
||||
font-size: 48px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-text {
|
||||
font-size: 18px;
|
||||
color: #333;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.upload-hint {
|
||||
font-size: 14px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
#fileInput {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.preview-area {
|
||||
margin-top: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.preview-image {
|
||||
max-width: 100%;
|
||||
border-radius: 8px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.button {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 12px 30px;
|
||||
font-size: 16px;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
transition: transform 0.2s;
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.button:hover {
|
||||
transform: translateY(-2px);
|
||||
}
|
||||
|
||||
.button:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
transform: none;
|
||||
}
|
||||
|
||||
.button-secondary {
|
||||
background: #6c757d;
|
||||
}
|
||||
|
||||
.results {
|
||||
margin-top: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.result-item {
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.result-success {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
border: 1px solid #c3e6cb;
|
||||
}
|
||||
|
||||
.result-error {
|
||||
background: #f8d7da;
|
||||
color: #721c24;
|
||||
border: 1px solid #f5c6cb;
|
||||
}
|
||||
|
||||
.result-info {
|
||||
background: #d1ecf1;
|
||||
color: #0c5460;
|
||||
border: 1px solid #bee5eb;
|
||||
}
|
||||
|
||||
.loading {
|
||||
text-align: center;
|
||||
padding: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.spinner {
|
||||
border: 4px solid #f3f3f3;
|
||||
border-top: 4px solid #667eea;
|
||||
border-radius: 50%;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
animation: spin 1s linear infinite;
|
||||
margin: 0 auto 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
.receipt-list {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.receipt-card {
|
||||
background: #f7f9fc;
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-bottom: 10px;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.receipt-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.receipt-id {
|
||||
font-family: monospace;
|
||||
font-size: 12px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
.receipt-status {
|
||||
display: inline-block;
|
||||
padding: 4px 12px;
|
||||
border-radius: 12px;
|
||||
font-size: 12px;
|
||||
font-weight: 600;
|
||||
margin-left: 10px;
|
||||
}
|
||||
|
||||
.status-processing {
|
||||
background: #fff3cd;
|
||||
color: #856404;
|
||||
}
|
||||
|
||||
.status-processed {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
}
|
||||
|
||||
.status-error {
|
||||
background: #f8d7da;
|
||||
color: #721c24;
|
||||
}
|
||||
|
||||
.quality-score {
|
||||
font-size: 24px;
|
||||
font-weight: bold;
|
||||
color: #667eea;
|
||||
}
|
||||
|
||||
.actions {
|
||||
margin-top: 20px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.export-section {
|
||||
margin-top: 30px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.export-section h3 {
|
||||
margin-bottom: 15px;
|
||||
color: #333;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="header">
|
||||
<h1>📄 Project Thoth</h1>
|
||||
<p>Receipt Processing System</p>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<h2>Upload Receipt</h2>
|
||||
<div class="upload-area" id="uploadArea">
|
||||
<div class="upload-icon">📸</div>
|
||||
<div class="upload-text">Click to upload or drag and drop</div>
|
||||
<div class="upload-hint">Supports JPG, PNG (max 10MB)</div>
|
||||
</div>
|
||||
<input type="file" id="fileInput" accept="image/*">
|
||||
|
||||
<div class="preview-area" id="previewArea">
|
||||
<img id="previewImage" class="preview-image">
|
||||
<div>
|
||||
<button class="button" id="uploadBtn">Upload Receipt</button>
|
||||
<button class="button button-secondary" id="cancelBtn">Cancel</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="loading" id="loading">
|
||||
<div class="spinner"></div>
|
||||
<p>Processing receipt...</p>
|
||||
</div>
|
||||
|
||||
<div class="results" id="results"></div>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<h2>Recent Receipts</h2>
|
||||
<div class="receipt-list" id="receiptList">
|
||||
<p style="text-align: center; color: #666;">No receipts yet. Upload one above!</p>
|
||||
</div>
|
||||
|
||||
<div class="export-section">
|
||||
<h3>Export Data</h3>
|
||||
<button class="button" onclick="exportCSV()">📊 Download CSV</button>
|
||||
<button class="button" onclick="exportExcel()">📈 Download Excel</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Use relative URL so it works from any host (localhost or remote IP)
|
||||
const API_BASE = '/api/v1';
|
||||
const uploadArea = document.getElementById('uploadArea');
|
||||
const fileInput = document.getElementById('fileInput');
|
||||
const previewArea = document.getElementById('previewArea');
|
||||
const previewImage = document.getElementById('previewImage');
|
||||
const uploadBtn = document.getElementById('uploadBtn');
|
||||
const cancelBtn = document.getElementById('cancelBtn');
|
||||
const loading = document.getElementById('loading');
|
||||
const results = document.getElementById('results');
|
||||
const receiptList = document.getElementById('receiptList');
|
||||
|
||||
let selectedFile = null;
|
||||
let receipts = [];
|
||||
|
||||
// Click to upload
|
||||
uploadArea.addEventListener('click', () => fileInput.click());
|
||||
|
||||
// Drag and drop
|
||||
uploadArea.addEventListener('dragover', (e) => {
|
||||
e.preventDefault();
|
||||
uploadArea.classList.add('dragover');
|
||||
});
|
||||
|
||||
uploadArea.addEventListener('dragleave', () => {
|
||||
uploadArea.classList.remove('dragover');
|
||||
});
|
||||
|
||||
uploadArea.addEventListener('drop', (e) => {
|
||||
e.preventDefault();
|
||||
uploadArea.classList.remove('dragover');
|
||||
const files = e.dataTransfer.files;
|
||||
if (files.length > 0) {
|
||||
handleFileSelect(files[0]);
|
||||
}
|
||||
});
|
||||
|
||||
// File input change
|
||||
fileInput.addEventListener('change', (e) => {
|
||||
if (e.target.files.length > 0) {
|
||||
handleFileSelect(e.target.files[0]);
|
||||
}
|
||||
});
|
||||
|
||||
function handleFileSelect(file) {
|
||||
if (!file.type.startsWith('image/')) {
|
||||
showResult('error', 'Please select an image file');
|
||||
return;
|
||||
}
|
||||
|
||||
selectedFile = file;
|
||||
const reader = new FileReader();
|
||||
reader.onload = (e) => {
|
||||
previewImage.src = e.target.result;
|
||||
previewArea.style.display = 'block';
|
||||
uploadArea.style.display = 'none';
|
||||
};
|
||||
reader.readAsDataURL(file);
|
||||
}
|
||||
|
||||
cancelBtn.addEventListener('click', () => {
|
||||
selectedFile = null;
|
||||
previewArea.style.display = 'none';
|
||||
uploadArea.style.display = 'block';
|
||||
fileInput.value = '';
|
||||
});
|
||||
|
||||
uploadBtn.addEventListener('click', uploadReceipt);
|
||||
|
||||
async function uploadReceipt() {
|
||||
if (!selectedFile) return;
|
||||
|
||||
loading.style.display = 'block';
|
||||
results.style.display = 'none';
|
||||
uploadBtn.disabled = true;
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', selectedFile);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/receipts/`, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showResult('success', `Receipt uploaded! ID: ${data.id}`);
|
||||
showResult('info', 'Processing in background... Refresh in a few seconds to see results.');
|
||||
|
||||
// Reset form
|
||||
selectedFile = null;
|
||||
previewArea.style.display = 'none';
|
||||
uploadArea.style.display = 'block';
|
||||
fileInput.value = '';
|
||||
|
||||
// Refresh list after a delay
|
||||
setTimeout(loadReceipts, 3000);
|
||||
} else {
|
||||
showResult('error', `Upload failed: ${data.detail || 'Unknown error'}`);
|
||||
}
|
||||
} catch (error) {
|
||||
showResult('error', `Network error: ${error.message}`);
|
||||
} finally {
|
||||
loading.style.display = 'none';
|
||||
uploadBtn.disabled = false;
|
||||
}
|
||||
}
|
||||
|
||||
function showResult(type, message) {
|
||||
results.style.display = 'block';
|
||||
const div = document.createElement('div');
|
||||
div.className = `result-item result-${type}`;
|
||||
div.textContent = message;
|
||||
results.appendChild(div);
|
||||
|
||||
// Auto-hide after 5 seconds
|
||||
setTimeout(() => div.remove(), 5000);
|
||||
}
|
||||
|
||||
async function loadReceipts() {
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/export/stats`);
|
||||
const stats = await response.json();
|
||||
|
||||
if (stats.total_receipts === 0) {
|
||||
receiptList.innerHTML = '<p style="text-align: center; color: #666;">No receipts yet. Upload one above!</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
// For now, just show stats since we don't have a list endpoint
|
||||
// In Phase 2, we'll add a proper list endpoint
|
||||
receiptList.innerHTML = `
|
||||
<div class="receipt-card">
|
||||
<div class="receipt-info">
|
||||
<strong>Total Receipts:</strong> ${stats.total_receipts}<br>
|
||||
<strong>Average Quality:</strong> ${stats.average_quality_score}/100<br>
|
||||
<strong>Acceptable Quality:</strong> ${stats.acceptable_quality_count}
|
||||
</div>
|
||||
</div>
|
||||
<p style="text-align: center; color: #666; margin-top: 10px;">
|
||||
Click "Download Excel" below to see all receipts with details!
|
||||
</p>
|
||||
`;
|
||||
} catch (error) {
|
||||
console.error('Failed to load receipts:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async function exportCSV() {
|
||||
window.open(`${API_BASE}/export/csv`, '_blank');
|
||||
}
|
||||
|
||||
async function exportExcel() {
|
||||
window.open(`${API_BASE}/export/excel`, '_blank');
|
||||
}
|
||||
|
||||
// Load receipts on page load
|
||||
loadReceipts();
|
||||
|
||||
// Auto-refresh every 10 seconds
|
||||
setInterval(loadReceipts, 10000);
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
459
app/static/upload.html.backup
Normal file
459
app/static/upload.html.backup
Normal file
|
|
@ -0,0 +1,459 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Project Thoth - Receipt Upload</title>
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif;
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
min-height: 100vh;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 800px;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
.header {
|
||||
text-align: center;
|
||||
color: white;
|
||||
margin-bottom: 40px;
|
||||
}
|
||||
|
||||
.header h1 {
|
||||
font-size: 2.5em;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.header p {
|
||||
font-size: 1.2em;
|
||||
opacity: 0.9;
|
||||
}
|
||||
|
||||
.card {
|
||||
background: white;
|
||||
border-radius: 12px;
|
||||
padding: 30px;
|
||||
box-shadow: 0 10px 40px rgba(0,0,0,0.2);
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-area {
|
||||
border: 3px dashed #667eea;
|
||||
border-radius: 8px;
|
||||
padding: 40px;
|
||||
text-align: center;
|
||||
cursor: pointer;
|
||||
transition: all 0.3s;
|
||||
background: #f7f9fc;
|
||||
}
|
||||
|
||||
.upload-area:hover {
|
||||
border-color: #764ba2;
|
||||
background: #eef2f7;
|
||||
}
|
||||
|
||||
.upload-area.dragover {
|
||||
border-color: #764ba2;
|
||||
background: #e0e7ff;
|
||||
}
|
||||
|
||||
.upload-icon {
|
||||
font-size: 48px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-text {
|
||||
font-size: 18px;
|
||||
color: #333;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.upload-hint {
|
||||
font-size: 14px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
#fileInput {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.preview-area {
|
||||
margin-top: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.preview-image {
|
||||
max-width: 100%;
|
||||
border-radius: 8px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.button {
|
||||
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 12px 30px;
|
||||
font-size: 16px;
|
||||
border-radius: 6px;
|
||||
cursor: pointer;
|
||||
transition: transform 0.2s;
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.button:hover {
|
||||
transform: translateY(-2px);
|
||||
}
|
||||
|
||||
.button:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
transform: none;
|
||||
}
|
||||
|
||||
.button-secondary {
|
||||
background: #6c757d;
|
||||
}
|
||||
|
||||
.results {
|
||||
margin-top: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.result-item {
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.result-success {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
border: 1px solid #c3e6cb;
|
||||
}
|
||||
|
||||
.result-error {
|
||||
background: #f8d7da;
|
||||
color: #721c24;
|
||||
border: 1px solid #f5c6cb;
|
||||
}
|
||||
|
||||
.result-info {
|
||||
background: #d1ecf1;
|
||||
color: #0c5460;
|
||||
border: 1px solid #bee5eb;
|
||||
}
|
||||
|
||||
.loading {
|
||||
text-align: center;
|
||||
padding: 20px;
|
||||
display: none;
|
||||
}
|
||||
|
||||
.spinner {
|
||||
border: 4px solid #f3f3f3;
|
||||
border-top: 4px solid #667eea;
|
||||
border-radius: 50%;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
animation: spin 1s linear infinite;
|
||||
margin: 0 auto 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
.receipt-list {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.receipt-card {
|
||||
background: #f7f9fc;
|
||||
padding: 15px;
|
||||
border-radius: 6px;
|
||||
margin-bottom: 10px;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.receipt-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.receipt-id {
|
||||
font-family: monospace;
|
||||
font-size: 12px;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
.receipt-status {
|
||||
display: inline-block;
|
||||
padding: 4px 12px;
|
||||
border-radius: 12px;
|
||||
font-size: 12px;
|
||||
font-weight: 600;
|
||||
margin-left: 10px;
|
||||
}
|
||||
|
||||
.status-processing {
|
||||
background: #fff3cd;
|
||||
color: #856404;
|
||||
}
|
||||
|
||||
.status-processed {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
}
|
||||
|
||||
.status-error {
|
||||
background: #f8d7da;
|
||||
color: #721c24;
|
||||
}
|
||||
|
||||
.quality-score {
|
||||
font-size: 24px;
|
||||
font-weight: bold;
|
||||
color: #667eea;
|
||||
}
|
||||
|
||||
.actions {
|
||||
margin-top: 20px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.export-section {
|
||||
margin-top: 30px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.export-section h3 {
|
||||
margin-bottom: 15px;
|
||||
color: #333;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="header">
|
||||
<h1>📄 Project Thoth</h1>
|
||||
<p>Receipt Processing System</p>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<h2>Upload Receipt</h2>
|
||||
<div class="upload-area" id="uploadArea">
|
||||
<div class="upload-icon">📸</div>
|
||||
<div class="upload-text">Click to upload or drag and drop</div>
|
||||
<div class="upload-hint">Supports JPG, PNG (max 10MB)</div>
|
||||
</div>
|
||||
<input type="file" id="fileInput" accept="image/*">
|
||||
|
||||
<div class="preview-area" id="previewArea">
|
||||
<img id="previewImage" class="preview-image">
|
||||
<div>
|
||||
<button class="button" id="uploadBtn">Upload Receipt</button>
|
||||
<button class="button button-secondary" id="cancelBtn">Cancel</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="loading" id="loading">
|
||||
<div class="spinner"></div>
|
||||
<p>Processing receipt...</p>
|
||||
</div>
|
||||
|
||||
<div class="results" id="results"></div>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<h2>Recent Receipts</h2>
|
||||
<div class="receipt-list" id="receiptList">
|
||||
<p style="text-align: center; color: #666;">No receipts yet. Upload one above!</p>
|
||||
</div>
|
||||
|
||||
<div class="export-section">
|
||||
<h3>Export Data</h3>
|
||||
<button class="button" onclick="exportCSV()">📊 Download CSV</button>
|
||||
<button class="button" onclick="exportExcel()">📈 Download Excel</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
// Use relative URL so it works from any host (localhost or remote IP)
|
||||
const API_BASE = '/api/v1';
|
||||
const uploadArea = document.getElementById('uploadArea');
|
||||
const fileInput = document.getElementById('fileInput');
|
||||
const previewArea = document.getElementById('previewArea');
|
||||
const previewImage = document.getElementById('previewImage');
|
||||
const uploadBtn = document.getElementById('uploadBtn');
|
||||
const cancelBtn = document.getElementById('cancelBtn');
|
||||
const loading = document.getElementById('loading');
|
||||
const results = document.getElementById('results');
|
||||
const receiptList = document.getElementById('receiptList');
|
||||
|
||||
let selectedFile = null;
|
||||
let receipts = [];
|
||||
|
||||
// Click to upload
|
||||
uploadArea.addEventListener('click', () => fileInput.click());
|
||||
|
||||
// Drag and drop
|
||||
uploadArea.addEventListener('dragover', (e) => {
|
||||
e.preventDefault();
|
||||
uploadArea.classList.add('dragover');
|
||||
});
|
||||
|
||||
uploadArea.addEventListener('dragleave', () => {
|
||||
uploadArea.classList.remove('dragover');
|
||||
});
|
||||
|
||||
uploadArea.addEventListener('drop', (e) => {
|
||||
e.preventDefault();
|
||||
uploadArea.classList.remove('dragover');
|
||||
const files = e.dataTransfer.files;
|
||||
if (files.length > 0) {
|
||||
handleFileSelect(files[0]);
|
||||
}
|
||||
});
|
||||
|
||||
// File input change
|
||||
fileInput.addEventListener('change', (e) => {
|
||||
if (e.target.files.length > 0) {
|
||||
handleFileSelect(e.target.files[0]);
|
||||
}
|
||||
});
|
||||
|
||||
function handleFileSelect(file) {
|
||||
if (!file.type.startsWith('image/')) {
|
||||
showResult('error', 'Please select an image file');
|
||||
return;
|
||||
}
|
||||
|
||||
selectedFile = file;
|
||||
const reader = new FileReader();
|
||||
reader.onload = (e) => {
|
||||
previewImage.src = e.target.result;
|
||||
previewArea.style.display = 'block';
|
||||
uploadArea.style.display = 'none';
|
||||
};
|
||||
reader.readAsDataURL(file);
|
||||
}
|
||||
|
||||
cancelBtn.addEventListener('click', () => {
|
||||
selectedFile = null;
|
||||
previewArea.style.display = 'none';
|
||||
uploadArea.style.display = 'block';
|
||||
fileInput.value = '';
|
||||
});
|
||||
|
||||
uploadBtn.addEventListener('click', uploadReceipt);
|
||||
|
||||
async function uploadReceipt() {
|
||||
if (!selectedFile) return;
|
||||
|
||||
loading.style.display = 'block';
|
||||
results.style.display = 'none';
|
||||
uploadBtn.disabled = true;
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', selectedFile);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/receipts/`, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
showResult('success', `Receipt uploaded! ID: ${data.id}`);
|
||||
showResult('info', 'Processing in background... Refresh in a few seconds to see results.');
|
||||
|
||||
// Reset form
|
||||
selectedFile = null;
|
||||
previewArea.style.display = 'none';
|
||||
uploadArea.style.display = 'block';
|
||||
fileInput.value = '';
|
||||
|
||||
// Refresh list after a delay
|
||||
setTimeout(loadReceipts, 3000);
|
||||
} else {
|
||||
showResult('error', `Upload failed: ${data.detail || 'Unknown error'}`);
|
||||
}
|
||||
} catch (error) {
|
||||
showResult('error', `Network error: ${error.message}`);
|
||||
} finally {
|
||||
loading.style.display = 'none';
|
||||
uploadBtn.disabled = false;
|
||||
}
|
||||
}
|
||||
|
||||
function showResult(type, message) {
|
||||
results.style.display = 'block';
|
||||
const div = document.createElement('div');
|
||||
div.className = `result-item result-${type}`;
|
||||
div.textContent = message;
|
||||
results.appendChild(div);
|
||||
|
||||
// Auto-hide after 5 seconds
|
||||
setTimeout(() => div.remove(), 5000);
|
||||
}
|
||||
|
||||
async function loadReceipts() {
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/export/stats`);
|
||||
const stats = await response.json();
|
||||
|
||||
if (stats.total_receipts === 0) {
|
||||
receiptList.innerHTML = '<p style="text-align: center; color: #666;">No receipts yet. Upload one above!</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
// For now, just show stats since we don't have a list endpoint
|
||||
// In Phase 2, we'll add a proper list endpoint
|
||||
receiptList.innerHTML = `
|
||||
<div class="receipt-card">
|
||||
<div class="receipt-info">
|
||||
<strong>Total Receipts:</strong> ${stats.total_receipts}<br>
|
||||
<strong>Average Quality:</strong> ${stats.average_quality_score}/100<br>
|
||||
<strong>Acceptable Quality:</strong> ${stats.acceptable_quality_count}
|
||||
</div>
|
||||
</div>
|
||||
<p style="text-align: center; color: #666; margin-top: 10px;">
|
||||
Click "Download Excel" below to see all receipts with details!
|
||||
</p>
|
||||
`;
|
||||
} catch (error) {
|
||||
console.error('Failed to load receipts:', error);
|
||||
}
|
||||
}
|
||||
|
||||
async function exportCSV() {
|
||||
window.open(`${API_BASE}/export/csv`, '_blank');
|
||||
}
|
||||
|
||||
async function exportExcel() {
|
||||
window.open(`${API_BASE}/export/excel`, '_blank');
|
||||
}
|
||||
|
||||
// Load receipts on page load
|
||||
loadReceipts();
|
||||
|
||||
// Auto-refresh every 10 seconds
|
||||
setInterval(loadReceipts, 10000);
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
61
app/tiers.py
Normal file
61
app/tiers.py
Normal file
|
|
@ -0,0 +1,61 @@
|
|||
"""
|
||||
Kiwi tier gates.
|
||||
|
||||
Tiers: free < paid < premium
|
||||
(Ultra not used in Kiwi — no human-in-the-loop operations.)
|
||||
|
||||
Uses circuitforge-core can_use() with Kiwi's feature map.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from circuitforge_core.tiers.tiers import can_use as _can_use, BYOK_UNLOCKABLE
|
||||
|
||||
# Features that unlock when the user supplies their own LLM backend.
|
||||
KIWI_BYOK_UNLOCKABLE: frozenset[str] = frozenset({
|
||||
"recipe_suggestions",
|
||||
"expiry_llm_matching",
|
||||
"receipt_ocr",
|
||||
})
|
||||
|
||||
# Feature → minimum tier required
|
||||
KIWI_FEATURES: dict[str, str] = {
|
||||
# Free tier
|
||||
"inventory_crud": "free",
|
||||
"barcode_scan": "free",
|
||||
"receipt_upload": "free",
|
||||
"expiry_alerts": "free",
|
||||
"export_csv": "free",
|
||||
|
||||
# Paid tier
|
||||
"receipt_ocr": "paid", # BYOK-unlockable
|
||||
"recipe_suggestions": "paid", # BYOK-unlockable
|
||||
"expiry_llm_matching": "paid", # BYOK-unlockable
|
||||
"meal_planning": "paid",
|
||||
"dietary_profiles": "paid",
|
||||
|
||||
# Premium tier
|
||||
"multi_household": "premium",
|
||||
"background_monitoring": "premium",
|
||||
"leftover_mode": "premium",
|
||||
}
|
||||
|
||||
|
||||
def can_use(feature: str, tier: str, has_byok: bool = False) -> bool:
|
||||
"""Return True if the given tier can access the feature."""
|
||||
return _can_use(
|
||||
feature,
|
||||
tier,
|
||||
has_byok=has_byok,
|
||||
_features=KIWI_FEATURES,
|
||||
)
|
||||
|
||||
|
||||
def require_feature(feature: str, tier: str, has_byok: bool = False) -> None:
|
||||
"""Raise ValueError if the tier cannot access the feature."""
|
||||
if not can_use(feature, tier, has_byok):
|
||||
from circuitforge_core.tiers.tiers import tier_label
|
||||
needed = tier_label(feature, has_byok=has_byok, _features=KIWI_FEATURES)
|
||||
raise ValueError(
|
||||
f"Feature '{feature}' requires {needed} tier. "
|
||||
f"Current tier: {tier}."
|
||||
)
|
||||
5
app/utils/__init__.py
Normal file
5
app/utils/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
# app/utils/__init__.py
|
||||
"""
|
||||
Utility functions for Kiwi.
|
||||
Contains common helpers used throughout the application.
|
||||
"""
|
||||
248
app/utils/progress.py
Normal file
248
app/utils/progress.py
Normal file
|
|
@ -0,0 +1,248 @@
|
|||
# app/utils/progress.py
|
||||
import sys
|
||||
import time
|
||||
import asyncio
|
||||
from typing import Optional, Callable, Any
|
||||
import threading
|
||||
|
||||
class ProgressIndicator:
|
||||
"""
|
||||
A simple progress indicator for long-running operations.
|
||||
|
||||
This class provides different styles of progress indicators:
|
||||
- dots: Animated dots (. .. ... ....)
|
||||
- spinner: Spinning cursor (|/-\)
|
||||
- percentage: Progress percentage [#### ] 40%
|
||||
"""
|
||||
|
||||
def __init__(self,
|
||||
message: str = "Processing",
|
||||
style: str = "dots",
|
||||
total: Optional[int] = None):
|
||||
"""
|
||||
Initialize the progress indicator.
|
||||
|
||||
Args:
|
||||
message: The message to display before the indicator
|
||||
style: The indicator style ('dots', 'spinner', or 'percentage')
|
||||
total: Total items for percentage style (required for percentage)
|
||||
"""
|
||||
self.message = message
|
||||
self.style = style
|
||||
self.total = total
|
||||
self.current = 0
|
||||
self.start_time = None
|
||||
self._running = False
|
||||
self._thread = None
|
||||
self._task = None
|
||||
|
||||
# Validate style
|
||||
if style not in ["dots", "spinner", "percentage"]:
|
||||
raise ValueError("Style must be 'dots', 'spinner', or 'percentage'")
|
||||
|
||||
# Validate total for percentage style
|
||||
if style == "percentage" and total is None:
|
||||
raise ValueError("Total must be specified for percentage style")
|
||||
|
||||
def start(self):
|
||||
"""Start the progress indicator in a separate thread."""
|
||||
if self._running:
|
||||
return
|
||||
|
||||
self._running = True
|
||||
self.start_time = time.time()
|
||||
|
||||
# Start the appropriate indicator
|
||||
if self.style == "dots":
|
||||
self._thread = threading.Thread(target=self._dots_indicator)
|
||||
elif self.style == "spinner":
|
||||
self._thread = threading.Thread(target=self._spinner_indicator)
|
||||
elif self.style == "percentage":
|
||||
self._thread = threading.Thread(target=self._percentage_indicator)
|
||||
|
||||
self._thread.daemon = True
|
||||
self._thread.start()
|
||||
|
||||
async def start_async(self):
|
||||
"""Start the progress indicator as an asyncio task."""
|
||||
if self._running:
|
||||
return
|
||||
|
||||
self._running = True
|
||||
self.start_time = time.time()
|
||||
|
||||
# Start the appropriate indicator
|
||||
if self.style == "dots":
|
||||
self._task = asyncio.create_task(self._dots_indicator_async())
|
||||
elif self.style == "spinner":
|
||||
self._task = asyncio.create_task(self._spinner_indicator_async())
|
||||
elif self.style == "percentage":
|
||||
self._task = asyncio.create_task(self._percentage_indicator_async())
|
||||
|
||||
def update(self, current: int):
|
||||
"""Update the progress (for percentage style)."""
|
||||
self.current = current
|
||||
|
||||
def stop(self):
|
||||
"""Stop the progress indicator."""
|
||||
if not self._running:
|
||||
return
|
||||
|
||||
self._running = False
|
||||
|
||||
if self._thread:
|
||||
self._thread.join(timeout=1.0)
|
||||
|
||||
# Clear the progress line and write a newline
|
||||
sys.stdout.write("\r" + " " * 80 + "\r")
|
||||
sys.stdout.flush()
|
||||
|
||||
async def stop_async(self):
|
||||
"""Stop the progress indicator (async version)."""
|
||||
if not self._running:
|
||||
return
|
||||
|
||||
self._running = False
|
||||
|
||||
if self._task:
|
||||
self._task.cancel()
|
||||
try:
|
||||
await self._task
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
|
||||
# Clear the progress line and write a newline
|
||||
sys.stdout.write("\r" + " " * 80 + "\r")
|
||||
sys.stdout.flush()
|
||||
|
||||
def _dots_indicator(self):
|
||||
"""Display an animated dots indicator."""
|
||||
i = 0
|
||||
while self._running:
|
||||
dots = "." * (i % 4 + 1)
|
||||
elapsed = time.time() - self.start_time
|
||||
sys.stdout.write(f"\r{self.message}{dots:<4} ({elapsed:.1f}s)")
|
||||
sys.stdout.flush()
|
||||
time.sleep(0.5)
|
||||
i += 1
|
||||
|
||||
async def _dots_indicator_async(self):
|
||||
"""Display an animated dots indicator (async version)."""
|
||||
i = 0
|
||||
while self._running:
|
||||
dots = "." * (i % 4 + 1)
|
||||
elapsed = time.time() - self.start_time
|
||||
sys.stdout.write(f"\r{self.message}{dots:<4} ({elapsed:.1f}s)")
|
||||
sys.stdout.flush()
|
||||
await asyncio.sleep(0.5)
|
||||
i += 1
|
||||
|
||||
def _spinner_indicator(self):
|
||||
"""Display a spinning cursor indicator."""
|
||||
chars = "|/-\\"
|
||||
i = 0
|
||||
while self._running:
|
||||
char = chars[i % len(chars)]
|
||||
elapsed = time.time() - self.start_time
|
||||
sys.stdout.write(f"\r{self.message} {char} ({elapsed:.1f}s)")
|
||||
sys.stdout.flush()
|
||||
time.sleep(0.1)
|
||||
i += 1
|
||||
|
||||
async def _spinner_indicator_async(self):
|
||||
"""Display a spinning cursor indicator (async version)."""
|
||||
chars = "|/-\\"
|
||||
i = 0
|
||||
while self._running:
|
||||
char = chars[i % len(chars)]
|
||||
elapsed = time.time() - self.start_time
|
||||
sys.stdout.write(f"\r{self.message} {char} ({elapsed:.1f}s)")
|
||||
sys.stdout.flush()
|
||||
await asyncio.sleep(0.1)
|
||||
i += 1
|
||||
|
||||
def _percentage_indicator(self):
|
||||
"""Display a percentage progress bar."""
|
||||
while self._running:
|
||||
percentage = min(100, int((self.current / self.total) * 100))
|
||||
bar_length = 20
|
||||
filled_length = int(bar_length * percentage // 100)
|
||||
bar = '#' * filled_length + ' ' * (bar_length - filled_length)
|
||||
elapsed = time.time() - self.start_time
|
||||
|
||||
# Estimate time remaining if we have progress
|
||||
if percentage > 0:
|
||||
remaining = elapsed * (100 - percentage) / percentage
|
||||
sys.stdout.write(f"\r{self.message} [{bar}] {percentage}% ({elapsed:.1f}s elapsed, ~{remaining:.1f}s remaining)")
|
||||
else:
|
||||
sys.stdout.write(f"\r{self.message} [{bar}] {percentage}% ({elapsed:.1f}s elapsed)")
|
||||
|
||||
sys.stdout.flush()
|
||||
time.sleep(0.2)
|
||||
|
||||
async def _percentage_indicator_async(self):
|
||||
"""Display a percentage progress bar (async version)."""
|
||||
while self._running:
|
||||
percentage = min(100, int((self.current / self.total) * 100))
|
||||
bar_length = 20
|
||||
filled_length = int(bar_length * percentage // 100)
|
||||
bar = '#' * filled_length + ' ' * (bar_length - filled_length)
|
||||
elapsed = time.time() - self.start_time
|
||||
|
||||
# Estimate time remaining if we have progress
|
||||
if percentage > 0:
|
||||
remaining = elapsed * (100 - percentage) / percentage
|
||||
sys.stdout.write(f"\r{self.message} [{bar}] {percentage}% ({elapsed:.1f}s elapsed, ~{remaining:.1f}s remaining)")
|
||||
else:
|
||||
sys.stdout.write(f"\r{self.message} [{bar}] {percentage}% ({elapsed:.1f}s elapsed)")
|
||||
|
||||
sys.stdout.flush()
|
||||
await asyncio.sleep(0.2)
|
||||
|
||||
# Convenience function for running a task with progress indicator
|
||||
def with_progress(func: Callable, *args, message: str = "Processing", style: str = "dots", **kwargs) -> Any:
|
||||
"""
|
||||
Run a function with a progress indicator.
|
||||
|
||||
Args:
|
||||
func: Function to run
|
||||
*args: Arguments to pass to the function
|
||||
message: Message to display
|
||||
style: Progress indicator style
|
||||
**kwargs: Keyword arguments to pass to the function
|
||||
|
||||
Returns:
|
||||
The result of the function
|
||||
"""
|
||||
progress = ProgressIndicator(message=message, style=style)
|
||||
progress.start()
|
||||
|
||||
try:
|
||||
result = func(*args, **kwargs)
|
||||
return result
|
||||
finally:
|
||||
progress.stop()
|
||||
|
||||
# Async version of with_progress
|
||||
async def with_progress_async(func: Callable, *args, message: str = "Processing", style: str = "dots", **kwargs) -> Any:
|
||||
"""
|
||||
Run an async function with a progress indicator.
|
||||
|
||||
Args:
|
||||
func: Async function to run
|
||||
*args: Arguments to pass to the function
|
||||
message: Message to display
|
||||
style: Progress indicator style
|
||||
**kwargs: Keyword arguments to pass to the function
|
||||
|
||||
Returns:
|
||||
The result of the function
|
||||
"""
|
||||
progress = ProgressIndicator(message=message, style=style)
|
||||
await progress.start_async()
|
||||
|
||||
try:
|
||||
result = await func(*args, **kwargs)
|
||||
return result
|
||||
finally:
|
||||
await progress.stop_async()
|
||||
185
app/utils/units.py
Normal file
185
app/utils/units.py
Normal file
|
|
@ -0,0 +1,185 @@
|
|||
"""
|
||||
Unit normalization and conversion for Kiwi inventory.
|
||||
|
||||
Source of truth: metric.
|
||||
- Mass → grams (g)
|
||||
- Volume → milliliters (ml)
|
||||
- Count → each (dimensionless)
|
||||
|
||||
All inventory quantities are stored in the base metric unit.
|
||||
Conversion to display units happens at the API/frontend boundary.
|
||||
|
||||
Usage:
|
||||
from app.utils.units import normalize_to_metric, convert_from_metric
|
||||
|
||||
# Normalise OCR input
|
||||
qty, unit = normalize_to_metric(2.0, "lb") # → (907.18, "g")
|
||||
qty, unit = normalize_to_metric(1.0, "gal") # → (3785.41, "ml")
|
||||
qty, unit = normalize_to_metric(3.0, "each") # → (3.0, "each")
|
||||
|
||||
# Convert for display
|
||||
display_qty, display_unit = convert_from_metric(907.18, "g", preferred="imperial")
|
||||
# → (2.0, "lb")
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
# ── Unit categories ───────────────────────────────────────────────────────────
|
||||
|
||||
MASS_UNITS: frozenset[str] = frozenset({"g", "kg", "mg", "lb", "lbs", "oz"})
|
||||
VOLUME_UNITS: frozenset[str] = frozenset({
|
||||
"ml", "l",
|
||||
"fl oz", "floz", "fluid oz", "fluid ounce", "fluid ounces",
|
||||
"cup", "cups", "pt", "pint", "pints",
|
||||
"qt", "quart", "quarts", "gal", "gallon", "gallons",
|
||||
})
|
||||
COUNT_UNITS: frozenset[str] = frozenset({
|
||||
"each", "ea", "pc", "pcs", "piece", "pieces",
|
||||
"ct", "count", "item", "items",
|
||||
"pk", "pack", "packs", "bag", "bags",
|
||||
"bunch", "bunches", "head", "heads",
|
||||
"can", "cans", "bottle", "bottles", "box", "boxes",
|
||||
"jar", "jars", "tube", "tubes", "roll", "rolls",
|
||||
"loaf", "loaves", "dozen",
|
||||
})
|
||||
|
||||
# ── Conversion factors to base metric unit ────────────────────────────────────
|
||||
# All values are: 1 <unit> = N <base_unit>
|
||||
|
||||
# Mass → grams
|
||||
_TO_GRAMS: dict[str, float] = {
|
||||
"g": 1.0,
|
||||
"mg": 0.001,
|
||||
"kg": 1_000.0,
|
||||
"oz": 28.3495,
|
||||
"lb": 453.592,
|
||||
"lbs": 453.592,
|
||||
}
|
||||
|
||||
# Volume → millilitres
|
||||
_TO_ML: dict[str, float] = {
|
||||
"ml": 1.0,
|
||||
"l": 1_000.0,
|
||||
"fl oz": 29.5735,
|
||||
"floz": 29.5735,
|
||||
"fluid oz": 29.5735,
|
||||
"fluid ounce": 29.5735,
|
||||
"fluid ounces": 29.5735,
|
||||
"cup": 236.588,
|
||||
"cups": 236.588,
|
||||
"pt": 473.176,
|
||||
"pint": 473.176,
|
||||
"pints": 473.176,
|
||||
"qt": 946.353,
|
||||
"quart": 946.353,
|
||||
"quarts": 946.353,
|
||||
"gal": 3_785.41,
|
||||
"gallon": 3_785.41,
|
||||
"gallons": 3_785.41,
|
||||
}
|
||||
|
||||
# ── Imperial display preferences ─────────────────────────────────────────────
|
||||
# For convert_from_metric — which metric threshold triggers the next
|
||||
# larger imperial unit. Keeps display numbers human-readable.
|
||||
|
||||
_IMPERIAL_MASS_THRESHOLDS: list[tuple[float, str, float]] = [
|
||||
# (min grams, display unit, grams-per-unit)
|
||||
(453.592, "lb", 453.592), # ≥ 1 lb → show in lb
|
||||
(0.0, "oz", 28.3495), # otherwise → oz
|
||||
]
|
||||
|
||||
_METRIC_MASS_THRESHOLDS: list[tuple[float, str, float]] = [
|
||||
(1_000.0, "kg", 1_000.0),
|
||||
(0.0, "g", 1.0),
|
||||
]
|
||||
|
||||
_IMPERIAL_VOLUME_THRESHOLDS: list[tuple[float, str, float]] = [
|
||||
(3_785.41, "gal", 3_785.41),
|
||||
(946.353, "qt", 946.353),
|
||||
(473.176, "pt", 473.176),
|
||||
(236.588, "cup", 236.588),
|
||||
(0.0, "fl oz", 29.5735),
|
||||
]
|
||||
|
||||
_METRIC_VOLUME_THRESHOLDS: list[tuple[float, str, float]] = [
|
||||
(1_000.0, "l", 1_000.0),
|
||||
(0.0, "ml", 1.0),
|
||||
]
|
||||
|
||||
|
||||
# ── Public API ────────────────────────────────────────────────────────────────
|
||||
|
||||
def normalize_unit(raw: str) -> str:
|
||||
"""Canonicalize a raw unit string (lowercase, stripped)."""
|
||||
return raw.strip().lower()
|
||||
|
||||
|
||||
def classify_unit(unit: str) -> str:
|
||||
"""Return 'mass', 'volume', or 'count' for a canonical unit string."""
|
||||
u = normalize_unit(unit)
|
||||
if u in MASS_UNITS:
|
||||
return "mass"
|
||||
if u in VOLUME_UNITS:
|
||||
return "volume"
|
||||
return "count"
|
||||
|
||||
|
||||
def normalize_to_metric(quantity: float, unit: str) -> tuple[float, str]:
|
||||
"""Convert quantity + unit to the canonical metric base unit.
|
||||
|
||||
Returns (metric_quantity, base_unit) where base_unit is one of:
|
||||
'g' — grams (for all mass units)
|
||||
'ml' — millilitres (for all volume units)
|
||||
'each' — countable items (for everything else)
|
||||
|
||||
Unknown or ambiguous units (e.g. 'bag', 'bunch') are treated as count.
|
||||
"""
|
||||
u = normalize_unit(unit)
|
||||
|
||||
if u in _TO_GRAMS:
|
||||
return round(quantity * _TO_GRAMS[u], 4), "g"
|
||||
|
||||
if u in _TO_ML:
|
||||
return round(quantity * _TO_ML[u], 4), "ml"
|
||||
|
||||
# Count / ambiguous — store as-is
|
||||
return quantity, "each"
|
||||
|
||||
|
||||
def convert_from_metric(
|
||||
quantity: float,
|
||||
base_unit: str,
|
||||
preferred: str = "metric",
|
||||
) -> tuple[float, str]:
|
||||
"""Convert a stored metric quantity to a display unit.
|
||||
|
||||
Args:
|
||||
quantity: stored metric quantity
|
||||
base_unit: 'g', 'ml', or 'each'
|
||||
preferred: 'metric' or 'imperial'
|
||||
|
||||
Returns (display_quantity, display_unit).
|
||||
Rounds to 2 decimal places.
|
||||
"""
|
||||
if base_unit == "each":
|
||||
return quantity, "each"
|
||||
|
||||
thresholds: list[tuple[float, str, float]]
|
||||
|
||||
if base_unit == "g":
|
||||
thresholds = (
|
||||
_IMPERIAL_MASS_THRESHOLDS if preferred == "imperial"
|
||||
else _METRIC_MASS_THRESHOLDS
|
||||
)
|
||||
elif base_unit == "ml":
|
||||
thresholds = (
|
||||
_IMPERIAL_VOLUME_THRESHOLDS if preferred == "imperial"
|
||||
else _METRIC_VOLUME_THRESHOLDS
|
||||
)
|
||||
else:
|
||||
return quantity, base_unit
|
||||
|
||||
for min_qty, display_unit, factor in thresholds:
|
||||
if quantity >= min_qty:
|
||||
return round(quantity / factor, 2), display_unit
|
||||
|
||||
return round(quantity, 2), base_unit
|
||||
43
compose.cloud.yml
Normal file
43
compose.cloud.yml
Normal file
|
|
@ -0,0 +1,43 @@
|
|||
# Kiwi — cloud managed instance
|
||||
# Project: kiwi-cloud (docker compose -f compose.cloud.yml -p kiwi-cloud ...)
|
||||
# Web: http://127.0.0.1:8515 → menagerie.circuitforge.tech/kiwi (via Caddy + JWT auth)
|
||||
# API: internal only on kiwi-cloud-net (nginx proxies /api/ → api:8512)
|
||||
|
||||
services:
|
||||
api:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: kiwi/Dockerfile
|
||||
restart: unless-stopped
|
||||
env_file: .env
|
||||
environment:
|
||||
CLOUD_MODE: "true"
|
||||
CLOUD_DATA_ROOT: /devl/kiwi-cloud-data
|
||||
# DIRECTUS_JWT_SECRET, HEIMDALL_URL, HEIMDALL_ADMIN_TOKEN — set in .env
|
||||
volumes:
|
||||
- /devl/kiwi-cloud-data:/devl/kiwi-cloud-data
|
||||
# LLM config — shared with other CF products; read-only in container
|
||||
- ${HOME}/.config/circuitforge:/root/.config/circuitforge:ro
|
||||
networks:
|
||||
- kiwi-cloud-net
|
||||
|
||||
web:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: docker/web/Dockerfile
|
||||
args:
|
||||
VITE_BASE_URL: /kiwi
|
||||
VITE_API_BASE: /kiwi
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "8515:80"
|
||||
volumes:
|
||||
- ./docker/web/nginx.cloud.conf:/etc/nginx/conf.d/default.conf:ro
|
||||
networks:
|
||||
- kiwi-cloud-net
|
||||
depends_on:
|
||||
- api
|
||||
|
||||
networks:
|
||||
kiwi-cloud-net:
|
||||
driver: bridge
|
||||
21
compose.yml
Normal file
21
compose.yml
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
services:
|
||||
api:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: kiwi/Dockerfile
|
||||
network_mode: host
|
||||
env_file: .env
|
||||
volumes:
|
||||
- ./data:/app/kiwi/data
|
||||
- ${HOME}/.config/circuitforge:/root/.config/circuitforge:ro
|
||||
restart: unless-stopped
|
||||
|
||||
web:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: docker/web/Dockerfile
|
||||
ports:
|
||||
- "8511:80"
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- api
|
||||
22
docker/web/Dockerfile
Normal file
22
docker/web/Dockerfile
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# Stage 1: build
|
||||
FROM node:20-alpine AS build
|
||||
WORKDIR /app
|
||||
COPY frontend/package*.json ./
|
||||
RUN npm ci --prefer-offline
|
||||
COPY frontend/ ./
|
||||
|
||||
# Build-time env vars — Vite bakes these as static strings into the bundle.
|
||||
# VITE_BASE_URL: URL prefix the app is served under (/ for dev, /kiwi for cloud)
|
||||
# VITE_API_BASE: prefix for all /api/* fetch calls (empty for dev, /kiwi for cloud)
|
||||
ARG VITE_BASE_URL=/
|
||||
ARG VITE_API_BASE=
|
||||
ENV VITE_BASE_URL=$VITE_BASE_URL
|
||||
ENV VITE_API_BASE=$VITE_API_BASE
|
||||
|
||||
RUN npm run build
|
||||
|
||||
# Stage 2: serve
|
||||
FROM nginx:alpine
|
||||
COPY docker/web/nginx.conf /etc/nginx/conf.d/default.conf
|
||||
COPY --from=build /app/dist /usr/share/nginx/html
|
||||
EXPOSE 80
|
||||
32
docker/web/nginx.cloud.conf
Normal file
32
docker/web/nginx.cloud.conf
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
server {
|
||||
listen 80;
|
||||
server_name _;
|
||||
|
||||
root /usr/share/nginx/html;
|
||||
index index.html;
|
||||
|
||||
# Proxy API requests to the FastAPI container via Docker bridge network.
|
||||
location /api/ {
|
||||
proxy_pass http://api:8512;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $http_x_forwarded_proto;
|
||||
# Forward the session header injected by Caddy from cf_session cookie.
|
||||
proxy_set_header X-CF-Session $http_x_cf_session;
|
||||
}
|
||||
|
||||
location = /index.html {
|
||||
add_header Cache-Control "no-cache, no-store, must-revalidate";
|
||||
try_files $uri /index.html;
|
||||
}
|
||||
|
||||
location / {
|
||||
try_files $uri $uri/ /index.html;
|
||||
}
|
||||
|
||||
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
|
||||
expires 1y;
|
||||
add_header Cache-Control "public, immutable";
|
||||
}
|
||||
}
|
||||
27
docker/web/nginx.conf
Normal file
27
docker/web/nginx.conf
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
server {
|
||||
listen 80;
|
||||
server_name _;
|
||||
|
||||
root /usr/share/nginx/html;
|
||||
index index.html;
|
||||
|
||||
location /api/ {
|
||||
proxy_pass http://172.17.0.1:8512;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
}
|
||||
|
||||
location = /index.html {
|
||||
add_header Cache-Control "no-cache, no-store, must-revalidate";
|
||||
try_files $uri /index.html;
|
||||
}
|
||||
|
||||
location / {
|
||||
try_files $uri $uri/ /index.html;
|
||||
}
|
||||
|
||||
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
|
||||
expires 1y;
|
||||
add_header Cache-Control "public, immutable";
|
||||
}
|
||||
}
|
||||
18
environment.yml
Normal file
18
environment.yml
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
name: kiwi
|
||||
channels:
|
||||
- conda-forge
|
||||
- defaults
|
||||
dependencies:
|
||||
- python=3.11
|
||||
- pip
|
||||
- pip:
|
||||
- fastapi>=0.110
|
||||
- uvicorn[standard]>=0.27
|
||||
- python-multipart>=0.0.9
|
||||
- aiofiles>=23.0
|
||||
- opencv-python>=4.8
|
||||
- numpy>=1.25
|
||||
- pyzbar>=0.1.9
|
||||
- httpx>=0.27
|
||||
- pydantic>=2.5
|
||||
- PyJWT>=2.8
|
||||
3
frontend/.env
Normal file
3
frontend/.env
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
# API Configuration
|
||||
# Use the server's actual IP instead of localhost for remote access
|
||||
VITE_API_URL=http://10.1.10.71:8000/api/v1
|
||||
24
frontend/.gitignore
vendored
Normal file
24
frontend/.gitignore
vendored
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
pnpm-debug.log*
|
||||
lerna-debug.log*
|
||||
|
||||
node_modules
|
||||
dist
|
||||
dist-ssr
|
||||
*.local
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
!.vscode/extensions.json
|
||||
.idea
|
||||
.DS_Store
|
||||
*.suo
|
||||
*.ntvs*
|
||||
*.njsproj
|
||||
*.sln
|
||||
*.sw?
|
||||
3
frontend/.vscode/extensions.json
vendored
Normal file
3
frontend/.vscode/extensions.json
vendored
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
{
|
||||
"recommendations": ["Vue.volar"]
|
||||
}
|
||||
458
frontend/THEMING_SYSTEM.md
Normal file
458
frontend/THEMING_SYSTEM.md
Normal file
|
|
@ -0,0 +1,458 @@
|
|||
# Vue Frontend - Theming System Documentation
|
||||
|
||||
**Date**: 2025-10-31
|
||||
**Status**: ✅ Fully Implemented - Light/Dark Mode Support
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
The Vue frontend now uses a comprehensive CSS custom properties (variables) system that automatically adapts to the user's system color scheme preference. All components are theme-aware and will automatically switch between light and dark modes.
|
||||
|
||||
---
|
||||
|
||||
## How It Works
|
||||
|
||||
### Automatic Theme Detection
|
||||
|
||||
The theming system uses the CSS `prefers-color-scheme` media query to detect the user's system preference:
|
||||
|
||||
- **Dark Mode (Default)**: Used when system is set to dark mode or if no preference is detected
|
||||
- **Light Mode**: Automatically activated when system prefers light mode
|
||||
|
||||
### Color Scheme Declaration
|
||||
|
||||
All theme variables are defined in `/frontend/src/style.css`:
|
||||
|
||||
```css
|
||||
:root {
|
||||
color-scheme: light dark; /* Declares support for both schemes */
|
||||
|
||||
/* Dark mode variables (default) */
|
||||
--color-text-primary: rgba(255, 255, 255, 0.87);
|
||||
--color-bg-primary: #242424;
|
||||
/* ... */
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: light) {
|
||||
:root {
|
||||
/* Light mode overrides */
|
||||
--color-text-primary: #213547;
|
||||
--color-bg-primary: #f5f5f5;
|
||||
/* ... */
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Theme Variables Reference
|
||||
|
||||
### Text Colors
|
||||
|
||||
| Variable | Dark Mode | Light Mode | Usage |
|
||||
|----------|-----------|------------|-------|
|
||||
| `--color-text-primary` | `rgba(255, 255, 255, 0.87)` | `#213547` | Main text |
|
||||
| `--color-text-secondary` | `rgba(255, 255, 255, 0.6)` | `#666` | Secondary text, labels |
|
||||
| `--color-text-muted` | `rgba(255, 255, 255, 0.4)` | `#999` | Disabled, hints |
|
||||
|
||||
### Background Colors
|
||||
|
||||
| Variable | Dark Mode | Light Mode | Usage |
|
||||
|----------|-----------|------------|-------|
|
||||
| `--color-bg-primary` | `#242424` | `#f5f5f5` | Page background |
|
||||
| `--color-bg-secondary` | `#1a1a1a` | `#ffffff` | Secondary surfaces |
|
||||
| `--color-bg-elevated` | `#2d2d2d` | `#ffffff` | Elevated surfaces, dropdowns |
|
||||
| `--color-bg-card` | `#2d2d2d` | `#ffffff` | Card backgrounds |
|
||||
| `--color-bg-input` | `#1a1a1a` | `#ffffff` | Input fields |
|
||||
|
||||
### Border Colors
|
||||
|
||||
| Variable | Dark Mode | Light Mode | Usage |
|
||||
|----------|-----------|------------|-------|
|
||||
| `--color-border` | `rgba(255, 255, 255, 0.1)` | `#ddd` | Default borders |
|
||||
| `--color-border-focus` | `rgba(255, 255, 255, 0.2)` | `#ccc` | Focus state borders |
|
||||
|
||||
### Brand Colors
|
||||
|
||||
These remain consistent across themes:
|
||||
|
||||
| Variable | Value | Usage |
|
||||
|----------|-------|-------|
|
||||
| `--color-primary` | `#667eea` | Primary brand color |
|
||||
| `--color-primary-dark` | `#5568d3` | Darker variant (hover) |
|
||||
| `--color-primary-light` | `#7d8ff0` | Lighter variant |
|
||||
| `--color-secondary` | `#764ba2` | Secondary brand color |
|
||||
|
||||
### Status Colors
|
||||
|
||||
Base colors remain the same, but backgrounds adjust for contrast:
|
||||
|
||||
#### Success (Green)
|
||||
|
||||
| Variable | Value | Light Mode Bg | Usage |
|
||||
|----------|-------|---------------|-------|
|
||||
| `--color-success` | `#4CAF50` | Same | Success actions |
|
||||
| `--color-success-dark` | `#45a049` | Same | Hover states |
|
||||
| `--color-success-light` | `#66bb6a` | Same | Accents |
|
||||
| `--color-success-bg` | `rgba(76, 175, 80, 0.1)` | `#d4edda` | Success backgrounds |
|
||||
| `--color-success-border` | `rgba(76, 175, 80, 0.3)` | `#c3e6cb` | Success borders |
|
||||
|
||||
#### Warning (Orange)
|
||||
|
||||
| Variable | Value | Light Mode Bg | Usage |
|
||||
|----------|-------|---------------|-------|
|
||||
| `--color-warning` | `#ff9800` | Same | Warning states |
|
||||
| `--color-warning-dark` | `#f57c00` | Same | Hover states |
|
||||
| `--color-warning-light` | `#ffb74d` | Same | Accents |
|
||||
| `--color-warning-bg` | `rgba(255, 152, 0, 0.1)` | `#fff3cd` | Warning backgrounds |
|
||||
| `--color-warning-border` | `rgba(255, 152, 0, 0.3)` | `#ffeaa7` | Warning borders |
|
||||
|
||||
#### Error (Red)
|
||||
|
||||
| Variable | Value | Light Mode Bg | Usage |
|
||||
|----------|-------|---------------|-------|
|
||||
| `--color-error` | `#f44336` | Same | Error states |
|
||||
| `--color-error-dark` | `#d32f2f` | Same | Hover states |
|
||||
| `--color-error-light` | `#ff6b6b` | Same | Accents |
|
||||
| `--color-error-bg` | `rgba(244, 67, 54, 0.1)` | `#f8d7da` | Error backgrounds |
|
||||
| `--color-error-border` | `rgba(244, 67, 54, 0.3)` | `#f5c6cb` | Error borders |
|
||||
|
||||
#### Info (Blue)
|
||||
|
||||
| Variable | Value | Light Mode Bg | Usage |
|
||||
|----------|-------|---------------|-------|
|
||||
| `--color-info` | `#2196F3` | Same | Info states |
|
||||
| `--color-info-dark` | `#1976D2` | Same | Hover states |
|
||||
| `--color-info-light` | `#64b5f6` | Same | Accents |
|
||||
| `--color-info-bg` | `rgba(33, 150, 243, 0.1)` | `#d1ecf1` | Info backgrounds |
|
||||
| `--color-info-border` | `rgba(33, 150, 243, 0.3)` | `#bee5eb` | Info borders |
|
||||
|
||||
### Gradients
|
||||
|
||||
| Variable | Value | Usage |
|
||||
|----------|-------|-------|
|
||||
| `--gradient-primary` | `linear-gradient(135deg, var(--color-primary) 0%, var(--color-secondary) 100%)` | Headers, buttons |
|
||||
|
||||
### Shadows
|
||||
|
||||
Adjust opacity for light mode:
|
||||
|
||||
| Variable | Dark Mode | Light Mode | Usage |
|
||||
|----------|-----------|------------|-------|
|
||||
| `--shadow-sm` | `0 1px 3px rgba(0,0,0,0.3)` | `0 1px 3px rgba(0,0,0,0.1)` | Small shadows |
|
||||
| `--shadow-md` | `0 4px 6px rgba(0,0,0,0.3)` | `0 4px 6px rgba(0,0,0,0.1)` | Medium shadows |
|
||||
| `--shadow-lg` | `0 10px 20px rgba(0,0,0,0.4)` | `0 10px 20px rgba(0,0,0,0.15)` | Large shadows |
|
||||
| `--shadow-xl` | `0 20px 40px rgba(0,0,0,0.5)` | `0 20px 40px rgba(0,0,0,0.2)` | Extra large shadows |
|
||||
|
||||
### Typography
|
||||
|
||||
| Variable | Value | Usage |
|
||||
|----------|-------|-------|
|
||||
| `--font-size-xs` | `12px` | Very small text |
|
||||
| `--font-size-sm` | `14px` | Small text, labels |
|
||||
| `--font-size-base` | `16px` | Body text |
|
||||
| `--font-size-lg` | `18px` | Large text |
|
||||
| `--font-size-xl` | `24px` | Headings |
|
||||
| `--font-size-2xl` | `32px` | Large headings, stats |
|
||||
|
||||
### Spacing
|
||||
|
||||
| Variable | Value | Usage |
|
||||
|----------|-------|-------|
|
||||
| `--spacing-xs` | `4px` | Tiny gaps |
|
||||
| `--spacing-sm` | `8px` | Small gaps |
|
||||
| `--spacing-md` | `16px` | Medium gaps |
|
||||
| `--spacing-lg` | `24px` | Large gaps |
|
||||
| `--spacing-xl` | `32px` | Extra large gaps |
|
||||
|
||||
### Border Radius
|
||||
|
||||
| Variable | Value | Usage |
|
||||
|----------|-------|-------|
|
||||
| `--radius-sm` | `4px` | Small radius |
|
||||
| `--radius-md` | `6px` | Medium radius |
|
||||
| `--radius-lg` | `8px` | Large radius |
|
||||
| `--radius-xl` | `12px` | Extra large radius (cards) |
|
||||
|
||||
---
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### In Vue Components
|
||||
|
||||
```vue
|
||||
<style scoped>
|
||||
.my-component {
|
||||
background: var(--color-bg-card);
|
||||
color: var(--color-text-primary);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-md);
|
||||
padding: var(--spacing-lg);
|
||||
box-shadow: var(--shadow-md);
|
||||
}
|
||||
|
||||
.my-button {
|
||||
background: var(--gradient-primary);
|
||||
color: white;
|
||||
font-size: var(--font-size-base);
|
||||
padding: var(--spacing-sm) var(--spacing-lg);
|
||||
}
|
||||
|
||||
.success-message {
|
||||
background: var(--color-success-bg);
|
||||
color: var(--color-success-dark);
|
||||
border: 1px solid var(--color-success-border);
|
||||
}
|
||||
</style>
|
||||
```
|
||||
|
||||
### Status-Specific Styling
|
||||
|
||||
```vue
|
||||
<style scoped>
|
||||
.item-expiring-soon {
|
||||
border-left: 4px solid var(--color-warning);
|
||||
background: var(--color-warning-bg);
|
||||
}
|
||||
|
||||
.item-expired {
|
||||
border-left: 4px solid var(--color-error);
|
||||
color: var(--color-error);
|
||||
}
|
||||
|
||||
.item-success {
|
||||
background: var(--color-success-bg);
|
||||
border: 1px solid var(--color-success-border);
|
||||
}
|
||||
</style>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Testing Theme Modes
|
||||
|
||||
### On macOS
|
||||
|
||||
1. **System Preferences** → **General** → **Appearance**
|
||||
2. Select "Dark" or "Light"
|
||||
3. Vue app will automatically switch
|
||||
|
||||
### On Windows
|
||||
|
||||
1. **Settings** → **Personalization** → **Colors**
|
||||
2. Choose "Dark" or "Light" mode
|
||||
3. Vue app will automatically switch
|
||||
|
||||
### On Linux (GNOME)
|
||||
|
||||
1. **Settings** → **Appearance**
|
||||
2. Toggle **Dark Style**
|
||||
3. Vue app will automatically switch
|
||||
|
||||
### Browser DevTools Testing
|
||||
|
||||
**Chrome/Edge**:
|
||||
1. Open DevTools (F12)
|
||||
2. Press Ctrl+Shift+P (Cmd+Shift+P on Mac)
|
||||
3. Type "Rendering"
|
||||
4. Select "Emulate CSS media feature prefers-color-scheme"
|
||||
5. Choose "prefers-color-scheme: dark" or "light"
|
||||
|
||||
**Firefox**:
|
||||
1. Open DevTools (F12)
|
||||
2. Click the settings gear icon
|
||||
3. Scroll to "Inspector"
|
||||
4. Toggle "Disable prefers-color-scheme media queries"
|
||||
|
||||
---
|
||||
|
||||
## Components Using Theme Variables
|
||||
|
||||
All components have been updated to use theme variables:
|
||||
|
||||
✅ **App.vue**
|
||||
- Header gradient
|
||||
- Footer styling
|
||||
- Tab navigation
|
||||
|
||||
✅ **InventoryList.vue**
|
||||
- All cards and backgrounds
|
||||
- Status colors (success/warning/error)
|
||||
- Form inputs and labels
|
||||
- Buttons and actions
|
||||
- Upload areas
|
||||
- Loading spinners
|
||||
|
||||
✅ **ReceiptsView.vue**
|
||||
- Upload area
|
||||
- Receipt cards
|
||||
- Status indicators
|
||||
- Stats display
|
||||
|
||||
✅ **EditItemModal.vue**
|
||||
- Modal background
|
||||
- Form fields
|
||||
- Expiration date color coding
|
||||
- Buttons
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### DO ✅
|
||||
|
||||
1. **Always use theme variables** instead of hardcoded colors
|
||||
```css
|
||||
/* Good */
|
||||
color: var(--color-text-primary);
|
||||
|
||||
/* Bad */
|
||||
color: #333;
|
||||
```
|
||||
|
||||
2. **Use semantic variable names**
|
||||
```css
|
||||
/* Good */
|
||||
background: var(--color-bg-card);
|
||||
|
||||
/* Bad */
|
||||
background: var(--color-bg-elevated); /* Wrong semantic meaning */
|
||||
```
|
||||
|
||||
3. **Use spacing variables** for consistency
|
||||
```css
|
||||
/* Good */
|
||||
padding: var(--spacing-lg);
|
||||
|
||||
/* Bad */
|
||||
padding: 24px;
|
||||
```
|
||||
|
||||
4. **Use status colors appropriately**
|
||||
```css
|
||||
/* Good - Expiration warning */
|
||||
.expiring { color: var(--color-warning); }
|
||||
|
||||
/* Bad - Using error for warning */
|
||||
.expiring { color: var(--color-error); }
|
||||
```
|
||||
|
||||
### DON'T ❌
|
||||
|
||||
1. **Don't hardcode colors**
|
||||
```css
|
||||
/* Bad */
|
||||
background: #ffffff;
|
||||
color: #333333;
|
||||
```
|
||||
|
||||
2. **Don't use pixel values for spacing**
|
||||
```css
|
||||
/* Bad */
|
||||
margin: 16px;
|
||||
|
||||
/* Good */
|
||||
margin: var(--spacing-md);
|
||||
```
|
||||
|
||||
3. **Don't mix theme and non-theme styles**
|
||||
```css
|
||||
/* Bad */
|
||||
.card {
|
||||
background: var(--color-bg-card);
|
||||
border: 1px solid #ddd; /* Hardcoded! */
|
||||
}
|
||||
|
||||
/* Good */
|
||||
.card {
|
||||
background: var(--color-bg-card);
|
||||
border: 1px solid var(--color-border);
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Adding New Theme Variables
|
||||
|
||||
If you need to add new theme variables:
|
||||
|
||||
1. **Add to dark mode (default)** in `:root`:
|
||||
```css
|
||||
:root {
|
||||
--color-my-new-color: #value;
|
||||
}
|
||||
```
|
||||
|
||||
2. **Add light mode override** in media query:
|
||||
```css
|
||||
@media (prefers-color-scheme: light) {
|
||||
:root {
|
||||
--color-my-new-color: #different-value;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. **Use in components**:
|
||||
```css
|
||||
.my-element {
|
||||
color: var(--color-my-new-color);
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
Potential additions to the theming system:
|
||||
|
||||
1. **Manual Theme Toggle**
|
||||
- Add a theme switcher button
|
||||
- Store preference in localStorage
|
||||
- Override system preference
|
||||
|
||||
2. **Custom Color Schemes**
|
||||
- Allow users to choose accent colors
|
||||
- Save theme preferences per user
|
||||
|
||||
3. **High Contrast Mode**
|
||||
- Support `prefers-contrast: high`
|
||||
- Increase border widths and color differences
|
||||
|
||||
4. **Reduced Motion**
|
||||
- Support `prefers-reduced-motion`
|
||||
- Disable animations for accessibility
|
||||
|
||||
---
|
||||
|
||||
## Browser Support
|
||||
|
||||
The theming system is supported in:
|
||||
|
||||
✅ **Chrome/Edge**: 76+
|
||||
✅ **Firefox**: 67+
|
||||
✅ **Safari**: 12.1+
|
||||
✅ **Opera**: 62+
|
||||
|
||||
CSS Custom Properties (Variables) are supported in all modern browsers.
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
**What We Have**:
|
||||
- ✅ Automatic light/dark mode detection
|
||||
- ✅ Comprehensive variable system (50+ variables)
|
||||
- ✅ All components are theme-aware
|
||||
- ✅ Semantic, maintainable color system
|
||||
- ✅ Consistent spacing, typography, and shadows
|
||||
- ✅ Status colors with proper contrast in both modes
|
||||
|
||||
**Benefits**:
|
||||
- 🎨 Consistent design across the entire app
|
||||
- 🌓 Automatic theme switching based on system preference
|
||||
- 🔧 Easy to maintain and update colors globally
|
||||
- ♿ Better accessibility with proper contrast ratios
|
||||
- 🚀 Future-proof for theme customization
|
||||
|
||||
**The Vue frontend now fully supports light and dark modes! 🎉**
|
||||
13
frontend/index.html
Normal file
13
frontend/index.html
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>frontend</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="app"></div>
|
||||
<script type="module" src="/src/main.ts"></script>
|
||||
</body>
|
||||
</html>
|
||||
1884
frontend/package-lock.json
generated
Normal file
1884
frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load diff
25
frontend/package.json
Normal file
25
frontend/package.json
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
{
|
||||
"name": "frontend",
|
||||
"private": true,
|
||||
"version": "0.0.0",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "vue-tsc -b && vite build",
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"axios": "^1.13.1",
|
||||
"pinia": "^3.0.3",
|
||||
"vue": "^3.5.22",
|
||||
"vue-router": "^4.6.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^24.6.0",
|
||||
"@vitejs/plugin-vue": "^6.0.1",
|
||||
"@vue/tsconfig": "^0.8.1",
|
||||
"typescript": "~5.9.3",
|
||||
"vite": "^7.1.7",
|
||||
"vue-tsc": "^3.1.0"
|
||||
}
|
||||
}
|
||||
1
frontend/public/vite.svg
Normal file
1
frontend/public/vite.svg
Normal file
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="31.88" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 257"><defs><linearGradient id="IconifyId1813088fe1fbc01fb466" x1="-.828%" x2="57.636%" y1="7.652%" y2="78.411%"><stop offset="0%" stop-color="#41D1FF"></stop><stop offset="100%" stop-color="#BD34FE"></stop></linearGradient><linearGradient id="IconifyId1813088fe1fbc01fb467" x1="43.376%" x2="50.316%" y1="2.242%" y2="89.03%"><stop offset="0%" stop-color="#FFEA83"></stop><stop offset="8.333%" stop-color="#FFDD35"></stop><stop offset="100%" stop-color="#FFA800"></stop></linearGradient></defs><path fill="url(#IconifyId1813088fe1fbc01fb466)" d="M255.153 37.938L134.897 252.976c-2.483 4.44-8.862 4.466-11.382.048L.875 37.958c-2.746-4.814 1.371-10.646 6.827-9.67l120.385 21.517a6.537 6.537 0 0 0 2.322-.004l117.867-21.483c5.438-.991 9.574 4.796 6.877 9.62Z"></path><path fill="url(#IconifyId1813088fe1fbc01fb467)" d="M185.432.063L96.44 17.501a3.268 3.268 0 0 0-2.634 3.014l-5.474 92.456a3.268 3.268 0 0 0 3.997 3.378l24.777-5.718c2.318-.535 4.413 1.507 3.936 3.838l-7.361 36.047c-.495 2.426 1.782 4.5 4.151 3.78l15.304-4.649c2.372-.72 4.652 1.36 4.15 3.788l-11.698 56.621c-.732 3.542 3.979 5.473 5.943 2.437l1.313-2.028l72.516-144.72c1.215-2.423-.88-5.186-3.54-4.672l-25.505 4.922c-2.396.462-4.435-1.77-3.759-4.114l16.646-57.705c.677-2.35-1.37-4.583-3.769-4.113Z"></path></svg>
|
||||
|
After Width: | Height: | Size: 1.5 KiB |
199
frontend/src/App.vue
Normal file
199
frontend/src/App.vue
Normal file
|
|
@ -0,0 +1,199 @@
|
|||
<template>
|
||||
<div id="app">
|
||||
<header class="app-header">
|
||||
<div class="container">
|
||||
<h1>🥝 Kiwi</h1>
|
||||
<p class="tagline">Smart Pantry Tracking & Recipe Suggestions</p>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
<main class="app-main">
|
||||
<div class="container">
|
||||
<!-- Tabs -->
|
||||
<div class="tabs">
|
||||
<button
|
||||
:class="['tab', { active: currentTab === 'inventory' }]"
|
||||
@click="switchTab('inventory')"
|
||||
>
|
||||
🏪 Inventory
|
||||
</button>
|
||||
<button
|
||||
:class="['tab', { active: currentTab === 'receipts' }]"
|
||||
@click="switchTab('receipts')"
|
||||
>
|
||||
🧾 Receipts
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Tab Content -->
|
||||
<div v-show="currentTab === 'inventory'" class="tab-content">
|
||||
<InventoryList />
|
||||
</div>
|
||||
|
||||
<div v-show="currentTab === 'receipts'" class="tab-content">
|
||||
<ReceiptsView />
|
||||
</div>
|
||||
</div>
|
||||
</main>
|
||||
|
||||
<footer class="app-footer">
|
||||
<div class="container">
|
||||
<p>© 2026 CircuitForge LLC</p>
|
||||
</div>
|
||||
</footer>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref } from 'vue'
|
||||
import InventoryList from './components/InventoryList.vue'
|
||||
import ReceiptsView from './components/ReceiptsView.vue'
|
||||
|
||||
const currentTab = ref<'inventory' | 'receipts'>('inventory')
|
||||
|
||||
function switchTab(tab: 'inventory' | 'receipts') {
|
||||
currentTab.value = tab
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
|
||||
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
|
||||
sans-serif;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
background: var(--color-bg-primary);
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
#app {
|
||||
min-height: 100vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 1400px;
|
||||
margin: 0 auto;
|
||||
padding: 0 20px;
|
||||
}
|
||||
|
||||
.app-header {
|
||||
background: var(--gradient-primary);
|
||||
color: white;
|
||||
padding: var(--spacing-xl) 0;
|
||||
box-shadow: var(--shadow-md);
|
||||
}
|
||||
|
||||
.app-header h1 {
|
||||
font-size: 32px;
|
||||
margin-bottom: 5px;
|
||||
}
|
||||
|
||||
.app-header .tagline {
|
||||
font-size: 16px;
|
||||
opacity: 0.9;
|
||||
}
|
||||
|
||||
.app-main {
|
||||
flex: 1;
|
||||
padding: 20px 0;
|
||||
}
|
||||
|
||||
.app-footer {
|
||||
background: var(--color-bg-elevated);
|
||||
color: var(--color-text-secondary);
|
||||
padding: var(--spacing-lg) 0;
|
||||
text-align: center;
|
||||
margin-top: var(--spacing-xl);
|
||||
border-top: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.app-footer p {
|
||||
font-size: var(--font-size-sm);
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
/* Tabs */
|
||||
.tabs {
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.tab {
|
||||
background: rgba(255, 255, 255, 0.2);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 15px 30px;
|
||||
font-size: 16px;
|
||||
border-radius: 8px;
|
||||
cursor: pointer;
|
||||
transition: all 0.3s;
|
||||
}
|
||||
|
||||
.tab:hover {
|
||||
background: rgba(255, 255, 255, 0.3);
|
||||
}
|
||||
|
||||
.tab.active {
|
||||
background: var(--color-bg-card);
|
||||
color: var(--color-primary);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.tab-content {
|
||||
animation: fadeIn 0.3s;
|
||||
}
|
||||
|
||||
@keyframes fadeIn {
|
||||
from { opacity: 0; }
|
||||
to { opacity: 1; }
|
||||
}
|
||||
|
||||
/* Mobile Responsive Breakpoints */
|
||||
@media (max-width: 480px) {
|
||||
.container {
|
||||
padding: 0 12px;
|
||||
}
|
||||
|
||||
.app-header h1 {
|
||||
font-size: 24px;
|
||||
}
|
||||
|
||||
.app-header .tagline {
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.tabs {
|
||||
gap: 8px;
|
||||
}
|
||||
|
||||
.tab {
|
||||
padding: 12px 20px;
|
||||
font-size: 14px;
|
||||
flex: 1;
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 481px) and (max-width: 768px) {
|
||||
.container {
|
||||
padding: 0 16px;
|
||||
}
|
||||
|
||||
.app-header h1 {
|
||||
font-size: 28px;
|
||||
}
|
||||
|
||||
.tab {
|
||||
padding: 14px 25px;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
1
frontend/src/assets/vue.svg
Normal file
1
frontend/src/assets/vue.svg
Normal file
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="37.07" height="36" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 198"><path fill="#41B883" d="M204.8 0H256L128 220.8L0 0h97.92L128 51.2L157.44 0h47.36Z"></path><path fill="#41B883" d="m0 0l128 220.8L256 0h-51.2L128 132.48L50.56 0H0Z"></path><path fill="#35495E" d="M50.56 0L128 133.12L204.8 0h-47.36L128 51.2L97.92 0H50.56Z"></path></svg>
|
||||
|
After Width: | Height: | Size: 496 B |
189
frontend/src/components/ConfirmDialog.vue
Normal file
189
frontend/src/components/ConfirmDialog.vue
Normal file
|
|
@ -0,0 +1,189 @@
|
|||
<template>
|
||||
<Transition name="modal">
|
||||
<div v-if="show" class="modal-overlay" @click="handleCancel">
|
||||
<div class="modal-container" @click.stop>
|
||||
<div class="modal-header">
|
||||
<h3>{{ title }}</h3>
|
||||
</div>
|
||||
|
||||
<div class="modal-body">
|
||||
<p>{{ message }}</p>
|
||||
</div>
|
||||
|
||||
<div class="modal-footer">
|
||||
<button class="btn btn-secondary" @click="handleCancel">
|
||||
{{ cancelText }}
|
||||
</button>
|
||||
<button :class="['btn', `btn-${type}`]" @click="handleConfirm">
|
||||
{{ confirmText }}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</Transition>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
interface Props {
|
||||
show: boolean
|
||||
title?: string
|
||||
message: string
|
||||
confirmText?: string
|
||||
cancelText?: string
|
||||
type?: 'primary' | 'danger' | 'warning'
|
||||
}
|
||||
|
||||
withDefaults(defineProps<Props>(), {
|
||||
title: 'Confirm',
|
||||
confirmText: 'Confirm',
|
||||
cancelText: 'Cancel',
|
||||
type: 'primary',
|
||||
})
|
||||
|
||||
const emit = defineEmits<{
|
||||
confirm: []
|
||||
cancel: []
|
||||
}>()
|
||||
|
||||
function handleConfirm() {
|
||||
emit('confirm')
|
||||
}
|
||||
|
||||
function handleCancel() {
|
||||
emit('cancel')
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.modal-overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(0, 0, 0, 0.5);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 9999;
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.modal-container {
|
||||
background: var(--color-bg-elevated);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-xl);
|
||||
box-shadow: var(--shadow-xl);
|
||||
max-width: 500px;
|
||||
width: 100%;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
padding: var(--spacing-lg);
|
||||
border-bottom: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.modal-header h3 {
|
||||
margin: 0;
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-lg);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.modal-body {
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.modal-body p {
|
||||
margin: 0;
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-base);
|
||||
line-height: 1.5;
|
||||
}
|
||||
|
||||
.modal-footer {
|
||||
padding: var(--spacing-lg);
|
||||
border-top: 1px solid var(--color-border);
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
|
||||
.btn {
|
||||
padding: var(--spacing-sm) var(--spacing-lg);
|
||||
border: none;
|
||||
border-radius: var(--radius-md);
|
||||
font-size: var(--font-size-base);
|
||||
font-weight: 500;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
}
|
||||
|
||||
.btn-secondary {
|
||||
background: var(--color-bg-secondary);
|
||||
color: var(--color-text-primary);
|
||||
border: 1px solid var(--color-border);
|
||||
}
|
||||
|
||||
.btn-secondary:hover {
|
||||
background: var(--color-bg-primary);
|
||||
}
|
||||
|
||||
.btn-primary {
|
||||
background: var(--gradient-primary);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-primary:hover {
|
||||
opacity: 0.9;
|
||||
transform: translateY(-1px);
|
||||
box-shadow: var(--shadow-md);
|
||||
}
|
||||
|
||||
.btn-danger {
|
||||
background: var(--color-error);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-danger:hover {
|
||||
background: var(--color-error-dark);
|
||||
transform: translateY(-1px);
|
||||
box-shadow: var(--shadow-md);
|
||||
}
|
||||
|
||||
.btn-warning {
|
||||
background: var(--color-warning);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-warning:hover {
|
||||
background: var(--color-warning-dark);
|
||||
transform: translateY(-1px);
|
||||
box-shadow: var(--shadow-md);
|
||||
}
|
||||
|
||||
/* Animations */
|
||||
.modal-enter-active,
|
||||
.modal-leave-active {
|
||||
transition: opacity 0.3s ease;
|
||||
}
|
||||
|
||||
.modal-enter-active .modal-container,
|
||||
.modal-leave-active .modal-container {
|
||||
transition: transform 0.3s ease;
|
||||
}
|
||||
|
||||
.modal-enter-from,
|
||||
.modal-leave-to {
|
||||
opacity: 0;
|
||||
}
|
||||
|
||||
.modal-enter-from .modal-container {
|
||||
transform: scale(0.9) translateY(-20px);
|
||||
}
|
||||
|
||||
.modal-leave-to .modal-container {
|
||||
transform: scale(0.9) translateY(-20px);
|
||||
}
|
||||
</style>
|
||||
452
frontend/src/components/EditItemModal.vue
Normal file
452
frontend/src/components/EditItemModal.vue
Normal file
|
|
@ -0,0 +1,452 @@
|
|||
<template>
|
||||
<div class="modal-overlay" @click.self="$emit('close')">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h2>Edit Inventory Item</h2>
|
||||
<button class="close-btn" @click="$emit('close')">×</button>
|
||||
</div>
|
||||
|
||||
<form @submit.prevent="handleSubmit" class="edit-form">
|
||||
<div class="form-group">
|
||||
<label>Product</label>
|
||||
<div class="product-info">
|
||||
<strong>{{ item.product.name }}</strong>
|
||||
<span v-if="item.product.brand" class="brand">({{ item.product.brand }})</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="quantity">Quantity *</label>
|
||||
<input
|
||||
id="quantity"
|
||||
v-model.number="formData.quantity"
|
||||
type="number"
|
||||
step="0.1"
|
||||
min="0"
|
||||
required
|
||||
class="form-input"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label for="unit">Unit</label>
|
||||
<select id="unit" v-model="formData.unit" class="form-input">
|
||||
<option value="count">Count</option>
|
||||
<option value="kg">Kilograms</option>
|
||||
<option value="g">Grams</option>
|
||||
<option value="lb">Pounds</option>
|
||||
<option value="oz">Ounces</option>
|
||||
<option value="l">Liters</option>
|
||||
<option value="ml">Milliliters</option>
|
||||
<option value="gal">Gallons</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="location">Location *</label>
|
||||
<select id="location" v-model="formData.location" required class="form-input">
|
||||
<option value="fridge">Fridge</option>
|
||||
<option value="freezer">Freezer</option>
|
||||
<option value="pantry">Pantry</option>
|
||||
<option value="cabinet">Cabinet</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label for="sublocation">Sublocation</label>
|
||||
<input
|
||||
id="sublocation"
|
||||
v-model="formData.sublocation"
|
||||
type="text"
|
||||
placeholder="e.g., Top Shelf"
|
||||
class="form-input"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label for="purchase_date">Purchase Date</label>
|
||||
<input
|
||||
id="purchase_date"
|
||||
v-model="formData.purchase_date"
|
||||
type="date"
|
||||
class="form-input"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label for="expiration_date">Expiration Date</label>
|
||||
<input
|
||||
id="expiration_date"
|
||||
v-model="formData.expiration_date"
|
||||
type="date"
|
||||
class="form-input"
|
||||
:class="getExpiryInputClass()"
|
||||
/>
|
||||
<small v-if="formData.expiration_date" class="expiry-hint">
|
||||
{{ getExpiryHint() }}
|
||||
</small>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label for="status">Status</label>
|
||||
<select id="status" v-model="formData.status" class="form-input">
|
||||
<option value="available">Available</option>
|
||||
<option value="consumed">Consumed</option>
|
||||
<option value="expired">Expired</option>
|
||||
<option value="discarded">Discarded</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label for="notes">Notes</label>
|
||||
<textarea
|
||||
id="notes"
|
||||
v-model="formData.notes"
|
||||
rows="3"
|
||||
placeholder="Add any notes about this item..."
|
||||
class="form-input"
|
||||
></textarea>
|
||||
</div>
|
||||
|
||||
<div v-if="error" class="error-message">
|
||||
{{ error }}
|
||||
</div>
|
||||
|
||||
<div class="form-actions">
|
||||
<button type="button" @click="$emit('close')" class="btn-cancel">
|
||||
Cancel
|
||||
</button>
|
||||
<button type="submit" class="btn-save" :disabled="saving">
|
||||
{{ saving ? 'Saving...' : 'Save Changes' }}
|
||||
</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, reactive } from 'vue'
|
||||
import { useInventoryStore } from '../stores/inventory'
|
||||
import type { InventoryItem } from '../services/api'
|
||||
|
||||
const props = defineProps<{
|
||||
item: InventoryItem
|
||||
}>()
|
||||
|
||||
const emit = defineEmits<{
|
||||
close: []
|
||||
save: []
|
||||
}>()
|
||||
|
||||
const store = useInventoryStore()
|
||||
|
||||
const saving = ref(false)
|
||||
const error = ref<string | null>(null)
|
||||
|
||||
// Initialize form data
|
||||
const formData = reactive({
|
||||
quantity: props.item.quantity,
|
||||
unit: props.item.unit,
|
||||
location: props.item.location,
|
||||
sublocation: props.item.sublocation || '',
|
||||
purchase_date: props.item.purchase_date || '',
|
||||
expiration_date: props.item.expiration_date || '',
|
||||
status: props.item.status,
|
||||
notes: props.item.notes || '',
|
||||
})
|
||||
|
||||
async function handleSubmit() {
|
||||
saving.value = true
|
||||
error.value = null
|
||||
|
||||
try {
|
||||
// Prepare update object (only include changed fields)
|
||||
const update: any = {}
|
||||
|
||||
if (formData.quantity !== props.item.quantity) update.quantity = formData.quantity
|
||||
if (formData.unit !== props.item.unit) update.unit = formData.unit
|
||||
if (formData.location !== props.item.location) update.location = formData.location
|
||||
if (formData.sublocation !== props.item.sublocation) update.sublocation = formData.sublocation || null
|
||||
if (formData.purchase_date !== props.item.purchase_date) update.purchase_date = formData.purchase_date || null
|
||||
if (formData.expiration_date !== props.item.expiration_date) update.expiration_date = formData.expiration_date || null
|
||||
if (formData.status !== props.item.status) update.status = formData.status
|
||||
if (formData.notes !== props.item.notes) update.notes = formData.notes || null
|
||||
|
||||
await store.updateItem(props.item.id, update)
|
||||
|
||||
emit('save')
|
||||
} catch (err: any) {
|
||||
error.value = err.response?.data?.detail || 'Failed to update item'
|
||||
} finally {
|
||||
saving.value = false
|
||||
}
|
||||
}
|
||||
|
||||
function getExpiryInputClass(): string {
|
||||
if (!formData.expiration_date) return ''
|
||||
|
||||
const today = new Date()
|
||||
const expiry = new Date(formData.expiration_date)
|
||||
const diffDays = Math.ceil((expiry.getTime() - today.getTime()) / (1000 * 60 * 60 * 24))
|
||||
|
||||
if (diffDays < 0) return 'expiry-expired'
|
||||
if (diffDays <= 3) return 'expiry-soon'
|
||||
if (diffDays <= 7) return 'expiry-warning'
|
||||
return 'expiry-good'
|
||||
}
|
||||
|
||||
function getExpiryHint(): string {
|
||||
if (!formData.expiration_date) return ''
|
||||
|
||||
const today = new Date()
|
||||
today.setHours(0, 0, 0, 0)
|
||||
const expiry = new Date(formData.expiration_date)
|
||||
expiry.setHours(0, 0, 0, 0)
|
||||
|
||||
const diffDays = Math.ceil((expiry.getTime() - today.getTime()) / (1000 * 60 * 60 * 24))
|
||||
|
||||
if (diffDays < 0) return `⚠️ Expired ${Math.abs(diffDays)} days ago`
|
||||
if (diffDays === 0) return '⚠️ Expires today!'
|
||||
if (diffDays === 1) return '⚠️ Expires tomorrow'
|
||||
if (diffDays <= 3) return `⚠️ Expires in ${diffDays} days (use soon!)`
|
||||
if (diffDays <= 7) return `Expires in ${diffDays} days`
|
||||
return `Expires in ${diffDays} days`
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.modal-overlay {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: rgba(0, 0, 0, 0.5);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
z-index: 1000;
|
||||
}
|
||||
|
||||
.modal-content {
|
||||
background: var(--color-bg-card);
|
||||
border-radius: var(--radius-lg);
|
||||
width: 90%;
|
||||
max-width: 600px;
|
||||
max-height: 90vh;
|
||||
overflow-y: auto;
|
||||
box-shadow: 0 4px 16px rgba(0, 0, 0, 0.2);
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: 20px;
|
||||
border-bottom: 1px solid #eee;
|
||||
}
|
||||
|
||||
.modal-header h2 {
|
||||
margin: 0;
|
||||
font-size: var(--font-size-xl);
|
||||
}
|
||||
|
||||
.close-btn {
|
||||
background: none;
|
||||
border: none;
|
||||
font-size: 32px;
|
||||
color: #999;
|
||||
cursor: pointer;
|
||||
padding: 0;
|
||||
width: 32px;
|
||||
height: 32px;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
.close-btn:hover {
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.edit-form {
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
.form-group {
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
/* Using .form-row from theme.css */
|
||||
|
||||
.form-group label {
|
||||
display: block;
|
||||
margin-bottom: 8px;
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-sm);
|
||||
}
|
||||
|
||||
.form-input {
|
||||
width: 100%;
|
||||
padding: 10px;
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-sm);
|
||||
font-size: var(--font-size-sm);
|
||||
}
|
||||
|
||||
.form-input:focus {
|
||||
outline: none;
|
||||
border-color: #2196F3;
|
||||
box-shadow: 0 0 0 2px rgba(33, 150, 243, 0.1);
|
||||
}
|
||||
|
||||
.form-input.expiry-expired {
|
||||
border-color: #f44336;
|
||||
}
|
||||
|
||||
.form-input.expiry-soon {
|
||||
border-color: #ff5722;
|
||||
}
|
||||
|
||||
.form-input.expiry-warning {
|
||||
border-color: #ff9800;
|
||||
}
|
||||
|
||||
.form-input.expiry-good {
|
||||
border-color: #4CAF50;
|
||||
}
|
||||
|
||||
textarea.form-input {
|
||||
resize: vertical;
|
||||
font-family: inherit;
|
||||
}
|
||||
|
||||
.product-info {
|
||||
padding: 10px;
|
||||
background: #f5f5f5;
|
||||
border-radius: var(--radius-sm);
|
||||
font-size: var(--font-size-sm);
|
||||
}
|
||||
|
||||
.product-info .brand {
|
||||
color: var(--color-text-secondary);
|
||||
margin-left: 8px;
|
||||
}
|
||||
|
||||
.expiry-hint {
|
||||
display: block;
|
||||
margin-top: 5px;
|
||||
font-size: var(--font-size-xs);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.error-message {
|
||||
background: #ffebee;
|
||||
color: #c62828;
|
||||
padding: 12px;
|
||||
border-radius: var(--radius-sm);
|
||||
margin-bottom: 15px;
|
||||
font-size: var(--font-size-sm);
|
||||
}
|
||||
|
||||
.form-actions {
|
||||
display: flex;
|
||||
gap: 10px;
|
||||
justify-content: flex-end;
|
||||
margin-top: 25px;
|
||||
padding-top: 20px;
|
||||
border-top: 1px solid #eee;
|
||||
}
|
||||
|
||||
.btn-cancel,
|
||||
.btn-save {
|
||||
padding: 10px 24px;
|
||||
border: none;
|
||||
border-radius: var(--radius-sm);
|
||||
font-size: var(--font-size-sm);
|
||||
font-weight: 600;
|
||||
cursor: pointer;
|
||||
transition: background 0.2s;
|
||||
}
|
||||
|
||||
.btn-cancel {
|
||||
background: #f5f5f5;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.btn-cancel:hover {
|
||||
background: #e0e0e0;
|
||||
}
|
||||
|
||||
.btn-save {
|
||||
background: var(--color-success);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-save:hover:not(:disabled) {
|
||||
background: var(--color-success-dark);
|
||||
}
|
||||
|
||||
.btn-save:disabled {
|
||||
background: var(--color-text-muted);
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
/* Mobile Responsive - Form row handled by theme.css
|
||||
Component-specific overrides only below */
|
||||
|
||||
@media (max-width: 480px) {
|
||||
.modal-content {
|
||||
width: 95%;
|
||||
max-height: 95vh;
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
padding: 15px;
|
||||
}
|
||||
|
||||
.modal-header h2 {
|
||||
font-size: var(--font-size-lg);
|
||||
}
|
||||
|
||||
.edit-form {
|
||||
padding: 15px;
|
||||
}
|
||||
|
||||
.form-group {
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
/* Form actions stack on very small screens */
|
||||
.form-actions {
|
||||
flex-direction: column-reverse;
|
||||
gap: 10px;
|
||||
}
|
||||
|
||||
.btn-cancel,
|
||||
.btn-save {
|
||||
width: 100%;
|
||||
padding: 12px;
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 481px) and (max-width: 768px) {
|
||||
.modal-content {
|
||||
width: 92%;
|
||||
}
|
||||
|
||||
.modal-header {
|
||||
padding: 18px;
|
||||
}
|
||||
|
||||
.edit-form {
|
||||
padding: 18px;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
41
frontend/src/components/HelloWorld.vue
Normal file
41
frontend/src/components/HelloWorld.vue
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
<script setup lang="ts">
|
||||
import { ref } from 'vue'
|
||||
|
||||
defineProps<{ msg: string }>()
|
||||
|
||||
const count = ref(0)
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<h1>{{ msg }}</h1>
|
||||
|
||||
<div class="card">
|
||||
<button type="button" @click="count++">count is {{ count }}</button>
|
||||
<p>
|
||||
Edit
|
||||
<code>components/HelloWorld.vue</code> to test HMR
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<p>
|
||||
Check out
|
||||
<a href="https://vuejs.org/guide/quick-start.html#local" target="_blank"
|
||||
>create-vue</a
|
||||
>, the official Vue + Vite starter
|
||||
</p>
|
||||
<p>
|
||||
Learn more about IDE Support for Vue in the
|
||||
<a
|
||||
href="https://vuejs.org/guide/scaling-up/tooling.html#ide-support"
|
||||
target="_blank"
|
||||
>Vue Docs Scaling up Guide</a
|
||||
>.
|
||||
</p>
|
||||
<p class="read-the-docs">Click on the Vite and Vue logos to learn more</p>
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
.read-the-docs {
|
||||
color: #888;
|
||||
}
|
||||
</style>
|
||||
1281
frontend/src/components/InventoryList.vue
Normal file
1281
frontend/src/components/InventoryList.vue
Normal file
File diff suppressed because it is too large
Load diff
454
frontend/src/components/ReceiptsView.vue
Normal file
454
frontend/src/components/ReceiptsView.vue
Normal file
|
|
@ -0,0 +1,454 @@
|
|||
<template>
|
||||
<div class="receipts-view">
|
||||
<!-- Upload Section -->
|
||||
<div class="card">
|
||||
<h2>📸 Upload Receipt</h2>
|
||||
<div
|
||||
class="upload-area"
|
||||
@click="triggerFileInput"
|
||||
@dragover.prevent
|
||||
@drop.prevent="handleDrop"
|
||||
>
|
||||
<div class="upload-icon">🧾</div>
|
||||
<div class="upload-text">Click to upload or drag and drop</div>
|
||||
<div class="upload-hint">Supports JPG, PNG (max 10MB)</div>
|
||||
</div>
|
||||
<input
|
||||
ref="fileInput"
|
||||
type="file"
|
||||
accept="image/*"
|
||||
style="display: none"
|
||||
@change="handleFileSelect"
|
||||
/>
|
||||
|
||||
<div v-if="uploading" class="loading">
|
||||
<div class="spinner"></div>
|
||||
<p>Processing receipt...</p>
|
||||
</div>
|
||||
|
||||
<div v-if="uploadResults.length > 0" class="results">
|
||||
<div
|
||||
v-for="(result, index) in uploadResults"
|
||||
:key="index"
|
||||
:class="['result-item', `result-${result.type}`]"
|
||||
>
|
||||
{{ result.message }}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Receipts List Section -->
|
||||
<div class="card">
|
||||
<h2>📋 Recent Receipts</h2>
|
||||
<div v-if="receipts.length === 0" style="text-align: center; color: var(--color-text-secondary)">
|
||||
<p>No receipts yet. Upload one above!</p>
|
||||
</div>
|
||||
<div v-else>
|
||||
<!-- Stats Summary -->
|
||||
<div class="grid-stats">
|
||||
<div class="stat-card">
|
||||
<div class="stat-value">{{ receipts.length }}</div>
|
||||
<div class="stat-label">Total Receipts</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value">${{ totalSpent.toFixed(2) }}</div>
|
||||
<div class="stat-label">Total Spent</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-value">{{ totalItems }}</div>
|
||||
<div class="stat-label">Total Items</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Receipts List -->
|
||||
<div class="receipts-list">
|
||||
<div
|
||||
v-for="receipt in receipts"
|
||||
:key="receipt.id"
|
||||
class="receipt-item"
|
||||
>
|
||||
<div class="receipt-info">
|
||||
<div class="receipt-merchant">
|
||||
{{ receipt.ocr_data?.merchant_name || 'Processing...' }}
|
||||
</div>
|
||||
<div class="receipt-details">
|
||||
<span v-if="receipt.ocr_data?.transaction_date">
|
||||
📅 {{ formatDate(receipt.ocr_data.transaction_date) }}
|
||||
</span>
|
||||
<span v-if="receipt.ocr_data?.total">
|
||||
💵 ${{ receipt.ocr_data.total }}
|
||||
</span>
|
||||
<span v-if="receipt.ocr_data?.items">
|
||||
📦 {{ receipt.ocr_data.items.length }} items
|
||||
</span>
|
||||
<span :class="getStatusClass(receipt.status)">
|
||||
{{ receipt.status }}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div style="margin-top: 20px">
|
||||
<button class="button" @click="exportCSV">📊 Download CSV</button>
|
||||
<button class="button" @click="exportExcel">📈 Download Excel</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, onMounted } from 'vue'
|
||||
import { receiptsAPI } from '../services/api'
|
||||
|
||||
const fileInput = ref<HTMLInputElement | null>(null)
|
||||
const uploading = ref(false)
|
||||
const uploadResults = ref<Array<{ type: string; message: string }>>([])
|
||||
const receipts = ref<any[]>([])
|
||||
|
||||
const totalSpent = computed(() => {
|
||||
return receipts.value.reduce((sum, receipt) => {
|
||||
const total = parseFloat(receipt.ocr_data?.total || 0)
|
||||
return sum + total
|
||||
}, 0)
|
||||
})
|
||||
|
||||
const totalItems = computed(() => {
|
||||
return receipts.value.reduce((sum, receipt) => {
|
||||
const items = receipt.ocr_data?.items?.length || 0
|
||||
return sum + items
|
||||
}, 0)
|
||||
})
|
||||
|
||||
function triggerFileInput() {
|
||||
fileInput.value?.click()
|
||||
}
|
||||
|
||||
function handleDrop(e: DragEvent) {
|
||||
const files = e.dataTransfer?.files
|
||||
if (files && files.length > 0) {
|
||||
uploadFile(files[0]!)
|
||||
}
|
||||
}
|
||||
|
||||
function handleFileSelect(e: Event) {
|
||||
const target = e.target as HTMLInputElement
|
||||
const files = target.files
|
||||
if (files && files.length > 0) {
|
||||
uploadFile(files[0]!)
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadFile(file: File) {
|
||||
uploading.value = true
|
||||
uploadResults.value = []
|
||||
|
||||
try {
|
||||
const result = await receiptsAPI.upload(file)
|
||||
|
||||
uploadResults.value.push({
|
||||
type: 'success',
|
||||
message: `Receipt uploaded! ID: ${result.id}`,
|
||||
})
|
||||
uploadResults.value.push({
|
||||
type: 'info',
|
||||
message: 'Processing in background...',
|
||||
})
|
||||
|
||||
// Refresh receipts after a delay to allow background processing
|
||||
setTimeout(() => {
|
||||
loadReceipts()
|
||||
}, 3000)
|
||||
} catch (error: any) {
|
||||
uploadResults.value.push({
|
||||
type: 'error',
|
||||
message: `Upload failed: ${error.message}`,
|
||||
})
|
||||
} finally {
|
||||
uploading.value = false
|
||||
if (fileInput.value) {
|
||||
fileInput.value.value = ''
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function loadReceipts() {
|
||||
try {
|
||||
const data = await receiptsAPI.listReceipts()
|
||||
// Fetch OCR data for each receipt
|
||||
receipts.value = await Promise.all(
|
||||
data.map(async (receipt: any) => {
|
||||
try {
|
||||
const ocrData = await receiptsAPI.getOCRData(receipt.id)
|
||||
return { ...receipt, ocr_data: ocrData }
|
||||
} catch {
|
||||
return { ...receipt, ocr_data: null }
|
||||
}
|
||||
})
|
||||
)
|
||||
} catch (error) {
|
||||
console.error('Failed to load receipts:', error)
|
||||
}
|
||||
}
|
||||
|
||||
function formatDate(dateString: string): string {
|
||||
const date = new Date(dateString)
|
||||
return date.toLocaleDateString()
|
||||
}
|
||||
|
||||
function getStatusClass(status: string): string {
|
||||
const statusMap: Record<string, string> = {
|
||||
completed: 'status-success',
|
||||
processing: 'status-processing',
|
||||
failed: 'status-error',
|
||||
}
|
||||
return statusMap[status] || 'status-default'
|
||||
}
|
||||
|
||||
function exportCSV() {
|
||||
const apiUrl = import.meta.env.VITE_API_URL || '/api/v1'
|
||||
window.open(`${apiUrl}/export/csv`, '_blank')
|
||||
}
|
||||
|
||||
function exportExcel() {
|
||||
const apiUrl = import.meta.env.VITE_API_URL || '/api/v1'
|
||||
window.open(`${apiUrl}/export/excel`, '_blank')
|
||||
}
|
||||
|
||||
onMounted(() => {
|
||||
loadReceipts()
|
||||
})
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.receipts-view {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 20px;
|
||||
}
|
||||
|
||||
.card {
|
||||
background: var(--color-bg-card);
|
||||
border-radius: var(--radius-xl);
|
||||
padding: 30px;
|
||||
box-shadow: 0 10px 40px rgba(0, 0, 0, 0.2);
|
||||
}
|
||||
|
||||
.card h2 {
|
||||
margin-bottom: 20px;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.upload-area {
|
||||
border: 3px dashed var(--color-primary);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: 40px;
|
||||
text-align: center;
|
||||
cursor: pointer;
|
||||
transition: all 0.3s;
|
||||
background: var(--color-bg-secondary);
|
||||
}
|
||||
|
||||
.upload-area:hover {
|
||||
border-color: var(--color-secondary);
|
||||
background: var(--color-bg-elevated);
|
||||
}
|
||||
|
||||
.upload-icon {
|
||||
font-size: 48px;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.upload-text {
|
||||
font-size: var(--font-size-lg);
|
||||
color: var(--color-text-primary);
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.upload-hint {
|
||||
font-size: var(--font-size-sm);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.loading {
|
||||
text-align: center;
|
||||
padding: 20px;
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.spinner {
|
||||
border: 4px solid #f3f3f3;
|
||||
border-top: 4px solid #667eea;
|
||||
border-radius: 50%;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
animation: spin 1s linear infinite;
|
||||
margin: 0 auto 10px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% {
|
||||
transform: rotate(0deg);
|
||||
}
|
||||
100% {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
|
||||
.results {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.result-item {
|
||||
padding: 15px;
|
||||
border-radius: var(--radius-md);
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.result-success {
|
||||
background: var(--color-success-bg);
|
||||
color: var(--color-success-dark);
|
||||
border: 1px solid var(--color-success-border);
|
||||
}
|
||||
|
||||
.result-error {
|
||||
background: var(--color-error-bg);
|
||||
color: var(--color-error-dark);
|
||||
border: 1px solid var(--color-error-border);
|
||||
}
|
||||
|
||||
.result-info {
|
||||
background: var(--color-info-bg);
|
||||
color: var(--color-info-dark);
|
||||
border: 1px solid var(--color-info-border);
|
||||
}
|
||||
|
||||
/* Using .grid-stats from theme.css */
|
||||
|
||||
.stat-card {
|
||||
background: var(--color-bg-secondary);
|
||||
padding: 20px;
|
||||
border-radius: var(--radius-lg);
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.stat-value {
|
||||
font-size: var(--font-size-2xl);
|
||||
font-weight: bold;
|
||||
color: var(--color-primary);
|
||||
margin-bottom: 5px;
|
||||
}
|
||||
|
||||
.stat-label {
|
||||
font-size: var(--font-size-sm);
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
.button {
|
||||
background: var(--gradient-primary);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 12px 30px;
|
||||
font-size: var(--font-size-base);
|
||||
border-radius: var(--radius-md);
|
||||
cursor: pointer;
|
||||
transition: transform 0.2s;
|
||||
margin-right: 10px;
|
||||
}
|
||||
|
||||
.button:hover {
|
||||
transform: translateY(-2px);
|
||||
}
|
||||
|
||||
.button:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
transform: none;
|
||||
}
|
||||
|
||||
.receipts-list {
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.receipt-item {
|
||||
background: var(--color-bg-secondary);
|
||||
padding: 15px;
|
||||
border-radius: var(--radius-md);
|
||||
margin-bottom: 10px;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.receipt-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.receipt-merchant {
|
||||
font-weight: 600;
|
||||
font-size: var(--font-size-base);
|
||||
margin-bottom: 5px;
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
.receipt-details {
|
||||
font-size: var(--font-size-sm);
|
||||
color: var(--color-text-secondary);
|
||||
display: flex;
|
||||
gap: 15px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.status-success {
|
||||
color: var(--color-success);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.status-processing {
|
||||
color: var(--color-warning);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.status-error {
|
||||
color: var(--color-error);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.status-default {
|
||||
color: var(--color-text-secondary);
|
||||
}
|
||||
|
||||
/* Mobile Responsive - Handled by theme.css
|
||||
Component-specific overrides only below */
|
||||
|
||||
@media (max-width: 480px) {
|
||||
.stat-card {
|
||||
padding: 15px;
|
||||
}
|
||||
|
||||
/* Receipt items stack content vertically */
|
||||
.receipt-item {
|
||||
flex-direction: column;
|
||||
align-items: flex-start;
|
||||
gap: 12px;
|
||||
padding: 12px;
|
||||
}
|
||||
|
||||
.receipt-info {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.receipt-details {
|
||||
gap: 10px;
|
||||
font-size: var(--font-size-xs);
|
||||
}
|
||||
|
||||
/* Buttons full width on mobile */
|
||||
.button {
|
||||
width: 100%;
|
||||
margin-right: 0;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
252
frontend/src/components/ToastNotification.vue
Normal file
252
frontend/src/components/ToastNotification.vue
Normal file
|
|
@ -0,0 +1,252 @@
|
|||
<template>
|
||||
<Transition name="toast">
|
||||
<div v-if="visible" :class="['toast', type]" @click="close">
|
||||
<div class="toast-icon">{{ icon }}</div>
|
||||
<div class="toast-content">
|
||||
<div class="toast-message">{{ message }}</div>
|
||||
</div>
|
||||
<button class="toast-close" @click.stop="close">×</button>
|
||||
</div>
|
||||
</Transition>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, watch, onMounted } from 'vue'
|
||||
|
||||
interface Props {
|
||||
message: string
|
||||
type?: 'success' | 'error' | 'warning' | 'info'
|
||||
duration?: number
|
||||
show?: boolean
|
||||
}
|
||||
|
||||
const props = withDefaults(defineProps<Props>(), {
|
||||
type: 'info',
|
||||
duration: 3000,
|
||||
show: false,
|
||||
})
|
||||
|
||||
const emit = defineEmits<{
|
||||
close: []
|
||||
}>()
|
||||
|
||||
const visible = ref(props.show)
|
||||
let timeout: number | null = null
|
||||
|
||||
const icon = {
|
||||
success: '✓',
|
||||
error: '✗',
|
||||
warning: '⚠',
|
||||
info: 'ℹ',
|
||||
}[props.type]
|
||||
|
||||
watch(() => props.show, (newVal) => {
|
||||
if (newVal) {
|
||||
visible.value = true
|
||||
if (props.duration > 0) {
|
||||
if (timeout) clearTimeout(timeout)
|
||||
timeout = window.setTimeout(() => {
|
||||
close()
|
||||
}, props.duration)
|
||||
}
|
||||
} else {
|
||||
visible.value = false
|
||||
}
|
||||
})
|
||||
|
||||
onMounted(() => {
|
||||
if (props.show && props.duration > 0) {
|
||||
timeout = window.setTimeout(() => {
|
||||
close()
|
||||
}, props.duration)
|
||||
}
|
||||
})
|
||||
|
||||
function close() {
|
||||
visible.value = false
|
||||
if (timeout) {
|
||||
clearTimeout(timeout)
|
||||
timeout = null
|
||||
}
|
||||
emit('close')
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.toast {
|
||||
position: fixed;
|
||||
top: 20px;
|
||||
right: 20px;
|
||||
min-width: 300px;
|
||||
max-width: 500px;
|
||||
padding: var(--spacing-md);
|
||||
background: var(--color-bg-elevated);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-lg);
|
||||
box-shadow: var(--shadow-lg);
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-md);
|
||||
z-index: 10000;
|
||||
cursor: pointer;
|
||||
transition: transform 0.2s ease;
|
||||
}
|
||||
|
||||
.toast:hover {
|
||||
transform: translateY(-2px);
|
||||
box-shadow: var(--shadow-xl);
|
||||
}
|
||||
|
||||
.toast-icon {
|
||||
font-size: var(--font-size-xl);
|
||||
flex-shrink: 0;
|
||||
width: 32px;
|
||||
height: 32px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
border-radius: 50%;
|
||||
}
|
||||
|
||||
.toast-content {
|
||||
flex: 1;
|
||||
min-width: 0;
|
||||
}
|
||||
|
||||
.toast-message {
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-base);
|
||||
word-wrap: break-word;
|
||||
}
|
||||
|
||||
.toast-close {
|
||||
background: none;
|
||||
border: none;
|
||||
color: var(--color-text-secondary);
|
||||
font-size: var(--font-size-2xl);
|
||||
cursor: pointer;
|
||||
padding: 0;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
flex-shrink: 0;
|
||||
transition: color 0.2s ease;
|
||||
}
|
||||
|
||||
.toast-close:hover {
|
||||
color: var(--color-text-primary);
|
||||
}
|
||||
|
||||
/* Type-specific styles */
|
||||
.toast.success {
|
||||
border-left: 4px solid var(--color-success);
|
||||
}
|
||||
|
||||
.toast.success .toast-icon {
|
||||
background: var(--color-success-bg);
|
||||
color: var(--color-success);
|
||||
}
|
||||
|
||||
.toast.error {
|
||||
border-left: 4px solid var(--color-error);
|
||||
}
|
||||
|
||||
.toast.error .toast-icon {
|
||||
background: var(--color-error-bg);
|
||||
color: var(--color-error);
|
||||
}
|
||||
|
||||
.toast.warning {
|
||||
border-left: 4px solid var(--color-warning);
|
||||
}
|
||||
|
||||
.toast.warning .toast-icon {
|
||||
background: var(--color-warning-bg);
|
||||
color: var(--color-warning);
|
||||
}
|
||||
|
||||
.toast.info {
|
||||
border-left: 4px solid var(--color-info);
|
||||
}
|
||||
|
||||
.toast.info .toast-icon {
|
||||
background: var(--color-info-bg);
|
||||
color: var(--color-info);
|
||||
}
|
||||
|
||||
/* Animations */
|
||||
.toast-enter-active,
|
||||
.toast-leave-active {
|
||||
transition: all 0.3s ease;
|
||||
}
|
||||
|
||||
.toast-enter-from {
|
||||
opacity: 0;
|
||||
transform: translateX(100%);
|
||||
}
|
||||
|
||||
.toast-leave-to {
|
||||
opacity: 0;
|
||||
transform: translateX(100%);
|
||||
}
|
||||
|
||||
/* Mobile Responsive */
|
||||
|
||||
/* Small phones (320px - 480px) */
|
||||
@media (max-width: 480px) {
|
||||
.toast {
|
||||
top: 10px;
|
||||
right: 10px;
|
||||
left: 10px;
|
||||
min-width: auto;
|
||||
max-width: none;
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.toast-icon {
|
||||
font-size: var(--font-size-lg);
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
}
|
||||
|
||||
.toast-message {
|
||||
font-size: var(--font-size-sm);
|
||||
}
|
||||
|
||||
.toast-close {
|
||||
font-size: var(--font-size-xl);
|
||||
width: 20px;
|
||||
height: 20px;
|
||||
}
|
||||
|
||||
/* Adjust animation for centered toast */
|
||||
.toast-enter-from {
|
||||
transform: translateY(-100%);
|
||||
}
|
||||
|
||||
.toast-leave-to {
|
||||
transform: translateY(-100%);
|
||||
}
|
||||
}
|
||||
|
||||
/* Large phones and small tablets (481px - 768px) */
|
||||
@media (min-width: 481px) and (max-width: 768px) {
|
||||
.toast {
|
||||
top: 15px;
|
||||
right: 15px;
|
||||
min-width: 250px;
|
||||
max-width: 400px;
|
||||
}
|
||||
}
|
||||
|
||||
/* Tablets (769px - 1024px) */
|
||||
@media (min-width: 769px) and (max-width: 1024px) {
|
||||
.toast {
|
||||
min-width: 280px;
|
||||
max-width: 450px;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
11
frontend/src/main.ts
Normal file
11
frontend/src/main.ts
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
import { createApp } from 'vue'
|
||||
import { createPinia } from 'pinia'
|
||||
import './style.css'
|
||||
import './theme.css'
|
||||
import App from './App.vue'
|
||||
|
||||
const app = createApp(App)
|
||||
const pinia = createPinia()
|
||||
|
||||
app.use(pinia)
|
||||
app.mount('#app')
|
||||
407
frontend/src/services/api.ts
Normal file
407
frontend/src/services/api.ts
Normal file
|
|
@ -0,0 +1,407 @@
|
|||
/**
|
||||
* API Service for Kiwi Backend
|
||||
*
|
||||
* VITE_API_BASE is baked in at build time:
|
||||
* dev: '' (empty — proxy in vite.config.ts handles /api/)
|
||||
* cloud: '/kiwi' (Caddy strips /kiwi and forwards to nginx, which proxies /api/ → api container)
|
||||
*/
|
||||
|
||||
import axios, { type AxiosInstance } from 'axios'
|
||||
|
||||
// API Configuration
|
||||
const API_BASE_URL = (import.meta.env.VITE_API_BASE ?? '') + '/api/v1'
|
||||
|
||||
// Create axios instance
|
||||
const api: AxiosInstance = axios.create({
|
||||
baseURL: API_BASE_URL,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
timeout: 30000, // 30 seconds
|
||||
})
|
||||
|
||||
// Request interceptor for logging
|
||||
api.interceptors.request.use(
|
||||
(config) => {
|
||||
console.log(`[API] ${config.method?.toUpperCase()} ${config.baseURL}${config.url}`, {
|
||||
params: config.params,
|
||||
data: config.data instanceof FormData ? '<FormData>' : config.data,
|
||||
})
|
||||
return config
|
||||
},
|
||||
(error) => {
|
||||
console.error('[API Request Error]', error)
|
||||
return Promise.reject(error)
|
||||
}
|
||||
)
|
||||
|
||||
// Response interceptor for error handling
|
||||
api.interceptors.response.use(
|
||||
(response) => {
|
||||
console.log(`[API] ✓ ${response.status} ${response.config.method?.toUpperCase()} ${response.config.url}`)
|
||||
return response
|
||||
},
|
||||
(error) => {
|
||||
console.error('[API Error]', {
|
||||
message: error.message,
|
||||
url: error.config?.url,
|
||||
method: error.config?.method?.toUpperCase(),
|
||||
status: error.response?.status,
|
||||
statusText: error.response?.statusText,
|
||||
data: error.response?.data,
|
||||
baseURL: error.config?.baseURL,
|
||||
})
|
||||
return Promise.reject(error)
|
||||
}
|
||||
)
|
||||
|
||||
// ========== Types ==========
|
||||
|
||||
export interface Product {
|
||||
id: string
|
||||
barcode: string | null
|
||||
name: string
|
||||
brand: string | null
|
||||
category: string | null
|
||||
description: string | null
|
||||
image_url: string | null
|
||||
nutrition_data: Record<string, any>
|
||||
source: string
|
||||
tags: Tag[]
|
||||
}
|
||||
|
||||
export interface Tag {
|
||||
id: string
|
||||
name: string
|
||||
slug: string
|
||||
description: string | null
|
||||
color: string | null
|
||||
category: string | null
|
||||
}
|
||||
|
||||
export interface InventoryItem {
|
||||
id: string
|
||||
product_id: string
|
||||
product: Product
|
||||
quantity: number
|
||||
unit: string
|
||||
location: string
|
||||
sublocation: string | null
|
||||
purchase_date: string | null
|
||||
expiration_date: string | null
|
||||
status: string
|
||||
source: string
|
||||
notes: string | null
|
||||
created_at: string
|
||||
updated_at: string
|
||||
}
|
||||
|
||||
export interface InventoryItemUpdate {
|
||||
quantity?: number
|
||||
unit?: string
|
||||
location?: string
|
||||
sublocation?: string | null
|
||||
purchase_date?: string | null
|
||||
expiration_date?: string | null
|
||||
status?: string
|
||||
notes?: string | null
|
||||
}
|
||||
|
||||
export interface InventoryStats {
|
||||
total_items: number
|
||||
total_products: number
|
||||
expiring_soon: number
|
||||
expired: number
|
||||
items_by_location: Record<string, number>
|
||||
items_by_status: Record<string, number>
|
||||
}
|
||||
|
||||
export interface Receipt {
|
||||
id: string
|
||||
filename: string
|
||||
status: string
|
||||
metadata: Record<string, any>
|
||||
quality_score: number | null
|
||||
}
|
||||
|
||||
export interface ReceiptOCRData {
|
||||
id: string
|
||||
receipt_id: string
|
||||
merchant: {
|
||||
name: string | null
|
||||
address: string | null
|
||||
phone: string | null
|
||||
}
|
||||
transaction: {
|
||||
date: string | null
|
||||
time: string | null
|
||||
receipt_number: string | null
|
||||
register: string | null
|
||||
cashier: string | null
|
||||
}
|
||||
items: Array<{
|
||||
name: string
|
||||
quantity: number
|
||||
unit_price: number | null
|
||||
total_price: number
|
||||
category: string | null
|
||||
}>
|
||||
totals: {
|
||||
subtotal: number | null
|
||||
tax: number | null
|
||||
total: number | null
|
||||
payment_method: string | null
|
||||
}
|
||||
confidence: Record<string, number>
|
||||
warnings: string[]
|
||||
processing_time: number | null
|
||||
created_at: string
|
||||
}
|
||||
|
||||
// ========== Inventory API ==========
|
||||
|
||||
export const inventoryAPI = {
|
||||
/**
|
||||
* List all inventory items
|
||||
*/
|
||||
async listItems(params?: {
|
||||
location?: string
|
||||
status?: string
|
||||
limit?: number
|
||||
offset?: number
|
||||
}): Promise<InventoryItem[]> {
|
||||
const response = await api.get('/inventory/items', { params })
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Get a single inventory item
|
||||
*/
|
||||
async getItem(itemId: string): Promise<InventoryItem> {
|
||||
const response = await api.get(`/inventory/items/${itemId}`)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Update an inventory item
|
||||
*/
|
||||
async updateItem(itemId: string, update: InventoryItemUpdate): Promise<InventoryItem> {
|
||||
const response = await api.patch(`/inventory/items/${itemId}`, update)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Delete an inventory item
|
||||
*/
|
||||
async deleteItem(itemId: string): Promise<void> {
|
||||
await api.delete(`/inventory/items/${itemId}`)
|
||||
},
|
||||
|
||||
/**
|
||||
* Get inventory statistics
|
||||
*/
|
||||
async getStats(): Promise<InventoryStats> {
|
||||
const response = await api.get('/inventory/stats')
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Get items expiring soon
|
||||
*/
|
||||
async getExpiring(days: number = 7): Promise<any[]> {
|
||||
const response = await api.get(`/inventory/expiring?days=${days}`)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Scan barcode from text
|
||||
*/
|
||||
async scanBarcodeText(
|
||||
barcode: string,
|
||||
location: string = 'pantry',
|
||||
quantity: number = 1.0,
|
||||
autoAdd: boolean = true
|
||||
): Promise<any> {
|
||||
const response = await api.post('/inventory/scan/text', {
|
||||
barcode,
|
||||
location,
|
||||
quantity,
|
||||
auto_add_to_inventory: autoAdd,
|
||||
})
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Mark item as consumed
|
||||
*/
|
||||
async consumeItem(itemId: string): Promise<void> {
|
||||
await api.post(`/inventory/items/${itemId}/consume`)
|
||||
},
|
||||
|
||||
/**
|
||||
* Create a new product
|
||||
*/
|
||||
async createProduct(data: {
|
||||
name: string
|
||||
brand?: string
|
||||
source?: string
|
||||
}): Promise<Product> {
|
||||
const response = await api.post('/inventory/products', data)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Create a new inventory item
|
||||
*/
|
||||
async createItem(data: {
|
||||
product_id: string
|
||||
quantity: number
|
||||
unit?: string
|
||||
location: string
|
||||
expiration_date?: string
|
||||
source?: string
|
||||
}): Promise<InventoryItem> {
|
||||
const response = await api.post('/inventory/items', data)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Scan barcode from image
|
||||
*/
|
||||
async scanBarcodeImage(
|
||||
file: File,
|
||||
location: string = 'pantry',
|
||||
quantity: number = 1.0,
|
||||
autoAdd: boolean = true
|
||||
): Promise<any> {
|
||||
const formData = new FormData()
|
||||
formData.append('file', file)
|
||||
formData.append('location', location)
|
||||
formData.append('quantity', quantity.toString())
|
||||
formData.append('auto_add_to_inventory', autoAdd.toString())
|
||||
|
||||
const response = await api.post('/inventory/scan', formData, {
|
||||
headers: {
|
||||
'Content-Type': 'multipart/form-data',
|
||||
},
|
||||
})
|
||||
return response.data
|
||||
},
|
||||
}
|
||||
|
||||
// ========== Receipts API ==========
|
||||
|
||||
export const receiptsAPI = {
|
||||
/**
|
||||
* List all receipts
|
||||
*/
|
||||
async listReceipts(): Promise<Receipt[]> {
|
||||
const response = await api.get('/receipts/')
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Get a single receipt
|
||||
*/
|
||||
async getReceipt(receiptId: string): Promise<Receipt> {
|
||||
const response = await api.get(`/receipts/${receiptId}`)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Upload a receipt
|
||||
*/
|
||||
async upload(file: File): Promise<Receipt> {
|
||||
const formData = new FormData()
|
||||
formData.append('file', file)
|
||||
|
||||
const response = await api.post('/receipts/', formData, {
|
||||
headers: {
|
||||
'Content-Type': 'multipart/form-data',
|
||||
},
|
||||
})
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Get receipt statistics
|
||||
*/
|
||||
async getStats(): Promise<any> {
|
||||
const response = await api.get('/export/stats')
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Get OCR data for a receipt
|
||||
*/
|
||||
async getOCRData(receiptId: string): Promise<ReceiptOCRData> {
|
||||
const response = await api.get(`/receipts/${receiptId}/ocr/data`)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Get OCR status for a receipt
|
||||
*/
|
||||
async getOCRStatus(receiptId: string): Promise<any> {
|
||||
const response = await api.get(`/receipts/${receiptId}/ocr/status`)
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Trigger OCR processing
|
||||
*/
|
||||
async triggerOCR(receiptId: string, forceReprocess: boolean = false): Promise<any> {
|
||||
const response = await api.post(`/receipts/${receiptId}/ocr/trigger`, {
|
||||
force_reprocess: forceReprocess,
|
||||
})
|
||||
return response.data
|
||||
},
|
||||
}
|
||||
|
||||
// ========== Export API ==========
|
||||
|
||||
export const exportAPI = {
|
||||
/**
|
||||
* Get export statistics
|
||||
*/
|
||||
async getStats(): Promise<any> {
|
||||
const response = await api.get('/export/stats')
|
||||
return response.data
|
||||
},
|
||||
|
||||
/**
|
||||
* Download inventory CSV
|
||||
*/
|
||||
getInventoryCSVUrl(location?: string, status: string = 'available'): string {
|
||||
const params = new URLSearchParams()
|
||||
if (location) params.append('location', location)
|
||||
params.append('status', status)
|
||||
return `${API_BASE_URL}/export/inventory/csv?${params.toString()}`
|
||||
},
|
||||
|
||||
/**
|
||||
* Download inventory Excel
|
||||
*/
|
||||
getInventoryExcelUrl(location?: string, status: string = 'available'): string {
|
||||
const params = new URLSearchParams()
|
||||
if (location) params.append('location', location)
|
||||
params.append('status', status)
|
||||
return `${API_BASE_URL}/export/inventory/excel?${params.toString()}`
|
||||
},
|
||||
|
||||
/**
|
||||
* Download receipts CSV
|
||||
*/
|
||||
getReceiptsCSVUrl(): string {
|
||||
return `${API_BASE_URL}/export/csv`
|
||||
},
|
||||
|
||||
/**
|
||||
* Download receipts Excel
|
||||
*/
|
||||
getReceiptsExcelUrl(): string {
|
||||
return `${API_BASE_URL}/export/excel`
|
||||
},
|
||||
}
|
||||
|
||||
export default api
|
||||
173
frontend/src/stores/inventory.ts
Normal file
173
frontend/src/stores/inventory.ts
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
/**
|
||||
* Inventory Store
|
||||
*
|
||||
* Manages inventory items, products, and related state using Pinia.
|
||||
*/
|
||||
|
||||
import { defineStore } from 'pinia'
|
||||
import { ref, computed } from 'vue'
|
||||
import { inventoryAPI, type InventoryItem, type InventoryStats, type InventoryItemUpdate } from '../services/api'
|
||||
|
||||
export const useInventoryStore = defineStore('inventory', () => {
|
||||
// State
|
||||
const items = ref<InventoryItem[]>([])
|
||||
const stats = ref<InventoryStats | null>(null)
|
||||
const loading = ref(false)
|
||||
const error = ref<string | null>(null)
|
||||
|
||||
// Filter state
|
||||
const locationFilter = ref<string>('all')
|
||||
const statusFilter = ref<string>('available')
|
||||
|
||||
// Computed
|
||||
const filteredItems = computed(() => {
|
||||
return items.value.filter((item) => {
|
||||
const matchesLocation = locationFilter.value === 'all' || item.location === locationFilter.value
|
||||
const matchesStatus = statusFilter.value === 'all' || item.status === statusFilter.value
|
||||
return matchesLocation && matchesStatus
|
||||
})
|
||||
})
|
||||
|
||||
const expiringItems = computed(() => {
|
||||
const today = new Date()
|
||||
const weekFromNow = new Date(today.getTime() + 7 * 24 * 60 * 60 * 1000)
|
||||
|
||||
return items.value.filter((item) => {
|
||||
if (!item.expiration_date || item.status !== 'available') return false
|
||||
const expiryDate = new Date(item.expiration_date)
|
||||
return expiryDate >= today && expiryDate <= weekFromNow
|
||||
})
|
||||
})
|
||||
|
||||
const expiredItems = computed(() => {
|
||||
const today = new Date()
|
||||
|
||||
return items.value.filter((item) => {
|
||||
if (!item.expiration_date || item.status !== 'available') return false
|
||||
const expiryDate = new Date(item.expiration_date)
|
||||
return expiryDate < today
|
||||
})
|
||||
})
|
||||
|
||||
// Actions
|
||||
async function fetchItems() {
|
||||
loading.value = true
|
||||
error.value = null
|
||||
|
||||
try {
|
||||
items.value = await inventoryAPI.listItems({
|
||||
status: statusFilter.value === 'all' ? undefined : statusFilter.value,
|
||||
location: locationFilter.value === 'all' ? undefined : locationFilter.value,
|
||||
limit: 1000,
|
||||
})
|
||||
} catch (err: any) {
|
||||
error.value = err.response?.data?.detail || 'Failed to fetch inventory items'
|
||||
console.error('Error fetching inventory:', err)
|
||||
} finally {
|
||||
loading.value = false
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchStats() {
|
||||
try {
|
||||
stats.value = await inventoryAPI.getStats()
|
||||
} catch (err: any) {
|
||||
console.error('Error fetching stats:', err)
|
||||
}
|
||||
}
|
||||
|
||||
async function updateItem(itemId: string, update: InventoryItemUpdate) {
|
||||
loading.value = true
|
||||
error.value = null
|
||||
|
||||
try {
|
||||
const updatedItem = await inventoryAPI.updateItem(itemId, update)
|
||||
|
||||
// Update in local state
|
||||
const index = items.value.findIndex((item) => item.id === itemId)
|
||||
if (index !== -1) {
|
||||
items.value[index] = updatedItem
|
||||
}
|
||||
|
||||
return updatedItem
|
||||
} catch (err: any) {
|
||||
error.value = err.response?.data?.detail || 'Failed to update item'
|
||||
console.error('Error updating item:', err)
|
||||
throw err
|
||||
} finally {
|
||||
loading.value = false
|
||||
}
|
||||
}
|
||||
|
||||
async function deleteItem(itemId: string) {
|
||||
loading.value = true
|
||||
error.value = null
|
||||
|
||||
try {
|
||||
await inventoryAPI.deleteItem(itemId)
|
||||
|
||||
// Remove from local state
|
||||
items.value = items.value.filter((item) => item.id !== itemId)
|
||||
} catch (err: any) {
|
||||
error.value = err.response?.data?.detail || 'Failed to delete item'
|
||||
console.error('Error deleting item:', err)
|
||||
throw err
|
||||
} finally {
|
||||
loading.value = false
|
||||
}
|
||||
}
|
||||
|
||||
async function scanBarcode(barcode: string, location: string = 'pantry', quantity: number = 1) {
|
||||
loading.value = true
|
||||
error.value = null
|
||||
|
||||
try {
|
||||
const result = await inventoryAPI.scanBarcodeText(barcode, location, quantity, true)
|
||||
|
||||
// Refresh items after successful scan
|
||||
if (result.success) {
|
||||
await fetchItems()
|
||||
}
|
||||
|
||||
return result
|
||||
} catch (err: any) {
|
||||
error.value = err.response?.data?.detail || 'Failed to scan barcode'
|
||||
console.error('Error scanning barcode:', err)
|
||||
throw err
|
||||
} finally {
|
||||
loading.value = false
|
||||
}
|
||||
}
|
||||
|
||||
function setLocationFilter(location: string) {
|
||||
locationFilter.value = location
|
||||
}
|
||||
|
||||
function setStatusFilter(status: string) {
|
||||
statusFilter.value = status
|
||||
}
|
||||
|
||||
return {
|
||||
// State
|
||||
items,
|
||||
stats,
|
||||
loading,
|
||||
error,
|
||||
locationFilter,
|
||||
statusFilter,
|
||||
|
||||
// Computed
|
||||
filteredItems,
|
||||
expiringItems,
|
||||
expiredItems,
|
||||
|
||||
// Actions
|
||||
fetchItems,
|
||||
fetchStats,
|
||||
updateItem,
|
||||
deleteItem,
|
||||
scanBarcode,
|
||||
setLocationFilter,
|
||||
setStatusFilter,
|
||||
}
|
||||
})
|
||||
255
frontend/src/style.css
Normal file
255
frontend/src/style.css
Normal file
|
|
@ -0,0 +1,255 @@
|
|||
:root {
|
||||
font-family: system-ui, Avenir, Helvetica, Arial, sans-serif;
|
||||
line-height: 1.5;
|
||||
font-weight: 400;
|
||||
|
||||
color-scheme: light dark;
|
||||
|
||||
font-synthesis: none;
|
||||
text-rendering: optimizeLegibility;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
|
||||
/* Theme Colors - Dark Mode (Default) */
|
||||
--color-text-primary: rgba(255, 255, 255, 0.87);
|
||||
--color-text-secondary: rgba(255, 255, 255, 0.6);
|
||||
--color-text-muted: rgba(255, 255, 255, 0.4);
|
||||
|
||||
--color-bg-primary: #242424;
|
||||
--color-bg-secondary: #1a1a1a;
|
||||
--color-bg-elevated: #2d2d2d;
|
||||
--color-bg-card: #2d2d2d;
|
||||
--color-bg-input: #1a1a1a;
|
||||
|
||||
--color-border: rgba(255, 255, 255, 0.1);
|
||||
--color-border-focus: rgba(255, 255, 255, 0.2);
|
||||
|
||||
/* Brand Colors */
|
||||
--color-primary: #667eea;
|
||||
--color-primary-dark: #5568d3;
|
||||
--color-primary-light: #7d8ff0;
|
||||
--color-secondary: #764ba2;
|
||||
|
||||
/* Status Colors */
|
||||
--color-success: #4CAF50;
|
||||
--color-success-dark: #45a049;
|
||||
--color-success-light: #66bb6a;
|
||||
--color-success-bg: rgba(76, 175, 80, 0.1);
|
||||
--color-success-border: rgba(76, 175, 80, 0.3);
|
||||
|
||||
--color-warning: #ff9800;
|
||||
--color-warning-dark: #f57c00;
|
||||
--color-warning-light: #ffb74d;
|
||||
--color-warning-bg: rgba(255, 152, 0, 0.1);
|
||||
--color-warning-border: rgba(255, 152, 0, 0.3);
|
||||
|
||||
--color-error: #f44336;
|
||||
--color-error-dark: #d32f2f;
|
||||
--color-error-light: #ff6b6b;
|
||||
--color-error-bg: rgba(244, 67, 54, 0.1);
|
||||
--color-error-border: rgba(244, 67, 54, 0.3);
|
||||
|
||||
--color-info: #2196F3;
|
||||
--color-info-dark: #1976D2;
|
||||
--color-info-light: #64b5f6;
|
||||
--color-info-bg: rgba(33, 150, 243, 0.1);
|
||||
--color-info-border: rgba(33, 150, 243, 0.3);
|
||||
|
||||
/* Gradient */
|
||||
--gradient-primary: linear-gradient(135deg, var(--color-primary) 0%, var(--color-secondary) 100%);
|
||||
|
||||
/* Shadows */
|
||||
--shadow-sm: 0 1px 3px rgba(0, 0, 0, 0.3);
|
||||
--shadow-md: 0 4px 6px rgba(0, 0, 0, 0.3);
|
||||
--shadow-lg: 0 10px 20px rgba(0, 0, 0, 0.4);
|
||||
--shadow-xl: 0 20px 40px rgba(0, 0, 0, 0.5);
|
||||
|
||||
/* Typography */
|
||||
--font-size-xs: 12px;
|
||||
--font-size-sm: 14px;
|
||||
--font-size-base: 16px;
|
||||
--font-size-lg: 18px;
|
||||
--font-size-xl: 24px;
|
||||
--font-size-2xl: 32px;
|
||||
|
||||
/* Spacing */
|
||||
--spacing-xs: 4px;
|
||||
--spacing-sm: 8px;
|
||||
--spacing-md: 16px;
|
||||
--spacing-lg: 24px;
|
||||
--spacing-xl: 32px;
|
||||
|
||||
/* Border Radius */
|
||||
--radius-sm: 4px;
|
||||
--radius-md: 6px;
|
||||
--radius-lg: 8px;
|
||||
--radius-xl: 12px;
|
||||
|
||||
color: var(--color-text-primary);
|
||||
background-color: var(--color-bg-primary);
|
||||
}
|
||||
|
||||
a {
|
||||
font-weight: 500;
|
||||
color: #646cff;
|
||||
text-decoration: inherit;
|
||||
}
|
||||
a:hover {
|
||||
color: #535bf2;
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
display: flex;
|
||||
place-items: center;
|
||||
min-width: 320px;
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 3.2em;
|
||||
line-height: 1.1;
|
||||
}
|
||||
|
||||
button {
|
||||
border-radius: 8px;
|
||||
border: 1px solid transparent;
|
||||
padding: 0.6em 1.2em;
|
||||
font-size: 1em;
|
||||
font-weight: 500;
|
||||
font-family: inherit;
|
||||
background-color: #1a1a1a;
|
||||
cursor: pointer;
|
||||
transition: border-color 0.25s;
|
||||
}
|
||||
button:hover {
|
||||
border-color: #646cff;
|
||||
}
|
||||
button:focus,
|
||||
button:focus-visible {
|
||||
outline: 4px auto -webkit-focus-ring-color;
|
||||
}
|
||||
|
||||
.card {
|
||||
padding: 2em;
|
||||
}
|
||||
|
||||
#app {
|
||||
max-width: 1280px;
|
||||
margin: 0 auto;
|
||||
padding: 2rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: light) {
|
||||
:root {
|
||||
/* Theme Colors - Light Mode */
|
||||
--color-text-primary: #213547;
|
||||
--color-text-secondary: #666;
|
||||
--color-text-muted: #999;
|
||||
|
||||
--color-bg-primary: #f5f5f5;
|
||||
--color-bg-secondary: #ffffff;
|
||||
--color-bg-elevated: #ffffff;
|
||||
--color-bg-card: #ffffff;
|
||||
--color-bg-input: #ffffff;
|
||||
|
||||
--color-border: #ddd;
|
||||
--color-border-focus: #ccc;
|
||||
|
||||
/* Status colors stay the same in light mode */
|
||||
/* But we adjust backgrounds for better contrast */
|
||||
--color-success-bg: #d4edda;
|
||||
--color-success-border: #c3e6cb;
|
||||
|
||||
--color-warning-bg: #fff3cd;
|
||||
--color-warning-border: #ffeaa7;
|
||||
|
||||
--color-error-bg: #f8d7da;
|
||||
--color-error-border: #f5c6cb;
|
||||
|
||||
--color-info-bg: #d1ecf1;
|
||||
--color-info-border: #bee5eb;
|
||||
|
||||
/* Shadows for light mode (lighter) */
|
||||
--shadow-sm: 0 1px 3px rgba(0, 0, 0, 0.1);
|
||||
--shadow-md: 0 4px 6px rgba(0, 0, 0, 0.1);
|
||||
--shadow-lg: 0 10px 20px rgba(0, 0, 0, 0.15);
|
||||
--shadow-xl: 0 20px 40px rgba(0, 0, 0, 0.2);
|
||||
|
||||
color: var(--color-text-primary);
|
||||
background-color: var(--color-bg-primary);
|
||||
}
|
||||
|
||||
a:hover {
|
||||
color: #747bff;
|
||||
}
|
||||
|
||||
button {
|
||||
background-color: #f9f9f9;
|
||||
}
|
||||
}
|
||||
|
||||
/* Mobile Responsive Typography and Spacing */
|
||||
@media (max-width: 480px) {
|
||||
:root {
|
||||
/* Reduce font sizes for mobile */
|
||||
--font-size-xs: 11px;
|
||||
--font-size-sm: 13px;
|
||||
--font-size-base: 14px;
|
||||
--font-size-lg: 16px;
|
||||
--font-size-xl: 20px;
|
||||
--font-size-2xl: 24px;
|
||||
|
||||
/* Reduce spacing for mobile */
|
||||
--spacing-xs: 4px;
|
||||
--spacing-sm: 6px;
|
||||
--spacing-md: 12px;
|
||||
--spacing-lg: 16px;
|
||||
--spacing-xl: 20px;
|
||||
|
||||
/* Reduce shadows for mobile */
|
||||
--shadow-sm: 0 1px 2px rgba(0, 0, 0, 0.2);
|
||||
--shadow-md: 0 2px 4px rgba(0, 0, 0, 0.2);
|
||||
--shadow-lg: 0 4px 8px rgba(0, 0, 0, 0.3);
|
||||
--shadow-xl: 0 8px 16px rgba(0, 0, 0, 0.4);
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 2em;
|
||||
}
|
||||
|
||||
.card {
|
||||
padding: 1em;
|
||||
}
|
||||
|
||||
#app {
|
||||
padding: 1rem;
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 481px) and (max-width: 768px) {
|
||||
:root {
|
||||
/* Slightly reduced sizes for tablets */
|
||||
--font-size-base: 15px;
|
||||
--font-size-lg: 17px;
|
||||
--font-size-xl: 22px;
|
||||
--font-size-2xl: 28px;
|
||||
|
||||
--spacing-md: 14px;
|
||||
--spacing-lg: 20px;
|
||||
--spacing-xl: 28px;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 2.5em;
|
||||
}
|
||||
|
||||
.card {
|
||||
padding: 1.5em;
|
||||
}
|
||||
|
||||
#app {
|
||||
padding: 1.5rem;
|
||||
}
|
||||
}
|
||||
536
frontend/src/theme.css
Normal file
536
frontend/src/theme.css
Normal file
|
|
@ -0,0 +1,536 @@
|
|||
/**
|
||||
* Central Theme System for Project Thoth
|
||||
*
|
||||
* This file contains all reusable, theme-aware, responsive CSS classes.
|
||||
* Components should use these classes instead of custom styles where possible.
|
||||
*/
|
||||
|
||||
/* ============================================
|
||||
LAYOUT UTILITIES - RESPONSIVE GRIDS
|
||||
============================================ */
|
||||
|
||||
/* Responsive Grid - Automatically adjusts columns based on screen size */
|
||||
.grid-responsive {
|
||||
display: grid;
|
||||
gap: var(--spacing-md);
|
||||
}
|
||||
|
||||
/* Mobile: 1 column, Tablet: 2 columns, Desktop: 3+ columns */
|
||||
.grid-auto {
|
||||
display: grid;
|
||||
gap: var(--spacing-md);
|
||||
grid-template-columns: 1fr; /* Default to single column */
|
||||
}
|
||||
|
||||
/* Stats grid - always fills available space */
|
||||
.grid-stats {
|
||||
display: grid;
|
||||
gap: var(--spacing-md);
|
||||
grid-template-columns: 1fr; /* Default to single column */
|
||||
}
|
||||
|
||||
/* Force specific column counts */
|
||||
.grid-1 { grid-template-columns: 1fr; }
|
||||
.grid-2 { grid-template-columns: repeat(2, 1fr); }
|
||||
.grid-3 { grid-template-columns: repeat(3, 1fr); }
|
||||
.grid-4 { grid-template-columns: repeat(4, 1fr); }
|
||||
|
||||
/* ============================================
|
||||
FLEXBOX UTILITIES - RESPONSIVE
|
||||
============================================ */
|
||||
|
||||
.flex { display: flex; }
|
||||
.flex-col { display: flex; flex-direction: column; }
|
||||
.flex-wrap { flex-wrap: wrap; }
|
||||
.flex-center {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
.flex-between {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
.flex-start {
|
||||
display: flex;
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
}
|
||||
.flex-end {
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
/* Stack on mobile, horizontal on desktop */
|
||||
.flex-responsive {
|
||||
display: flex;
|
||||
gap: var(--spacing-md);
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
SPACING UTILITIES
|
||||
============================================ */
|
||||
|
||||
/* Gaps */
|
||||
.gap-xs { gap: var(--spacing-xs); }
|
||||
.gap-sm { gap: var(--spacing-sm); }
|
||||
.gap-md { gap: var(--spacing-md); }
|
||||
.gap-lg { gap: var(--spacing-lg); }
|
||||
.gap-xl { gap: var(--spacing-xl); }
|
||||
|
||||
/* Padding */
|
||||
.p-0 { padding: 0; }
|
||||
.p-xs { padding: var(--spacing-xs); }
|
||||
.p-sm { padding: var(--spacing-sm); }
|
||||
.p-md { padding: var(--spacing-md); }
|
||||
.p-lg { padding: var(--spacing-lg); }
|
||||
.p-xl { padding: var(--spacing-xl); }
|
||||
|
||||
/* Margin */
|
||||
.m-0 { margin: 0; }
|
||||
.m-xs { margin: var(--spacing-xs); }
|
||||
.m-sm { margin: var(--spacing-sm); }
|
||||
.m-md { margin: var(--spacing-md); }
|
||||
.m-lg { margin: var(--spacing-lg); }
|
||||
.m-xl { margin: var(--spacing-xl); }
|
||||
|
||||
/* Margin/Padding specific sides */
|
||||
.mt-md { margin-top: var(--spacing-md); }
|
||||
.mb-md { margin-bottom: var(--spacing-md); }
|
||||
.ml-md { margin-left: var(--spacing-md); }
|
||||
.mr-md { margin-right: var(--spacing-md); }
|
||||
|
||||
.pt-md { padding-top: var(--spacing-md); }
|
||||
.pb-md { padding-bottom: var(--spacing-md); }
|
||||
.pl-md { padding-left: var(--spacing-md); }
|
||||
.pr-md { padding-right: var(--spacing-md); }
|
||||
|
||||
/* ============================================
|
||||
CARD COMPONENTS - THEME AWARE
|
||||
============================================ */
|
||||
|
||||
.card {
|
||||
background: var(--color-bg-card);
|
||||
border-radius: var(--radius-xl);
|
||||
padding: var(--spacing-xl);
|
||||
box-shadow: var(--shadow-md);
|
||||
transition: box-shadow 0.2s ease;
|
||||
}
|
||||
|
||||
.card:hover {
|
||||
box-shadow: var(--shadow-lg);
|
||||
}
|
||||
|
||||
.card-sm {
|
||||
background: var(--color-bg-card);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: var(--spacing-md);
|
||||
box-shadow: var(--shadow-sm);
|
||||
}
|
||||
|
||||
.card-secondary {
|
||||
background: var(--color-bg-secondary);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
box-shadow: var(--shadow-sm);
|
||||
}
|
||||
|
||||
/* Status border variants */
|
||||
.card-success { border-left: 4px solid var(--color-success); }
|
||||
.card-warning { border-left: 4px solid var(--color-warning); }
|
||||
.card-error { border-left: 4px solid var(--color-error); }
|
||||
.card-info { border-left: 4px solid var(--color-info); }
|
||||
|
||||
/* ============================================
|
||||
BUTTON COMPONENTS - THEME AWARE
|
||||
============================================ */
|
||||
|
||||
.btn {
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
border: none;
|
||||
border-radius: var(--radius-md);
|
||||
font-size: var(--font-size-sm);
|
||||
font-weight: 600;
|
||||
cursor: pointer;
|
||||
transition: all 0.2s ease;
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.btn:hover {
|
||||
transform: translateY(-1px);
|
||||
}
|
||||
|
||||
.btn:active {
|
||||
transform: translateY(0);
|
||||
}
|
||||
|
||||
.btn:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
transform: none;
|
||||
}
|
||||
|
||||
/* Button variants */
|
||||
.btn-primary {
|
||||
background: var(--gradient-primary);
|
||||
color: white;
|
||||
border: none;
|
||||
}
|
||||
|
||||
.btn-success {
|
||||
background: var(--color-success);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-success:hover:not(:disabled) {
|
||||
background: var(--color-success-dark);
|
||||
}
|
||||
|
||||
.btn-error {
|
||||
background: var(--color-error);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-error:hover:not(:disabled) {
|
||||
background: var(--color-error-dark);
|
||||
}
|
||||
|
||||
.btn-info {
|
||||
background: var(--color-info);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn-info:hover:not(:disabled) {
|
||||
background: var(--color-info-dark);
|
||||
}
|
||||
|
||||
.btn-secondary {
|
||||
background: var(--color-bg-secondary);
|
||||
color: var(--color-text-primary);
|
||||
border: 2px solid var(--color-border);
|
||||
}
|
||||
|
||||
.btn-secondary:hover:not(:disabled) {
|
||||
background: var(--color-bg-primary);
|
||||
border-color: var(--color-primary);
|
||||
}
|
||||
|
||||
.btn-secondary.active {
|
||||
background: var(--gradient-primary);
|
||||
color: white;
|
||||
border-color: var(--color-primary);
|
||||
}
|
||||
|
||||
/* Button sizes */
|
||||
.btn-sm {
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
font-size: var(--font-size-xs);
|
||||
}
|
||||
|
||||
.btn-lg {
|
||||
padding: var(--spacing-md) var(--spacing-xl);
|
||||
font-size: var(--font-size-lg);
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
FORM COMPONENTS - THEME AWARE
|
||||
============================================ */
|
||||
|
||||
.form-group {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.form-label {
|
||||
display: block;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
font-weight: 600;
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-sm);
|
||||
}
|
||||
|
||||
.form-input,
|
||||
.form-select,
|
||||
.form-textarea {
|
||||
width: 100%;
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
border: 1px solid var(--color-border);
|
||||
border-radius: var(--radius-md);
|
||||
background: var(--color-bg-input);
|
||||
color: var(--color-text-primary);
|
||||
font-size: var(--font-size-sm);
|
||||
transition: border-color 0.2s ease, box-shadow 0.2s ease;
|
||||
}
|
||||
|
||||
.form-input:focus,
|
||||
.form-select:focus,
|
||||
.form-textarea:focus {
|
||||
outline: none;
|
||||
border-color: var(--color-primary);
|
||||
box-shadow: 0 0 0 3px rgba(102, 126, 234, 0.1);
|
||||
}
|
||||
|
||||
.form-textarea {
|
||||
resize: vertical;
|
||||
min-height: 80px;
|
||||
font-family: inherit;
|
||||
}
|
||||
|
||||
/* Form layouts */
|
||||
.form-row {
|
||||
display: grid;
|
||||
gap: var(--spacing-md);
|
||||
grid-template-columns: 1fr;
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
TEXT UTILITIES
|
||||
============================================ */
|
||||
|
||||
.text-xs { font-size: var(--font-size-xs); }
|
||||
.text-sm { font-size: var(--font-size-sm); }
|
||||
.text-base { font-size: var(--font-size-base); }
|
||||
.text-lg { font-size: var(--font-size-lg); }
|
||||
.text-xl { font-size: var(--font-size-xl); }
|
||||
.text-2xl { font-size: var(--font-size-2xl); }
|
||||
|
||||
.text-primary { color: var(--color-text-primary); }
|
||||
.text-secondary { color: var(--color-text-secondary); }
|
||||
.text-muted { color: var(--color-text-muted); }
|
||||
|
||||
.text-success { color: var(--color-success); }
|
||||
.text-warning { color: var(--color-warning); }
|
||||
.text-error { color: var(--color-error); }
|
||||
.text-info { color: var(--color-info); }
|
||||
|
||||
.text-center { text-align: center; }
|
||||
.text-left { text-align: left; }
|
||||
.text-right { text-align: right; }
|
||||
|
||||
.font-bold { font-weight: 700; }
|
||||
.font-semibold { font-weight: 600; }
|
||||
.font-normal { font-weight: 400; }
|
||||
|
||||
/* ============================================
|
||||
RESPONSIVE UTILITIES
|
||||
============================================ */
|
||||
|
||||
/* Show/Hide based on screen size */
|
||||
.mobile-only { display: none; }
|
||||
.desktop-only { display: block; }
|
||||
|
||||
/* Width utilities */
|
||||
.w-full { width: 100%; }
|
||||
.w-auto { width: auto; }
|
||||
|
||||
/* Height utilities */
|
||||
.h-full { height: 100%; }
|
||||
.h-auto { height: auto; }
|
||||
|
||||
/* ============================================
|
||||
MOBILE BREAKPOINTS (≤480px)
|
||||
============================================ */
|
||||
|
||||
@media (max-width: 480px) {
|
||||
/* Show/Hide */
|
||||
.mobile-only { display: block; }
|
||||
.desktop-only { display: none; }
|
||||
|
||||
/* Grids already default to 1fr, just ensure it stays that way */
|
||||
.grid-2,
|
||||
.grid-3,
|
||||
.grid-4 {
|
||||
grid-template-columns: 1fr !important;
|
||||
}
|
||||
|
||||
/* Stack flex items vertically */
|
||||
.flex-responsive {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
/* Buttons take full width */
|
||||
.btn-mobile-full {
|
||||
width: 100%;
|
||||
min-width: 100%;
|
||||
}
|
||||
|
||||
/* Reduce card padding on mobile */
|
||||
.card {
|
||||
padding: var(--spacing-md);
|
||||
}
|
||||
|
||||
.card-sm {
|
||||
padding: var(--spacing-sm);
|
||||
}
|
||||
|
||||
/* Allow text wrapping on mobile */
|
||||
.btn {
|
||||
white-space: normal;
|
||||
text-align: center;
|
||||
}
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
TABLET BREAKPOINTS (481px - 768px)
|
||||
============================================ */
|
||||
|
||||
@media (min-width: 481px) and (max-width: 768px) {
|
||||
/* 2-column layouts on tablets */
|
||||
.grid-3,
|
||||
.grid-4 {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
|
||||
.grid-auto {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
|
||||
.grid-stats {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
|
||||
.form-row {
|
||||
grid-template-columns: 1fr 1fr;
|
||||
}
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
DESKTOP BREAKPOINTS (769px - 1024px)
|
||||
============================================ */
|
||||
|
||||
@media (min-width: 769px) and (max-width: 1024px) {
|
||||
.grid-auto {
|
||||
grid-template-columns: repeat(3, 1fr);
|
||||
}
|
||||
|
||||
.grid-stats {
|
||||
grid-template-columns: repeat(3, 1fr);
|
||||
}
|
||||
|
||||
.grid-4 {
|
||||
grid-template-columns: repeat(3, 1fr);
|
||||
}
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
LARGE DESKTOP (≥1025px)
|
||||
============================================ */
|
||||
|
||||
@media (min-width: 1025px) {
|
||||
.grid-auto {
|
||||
grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));
|
||||
}
|
||||
|
||||
.grid-stats {
|
||||
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
|
||||
}
|
||||
|
||||
.form-row {
|
||||
grid-template-columns: 1fr 1fr;
|
||||
}
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
STATUS & STATE UTILITIES
|
||||
============================================ */
|
||||
|
||||
.status-badge {
|
||||
display: inline-block;
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
border-radius: var(--radius-sm);
|
||||
font-size: var(--font-size-xs);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.status-success {
|
||||
background: var(--color-success-bg);
|
||||
color: var(--color-success-dark);
|
||||
border: 1px solid var(--color-success-border);
|
||||
}
|
||||
|
||||
.status-warning {
|
||||
background: var(--color-warning-bg);
|
||||
color: var(--color-warning-dark);
|
||||
border: 1px solid var(--color-warning-border);
|
||||
}
|
||||
|
||||
.status-error {
|
||||
background: var(--color-error-bg);
|
||||
color: var(--color-error-dark);
|
||||
border: 1px solid var(--color-error-border);
|
||||
}
|
||||
|
||||
.status-info {
|
||||
background: var(--color-info-bg);
|
||||
color: var(--color-info-dark);
|
||||
border: 1px solid var(--color-info-border);
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
ANIMATION UTILITIES
|
||||
============================================ */
|
||||
|
||||
.fade-in {
|
||||
animation: fadeIn 0.3s ease-in;
|
||||
}
|
||||
|
||||
@keyframes fadeIn {
|
||||
from { opacity: 0; }
|
||||
to { opacity: 1; }
|
||||
}
|
||||
|
||||
.slide-up {
|
||||
animation: slideUp 0.3s ease-out;
|
||||
}
|
||||
|
||||
@keyframes slideUp {
|
||||
from {
|
||||
opacity: 0;
|
||||
transform: translateY(20px);
|
||||
}
|
||||
to {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
LOADING UTILITIES
|
||||
============================================ */
|
||||
|
||||
.spinner {
|
||||
border: 3px solid var(--color-border);
|
||||
border-top: 3px solid var(--color-primary);
|
||||
border-radius: 50%;
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
animation: spin 1s linear infinite;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
.spinner-sm {
|
||||
width: 20px;
|
||||
height: 20px;
|
||||
border-width: 2px;
|
||||
}
|
||||
|
||||
@keyframes spin {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
|
||||
/* ============================================
|
||||
DIVIDER
|
||||
============================================ */
|
||||
|
||||
.divider {
|
||||
height: 1px;
|
||||
background: var(--color-border);
|
||||
margin: var(--spacing-lg) 0;
|
||||
}
|
||||
|
||||
.divider-md {
|
||||
margin: var(--spacing-md) 0;
|
||||
}
|
||||
16
frontend/tsconfig.app.json
Normal file
16
frontend/tsconfig.app.json
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"extends": "@vue/tsconfig/tsconfig.dom.json",
|
||||
"compilerOptions": {
|
||||
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.app.tsbuildinfo",
|
||||
"types": ["vite/client"],
|
||||
|
||||
/* Linting */
|
||||
"strict": true,
|
||||
"noUnusedLocals": true,
|
||||
"noUnusedParameters": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
"noUncheckedSideEffectImports": true
|
||||
},
|
||||
"include": ["src/**/*.ts", "src/**/*.tsx", "src/**/*.vue"]
|
||||
}
|
||||
7
frontend/tsconfig.json
Normal file
7
frontend/tsconfig.json
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
{
|
||||
"files": [],
|
||||
"references": [
|
||||
{ "path": "./tsconfig.app.json" },
|
||||
{ "path": "./tsconfig.node.json" }
|
||||
]
|
||||
}
|
||||
26
frontend/tsconfig.node.json
Normal file
26
frontend/tsconfig.node.json
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.node.tsbuildinfo",
|
||||
"target": "ES2023",
|
||||
"lib": ["ES2023"],
|
||||
"module": "ESNext",
|
||||
"types": ["node"],
|
||||
"skipLibCheck": true,
|
||||
|
||||
/* Bundler mode */
|
||||
"moduleResolution": "bundler",
|
||||
"allowImportingTsExtensions": true,
|
||||
"verbatimModuleSyntax": true,
|
||||
"moduleDetection": "force",
|
||||
"noEmit": true,
|
||||
|
||||
/* Linting */
|
||||
"strict": true,
|
||||
"noUnusedLocals": true,
|
||||
"noUnusedParameters": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
"noUncheckedSideEffectImports": true
|
||||
},
|
||||
"include": ["vite.config.ts"]
|
||||
}
|
||||
26
frontend/vite.config.ts
Normal file
26
frontend/vite.config.ts
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
import { defineConfig } from 'vite'
|
||||
import vue from '@vitejs/plugin-vue'
|
||||
|
||||
export default defineConfig({
|
||||
plugins: [vue()],
|
||||
base: process.env.VITE_BASE_URL ?? '/',
|
||||
build: {
|
||||
rollupOptions: {
|
||||
output: {
|
||||
entryFileNames: 'assets/[name]-[hash:16].js',
|
||||
chunkFileNames: 'assets/[name]-[hash:16].js',
|
||||
assetFileNames: 'assets/[name]-[hash:16].[ext]',
|
||||
},
|
||||
},
|
||||
},
|
||||
server: {
|
||||
host: '0.0.0.0',
|
||||
port: 5173,
|
||||
proxy: {
|
||||
'/api': {
|
||||
target: 'http://localhost:8512',
|
||||
changeOrigin: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
98
manage.sh
Executable file
98
manage.sh
Executable file
|
|
@ -0,0 +1,98 @@
|
|||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
SERVICE=kiwi
|
||||
WEB_PORT=8511 # Vue SPA (nginx) — dev
|
||||
API_PORT=8512 # FastAPI — dev
|
||||
CLOUD_WEB_PORT=8515 # Vue SPA (nginx) — cloud
|
||||
COMPOSE_FILE="compose.yml"
|
||||
CLOUD_COMPOSE_FILE="compose.cloud.yml"
|
||||
CLOUD_PROJECT="kiwi-cloud"
|
||||
|
||||
usage() {
|
||||
echo "Usage: $0 {start|stop|restart|status|logs|open|build|test"
|
||||
echo " |cloud-start|cloud-stop|cloud-restart|cloud-status|cloud-logs|cloud-build}"
|
||||
echo ""
|
||||
echo "Dev:"
|
||||
echo " start Build (if needed) and start all services"
|
||||
echo " stop Stop and remove containers"
|
||||
echo " restart Stop then start"
|
||||
echo " status Show running containers"
|
||||
echo " logs [svc] Follow logs (api | web — defaults to all)"
|
||||
echo " open Open web UI in browser"
|
||||
echo " build Rebuild Docker images without cache"
|
||||
echo " test Run pytest test suite"
|
||||
echo ""
|
||||
echo "Cloud (menagerie.circuitforge.tech/kiwi):"
|
||||
echo " cloud-start Build cloud images and start kiwi-cloud project"
|
||||
echo " cloud-stop Stop cloud instance"
|
||||
echo " cloud-restart Stop then start cloud instance"
|
||||
echo " cloud-status Show cloud containers"
|
||||
echo " cloud-logs Follow cloud logs [api|web — defaults to all]"
|
||||
echo " cloud-build Rebuild cloud images without cache"
|
||||
exit 1
|
||||
}
|
||||
|
||||
cmd="${1:-help}"
|
||||
shift || true
|
||||
|
||||
case "$cmd" in
|
||||
start)
|
||||
docker compose -f "$COMPOSE_FILE" up -d --build
|
||||
echo "Kiwi running → http://localhost:${WEB_PORT}"
|
||||
;;
|
||||
stop)
|
||||
docker compose -f "$COMPOSE_FILE" down
|
||||
;;
|
||||
restart)
|
||||
docker compose -f "$COMPOSE_FILE" down
|
||||
docker compose -f "$COMPOSE_FILE" up -d --build
|
||||
echo "Kiwi running → http://localhost:${WEB_PORT}"
|
||||
;;
|
||||
status)
|
||||
docker compose -f "$COMPOSE_FILE" ps
|
||||
;;
|
||||
logs)
|
||||
svc="${1:-}"
|
||||
docker compose -f "$COMPOSE_FILE" logs -f ${svc}
|
||||
;;
|
||||
open)
|
||||
xdg-open "http://localhost:${WEB_PORT}" 2>/dev/null \
|
||||
|| open "http://localhost:${WEB_PORT}" 2>/dev/null \
|
||||
|| echo "Open http://localhost:${WEB_PORT} in your browser"
|
||||
;;
|
||||
build)
|
||||
docker compose -f "$COMPOSE_FILE" build --no-cache
|
||||
;;
|
||||
test)
|
||||
docker compose -f "$COMPOSE_FILE" run --rm api \
|
||||
conda run -n job-seeker pytest tests/ -v
|
||||
;;
|
||||
|
||||
cloud-start)
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" up -d --build
|
||||
echo "Kiwi cloud running → https://menagerie.circuitforge.tech/kiwi"
|
||||
;;
|
||||
cloud-stop)
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" down
|
||||
;;
|
||||
cloud-restart)
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" down
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" up -d --build
|
||||
echo "Kiwi cloud running → https://menagerie.circuitforge.tech/kiwi"
|
||||
;;
|
||||
cloud-status)
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" ps
|
||||
;;
|
||||
cloud-logs)
|
||||
svc="${1:-}"
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" logs -f ${svc}
|
||||
;;
|
||||
cloud-build)
|
||||
docker compose -f "$CLOUD_COMPOSE_FILE" -p "$CLOUD_PROJECT" build --no-cache
|
||||
;;
|
||||
|
||||
*)
|
||||
usage
|
||||
;;
|
||||
esac
|
||||
33
pyproject.toml
Normal file
33
pyproject.toml
Normal file
|
|
@ -0,0 +1,33 @@
|
|||
[build-system]
|
||||
requires = ["setuptools>=68", "wheel"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "kiwi"
|
||||
version = "0.1.0"
|
||||
description = "Pantry tracking + leftover recipe suggestions"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.11"
|
||||
dependencies = [
|
||||
# API
|
||||
"fastapi>=0.110",
|
||||
"uvicorn[standard]>=0.27",
|
||||
"python-multipart>=0.0.9",
|
||||
"aiofiles>=23.0",
|
||||
# Image processing + OCR
|
||||
"opencv-python>=4.8",
|
||||
"numpy>=1.25",
|
||||
"pyzbar>=0.1.9",
|
||||
# HTTP client
|
||||
"httpx>=0.27",
|
||||
# CircuitForge shared scaffold
|
||||
"circuitforge-core",
|
||||
]
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["."]
|
||||
include = ["app*"]
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
testpaths = ["tests"]
|
||||
asyncio_mode = "auto"
|
||||
Loading…
Reference in a new issue