feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into the browser via WebGPU (Safari 26+ / iOS 26+). No server connection required — fully sovereign, fully offline. New files: - static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU detection, streaming chat, and progress callbacks - templates/mobile_local.html: Mobile-optimized UI with model selector, download progress, LOCAL/SERVER badge, and chat - tests/dashboard/test_local_models.py: 31 tests covering routes, config, template UX, JS asset, and XSS prevention Changes: - config.py: browser_model_enabled, browser_model_id, browser_model_fallback settings - routes/mobile.py: /mobile/local page, /mobile/local-models API - base.html: LOCAL AI nav link Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB), SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to server-side Ollama when local model is unavailable. https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
This commit is contained in:
@@ -90,6 +90,17 @@ class Settings(BaseSettings):
|
||||
work_orders_auto_execute: bool = False # Master switch for auto-execution
|
||||
work_orders_auto_threshold: str = "low" # Max priority that auto-executes: "low" | "medium" | "high" | "none"
|
||||
|
||||
# ── Browser Local Models (iPhone / WebGPU) ───────────────────────
|
||||
# Enable in-browser LLM inference via WebLLM for offline iPhone use.
|
||||
# When enabled, the mobile dashboard loads a small model directly
|
||||
# in the browser — no server or Ollama required.
|
||||
browser_model_enabled: bool = True
|
||||
# WebLLM model ID — must be a pre-compiled MLC model.
|
||||
# Recommended for iPhone: SmolLM2-360M (fast) or Qwen3-0.6B (smart).
|
||||
browser_model_id: str = "SmolLM2-360M-Instruct-q4f16_1-MLC"
|
||||
# Fallback to server when browser model is unavailable or too slow.
|
||||
browser_model_fallback: bool = True
|
||||
|
||||
# ── Scripture / Biblical Integration ──────────────────────────────
|
||||
# Enable the sovereign biblical text module. When enabled, Timmy
|
||||
# loads the local ESV text corpus and runs meditation workflows.
|
||||
|
||||
Reference in New Issue
Block a user