feat: Mission Control v2 — swarm, L402, voice, marketplace, React dashboard
Major expansion of the Timmy Time Dashboard:
Backend modules:
- Swarm subsystem: registry, manager, bidder, coordinator, agent_runner, swarm_node, tasks, comms
- L402/Lightning: payment_handler, l402_proxy with HMAC macaroons
- Voice NLU: regex-based intent detection (chat, status, swarm, task, help, voice)
- Notifications: push notifier for swarm events
- Shortcuts: Siri Shortcuts iOS integration endpoints
- WebSocket: live dashboard event manager
- Inter-agent: agent-to-agent messaging layer
Dashboard routes:
- /swarm/* — swarm management and agent registry
- /marketplace — agent catalog with sat pricing
- /voice/* — voice command processing
- /mobile — mobile status endpoint
- /swarm/live — WebSocket live feed
React web dashboard (dashboard-web/):
- Sovereign Terminal design — dark theme with Bitcoin orange accents
- Three-column layout: status sidebar, workspace tabs, context panel
- Chat, Swarm, Tasks, Marketplace tab views
- JetBrains Mono typography, terminal aesthetic
- Framer Motion animations throughout
Tests: 228 passing (expanded from 93)
Includes Kimi's additional templates and QA work.
2026-02-21 12:57:38 -05:00
|
|
|
"""Mobile-optimized dashboard route — /mobile endpoint.
|
|
|
|
|
|
|
|
|
|
Provides a simplified, mobile-first view of the dashboard that
|
|
|
|
|
prioritizes the chat interface and essential status information.
|
|
|
|
|
Designed for quick access from a phone's home screen.
|
feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into
the browser via WebGPU (Safari 26+ / iOS 26+). No server connection
required — fully sovereign, fully offline.
New files:
- static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU
detection, streaming chat, and progress callbacks
- templates/mobile_local.html: Mobile-optimized UI with model
selector, download progress, LOCAL/SERVER badge, and chat
- tests/dashboard/test_local_models.py: 31 tests covering routes,
config, template UX, JS asset, and XSS prevention
Changes:
- config.py: browser_model_enabled, browser_model_id,
browser_model_fallback settings
- routes/mobile.py: /mobile/local page, /mobile/local-models API
- base.html: LOCAL AI nav link
Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB),
SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to
server-side Ollama when local model is unavailable.
https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
2026-02-27 00:03:05 +00:00
|
|
|
|
|
|
|
|
The /mobile/local endpoint loads a small LLM directly into the
|
|
|
|
|
browser via WebLLM so Timmy can run on an iPhone with no server.
|
feat: Mission Control v2 — swarm, L402, voice, marketplace, React dashboard
Major expansion of the Timmy Time Dashboard:
Backend modules:
- Swarm subsystem: registry, manager, bidder, coordinator, agent_runner, swarm_node, tasks, comms
- L402/Lightning: payment_handler, l402_proxy with HMAC macaroons
- Voice NLU: regex-based intent detection (chat, status, swarm, task, help, voice)
- Notifications: push notifier for swarm events
- Shortcuts: Siri Shortcuts iOS integration endpoints
- WebSocket: live dashboard event manager
- Inter-agent: agent-to-agent messaging layer
Dashboard routes:
- /swarm/* — swarm management and agent registry
- /marketplace — agent catalog with sat pricing
- /voice/* — voice command processing
- /mobile — mobile status endpoint
- /swarm/live — WebSocket live feed
React web dashboard (dashboard-web/):
- Sovereign Terminal design — dark theme with Bitcoin orange accents
- Three-column layout: status sidebar, workspace tabs, context panel
- Chat, Swarm, Tasks, Marketplace tab views
- JetBrains Mono typography, terminal aesthetic
- Framer Motion animations throughout
Tests: 228 passing (expanded from 93)
Includes Kimi's additional templates and QA work.
2026-02-21 12:57:38 -05:00
|
|
|
"""
|
|
|
|
|
|
|
|
|
|
from fastapi import APIRouter, Request
|
|
|
|
|
from fastapi.responses import HTMLResponse
|
|
|
|
|
|
feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into
the browser via WebGPU (Safari 26+ / iOS 26+). No server connection
required — fully sovereign, fully offline.
New files:
- static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU
detection, streaming chat, and progress callbacks
- templates/mobile_local.html: Mobile-optimized UI with model
selector, download progress, LOCAL/SERVER badge, and chat
- tests/dashboard/test_local_models.py: 31 tests covering routes,
config, template UX, JS asset, and XSS prevention
Changes:
- config.py: browser_model_enabled, browser_model_id,
browser_model_fallback settings
- routes/mobile.py: /mobile/local page, /mobile/local-models API
- base.html: LOCAL AI nav link
Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB),
SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to
server-side Ollama when local model is unavailable.
https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
2026-02-27 00:03:05 +00:00
|
|
|
from config import settings
|
2026-03-05 18:56:52 -05:00
|
|
|
from dashboard.templating import templates
|
feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into
the browser via WebGPU (Safari 26+ / iOS 26+). No server connection
required — fully sovereign, fully offline.
New files:
- static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU
detection, streaming chat, and progress callbacks
- templates/mobile_local.html: Mobile-optimized UI with model
selector, download progress, LOCAL/SERVER badge, and chat
- tests/dashboard/test_local_models.py: 31 tests covering routes,
config, template UX, JS asset, and XSS prevention
Changes:
- config.py: browser_model_enabled, browser_model_id,
browser_model_fallback settings
- routes/mobile.py: /mobile/local page, /mobile/local-models API
- base.html: LOCAL AI nav link
Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB),
SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to
server-side Ollama when local model is unavailable.
https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
2026-02-27 00:03:05 +00:00
|
|
|
|
feat: Mission Control v2 — swarm, L402, voice, marketplace, React dashboard
Major expansion of the Timmy Time Dashboard:
Backend modules:
- Swarm subsystem: registry, manager, bidder, coordinator, agent_runner, swarm_node, tasks, comms
- L402/Lightning: payment_handler, l402_proxy with HMAC macaroons
- Voice NLU: regex-based intent detection (chat, status, swarm, task, help, voice)
- Notifications: push notifier for swarm events
- Shortcuts: Siri Shortcuts iOS integration endpoints
- WebSocket: live dashboard event manager
- Inter-agent: agent-to-agent messaging layer
Dashboard routes:
- /swarm/* — swarm management and agent registry
- /marketplace — agent catalog with sat pricing
- /voice/* — voice command processing
- /mobile — mobile status endpoint
- /swarm/live — WebSocket live feed
React web dashboard (dashboard-web/):
- Sovereign Terminal design — dark theme with Bitcoin orange accents
- Three-column layout: status sidebar, workspace tabs, context panel
- Chat, Swarm, Tasks, Marketplace tab views
- JetBrains Mono typography, terminal aesthetic
- Framer Motion animations throughout
Tests: 228 passing (expanded from 93)
Includes Kimi's additional templates and QA work.
2026-02-21 12:57:38 -05:00
|
|
|
router = APIRouter(tags=["mobile"])
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
@router.get("/mobile", response_class=HTMLResponse)
|
|
|
|
|
async def mobile_dashboard(request: Request):
|
|
|
|
|
"""Render the mobile-optimized dashboard.
|
|
|
|
|
|
|
|
|
|
Falls back to the main index template which is already responsive.
|
|
|
|
|
A dedicated mobile template can be added later for a more
|
|
|
|
|
streamlined experience.
|
|
|
|
|
"""
|
|
|
|
|
return templates.TemplateResponse(request, "index.html")
|
|
|
|
|
|
|
|
|
|
|
feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into
the browser via WebGPU (Safari 26+ / iOS 26+). No server connection
required — fully sovereign, fully offline.
New files:
- static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU
detection, streaming chat, and progress callbacks
- templates/mobile_local.html: Mobile-optimized UI with model
selector, download progress, LOCAL/SERVER badge, and chat
- tests/dashboard/test_local_models.py: 31 tests covering routes,
config, template UX, JS asset, and XSS prevention
Changes:
- config.py: browser_model_enabled, browser_model_id,
browser_model_fallback settings
- routes/mobile.py: /mobile/local page, /mobile/local-models API
- base.html: LOCAL AI nav link
Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB),
SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to
server-side Ollama when local model is unavailable.
https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
2026-02-27 00:03:05 +00:00
|
|
|
@router.get("/mobile/local", response_class=HTMLResponse)
|
|
|
|
|
async def mobile_local_dashboard(request: Request):
|
|
|
|
|
"""Mobile dashboard with in-browser local model inference.
|
|
|
|
|
|
|
|
|
|
Loads a small LLM (via WebLLM / WebGPU) directly into Safari
|
|
|
|
|
so Timmy works on an iPhone without any server connection.
|
|
|
|
|
Falls back to server-side Ollama when the local model is
|
|
|
|
|
unavailable or the user prefers it.
|
|
|
|
|
"""
|
|
|
|
|
return templates.TemplateResponse(
|
|
|
|
|
request,
|
|
|
|
|
"mobile_local.html",
|
|
|
|
|
{
|
|
|
|
|
"browser_model_enabled": settings.browser_model_enabled,
|
|
|
|
|
"browser_model_id": settings.browser_model_id,
|
|
|
|
|
"browser_model_fallback": settings.browser_model_fallback,
|
|
|
|
|
"server_model": settings.ollama_model,
|
2026-03-05 19:45:38 -05:00
|
|
|
"page_title": "Local AI",
|
feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into
the browser via WebGPU (Safari 26+ / iOS 26+). No server connection
required — fully sovereign, fully offline.
New files:
- static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU
detection, streaming chat, and progress callbacks
- templates/mobile_local.html: Mobile-optimized UI with model
selector, download progress, LOCAL/SERVER badge, and chat
- tests/dashboard/test_local_models.py: 31 tests covering routes,
config, template UX, JS asset, and XSS prevention
Changes:
- config.py: browser_model_enabled, browser_model_id,
browser_model_fallback settings
- routes/mobile.py: /mobile/local page, /mobile/local-models API
- base.html: LOCAL AI nav link
Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB),
SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to
server-side Ollama when local model is unavailable.
https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
2026-02-27 00:03:05 +00:00
|
|
|
},
|
|
|
|
|
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
@router.get("/mobile/local-models")
|
|
|
|
|
async def local_models_config():
|
|
|
|
|
"""Return browser model configuration for the JS client."""
|
|
|
|
|
return {
|
|
|
|
|
"enabled": settings.browser_model_enabled,
|
|
|
|
|
"default_model": settings.browser_model_id,
|
|
|
|
|
"fallback_to_server": settings.browser_model_fallback,
|
|
|
|
|
"server_model": settings.ollama_model,
|
|
|
|
|
"server_url": settings.ollama_url,
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
feat: Mission Control v2 — swarm, L402, voice, marketplace, React dashboard
Major expansion of the Timmy Time Dashboard:
Backend modules:
- Swarm subsystem: registry, manager, bidder, coordinator, agent_runner, swarm_node, tasks, comms
- L402/Lightning: payment_handler, l402_proxy with HMAC macaroons
- Voice NLU: regex-based intent detection (chat, status, swarm, task, help, voice)
- Notifications: push notifier for swarm events
- Shortcuts: Siri Shortcuts iOS integration endpoints
- WebSocket: live dashboard event manager
- Inter-agent: agent-to-agent messaging layer
Dashboard routes:
- /swarm/* — swarm management and agent registry
- /marketplace — agent catalog with sat pricing
- /voice/* — voice command processing
- /mobile — mobile status endpoint
- /swarm/live — WebSocket live feed
React web dashboard (dashboard-web/):
- Sovereign Terminal design — dark theme with Bitcoin orange accents
- Three-column layout: status sidebar, workspace tabs, context panel
- Chat, Swarm, Tasks, Marketplace tab views
- JetBrains Mono typography, terminal aesthetic
- Framer Motion animations throughout
Tests: 228 passing (expanded from 93)
Includes Kimi's additional templates and QA work.
2026-02-21 12:57:38 -05:00
|
|
|
@router.get("/mobile/status")
|
|
|
|
|
async def mobile_status():
|
|
|
|
|
"""Lightweight status endpoint optimized for mobile polling."""
|
|
|
|
|
from dashboard.routes.health import check_ollama
|
|
|
|
|
|
|
|
|
|
ollama_ok = await check_ollama()
|
|
|
|
|
return {
|
|
|
|
|
"ollama": "up" if ollama_ok else "down",
|
|
|
|
|
"model": settings.ollama_model,
|
2026-03-05 19:45:38 -05:00
|
|
|
"agent": "default",
|
feat: Mission Control v2 — swarm, L402, voice, marketplace, React dashboard
Major expansion of the Timmy Time Dashboard:
Backend modules:
- Swarm subsystem: registry, manager, bidder, coordinator, agent_runner, swarm_node, tasks, comms
- L402/Lightning: payment_handler, l402_proxy with HMAC macaroons
- Voice NLU: regex-based intent detection (chat, status, swarm, task, help, voice)
- Notifications: push notifier for swarm events
- Shortcuts: Siri Shortcuts iOS integration endpoints
- WebSocket: live dashboard event manager
- Inter-agent: agent-to-agent messaging layer
Dashboard routes:
- /swarm/* — swarm management and agent registry
- /marketplace — agent catalog with sat pricing
- /voice/* — voice command processing
- /mobile — mobile status endpoint
- /swarm/live — WebSocket live feed
React web dashboard (dashboard-web/):
- Sovereign Terminal design — dark theme with Bitcoin orange accents
- Three-column layout: status sidebar, workspace tabs, context panel
- Chat, Swarm, Tasks, Marketplace tab views
- JetBrains Mono typography, terminal aesthetic
- Framer Motion animations throughout
Tests: 228 passing (expanded from 93)
Includes Kimi's additional templates and QA work.
2026-02-21 12:57:38 -05:00
|
|
|
"ready": True,
|
feat: add in-browser local model support for iPhone via WebLLM
Enable Timmy to run directly on iPhone by loading a small LLM into
the browser via WebGPU (Safari 26+ / iOS 26+). No server connection
required — fully sovereign, fully offline.
New files:
- static/local_llm.js: WebLLM wrapper with model catalogue, WebGPU
detection, streaming chat, and progress callbacks
- templates/mobile_local.html: Mobile-optimized UI with model
selector, download progress, LOCAL/SERVER badge, and chat
- tests/dashboard/test_local_models.py: 31 tests covering routes,
config, template UX, JS asset, and XSS prevention
Changes:
- config.py: browser_model_enabled, browser_model_id,
browser_model_fallback settings
- routes/mobile.py: /mobile/local page, /mobile/local-models API
- base.html: LOCAL AI nav link
Supported models: SmolLM2-360M (~200MB), Qwen2.5-0.5B (~350MB),
SmolLM2-1.7B (~1GB), Llama-3.2-1B (~700MB). Falls back to
server-side Ollama when local model is unavailable.
https://claude.ai/code/session_01Cqkvr4sZbED7T3iDu1rwSD
2026-02-27 00:03:05 +00:00
|
|
|
"browser_model_enabled": settings.browser_model_enabled,
|
|
|
|
|
"browser_model_id": settings.browser_model_id,
|
feat: Mission Control v2 — swarm, L402, voice, marketplace, React dashboard
Major expansion of the Timmy Time Dashboard:
Backend modules:
- Swarm subsystem: registry, manager, bidder, coordinator, agent_runner, swarm_node, tasks, comms
- L402/Lightning: payment_handler, l402_proxy with HMAC macaroons
- Voice NLU: regex-based intent detection (chat, status, swarm, task, help, voice)
- Notifications: push notifier for swarm events
- Shortcuts: Siri Shortcuts iOS integration endpoints
- WebSocket: live dashboard event manager
- Inter-agent: agent-to-agent messaging layer
Dashboard routes:
- /swarm/* — swarm management and agent registry
- /marketplace — agent catalog with sat pricing
- /voice/* — voice command processing
- /mobile — mobile status endpoint
- /swarm/live — WebSocket live feed
React web dashboard (dashboard-web/):
- Sovereign Terminal design — dark theme with Bitcoin orange accents
- Three-column layout: status sidebar, workspace tabs, context panel
- Chat, Swarm, Tasks, Marketplace tab views
- JetBrains Mono typography, terminal aesthetic
- Framer Motion animations throughout
Tests: 228 passing (expanded from 93)
Includes Kimi's additional templates and QA work.
2026-02-21 12:57:38 -05:00
|
|
|
}
|