Merge pull request #1 from Alexspayne/claude/run-tests-IYl0F

This commit is contained in:
Alexander Whitestone
2026-02-19 14:59:57 -05:00
committed by GitHub
26 changed files with 1360 additions and 0 deletions

13
.env.example Normal file
View File

@@ -0,0 +1,13 @@
# Timmy Time — Mission Control
# Copy this file to .env and uncomment lines you want to override.
# .env is gitignored and never committed.
# Ollama host (default: http://localhost:11434)
# Override if Ollama is running on another machine or port.
# OLLAMA_URL=http://localhost:11434
# LLM model to use via Ollama (default: llama3.2)
# OLLAMA_MODEL=llama3.2
# Enable FastAPI interactive docs at /docs and /redoc (default: false)
# DEBUG=true

57
.github/workflows/tests.yml vendored Normal file
View File

@@ -0,0 +1,57 @@
name: Tests
on:
push:
branches: ["**"]
pull_request:
branches: ["**"]
jobs:
test:
runs-on: ubuntu-latest
# Required for publish-unit-test-result-action to post check runs and PR comments
permissions:
contents: read
checks: write
pull-requests: write
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
cache: "pip"
- name: Install dependencies
run: pip install -e ".[dev]"
- name: Run tests
run: |
pytest \
--tb=short \
--cov=src \
--cov-report=term-missing \
--cov-report=xml:reports/coverage.xml \
--junitxml=reports/junit.xml
# Posts a check annotation + PR comment showing pass/fail counts.
# Visible in the GitHub mobile app under Checks and in PR conversations.
- name: Publish test results
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
files: reports/junit.xml
check_name: "pytest results"
comment_title: "Test Results"
report_individual_runs: true
# Coverage report available as a downloadable artifact in the Actions tab
- name: Upload coverage report
uses: actions/upload-artifact@v4
if: always()
with:
name: coverage-report
path: reports/coverage.xml
retention-days: 14

35
.gitignore vendored Normal file
View File

@@ -0,0 +1,35 @@
# Python
__pycache__/
*.py[cod]
*.pyo
.Python
build/
dist/
*.egg-info/
.eggs/
# Virtual envs
.venv/
venv/
env/
# Secrets / local config — commit only .env.example (the template)
.env
.env.*
!.env.example
# SQLite memory — never commit agent memory
*.db
# Testing
.pytest_cache/
.coverage
htmlcov/
reports/
# IDE
.idea/
.vscode/
*.swp
*.swo
.DS_Store

216
README.md Normal file
View File

@@ -0,0 +1,216 @@
# Timmy Time — Mission Control
[![Tests](https://github.com/Alexspayne/Timmy-time-dashboard/actions/workflows/tests.yml/badge.svg)](https://github.com/Alexspayne/Timmy-time-dashboard/actions/workflows/tests.yml)
A local-first dashboard for your sovereign AI agents. Talk to Timmy, watch his status, verify Ollama is running — all from a browser, no cloud required.
---
## Prerequisites
You need three things on your Mac before anything else:
**Python 3.11+**
```bash
python3 --version # should be 3.11 or higher
```
If not: `brew install python@3.11`
**Ollama** (runs the local LLM)
```bash
brew install ollama
```
Or download from https://ollama.com
**Git** — already on every Mac.
---
## Quickstart (copy-paste friendly)
### 1. Clone the branch
```bash
git clone -b claude/run-tests-IYl0F https://github.com/Alexspayne/Timmy-time-dashboard.git
cd Timmy-time-dashboard
```
### 2. Create a virtual environment and install
```bash
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
```
### 3. Pull the model (one-time, ~2 GB download)
Open a **new terminal tab** and run:
```bash
ollama serve
```
Back in your first tab:
```bash
ollama pull llama3.2
```
### 4. Start the dashboard
```bash
uvicorn dashboard.app:app --reload
```
Open your browser to **http://localhost:8000**
---
## Access from your phone
The dashboard is mobile-optimized. To open it on your phone:
**Step 1 — bind to your local network** (instead of just localhost):
```bash
uvicorn dashboard.app:app --host 0.0.0.0 --port 8000 --reload
```
**Step 2 — find your Mac's IP address:**
```bash
ipconfig getifaddr en0
```
This prints something like `192.168.1.42`. If you're on ethernet instead of Wi-Fi, try `en1`.
**Step 3 — open on your phone:**
Make sure your phone is on the **same Wi-Fi network** as your Mac, then open:
```
http://192.168.1.42:8000
```
(replace with your actual IP)
On mobile the layout switches to a single column — status panels become a horizontal scroll strip at the top, chat fills the rest of the screen. The input field is sized to prevent iOS from zooming in when you tap it.
---
## What you'll see
The dashboard has two panels on the left and a chat window on the right:
- **AGENTS** — Timmy's metadata (model, type, version)
- **SYSTEM HEALTH** — live Ollama status, auto-refreshes every 30 seconds
- **TIMMY INTERFACE** — type a message, hit SEND, get a response from the local LLM
If Ollama isn't running when you send a message, the chat will show a "Timmy is offline" error instead of crashing.
---
## Run the tests
No Ollama needed — all external calls are mocked.
```bash
pytest
```
Expected output:
```
27 passed in 0.67s
```
---
## Optional: CLI
With your venv active:
```bash
timmy chat "What is sovereignty?"
timmy think "Bitcoin and self-custody"
timmy status
```
---
## Architecture
```mermaid
graph TD
Phone["📱 Phone / Browser"]
Browser["💻 Browser"]
Phone -->|HTTP + HTMX| FastAPI
Browser -->|HTTP + HTMX| FastAPI
subgraph "Local Machine"
FastAPI["FastAPI\n(dashboard.app)"]
Jinja["Jinja2 Templates\n+ static CSS"]
Timmy["Timmy Agent\n(Agno wrapper)"]
Ollama["Ollama\n:11434"]
SQLite[("SQLite\ntimmy.db")]
FastAPI -->|renders| Jinja
FastAPI -->|/agents/timmy/chat| Timmy
FastAPI -->|/health/status ping| Ollama
Timmy -->|LLM call| Ollama
Timmy -->|conversation memory| SQLite
end
```
All traffic stays on your local network. No cloud, no telemetry.
## Configuration
Override defaults without touching code — create a `.env` file (see `.env.example`):
```bash
cp .env.example .env
# then edit .env
```
| Variable | Default | Purpose |
|---|---|---|
| `OLLAMA_URL` | `http://localhost:11434` | Ollama host (useful if Ollama runs on another machine) |
| `OLLAMA_MODEL` | `llama3.2` | LLM model served by Ollama |
| `DEBUG` | `false` | Set `true` to enable `/docs` and `/redoc` |
## Project layout
```
src/
config.py # pydantic-settings (reads .env)
timmy/ # Timmy agent — wraps Agno (soul = prompt, body = Agno)
dashboard/ # FastAPI app + routes + Jinja2 templates
static/ # CSS (dark mission-control theme)
tests/ # pytest suite (27 tests, no Ollama required)
.env.example # environment variable reference
pyproject.toml # dependencies and build config
```
---
## Troubleshooting
**`ollama: command not found`** — Ollama isn't installed or isn't on your PATH. Install via Homebrew or the .dmg from ollama.com.
**`connection refused` in the chat** — Ollama isn't running. Open a terminal and run `ollama serve`, then try again.
**`ModuleNotFoundError: No module named 'dashboard'`** — You're not in the venv or forgot `pip install -e .`. Run `source .venv/bin/activate` then `pip install -e ".[dev]"`.
**Health panel shows DOWN** — Ollama isn't running. The chat still works for testing but will return the offline error message.
---
## Roadmap
| Version | Name | Milestone |
|---------|------------|--------------------------------------------|
| 1.0.0 | Genesis | Agno + Ollama + SQLite + Dashboard |
| 2.0.0 | Exodus | MCP tools + multi-agent |
| 3.0.0 | Revelation | Bitcoin Lightning treasury + single `.app` |

47
STATUS.md Normal file
View File

@@ -0,0 +1,47 @@
# Timmy Time — Status
## Current Version: 1.0.0 (Genesis)
### What's Built
- `src/timmy/` — Agno-powered Timmy agent (llama3.2 via Ollama, SQLite memory)
- `src/dashboard/` — FastAPI Mission Control dashboard (HTMX + Jinja2)
- CLI: `timmy think / chat / status`
- Pytest test suite (prompts, agent config, dashboard routes)
### System Requirements
- Python 3.11+
- Ollama running at `http://localhost:11434`
- `llama3.2` model pulled
### Quickstart
```bash
pip install -e ".[dev]"
# Start Ollama (separate terminal)
ollama serve
ollama pull llama3.2
# Run dashboard
uvicorn dashboard.app:app --reload
# Run tests (no Ollama required)
pytest
```
### Dashboard
`http://localhost:8000` — Mission Control UI with:
- Timmy agent status panel
- Ollama health indicator (auto-refreshes every 30s)
- Live chat interface
---
## Roadmap
| Tag | Name | Milestone |
|-------|------------|----------------------------------------------|
| 1.0.0 | Genesis | Agno + Ollama + SQLite + Dashboard |
| 2.0.0 | Exodus | MCP tools + multi-agent support |
| 3.0.0 | Revelation | Bitcoin Lightning treasury + single `.app` |
_Last updated: 2026-02-19_

47
pyproject.toml Normal file
View File

@@ -0,0 +1,47 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "timmy-time"
version = "1.0.0"
description = "Mission Control for sovereign AI agents"
readme = "README.md"
requires-python = ">=3.11"
license = { text = "MIT" }
dependencies = [
"agno>=1.4.0",
"fastapi>=0.115.0",
"uvicorn[standard]>=0.32.0",
"jinja2>=3.1.0",
"httpx>=0.27.0",
"python-multipart>=0.0.12",
"aiofiles>=24.0.0",
"typer>=0.12.0",
"rich>=13.0.0",
"pydantic-settings>=2.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0.0",
"pytest-asyncio>=0.24.0",
"pytest-cov>=5.0.0",
]
[project.scripts]
timmy = "timmy.cli:main"
[tool.hatch.build.targets.wheel]
sources = {"src" = ""}
include = ["src/timmy", "src/dashboard", "src/config.py"]
[tool.pytest.ini_options]
testpaths = ["tests"]
pythonpath = ["src"]
asyncio_mode = "auto"
addopts = "-v --tb=short"
[tool.coverage.run]
source = ["src"]
omit = ["*/tests/*"]

21
src/config.py Normal file
View File

@@ -0,0 +1,21 @@
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
# Ollama host — override with OLLAMA_URL env var or .env file
ollama_url: str = "http://localhost:11434"
# LLM model passed to Agno/Ollama — override with OLLAMA_MODEL
ollama_model: str = "llama3.2"
# Set DEBUG=true to enable /docs and /redoc (disabled by default)
debug: bool = False
model_config = SettingsConfigDict(
env_file=".env",
env_file_encoding="utf-8",
extra="ignore",
)
settings = Settings()

View File

40
src/dashboard/app.py Normal file
View File

@@ -0,0 +1,40 @@
import logging
from pathlib import Path
from fastapi import FastAPI, Request
from fastapi.responses import HTMLResponse
from fastapi.staticfiles import StaticFiles
from fastapi.templating import Jinja2Templates
from config import settings
from dashboard.routes.agents import router as agents_router
from dashboard.routes.health import router as health_router
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s %(levelname)-8s %(name)s%(message)s",
datefmt="%H:%M:%S",
)
logger = logging.getLogger(__name__)
BASE_DIR = Path(__file__).parent
PROJECT_ROOT = BASE_DIR.parent.parent
app = FastAPI(
title="Timmy Time — Mission Control",
version="1.0.0",
# Docs disabled unless DEBUG=true in env / .env
docs_url="/docs" if settings.debug else None,
redoc_url="/redoc" if settings.debug else None,
)
templates = Jinja2Templates(directory=str(BASE_DIR / "templates"))
app.mount("/static", StaticFiles(directory=str(PROJECT_ROOT / "static")), name="static")
app.include_router(health_router)
app.include_router(agents_router)
@app.get("/", response_class=HTMLResponse)
async def index(request: Request):
return templates.TemplateResponse(request, "index.html")

View File

View File

@@ -0,0 +1,52 @@
from datetime import datetime
from pathlib import Path
from fastapi import APIRouter, Form, Request
from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
from timmy.agent import create_timmy
router = APIRouter(prefix="/agents", tags=["agents"])
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
AGENT_REGISTRY = {
"timmy": {
"id": "timmy",
"name": "Timmy",
"type": "sovereign",
"model": "llama3.2",
"backend": "ollama",
"version": "1.0.0",
}
}
@router.get("")
async def list_agents():
return {"agents": list(AGENT_REGISTRY.values())}
@router.post("/timmy/chat", response_class=HTMLResponse)
async def chat_timmy(request: Request, message: str = Form(...)):
timestamp = datetime.now().strftime("%H:%M:%S")
response_text = None
error_text = None
try:
agent = create_timmy()
run = agent.run(message, stream=False)
response_text = run.content if hasattr(run, "content") else str(run)
except Exception as exc:
error_text = f"Timmy is offline: {exc}"
return templates.TemplateResponse(
request,
"partials/chat_message.html",
{
"user_message": message,
"response": response_text,
"error": error_text,
"timestamp": timestamp,
},
)

View File

@@ -0,0 +1,42 @@
import httpx
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
from pathlib import Path
from config import settings
router = APIRouter(tags=["health"])
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
async def check_ollama() -> bool:
"""Ping Ollama to verify it's running."""
try:
async with httpx.AsyncClient(timeout=2.0) as client:
r = await client.get(settings.ollama_url)
return r.status_code == 200
except Exception:
return False
@router.get("/health")
async def health():
ollama_ok = await check_ollama()
return {
"status": "ok",
"services": {
"ollama": "up" if ollama_ok else "down",
},
"agents": ["timmy"],
}
@router.get("/health/status", response_class=HTMLResponse)
async def health_status(request: Request):
ollama_ok = await check_ollama()
return templates.TemplateResponse(
request,
"partials/health_status.html",
{"ollama": ollama_ok},
)

View File

@@ -0,0 +1,41 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, viewport-fit=cover" />
<meta name="apple-mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent" />
<meta name="theme-color" content="#060d14" />
<title>{% block title %}Timmy Time — Mission Control{% endblock %}</title>
<link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link href="https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@300;400;500;700&display=swap" rel="stylesheet" />
<link rel="stylesheet" href="/static/style.css" />
<script src="https://unpkg.com/htmx.org@2.0.3" integrity="sha384-0895/pl2MU10Hqc6jd4RvrthNlDiE9U1tWmX7WRESftEDRosgxNsQG/Ze9YMRzHq" crossorigin="anonymous"></script>
</head>
<body>
<header class="mc-header">
<div class="mc-header-left">
<span class="mc-title">TIMMY TIME</span>
<span class="mc-subtitle">MISSION CONTROL</span>
</div>
<div class="mc-header-right">
<span class="mc-time" id="clock"></span>
</div>
</header>
<main class="mc-main">
{% block content %}{% endblock %}
</main>
<script>
function updateClock() {
const now = new Date();
document.getElementById('clock').textContent =
now.toLocaleTimeString('en-US', { hour12: false });
}
setInterval(updateClock, 1000);
updateClock();
</script>
</body>
</html>

View File

@@ -0,0 +1,87 @@
{% extends "base.html" %}
{% block content %}
<div class="sidebar">
<!-- Agents -->
<div class="panel">
<div class="panel-header">// AGENTS</div>
<div class="panel-body">
<div class="agent-card">
<div class="agent-card-header">
<span class="status-dot amber"></span>
<span class="agent-name">TIMMY</span>
</div>
<div class="agent-meta">
<span class="meta-key">TYPE</span> <span class="meta-val">sovereign</span><br>
<span class="meta-key">MODEL</span> <span class="meta-val">llama3.2</span><br>
<span class="meta-key">BACKEND</span> <span class="meta-val">ollama</span><br>
<span class="meta-key">VERSION</span> <span class="meta-val">1.0.0</span>
</div>
</div>
</div>
</div>
<!-- System Health (HTMX polled) -->
<div class="panel"
hx-get="/health/status"
hx-trigger="load, every 30s"
hx-target="this"
hx-swap="innerHTML">
<div class="panel-header">// SYSTEM HEALTH</div>
<div class="panel-body">
<div class="health-row">
<span class="health-label">LOADING...</span>
</div>
</div>
</div>
</div>
<!-- Chat Panel -->
<div class="panel chat-panel">
<div class="panel-header">// TIMMY INTERFACE</div>
<div class="chat-log" id="chat-log">
<div class="chat-message agent">
<div class="msg-meta">TIMMY // SYSTEM</div>
<div class="msg-body">Mission Control initialized. Timmy ready — awaiting input.</div>
</div>
</div>
<div class="chat-input-bar">
<form hx-post="/agents/timmy/chat"
hx-target="#chat-log"
hx-swap="beforeend"
hx-indicator="#send-indicator"
hx-sync="this:drop"
hx-disabled-elt="find button"
hx-on::after-settle="this.reset(); scrollChat()"
style="display:flex; flex:1; gap:8px;">
<input type="text"
name="message"
placeholder="send a message to timmy..."
autocomplete="off"
autocorrect="off"
autocapitalize="none"
spellcheck="false"
enterkeyhint="send"
required />
<button type="submit">
SEND
<span id="send-indicator" class="htmx-indicator"></span>
</button>
</form>
</div>
</div>
<script>
function scrollChat() {
const log = document.getElementById('chat-log');
log.scrollTop = log.scrollHeight;
}
scrollChat();
</script>
{% endblock %}

View File

@@ -0,0 +1,15 @@
<div class="chat-message user">
<div class="msg-meta">YOU // {{ timestamp }}</div>
<div class="msg-body">{{ user_message }}</div>
</div>
{% if response %}
<div class="chat-message agent">
<div class="msg-meta">TIMMY // {{ timestamp }}</div>
<div class="msg-body">{{ response }}</div>
</div>
{% elif error %}
<div class="chat-message error-msg">
<div class="msg-meta">SYSTEM // {{ timestamp }}</div>
<div class="msg-body">{{ error }}</div>
</div>
{% endif %}

View File

@@ -0,0 +1,19 @@
<div class="panel-header">// SYSTEM HEALTH</div>
<div class="panel-body">
<div class="health-row">
<span class="health-label">OLLAMA</span>
{% if ollama %}
<span class="badge up">UP</span>
{% else %}
<span class="badge down">DOWN</span>
{% endif %}
</div>
<div class="health-row">
<span class="health-label">TIMMY</span>
<span class="badge ready">READY</span>
</div>
<div class="health-row">
<span class="health-label">MODEL</span>
<span class="badge ready">llama3.2</span>
</div>
</div>

0
src/timmy/__init__.py Normal file
View File

19
src/timmy/agent.py Normal file
View File

@@ -0,0 +1,19 @@
from agno.agent import Agent
from agno.models.ollama import Ollama
from agno.db.sqlite import SqliteDb
from timmy.prompts import TIMMY_SYSTEM_PROMPT
from config import settings
def create_timmy(db_file: str = "timmy.db") -> Agent:
"""Instantiate Timmy with Agno + Ollama + SQLite memory."""
return Agent(
name="Timmy",
model=Ollama(id=settings.ollama_model),
db=SqliteDb(db_file=db_file),
description=TIMMY_SYSTEM_PROMPT,
add_history_to_context=True,
num_history_runs=10,
markdown=True,
)

30
src/timmy/cli.py Normal file
View File

@@ -0,0 +1,30 @@
import typer
from timmy.agent import create_timmy
app = typer.Typer(help="Timmy — sovereign AI agent")
@app.command()
def think(topic: str = typer.Argument(..., help="Topic to reason about")):
"""Ask Timmy to think carefully about a topic."""
timmy = create_timmy()
timmy.print_response(f"Think carefully about: {topic}", stream=True)
@app.command()
def chat(message: str = typer.Argument(..., help="Message to send")):
"""Send a message to Timmy."""
timmy = create_timmy()
timmy.print_response(message, stream=True)
@app.command()
def status():
"""Print Timmy's operational status."""
timmy = create_timmy()
timmy.print_response("Brief status report — one sentence.", stream=False)
def main():
app()

7
src/timmy/prompts.py Normal file
View File

@@ -0,0 +1,7 @@
TIMMY_SYSTEM_PROMPT = """You are Timmy — a sovereign AI agent running locally.
No cloud dependencies. You think clearly, speak plainly, act with intention.
Grounded in Christian faith, powered by Bitcoin economics, committed to the
user's digital sovereignty."""
TIMMY_STATUS_PROMPT = """You are Timmy. Give a one-sentence status report confirming
you are operational and running locally."""

325
static/style.css Normal file
View File

@@ -0,0 +1,325 @@
:root {
--bg-deep: #060d14;
--bg-panel: #0c1824;
--bg-card: #0f2030;
--border: #1a3a55;
--border-glow: #1e4d72;
--text: #b8d0e8;
--text-dim: #4a7a9a;
--text-bright: #ddeeff;
--green: #00e87a;
--green-dim: #00704a;
--amber: #ffb800;
--amber-dim: #7a5800;
--red: #ff4455;
--red-dim: #7a1a22;
--blue: #00aaff;
--font: 'JetBrains Mono', 'Courier New', monospace;
--header-h: 52px;
}
* { box-sizing: border-box; margin: 0; padding: 0; }
body {
background: var(--bg-deep);
color: var(--text);
font-family: var(--font);
font-size: 13px;
min-height: 100dvh;
overflow-x: hidden;
/* prevent bounce-scroll from revealing background on iOS */
overscroll-behavior: none;
}
/* ── Header ─────────────────────────────────────── */
.mc-header {
display: flex;
justify-content: space-between;
align-items: center;
padding: 12px 24px;
padding-top: max(12px, env(safe-area-inset-top));
background: var(--bg-panel);
border-bottom: 1px solid var(--border);
position: sticky;
top: 0;
z-index: 100;
}
.mc-header-left { display: flex; align-items: baseline; gap: 0; }
.mc-title {
font-size: 18px;
font-weight: 700;
color: var(--text-bright);
letter-spacing: 0.15em;
}
.mc-subtitle {
font-size: 11px;
color: var(--text-dim);
letter-spacing: 0.2em;
margin-left: 16px;
}
.mc-time {
font-size: 14px;
color: var(--blue);
letter-spacing: 0.1em;
}
/* ── Layout — desktop ────────────────────────────── */
.mc-main {
display: grid;
grid-template-columns: 260px 1fr;
gap: 16px;
padding: 16px;
height: calc(100dvh - var(--header-h));
}
/* ── Panels ──────────────────────────────────────── */
.panel {
background: var(--bg-panel);
border: 1px solid var(--border);
border-radius: 4px;
overflow: hidden;
}
.panel-header {
padding: 8px 14px;
background: var(--bg-card);
border-bottom: 1px solid var(--border);
font-size: 10px;
font-weight: 700;
color: var(--text-dim);
letter-spacing: 0.2em;
text-transform: uppercase;
}
.panel-body { padding: 14px; }
/* ── Sidebar — desktop ───────────────────────────── */
.sidebar {
grid-column: 1;
display: flex;
flex-direction: column;
gap: 16px;
overflow-y: auto;
}
/* ── Agent Card ──────────────────────────────────── */
.agent-card {
border: 1px solid var(--border);
border-radius: 3px;
padding: 12px;
background: var(--bg-card);
}
.agent-card-header {
display: flex;
align-items: center;
gap: 8px;
margin-bottom: 10px;
}
.status-dot {
width: 8px;
height: 8px;
border-radius: 50%;
flex-shrink: 0;
}
.status-dot.green { background: var(--green); box-shadow: 0 0 6px var(--green); }
.status-dot.amber { background: var(--amber); box-shadow: 0 0 6px var(--amber); }
.status-dot.red { background: var(--red); box-shadow: 0 0 6px var(--red); }
.agent-name {
font-size: 14px;
font-weight: 700;
color: var(--text-bright);
letter-spacing: 0.1em;
}
.agent-meta { font-size: 11px; line-height: 2; }
.meta-key { color: var(--text-dim); display: inline-block; width: 60px; }
.meta-val { color: var(--text); }
/* ── Health ──────────────────────────────────────── */
.health-row {
display: flex;
justify-content: space-between;
align-items: center;
padding: 7px 0;
border-bottom: 1px solid var(--border);
font-size: 12px;
}
.health-row:last-child { border-bottom: none; }
.health-label { color: var(--text-dim); letter-spacing: 0.08em; }
.badge {
padding: 2px 8px;
border-radius: 2px;
font-size: 10px;
font-weight: 700;
letter-spacing: 0.12em;
}
.badge.up { background: var(--green-dim); color: var(--green); }
.badge.down { background: var(--red-dim); color: var(--red); }
.badge.ready { background: var(--amber-dim); color: var(--amber); }
/* ── Chat Panel ──────────────────────────────────── */
.chat-panel {
display: flex;
flex-direction: column;
grid-column: 2;
min-height: 0;
}
.chat-log {
flex: 1;
overflow-y: auto;
padding: 14px;
-webkit-overflow-scrolling: touch;
}
.chat-message { margin-bottom: 16px; }
.msg-meta {
font-size: 10px;
color: var(--text-dim);
margin-bottom: 4px;
letter-spacing: 0.12em;
}
.chat-message.user .msg-meta { color: var(--blue); }
.chat-message.agent .msg-meta { color: var(--green); }
.chat-message.error-msg .msg-meta { color: var(--red); }
.msg-body {
background: var(--bg-card);
border: 1px solid var(--border);
border-radius: 3px;
padding: 10px 12px;
line-height: 1.65;
white-space: pre-wrap;
word-break: break-word;
}
.chat-message.user .msg-body { border-color: var(--border-glow); }
.chat-message.agent .msg-body { border-left: 3px solid var(--green); }
.chat-message.error-msg .msg-body { border-left: 3px solid var(--red); color: var(--red); }
/* ── Chat Input ──────────────────────────────────── */
.chat-input-bar {
padding: 12px 14px;
/* safe area for iPhone home bar */
padding-bottom: max(12px, env(safe-area-inset-bottom));
background: var(--bg-card);
border-top: 1px solid var(--border);
display: flex;
gap: 8px;
flex-shrink: 0;
}
.chat-input-bar input {
flex: 1;
background: var(--bg-deep);
border: 1px solid var(--border);
border-radius: 3px;
color: var(--text-bright);
font-family: var(--font);
font-size: 13px;
padding: 8px 12px;
outline: none;
}
.chat-input-bar input:focus {
border-color: var(--border-glow);
box-shadow: 0 0 0 1px var(--border-glow);
}
.chat-input-bar input::placeholder { color: var(--text-dim); }
.chat-input-bar button {
background: var(--border-glow);
border: none;
border-radius: 3px;
color: var(--text-bright);
font-family: var(--font);
font-size: 12px;
font-weight: 700;
padding: 8px 18px;
cursor: pointer;
letter-spacing: 0.12em;
transition: background 0.15s, color 0.15s;
/* prevent double-tap zoom on iOS */
touch-action: manipulation;
}
.chat-input-bar button:hover { background: var(--blue); color: var(--bg-deep); }
/* ── HTMX Loading ────────────────────────────────── */
.htmx-indicator { display: none; }
.htmx-request .htmx-indicator,
.htmx-request.htmx-indicator { display: inline-block; color: var(--amber); animation: blink 0.8s infinite; }
@keyframes blink { 0%, 100% { opacity: 1; } 50% { opacity: 0.2; } }
/* ── Scrollbar ───────────────────────────────────── */
::-webkit-scrollbar { width: 4px; }
::-webkit-scrollbar-track { background: var(--bg-deep); }
::-webkit-scrollbar-thumb { background: var(--border); border-radius: 2px; }
::-webkit-scrollbar-thumb:hover { background: var(--border-glow); }
/* ════════════════════════════════════════════════════
MOBILE (≤ 768 px)
════════════════════════════════════════════════════ */
@media (max-width: 768px) {
:root { --header-h: 44px; }
/* Compact header */
.mc-header { padding: 10px 16px; padding-top: max(10px, env(safe-area-inset-top)); }
.mc-title { font-size: 14px; letter-spacing: 0.1em; }
.mc-subtitle { display: none; }
.mc-time { font-size: 12px; }
/* Single-column stack; sidebar on top, chat below */
.mc-main {
grid-template-columns: 1fr;
grid-template-rows: auto minmax(0, 1fr);
padding: 8px;
gap: 8px;
height: calc(100dvh - var(--header-h));
}
/* Sidebar becomes a horizontal scroll strip */
.sidebar {
grid-column: 1;
grid-row: 1;
flex-direction: row;
overflow-x: auto;
overflow-y: hidden;
gap: 8px;
flex-shrink: 0;
scrollbar-width: none; /* Firefox */
-webkit-overflow-scrolling: touch;
}
.sidebar::-webkit-scrollbar { display: none; }
/* Each panel card has a fixed width so they don't squash */
.sidebar .panel {
min-width: 200px;
flex-shrink: 0;
}
/* Chat fills remaining vertical space */
.chat-panel {
grid-column: 1;
grid-row: 2;
min-height: 0;
}
/* Tighter message padding */
.chat-log { padding: 10px; }
.msg-body { padding: 8px 10px; font-size: 13px; }
.chat-message { margin-bottom: 12px; }
/* Touch-friendly input bar */
.chat-input-bar {
padding: 8px 10px;
padding-bottom: max(8px, env(safe-area-inset-bottom));
gap: 6px;
}
.chat-input-bar input {
/* 16px prevents iOS from zooming when the field focuses */
font-size: 16px;
min-height: 44px;
padding: 0 12px;
}
.chat-input-bar button {
min-height: 44px;
min-width: 64px;
font-size: 12px;
padding: 0 14px;
}
}

0
tests/__init__.py Normal file
View File

25
tests/conftest.py Normal file
View File

@@ -0,0 +1,25 @@
import sys
from pathlib import Path
from unittest.mock import MagicMock
import pytest
from fastapi.testclient import TestClient
# ── Mock agno so tests run without it installed ───────────────────────────────
# Uses setdefault: real module is used if installed, mock otherwise.
for _mod in [
"agno",
"agno.agent",
"agno.models",
"agno.models.ollama",
"agno.db",
"agno.db.sqlite",
]:
sys.modules.setdefault(_mod, MagicMock())
@pytest.fixture
def client():
from dashboard.app import app
with TestClient(app) as c:
yield c

79
tests/test_agent.py Normal file
View File

@@ -0,0 +1,79 @@
from unittest.mock import MagicMock, patch
def test_create_timmy_returns_agent():
"""create_timmy should delegate to Agno Agent with correct config."""
with patch("timmy.agent.Agent") as MockAgent, \
patch("timmy.agent.Ollama"), \
patch("timmy.agent.SqliteDb"):
mock_instance = MagicMock()
MockAgent.return_value = mock_instance
from timmy.agent import create_timmy
result = create_timmy()
assert result is mock_instance
MockAgent.assert_called_once()
def test_create_timmy_agent_name():
with patch("timmy.agent.Agent") as MockAgent, \
patch("timmy.agent.Ollama"), \
patch("timmy.agent.SqliteDb"):
from timmy.agent import create_timmy
create_timmy()
kwargs = MockAgent.call_args.kwargs
assert kwargs["name"] == "Timmy"
def test_create_timmy_uses_llama32():
with patch("timmy.agent.Agent"), \
patch("timmy.agent.Ollama") as MockOllama, \
patch("timmy.agent.SqliteDb"):
from timmy.agent import create_timmy
create_timmy()
MockOllama.assert_called_once_with(id="llama3.2")
def test_create_timmy_history_config():
with patch("timmy.agent.Agent") as MockAgent, \
patch("timmy.agent.Ollama"), \
patch("timmy.agent.SqliteDb"):
from timmy.agent import create_timmy
create_timmy()
kwargs = MockAgent.call_args.kwargs
assert kwargs["add_history_to_context"] is True
assert kwargs["num_history_runs"] == 10
assert kwargs["markdown"] is True
def test_create_timmy_custom_db_file():
with patch("timmy.agent.Agent"), \
patch("timmy.agent.Ollama"), \
patch("timmy.agent.SqliteDb") as MockDb:
from timmy.agent import create_timmy
create_timmy(db_file="custom.db")
MockDb.assert_called_once_with(db_file="custom.db")
def test_create_timmy_embeds_system_prompt():
from timmy.prompts import TIMMY_SYSTEM_PROMPT
with patch("timmy.agent.Agent") as MockAgent, \
patch("timmy.agent.Ollama"), \
patch("timmy.agent.SqliteDb"):
from timmy.agent import create_timmy
create_timmy()
kwargs = MockAgent.call_args.kwargs
assert kwargs["description"] == TIMMY_SYSTEM_PROMPT

110
tests/test_dashboard.py Normal file
View File

@@ -0,0 +1,110 @@
from unittest.mock import AsyncMock, MagicMock, patch
# ── Index ─────────────────────────────────────────────────────────────────────
def test_index_returns_200(client):
response = client.get("/")
assert response.status_code == 200
def test_index_contains_title(client):
response = client.get("/")
assert "TIMMY TIME" in response.text
def test_index_contains_chat_interface(client):
response = client.get("/")
assert "TIMMY INTERFACE" in response.text
# ── Health ────────────────────────────────────────────────────────────────────
def test_health_endpoint_ok(client):
with patch("dashboard.routes.health.check_ollama", new_callable=AsyncMock, return_value=True):
response = client.get("/health")
assert response.status_code == 200
data = response.json()
assert data["status"] == "ok"
assert data["services"]["ollama"] == "up"
assert "timmy" in data["agents"]
def test_health_endpoint_ollama_down(client):
with patch("dashboard.routes.health.check_ollama", new_callable=AsyncMock, return_value=False):
response = client.get("/health")
assert response.status_code == 200
assert response.json()["services"]["ollama"] == "down"
def test_health_status_panel_ollama_up(client):
with patch("dashboard.routes.health.check_ollama", new_callable=AsyncMock, return_value=True):
response = client.get("/health/status")
assert response.status_code == 200
assert "UP" in response.text
def test_health_status_panel_ollama_down(client):
with patch("dashboard.routes.health.check_ollama", new_callable=AsyncMock, return_value=False):
response = client.get("/health/status")
assert response.status_code == 200
assert "DOWN" in response.text
# ── Agents ────────────────────────────────────────────────────────────────────
def test_agents_list(client):
response = client.get("/agents")
assert response.status_code == 200
data = response.json()
assert "agents" in data
ids = [a["id"] for a in data["agents"]]
assert "timmy" in ids
def test_agents_list_timmy_metadata(client):
response = client.get("/agents")
timmy = next(a for a in response.json()["agents"] if a["id"] == "timmy")
assert timmy["name"] == "Timmy"
assert timmy["model"] == "llama3.2"
assert timmy["type"] == "sovereign"
# ── Chat ──────────────────────────────────────────────────────────────────────
def test_chat_timmy_success(client):
mock_agent = MagicMock()
mock_run = MagicMock()
mock_run.content = "I am Timmy, operational and sovereign."
mock_agent.run.return_value = mock_run
with patch("dashboard.routes.agents.create_timmy", return_value=mock_agent):
response = client.post("/agents/timmy/chat", data={"message": "status?"})
assert response.status_code == 200
assert "status?" in response.text
assert "I am Timmy" in response.text
def test_chat_timmy_shows_user_message(client):
mock_agent = MagicMock()
mock_agent.run.return_value = MagicMock(content="Acknowledged.")
with patch("dashboard.routes.agents.create_timmy", return_value=mock_agent):
response = client.post("/agents/timmy/chat", data={"message": "hello there"})
assert "hello there" in response.text
def test_chat_timmy_ollama_offline(client):
with patch("dashboard.routes.agents.create_timmy", side_effect=Exception("connection refused")):
response = client.post("/agents/timmy/chat", data={"message": "ping"})
assert response.status_code == 200
assert "Timmy is offline" in response.text
assert "ping" in response.text
def test_chat_timmy_requires_message(client):
response = client.post("/agents/timmy/chat", data={})
assert response.status_code == 422

33
tests/test_prompts.py Normal file
View File

@@ -0,0 +1,33 @@
from timmy.prompts import TIMMY_SYSTEM_PROMPT, TIMMY_STATUS_PROMPT
def test_system_prompt_not_empty():
assert TIMMY_SYSTEM_PROMPT.strip()
def test_system_prompt_has_timmy_identity():
assert "Timmy" in TIMMY_SYSTEM_PROMPT
def test_system_prompt_mentions_sovereignty():
assert "sovereignty" in TIMMY_SYSTEM_PROMPT.lower()
def test_system_prompt_references_local():
assert "local" in TIMMY_SYSTEM_PROMPT.lower()
def test_system_prompt_is_multiline():
assert "\n" in TIMMY_SYSTEM_PROMPT
def test_status_prompt_not_empty():
assert TIMMY_STATUS_PROMPT.strip()
def test_status_prompt_has_timmy():
assert "Timmy" in TIMMY_STATUS_PROMPT
def test_prompts_are_distinct():
assert TIMMY_SYSTEM_PROMPT != TIMMY_STATUS_PROMPT