Claude 46b848a2d7 fix: chat input correctness and mobile UX
- after-request → after-settle: scrollChat() was firing before HTMX
  swapped the new message into the DOM, so the chat log didn't scroll
  to the new message. after-settle fires post-swap, post-settle.

- hx-sync="this:drop": prevents duplicate submissions if the user taps
  SEND a second time while a slow Ollama response is in flight.

- hx-disabled-elt="find button": disables SEND button visually during
  a pending request; paired with hx-sync for belt-and-suspenders.

- autocorrect="off" autocapitalize="none" spellcheck="false": iOS
  autocorrect mangles model names (llama3.2 etc.) and autocapitalize
  uppercases every message's first word. Both are wrong for a terminal-
  style chat interface.

- enterkeyhint="send": tells the iOS/Android soft keyboard to label
  the Return key "Send" instead of the generic return arrow.

https://claude.ai/code/session_01M4L3R98N5fgXFZRvV8X9b6
2026-02-19 19:24:20 +00:00
2026-02-05 05:17:22 -05:00

Timmy Time — Mission Control

A local-first dashboard for your sovereign AI agents. Talk to Timmy, watch his status, verify Ollama is running — all from a browser, no cloud required.


Prerequisites

You need three things on your Mac before anything else:

Python 3.11+

python3 --version   # should be 3.11 or higher

If not: brew install python@3.11

Ollama (runs the local LLM)

brew install ollama

Or download from https://ollama.com

Git — already on every Mac.


Quickstart (copy-paste friendly)

1. Clone the branch

git clone -b claude/run-tests-IYl0F https://github.com/Alexspayne/Timmy-time-dashboard.git
cd Timmy-time-dashboard

2. Create a virtual environment and install

python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"

3. Pull the model (one-time, ~2 GB download)

Open a new terminal tab and run:

ollama serve

Back in your first tab:

ollama pull llama3.2

4. Start the dashboard

uvicorn dashboard.app:app --reload

Open your browser to http://localhost:8000


Access from your phone

The dashboard is mobile-optimized. To open it on your phone:

Step 1 — bind to your local network (instead of just localhost):

uvicorn dashboard.app:app --host 0.0.0.0 --port 8000 --reload

Step 2 — find your Mac's IP address:

ipconfig getifaddr en0

This prints something like 192.168.1.42. If you're on ethernet instead of Wi-Fi, try en1.

Step 3 — open on your phone:

Make sure your phone is on the same Wi-Fi network as your Mac, then open:

http://192.168.1.42:8000

(replace with your actual IP)

On mobile the layout switches to a single column — status panels become a horizontal scroll strip at the top, chat fills the rest of the screen. The input field is sized to prevent iOS from zooming in when you tap it.


What you'll see

The dashboard has two panels on the left and a chat window on the right:

  • AGENTS — Timmy's metadata (model, type, version)
  • SYSTEM HEALTH — live Ollama status, auto-refreshes every 30 seconds
  • TIMMY INTERFACE — type a message, hit SEND, get a response from the local LLM

If Ollama isn't running when you send a message, the chat will show a "Timmy is offline" error instead of crashing.


Run the tests

No Ollama needed — all external calls are mocked.

pytest

Expected output:

27 passed in 0.67s

Optional: CLI

With your venv active:

timmy chat "What is sovereignty?"
timmy think "Bitcoin and self-custody"
timmy status

Project layout

src/
  timmy/          # Timmy agent — wraps Agno (soul = prompt, body = Agno)
  dashboard/      # FastAPI app + routes + Jinja2 templates
static/           # CSS (dark mission-control theme)
tests/            # 27 pytest tests
pyproject.toml    # dependencies and build config

Troubleshooting

ollama: command not found — Ollama isn't installed or isn't on your PATH. Install via Homebrew or the .dmg from ollama.com.

connection refused in the chat — Ollama isn't running. Open a terminal and run ollama serve, then try again.

ModuleNotFoundError: No module named 'dashboard' — You're not in the venv or forgot pip install -e .. Run source .venv/bin/activate then pip install -e ".[dev]".

Health panel shows DOWN — Ollama isn't running. The chat still works for testing but will return the offline error message.


Roadmap

Version Name Milestone
1.0.0 Genesis Agno + Ollama + SQLite + Dashboard
2.0.0 Exodus MCP tools + multi-agent
3.0.0 Revelation Bitcoin Lightning treasury + single .app
Description
[LEGACY - FROZEN] Original Timmy prototype. Superseded by hermes-agent + the-nexus. Do not create new issues or PRs.
Readme MIT 204 MiB
Languages
Python 86.9%
HTML 6.8%
CSS 2.7%
Shell 1.6%
TypeScript 1%
Other 1%