A local-first dashboard for your sovereign AI agents. Talk to Timmy, watch his status, verify Ollama is running — all from a browser, no cloud required.
This prints something like `192.168.1.42`. If you're on ethernet instead of Wi-Fi, try `en1`.
**Step 3 — open on your phone:**
Make sure your phone is on the **same Wi-Fi network** as your Mac, then open:
```
http://192.168.1.42:8000
```
(replace with your actual IP)
On mobile the layout switches to a single column — status panels become a horizontal scroll strip at the top, chat fills the rest of the screen. The input field is sized to prevent iOS from zooming in when you tap it.
**`ollama: command not found`** — Ollama isn't installed or isn't on your PATH. Install via Homebrew or the .dmg from ollama.com.
**`connection refused` in the chat** — Ollama isn't running. Open a terminal and run `ollama serve`, then try again.
**`ModuleNotFoundError: No module named 'dashboard'`** — You're not in the venv or forgot `pip install -e .`. Run `source .venv/bin/activate` then `pip install -e ".[dev]"`.
**Health panel shows DOWN** — Ollama isn't running. The chat still works for testing but will return the offline error message.