forked from Rockachopa/Timmy-time-dashboard
docs: rewrite README as human-friendly Mac quickstart
Step-by-step clone → venv → install → Ollama → run sequence, what to expect in the browser, troubleshooting section for common Mac failure modes. https://claude.ai/code/session_01M4L3R98N5fgXFZRvV8X9b6
This commit is contained in:
126
README.md
126
README.md
@@ -1,33 +1,100 @@
|
||||
# Timmy Time — Mission Control
|
||||
|
||||
Sovereign AI agent dashboard. Monitor and interact with local and cloud AI agents.
|
||||
A local-first dashboard for your sovereign AI agents. Talk to Timmy, watch his status, verify Ollama is running — all from a browser, no cloud required.
|
||||
|
||||
## Stack
|
||||
---
|
||||
|
||||
| Layer | Tech |
|
||||
|-----------|------------------------------|
|
||||
| Agent | Agno + Ollama (llama3.2) |
|
||||
| Memory | SQLite via Agno SqliteDb |
|
||||
| Backend | FastAPI |
|
||||
| Frontend | HTMX + Jinja2 |
|
||||
| Tests | Pytest |
|
||||
## Prerequisites
|
||||
|
||||
## Quickstart
|
||||
You need three things on your Mac before anything else:
|
||||
|
||||
**Python 3.11+**
|
||||
```bash
|
||||
python3 --version # should be 3.11 or higher
|
||||
```
|
||||
If not: `brew install python@3.11`
|
||||
|
||||
**Ollama** (runs the local LLM)
|
||||
```bash
|
||||
brew install ollama
|
||||
```
|
||||
Or download from https://ollama.com
|
||||
|
||||
**Git** — already on every Mac.
|
||||
|
||||
---
|
||||
|
||||
## Quickstart (copy-paste friendly)
|
||||
|
||||
### 1. Clone the branch
|
||||
|
||||
```bash
|
||||
git clone -b claude/run-tests-IYl0F https://github.com/Alexspayne/Timmy-time-dashboard.git
|
||||
cd Timmy-time-dashboard
|
||||
```
|
||||
|
||||
### 2. Create a virtual environment and install
|
||||
|
||||
```bash
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
pip install -e ".[dev]"
|
||||
```
|
||||
|
||||
# Ollama (separate terminal)
|
||||
ollama serve && ollama pull llama3.2
|
||||
### 3. Pull the model (one-time, ~2 GB download)
|
||||
|
||||
# Dashboard
|
||||
Open a **new terminal tab** and run:
|
||||
|
||||
```bash
|
||||
ollama serve
|
||||
```
|
||||
|
||||
Back in your first tab:
|
||||
|
||||
```bash
|
||||
ollama pull llama3.2
|
||||
```
|
||||
|
||||
### 4. Start the dashboard
|
||||
|
||||
```bash
|
||||
uvicorn dashboard.app:app --reload
|
||||
```
|
||||
|
||||
# Tests (no Ollama needed)
|
||||
Open your browser to **http://localhost:8000**
|
||||
|
||||
---
|
||||
|
||||
## What you'll see
|
||||
|
||||
The dashboard has two panels on the left and a chat window on the right:
|
||||
|
||||
- **AGENTS** — Timmy's metadata (model, type, version)
|
||||
- **SYSTEM HEALTH** — live Ollama status, auto-refreshes every 30 seconds
|
||||
- **TIMMY INTERFACE** — type a message, hit SEND, get a response from the local LLM
|
||||
|
||||
If Ollama isn't running when you send a message, the chat will show a "Timmy is offline" error instead of crashing.
|
||||
|
||||
---
|
||||
|
||||
## Run the tests
|
||||
|
||||
No Ollama needed — all external calls are mocked.
|
||||
|
||||
```bash
|
||||
pytest
|
||||
```
|
||||
|
||||
## CLI
|
||||
Expected output:
|
||||
```
|
||||
27 passed in 0.67s
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Optional: CLI
|
||||
|
||||
With your venv active:
|
||||
|
||||
```bash
|
||||
timmy chat "What is sovereignty?"
|
||||
@@ -35,18 +102,33 @@ timmy think "Bitcoin and self-custody"
|
||||
timmy status
|
||||
```
|
||||
|
||||
## Project Structure
|
||||
---
|
||||
|
||||
## Project layout
|
||||
|
||||
```
|
||||
src/
|
||||
timmy/ # Agent identity — soul (prompt) + body (Agno)
|
||||
dashboard/ # Mission Control UI
|
||||
routes/ # FastAPI route handlers
|
||||
templates/ # Jinja2 HTML (HTMX-powered)
|
||||
static/ # CSS
|
||||
tests/ # Pytest suite
|
||||
timmy/ # Timmy agent — wraps Agno (soul = prompt, body = Agno)
|
||||
dashboard/ # FastAPI app + routes + Jinja2 templates
|
||||
static/ # CSS (dark mission-control theme)
|
||||
tests/ # 27 pytest tests
|
||||
pyproject.toml # dependencies and build config
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**`ollama: command not found`** — Ollama isn't installed or isn't on your PATH. Install via Homebrew or the .dmg from ollama.com.
|
||||
|
||||
**`connection refused` in the chat** — Ollama isn't running. Open a terminal and run `ollama serve`, then try again.
|
||||
|
||||
**`ModuleNotFoundError: No module named 'dashboard'`** — You're not in the venv or forgot `pip install -e .`. Run `source .venv/bin/activate` then `pip install -e ".[dev]"`.
|
||||
|
||||
**Health panel shows DOWN** — Ollama isn't running. The chat still works for testing but will return the offline error message.
|
||||
|
||||
---
|
||||
|
||||
## Roadmap
|
||||
|
||||
| Version | Name | Milestone |
|
||||
|
||||
Reference in New Issue
Block a user