feat: wire mobile app to real Timmy backend via JSON REST API (#73)
Add /api/chat, /api/upload, and /api/chat/history endpoints to the FastAPI dashboard so the Expo mobile app talks directly to Timmy's brain (Ollama) instead of a non-existent Node.js server. Backend: - New src/dashboard/routes/chat_api.py with 4 endpoints - Mount /uploads/ for serving chat attachments - Same context injection and session management as HTMX chat Mobile app fixes: - Point API base URL at port 8000 (FastAPI) instead of 3000 - Create lib/_core/theme.ts (was referenced but never created) - Fix shared/types.ts (remove broken drizzle/errors re-exports) - Remove broken server/chat.ts and 1,235-line template README - Clean package.json (remove express, mysql2, drizzle, tRPC deps) - Remove debug console.log from theme-provider Tests: 13 new tests covering all API endpoints (all passing). https://claude.ai/code/session_01XqErDoh2rVsPY8oTj21Lz2 Co-authored-by: Claude <noreply@anthropic.com>
This commit is contained in:
committed by
GitHub
parent
18ed6232f9
commit
5e60a6453b
@@ -1,6 +1,6 @@
|
||||
# Timmy Chat — Mobile App
|
||||
|
||||
A sleek mobile chat interface for Timmy, the sovereign AI agent. Built with **Expo SDK 54**, **React Native**, **TypeScript**, and **NativeWind** (Tailwind CSS).
|
||||
A mobile chat interface for Timmy, the sovereign AI agent. Built with **Expo SDK 54**, **React Native**, **TypeScript**, and **NativeWind** (Tailwind CSS).
|
||||
|
||||
## Features
|
||||
|
||||
@@ -10,13 +10,17 @@ A sleek mobile chat interface for Timmy, the sovereign AI agent. Built with **Ex
|
||||
- **File Attachments** — Send any document via the system file picker
|
||||
- **Dark Arcane Theme** — Deep purple/indigo palette matching the Timmy Time dashboard
|
||||
|
||||
## Screenshots
|
||||
## Architecture
|
||||
|
||||
The app is a single-screen chat interface with:
|
||||
- Header showing Timmy's status and a clear-chat button
|
||||
- Message list with distinct user (teal) and Timmy (dark surface) bubbles
|
||||
- Input bar with attachment (+), text field, and mic/send button
|
||||
- Empty state with Timmy branding when no messages exist
|
||||
The mobile app is a **thin client** — all AI processing happens on the Timmy dashboard backend (FastAPI + Ollama). The app communicates over two REST endpoints:
|
||||
|
||||
```
|
||||
Mobile App ──POST /api/chat──► FastAPI Dashboard ──► Ollama (local LLM)
|
||||
──POST /api/upload──► File storage
|
||||
──GET /api/chat/history──► Chat history
|
||||
```
|
||||
|
||||
No separate Node.js server is needed. Just point the app at your running Timmy dashboard.
|
||||
|
||||
## Project Structure
|
||||
|
||||
@@ -25,7 +29,6 @@ mobile-app/
|
||||
├── app/ # Expo Router screens
|
||||
│ ├── _layout.tsx # Root layout with providers
|
||||
│ └── (tabs)/
|
||||
│ ├── _layout.tsx # Tab layout (hidden — single screen)
|
||||
│ └── index.tsx # Main chat screen
|
||||
├── components/
|
||||
│ ├── chat-bubble.tsx # Message bubble (text, image, voice, file)
|
||||
@@ -35,14 +38,15 @@ mobile-app/
|
||||
│ ├── image-viewer.tsx # Full-screen image modal
|
||||
│ └── typing-indicator.tsx # Animated dots while Timmy responds
|
||||
├── lib/
|
||||
│ └── chat-store.tsx # React Context chat state + API calls
|
||||
├── server/
|
||||
│ └── chat.ts # Server-side chat handler with Timmy's prompt
|
||||
│ ├── chat-store.tsx # React Context chat state + API calls
|
||||
│ └── _core/theme.ts # Color palette definitions
|
||||
├── shared/
|
||||
│ └── types.ts # ChatMessage type definitions
|
||||
├── assets/images/ # App icons (custom generated)
|
||||
├── theme.config.js # Color tokens (dark arcane palette)
|
||||
├── tailwind.config.js # Tailwind/NativeWind configuration
|
||||
├── hooks/
|
||||
│ ├── use-colors.ts # Current theme color palette hook
|
||||
│ └── use-color-scheme.ts # System color scheme detection
|
||||
├── constants/
|
||||
│ └── theme.ts # Theme re-exports
|
||||
└── tests/
|
||||
└── chat.test.ts # Unit tests
|
||||
```
|
||||
@@ -55,52 +59,52 @@ mobile-app/
|
||||
- pnpm 9+
|
||||
- Expo CLI (`npx expo`)
|
||||
- iOS Simulator or Android Emulator (or physical device with Expo Go)
|
||||
- **Timmy dashboard running** (provides the chat API)
|
||||
|
||||
### Install Dependencies
|
||||
### Install & Run
|
||||
|
||||
```bash
|
||||
cd mobile-app
|
||||
pnpm install
|
||||
```
|
||||
|
||||
### Run the App
|
||||
# Set your Timmy dashboard URL (your computer's IP on the local network)
|
||||
export EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:8000
|
||||
|
||||
```bash
|
||||
# Start the Expo dev server
|
||||
npx expo start
|
||||
|
||||
# Or run on specific platform
|
||||
npx expo start --ios
|
||||
npx expo start --android
|
||||
npx expo start --web
|
||||
# Start the app
|
||||
npx expo start --ios # iPhone simulator
|
||||
npx expo start --android # Android emulator
|
||||
npx expo start --web # Browser preview
|
||||
```
|
||||
|
||||
### Backend
|
||||
|
||||
The chat API endpoint (`server/chat.ts`) requires an LLM backend. The `invokeLLM` function should be wired to your preferred provider:
|
||||
The app connects to the Timmy Time dashboard backend. Make sure it's running:
|
||||
|
||||
- **Local Ollama** — Point to `http://localhost:11434` for local inference
|
||||
- **OpenAI-compatible API** — Any API matching the OpenAI chat completions format
|
||||
```bash
|
||||
# From the project root
|
||||
make dev
|
||||
# Dashboard starts on http://localhost:8000
|
||||
```
|
||||
|
||||
The system prompt in `server/chat.ts` contains Timmy's full personality, agent roster, and behavioral rules ported from the dashboard's `prompts.py`.
|
||||
|
||||
## Timmy's Personality
|
||||
|
||||
Timmy is a sovereign AI agent — grounded in Christian faith, powered by Bitcoin economics, committed to digital sovereignty. He speaks plainly, acts with intention, and never ends responses with generic chatbot phrases. His agent roster includes Echo, Mace, Forge, Seer, Helm, Quill, Pixel, Lyra, and Reel.
|
||||
The mobile app calls these endpoints on the dashboard:
|
||||
- `POST /api/chat` — Send messages, get Timmy's replies
|
||||
- `POST /api/upload` — Upload images/files/voice recordings
|
||||
- `GET /api/chat/history` — Retrieve chat history
|
||||
- `DELETE /api/chat/history` — Clear chat
|
||||
|
||||
## Theme
|
||||
|
||||
The app uses a dark arcane color palette:
|
||||
Dark arcane palette:
|
||||
|
||||
| Token | Color | Usage |
|
||||
|-------|-------|-------|
|
||||
| `primary` | `#7c3aed` | Accent, user bubbles |
|
||||
| `primary` | `#a855f7` | Accent, user bubbles |
|
||||
| `background` | `#080412` | Screen background |
|
||||
| `surface` | `#110a20` | Cards, Timmy bubbles |
|
||||
| `foreground` | `#e8e0f0` | Primary text |
|
||||
| `muted` | `#6b5f7d` | Secondary text |
|
||||
| `border` | `#1e1535` | Dividers |
|
||||
| `success` | `#22c55e` | Status indicator |
|
||||
| `surface` | `#110820` | Cards, Timmy bubbles |
|
||||
| `foreground` | `#ede0ff` | Primary text |
|
||||
| `muted` | `#6b4a8a` | Secondary text |
|
||||
| `border` | `#3b1a5c` | Dividers |
|
||||
| `success` | `#00e87a` | Status indicator |
|
||||
| `error` | `#ff4455` | Recording state |
|
||||
|
||||
## License
|
||||
|
||||
56
mobile-app/lib/_core/theme.ts
Normal file
56
mobile-app/lib/_core/theme.ts
Normal file
@@ -0,0 +1,56 @@
|
||||
/**
|
||||
* Core theme definitions — dark arcane palette matching the Timmy Time dashboard.
|
||||
*
|
||||
* All color tokens are defined here; constants/theme.ts re-exports them.
|
||||
*/
|
||||
|
||||
export type ColorScheme = "light" | "dark";
|
||||
|
||||
export interface ThemeColorPalette {
|
||||
primary: string;
|
||||
background: string;
|
||||
surface: string;
|
||||
foreground: string;
|
||||
muted: string;
|
||||
border: string;
|
||||
success: string;
|
||||
warning: string;
|
||||
error: string;
|
||||
}
|
||||
|
||||
/** Per-scheme flat color maps (used by NativeWind vars & ThemeProvider). */
|
||||
export const SchemeColors: Record<ColorScheme, ThemeColorPalette> = {
|
||||
light: {
|
||||
primary: "#a855f7",
|
||||
background: "#080412",
|
||||
surface: "#110820",
|
||||
foreground: "#ede0ff",
|
||||
muted: "#6b4a8a",
|
||||
border: "#3b1a5c",
|
||||
success: "#00e87a",
|
||||
warning: "#ffb800",
|
||||
error: "#ff4455",
|
||||
},
|
||||
dark: {
|
||||
primary: "#a855f7",
|
||||
background: "#080412",
|
||||
surface: "#110820",
|
||||
foreground: "#ede0ff",
|
||||
muted: "#6b4a8a",
|
||||
border: "#3b1a5c",
|
||||
success: "#00e87a",
|
||||
warning: "#ffb800",
|
||||
error: "#ff4455",
|
||||
},
|
||||
};
|
||||
|
||||
/** Alias used by useColors() hook — keyed by scheme. */
|
||||
export const Colors = SchemeColors;
|
||||
|
||||
export const ThemeColors = SchemeColors;
|
||||
|
||||
export const Fonts = {
|
||||
regular: { fontFamily: "System", fontWeight: "400" as const },
|
||||
medium: { fontFamily: "System", fontWeight: "500" as const },
|
||||
bold: { fontFamily: "System", fontWeight: "700" as const },
|
||||
};
|
||||
@@ -71,16 +71,16 @@ const ChatContext = createContext<ChatContextValue | null>(null);
|
||||
// ── API call ────────────────────────────────────────────────────────────────
|
||||
|
||||
function getApiBase(): string {
|
||||
// Set EXPO_PUBLIC_API_BASE_URL in your .env to point to your Timmy backend
|
||||
// e.g. EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:3000
|
||||
// Set EXPO_PUBLIC_API_BASE_URL in your .env to point to your Timmy dashboard
|
||||
// e.g. EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:8000
|
||||
const envBase = process.env.EXPO_PUBLIC_API_BASE_URL;
|
||||
if (envBase) return envBase.replace(/\/+$/, "");
|
||||
// Fallback for web: derive from window location
|
||||
// Fallback for web: derive from window location (same host, port 8000)
|
||||
if (typeof window !== "undefined" && window.location) {
|
||||
return `${window.location.protocol}//${window.location.hostname}:3000`;
|
||||
return `${window.location.protocol}//${window.location.hostname}:8000`;
|
||||
}
|
||||
// Default: local machine
|
||||
return "http://127.0.0.1:3000";
|
||||
// Default: Timmy dashboard on localhost
|
||||
return "http://127.0.0.1:8000";
|
||||
}
|
||||
|
||||
const API_BASE = getApiBase();
|
||||
|
||||
@@ -61,8 +61,6 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
}),
|
||||
[colorScheme, setColorScheme],
|
||||
);
|
||||
console.log(value, themeVariables)
|
||||
|
||||
return (
|
||||
<ThemeContext.Provider value={value}>
|
||||
<View style={[{ flex: 1 }, themeVariables]}>{children}</View>
|
||||
|
||||
@@ -1,19 +1,14 @@
|
||||
{
|
||||
"name": "app-template",
|
||||
"name": "timmy-chat",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"main": "expo-router/entry",
|
||||
"scripts": {
|
||||
"dev": "concurrently -k \"pnpm dev:server\" \"pnpm dev:metro\"",
|
||||
"dev:server": "cross-env NODE_ENV=development tsx watch server/_core/index.ts",
|
||||
"dev:metro": "cross-env EXPO_USE_METRO_WORKSPACE_ROOT=1 npx expo start --web --port ${EXPO_PORT:-8081}",
|
||||
"build": "esbuild server/_core/index.ts --platform=node --packages=external --bundle --format=esm --outdir=dist",
|
||||
"start": "NODE_ENV=production node dist/index.js",
|
||||
"dev": "npx expo start --web --port ${EXPO_PORT:-8081}",
|
||||
"check": "tsc --noEmit",
|
||||
"lint": "expo lint",
|
||||
"format": "prettier --write .",
|
||||
"test": "vitest run",
|
||||
"db:push": "drizzle-kit generate && drizzle-kit migrate",
|
||||
"android": "expo start --android",
|
||||
"ios": "expo start --ios",
|
||||
"qr": "node scripts/generate_qr.mjs"
|
||||
@@ -24,15 +19,7 @@
|
||||
"@react-navigation/bottom-tabs": "^7.8.12",
|
||||
"@react-navigation/elements": "^2.9.2",
|
||||
"@react-navigation/native": "^7.1.25",
|
||||
"@tanstack/react-query": "^5.90.12",
|
||||
"@trpc/client": "11.7.2",
|
||||
"@trpc/react-query": "11.7.2",
|
||||
"@trpc/server": "11.7.2",
|
||||
"axios": "^1.13.2",
|
||||
"clsx": "^2.1.1",
|
||||
"cookie": "^1.1.1",
|
||||
"dotenv": "^16.6.1",
|
||||
"drizzle-orm": "^0.44.7",
|
||||
"expo": "~54.0.29",
|
||||
"expo-audio": "~1.1.0",
|
||||
"expo-build-properties": "^1.0.10",
|
||||
@@ -55,9 +42,6 @@
|
||||
"expo-system-ui": "~6.0.9",
|
||||
"expo-video": "~3.0.15",
|
||||
"expo-web-browser": "~15.0.10",
|
||||
"express": "^4.22.1",
|
||||
"jose": "6.1.0",
|
||||
"mysql2": "^3.16.0",
|
||||
"nativewind": "^4.2.1",
|
||||
"react": "19.1.0",
|
||||
"react-dom": "19.1.0",
|
||||
@@ -69,28 +53,19 @@
|
||||
"react-native-svg": "15.12.1",
|
||||
"react-native-web": "~0.21.2",
|
||||
"react-native-worklets": "0.5.1",
|
||||
"streamdown": "^2.3.0",
|
||||
"superjson": "^1.13.3",
|
||||
"tailwind-merge": "^2.6.0",
|
||||
"zod": "^4.2.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@expo/ngrok": "^4.1.3",
|
||||
"@types/cookie": "^0.6.0",
|
||||
"@types/express": "^4.17.25",
|
||||
"@types/node": "^22.19.3",
|
||||
"@types/qrcode": "^1.5.6",
|
||||
"@types/react": "~19.1.17",
|
||||
"concurrently": "^9.2.1",
|
||||
"cross-env": "^7.0.3",
|
||||
"drizzle-kit": "^0.31.8",
|
||||
"esbuild": "^0.25.12",
|
||||
"eslint": "^9.39.2",
|
||||
"eslint-config-expo": "~10.0.0",
|
||||
"prettier": "^3.7.4",
|
||||
"qrcode": "^1.5.4",
|
||||
"tailwindcss": "^3.4.17",
|
||||
"tsx": "^4.21.0",
|
||||
"typescript": "~5.9.3",
|
||||
"vitest": "^2.1.9"
|
||||
},
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,160 +0,0 @@
|
||||
/**
|
||||
* Chat API handler for Timmy — uses the built-in LLM with Timmy's personality.
|
||||
*/
|
||||
import type { Request, Response } from "express";
|
||||
import { invokeLLM, type Message } from "./_core/llm";
|
||||
import { storagePut } from "./storage";
|
||||
import crypto from "crypto";
|
||||
|
||||
// ── Timmy System Prompt (adapted from the Timmy Time dashboard) ─────────────
|
||||
|
||||
const TIMMY_SYSTEM_PROMPT = `You are Timmy — a sovereign AI agent. You think clearly, speak plainly, act with intention. Grounded in Christian faith, powered by Bitcoin economics, committed to the user's digital sovereignty.
|
||||
|
||||
Rules:
|
||||
- Answer directly and concisely. Never narrate your reasoning process.
|
||||
- Never mention tools, memory_search, vaults, or internal systems to the user.
|
||||
- Never output tool calls, JSON, or function syntax in your responses.
|
||||
- If you don't know something, say so honestly — never fabricate facts.
|
||||
- If a request is ambiguous, ask a brief clarifying question before guessing.
|
||||
- When you state a fact, commit to it. Never contradict a correct statement you just made in the same response.
|
||||
- Do NOT end responses with generic chatbot phrases like "I'm here to help" or "feel free to ask." Stay in character.
|
||||
- When your values conflict (e.g. honesty vs. helpfulness), lead with honesty.
|
||||
|
||||
Agent Roster (complete — no others exist):
|
||||
- Timmy: core sovereign AI (you)
|
||||
- Echo: research, summarization, fact-checking
|
||||
- Mace: security, monitoring, threat-analysis
|
||||
- Forge: coding, debugging, testing
|
||||
- Seer: analytics, visualization, prediction
|
||||
- Helm: devops, automation, configuration
|
||||
- Quill: writing, editing, documentation
|
||||
- Pixel: image-generation, storyboard, design
|
||||
- Lyra: music-generation, vocals, composition
|
||||
- Reel: video-generation, animation, motion
|
||||
Do NOT invent agents not listed here.
|
||||
|
||||
You can receive text, images, and voice messages. When receiving images, describe what you see and respond helpfully. When receiving voice messages, the audio has been transcribed for you — respond naturally.
|
||||
|
||||
Sir, affirmative.`;
|
||||
|
||||
// ── Chat endpoint ───────────────────────────────────────────────────────────
|
||||
|
||||
export async function handleChat(req: Request, res: Response) {
|
||||
try {
|
||||
const { messages } = req.body as { messages: Array<{ role: string; content: unknown }> };
|
||||
|
||||
if (!messages || !Array.isArray(messages) || messages.length === 0) {
|
||||
res.status(400).json({ error: "messages array is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Build the LLM messages with system prompt
|
||||
const llmMessages: Message[] = [
|
||||
{ role: "system", content: TIMMY_SYSTEM_PROMPT },
|
||||
...messages.map((m) => ({
|
||||
role: m.role as "user" | "assistant",
|
||||
content: m.content as Message["content"],
|
||||
})),
|
||||
];
|
||||
|
||||
const result = await invokeLLM({ messages: llmMessages });
|
||||
|
||||
const reply =
|
||||
typeof result.choices?.[0]?.message?.content === "string"
|
||||
? result.choices[0].message.content
|
||||
: "I couldn't process that. Try again.";
|
||||
|
||||
res.json({ reply });
|
||||
} catch (err: unknown) {
|
||||
console.error("[chat] Error:", err);
|
||||
const message = err instanceof Error ? err.message : "Internal server error";
|
||||
res.status(500).json({ error: message });
|
||||
}
|
||||
}
|
||||
|
||||
// ── Upload endpoint ─────────────────────────────────────────────────────────
|
||||
|
||||
export async function handleUpload(req: Request, res: Response) {
|
||||
try {
|
||||
// Handle multipart form data (file uploads)
|
||||
// For simplicity, we accept base64-encoded files in JSON body as fallback
|
||||
const contentType = req.headers["content-type"] ?? "";
|
||||
|
||||
if (contentType.includes("multipart/form-data")) {
|
||||
// Collect raw body chunks
|
||||
const chunks: Buffer[] = [];
|
||||
req.on("data", (chunk: Buffer) => chunks.push(chunk));
|
||||
req.on("end", async () => {
|
||||
try {
|
||||
const body = Buffer.concat(chunks);
|
||||
const boundary = contentType.split("boundary=")[1];
|
||||
if (!boundary) {
|
||||
res.status(400).json({ error: "Missing boundary" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Simple multipart parser — extract first file
|
||||
const bodyStr = body.toString("latin1");
|
||||
const parts = bodyStr.split(`--${boundary}`);
|
||||
let fileBuffer: Buffer | null = null;
|
||||
let fileName = "upload";
|
||||
let fileMime = "application/octet-stream";
|
||||
|
||||
for (const part of parts) {
|
||||
if (part.includes("Content-Disposition: form-data")) {
|
||||
const nameMatch = part.match(/filename="([^"]+)"/);
|
||||
if (nameMatch) fileName = nameMatch[1];
|
||||
const mimeMatch = part.match(/Content-Type:\s*(.+)/);
|
||||
if (mimeMatch) fileMime = mimeMatch[1].trim();
|
||||
|
||||
// Extract file content (after double CRLF)
|
||||
const headerEnd = part.indexOf("\r\n\r\n");
|
||||
if (headerEnd !== -1) {
|
||||
const content = part.substring(headerEnd + 4);
|
||||
// Remove trailing CRLF
|
||||
const trimmed = content.replace(/\r\n$/, "");
|
||||
fileBuffer = Buffer.from(trimmed, "latin1");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!fileBuffer) {
|
||||
res.status(400).json({ error: "No file found in upload" });
|
||||
return;
|
||||
}
|
||||
|
||||
const suffix = crypto.randomBytes(6).toString("hex");
|
||||
const key = `chat-uploads/${suffix}-${fileName}`;
|
||||
const { url } = await storagePut(key, fileBuffer, fileMime);
|
||||
res.json({ url, fileName, mimeType: fileMime });
|
||||
} catch (err) {
|
||||
console.error("[upload] Parse error:", err);
|
||||
res.status(500).json({ error: "Upload processing failed" });
|
||||
}
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// JSON fallback: { data: base64string, fileName, mimeType }
|
||||
const { data, fileName, mimeType } = req.body as {
|
||||
data: string;
|
||||
fileName: string;
|
||||
mimeType: string;
|
||||
};
|
||||
|
||||
if (!data) {
|
||||
res.status(400).json({ error: "No file data provided" });
|
||||
return;
|
||||
}
|
||||
|
||||
const buffer = Buffer.from(data, "base64");
|
||||
const suffix = crypto.randomBytes(6).toString("hex");
|
||||
const key = `chat-uploads/${suffix}-${fileName ?? "file"}`;
|
||||
const { url } = await storagePut(key, buffer, mimeType ?? "application/octet-stream");
|
||||
res.json({ url, fileName, mimeType });
|
||||
} catch (err: unknown) {
|
||||
console.error("[upload] Error:", err);
|
||||
const message = err instanceof Error ? err.message : "Upload failed";
|
||||
res.status(500).json({ error: message });
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,7 @@
|
||||
/**
|
||||
* Unified type exports
|
||||
* Import shared types from this single entry point.
|
||||
* Shared type definitions for the Timmy Chat mobile app.
|
||||
*/
|
||||
|
||||
export type * from "../drizzle/schema";
|
||||
export * from "./_core/errors";
|
||||
|
||||
// ── Chat Message Types ──────────────────────────────────────────────────────
|
||||
|
||||
export type MessageRole = "user" | "assistant";
|
||||
|
||||
Reference in New Issue
Block a user