feat: add Timmy Chat mobile app (Expo/React Native)
- Single-screen chat interface with Timmy's sovereign AI personality - Text messaging with real-time AI responses via server chat API - Voice recording and playback with waveform visualization - Image sharing (camera + photo library) with full-screen viewer - File attachments via document picker - Dark arcane theme matching the Timmy Time dashboard - Custom app icon with glowing T circuit design - Timmy system prompt ported from dashboard prompts.py - Unit tests for chat utilities and message types
This commit is contained in:
108
mobile-app/README.md
Normal file
108
mobile-app/README.md
Normal file
@@ -0,0 +1,108 @@
|
||||
# Timmy Chat — Mobile App
|
||||
|
||||
A sleek mobile chat interface for Timmy, the sovereign AI agent. Built with **Expo SDK 54**, **React Native**, **TypeScript**, and **NativeWind** (Tailwind CSS).
|
||||
|
||||
## Features
|
||||
|
||||
- **Text Chat** — Send and receive messages with Timmy's full personality
|
||||
- **Voice Messages** — Record and send voice notes via the mic button; playback with waveform UI
|
||||
- **Image Sharing** — Take photos or pick from library; full-screen image viewer
|
||||
- **File Attachments** — Send any document via the system file picker
|
||||
- **Dark Arcane Theme** — Deep purple/indigo palette matching the Timmy Time dashboard
|
||||
|
||||
## Screenshots
|
||||
|
||||
The app is a single-screen chat interface with:
|
||||
- Header showing Timmy's status and a clear-chat button
|
||||
- Message list with distinct user (teal) and Timmy (dark surface) bubbles
|
||||
- Input bar with attachment (+), text field, and mic/send button
|
||||
- Empty state with Timmy branding when no messages exist
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
mobile-app/
|
||||
├── app/ # Expo Router screens
|
||||
│ ├── _layout.tsx # Root layout with providers
|
||||
│ └── (tabs)/
|
||||
│ ├── _layout.tsx # Tab layout (hidden — single screen)
|
||||
│ └── index.tsx # Main chat screen
|
||||
├── components/
|
||||
│ ├── chat-bubble.tsx # Message bubble (text, image, voice, file)
|
||||
│ ├── chat-header.tsx # Header with Timmy status
|
||||
│ ├── chat-input.tsx # Input bar (text, mic, attachments)
|
||||
│ ├── empty-chat.tsx # Empty state welcome screen
|
||||
│ ├── image-viewer.tsx # Full-screen image modal
|
||||
│ └── typing-indicator.tsx # Animated dots while Timmy responds
|
||||
├── lib/
|
||||
│ └── chat-store.tsx # React Context chat state + API calls
|
||||
├── server/
|
||||
│ └── chat.ts # Server-side chat handler with Timmy's prompt
|
||||
├── shared/
|
||||
│ └── types.ts # ChatMessage type definitions
|
||||
├── assets/images/ # App icons (custom generated)
|
||||
├── theme.config.js # Color tokens (dark arcane palette)
|
||||
├── tailwind.config.js # Tailwind/NativeWind configuration
|
||||
└── tests/
|
||||
└── chat.test.ts # Unit tests
|
||||
```
|
||||
|
||||
## Setup
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js 18+
|
||||
- pnpm 9+
|
||||
- Expo CLI (`npx expo`)
|
||||
- iOS Simulator or Android Emulator (or physical device with Expo Go)
|
||||
|
||||
### Install Dependencies
|
||||
|
||||
```bash
|
||||
cd mobile-app
|
||||
pnpm install
|
||||
```
|
||||
|
||||
### Run the App
|
||||
|
||||
```bash
|
||||
# Start the Expo dev server
|
||||
npx expo start
|
||||
|
||||
# Or run on specific platform
|
||||
npx expo start --ios
|
||||
npx expo start --android
|
||||
npx expo start --web
|
||||
```
|
||||
|
||||
### Backend
|
||||
|
||||
The chat API endpoint (`server/chat.ts`) requires an LLM backend. The `invokeLLM` function should be wired to your preferred provider:
|
||||
|
||||
- **Local Ollama** — Point to `http://localhost:11434` for local inference
|
||||
- **OpenAI-compatible API** — Any API matching the OpenAI chat completions format
|
||||
|
||||
The system prompt in `server/chat.ts` contains Timmy's full personality, agent roster, and behavioral rules ported from the dashboard's `prompts.py`.
|
||||
|
||||
## Timmy's Personality
|
||||
|
||||
Timmy is a sovereign AI agent — grounded in Christian faith, powered by Bitcoin economics, committed to digital sovereignty. He speaks plainly, acts with intention, and never ends responses with generic chatbot phrases. His agent roster includes Echo, Mace, Forge, Seer, Helm, Quill, Pixel, Lyra, and Reel.
|
||||
|
||||
## Theme
|
||||
|
||||
The app uses a dark arcane color palette:
|
||||
|
||||
| Token | Color | Usage |
|
||||
|-------|-------|-------|
|
||||
| `primary` | `#7c3aed` | Accent, user bubbles |
|
||||
| `background` | `#080412` | Screen background |
|
||||
| `surface` | `#110a20` | Cards, Timmy bubbles |
|
||||
| `foreground` | `#e8e0f0` | Primary text |
|
||||
| `muted` | `#6b5f7d` | Secondary text |
|
||||
| `border` | `#1e1535` | Dividers |
|
||||
| `success` | `#22c55e` | Status indicator |
|
||||
| `error` | `#ff4455` | Recording state |
|
||||
|
||||
## License
|
||||
|
||||
Same as the parent Timmy Time Dashboard project.
|
||||
130
mobile-app/app.config.ts
Normal file
130
mobile-app/app.config.ts
Normal file
@@ -0,0 +1,130 @@
|
||||
// Load environment variables with proper priority (system > .env)
|
||||
import "./scripts/load-env.js";
|
||||
import type { ExpoConfig } from "expo/config";
|
||||
|
||||
// Bundle ID format: space.manus.<project_name_dots>.<timestamp>
|
||||
// e.g., "my-app" created at 2024-01-15 10:30:45 -> "space.manus.my.app.t20240115103045"
|
||||
// Bundle ID can only contain letters, numbers, and dots
|
||||
// Android requires each dot-separated segment to start with a letter
|
||||
const rawBundleId = "space.manus.timmy.chat.t20260226211148";
|
||||
const bundleId =
|
||||
rawBundleId
|
||||
.replace(/[-_]/g, ".") // Replace hyphens/underscores with dots
|
||||
.replace(/[^a-zA-Z0-9.]/g, "") // Remove invalid chars
|
||||
.replace(/\.+/g, ".") // Collapse consecutive dots
|
||||
.replace(/^\.+|\.+$/g, "") // Trim leading/trailing dots
|
||||
.toLowerCase()
|
||||
.split(".")
|
||||
.map((segment) => {
|
||||
// Android requires each segment to start with a letter
|
||||
// Prefix with 'x' if segment starts with a digit
|
||||
return /^[a-zA-Z]/.test(segment) ? segment : "x" + segment;
|
||||
})
|
||||
.join(".") || "space.manus.app";
|
||||
// Extract timestamp from bundle ID and prefix with "manus" for deep link scheme
|
||||
// e.g., "space.manus.my.app.t20240115103045" -> "manus20240115103045"
|
||||
const timestamp = bundleId.split(".").pop()?.replace(/^t/, "") ?? "";
|
||||
const schemeFromBundleId = `manus${timestamp}`;
|
||||
|
||||
const env = {
|
||||
// App branding - update these values directly (do not use env vars)
|
||||
appName: "Timmy Chat",
|
||||
appSlug: "timmy-chat",
|
||||
// S3 URL of the app logo - set this to the URL returned by generate_image when creating custom logo
|
||||
// Leave empty to use the default icon from assets/images/icon.png
|
||||
logoUrl: "https://files.manuscdn.com/user_upload_by_module/session_file/310519663286296482/kuSmtQpNVBtvECMG.png",
|
||||
scheme: schemeFromBundleId,
|
||||
iosBundleId: bundleId,
|
||||
androidPackage: bundleId,
|
||||
};
|
||||
|
||||
const config: ExpoConfig = {
|
||||
name: env.appName,
|
||||
slug: env.appSlug,
|
||||
version: "1.0.0",
|
||||
orientation: "portrait",
|
||||
icon: "./assets/images/icon.png",
|
||||
scheme: env.scheme,
|
||||
userInterfaceStyle: "automatic",
|
||||
newArchEnabled: true,
|
||||
ios: {
|
||||
supportsTablet: true,
|
||||
bundleIdentifier: env.iosBundleId,
|
||||
"infoPlist": {
|
||||
"ITSAppUsesNonExemptEncryption": false
|
||||
}
|
||||
},
|
||||
android: {
|
||||
adaptiveIcon: {
|
||||
backgroundColor: "#080412",
|
||||
foregroundImage: "./assets/images/android-icon-foreground.png",
|
||||
backgroundImage: "./assets/images/android-icon-background.png",
|
||||
monochromeImage: "./assets/images/android-icon-monochrome.png",
|
||||
},
|
||||
edgeToEdgeEnabled: true,
|
||||
predictiveBackGestureEnabled: false,
|
||||
package: env.androidPackage,
|
||||
permissions: ["POST_NOTIFICATIONS"],
|
||||
intentFilters: [
|
||||
{
|
||||
action: "VIEW",
|
||||
autoVerify: true,
|
||||
data: [
|
||||
{
|
||||
scheme: env.scheme,
|
||||
host: "*",
|
||||
},
|
||||
],
|
||||
category: ["BROWSABLE", "DEFAULT"],
|
||||
},
|
||||
],
|
||||
},
|
||||
web: {
|
||||
bundler: "metro",
|
||||
output: "static",
|
||||
favicon: "./assets/images/favicon.png",
|
||||
},
|
||||
plugins: [
|
||||
"expo-router",
|
||||
[
|
||||
"expo-audio",
|
||||
{
|
||||
microphonePermission: "Allow $(PRODUCT_NAME) to access your microphone.",
|
||||
},
|
||||
],
|
||||
[
|
||||
"expo-video",
|
||||
{
|
||||
supportsBackgroundPlayback: true,
|
||||
supportsPictureInPicture: true,
|
||||
},
|
||||
],
|
||||
[
|
||||
"expo-splash-screen",
|
||||
{
|
||||
image: "./assets/images/splash-icon.png",
|
||||
imageWidth: 200,
|
||||
resizeMode: "contain",
|
||||
backgroundColor: "#080412",
|
||||
dark: {
|
||||
backgroundColor: "#080412",
|
||||
},
|
||||
},
|
||||
],
|
||||
[
|
||||
"expo-build-properties",
|
||||
{
|
||||
android: {
|
||||
buildArchs: ["armeabi-v7a", "arm64-v8a"],
|
||||
minSdkVersion: 24,
|
||||
},
|
||||
},
|
||||
],
|
||||
],
|
||||
experiments: {
|
||||
typedRoutes: true,
|
||||
reactCompiler: true,
|
||||
},
|
||||
};
|
||||
|
||||
export default config;
|
||||
17
mobile-app/app/(tabs)/_layout.tsx
Normal file
17
mobile-app/app/(tabs)/_layout.tsx
Normal file
@@ -0,0 +1,17 @@
|
||||
import { Tabs } from "expo-router";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
|
||||
export default function TabLayout() {
|
||||
const colors = useColors();
|
||||
|
||||
return (
|
||||
<Tabs
|
||||
screenOptions={{
|
||||
headerShown: false,
|
||||
tabBarStyle: { display: "none" },
|
||||
}}
|
||||
>
|
||||
<Tabs.Screen name="index" options={{ title: "Chat" }} />
|
||||
</Tabs>
|
||||
);
|
||||
}
|
||||
96
mobile-app/app/(tabs)/index.tsx
Normal file
96
mobile-app/app/(tabs)/index.tsx
Normal file
@@ -0,0 +1,96 @@
|
||||
import { useCallback, useRef, useState } from "react";
|
||||
import { FlatList, KeyboardAvoidingView, Platform, StyleSheet, View } from "react-native";
|
||||
import { ScreenContainer } from "@/components/screen-container";
|
||||
import { ChatHeader } from "@/components/chat-header";
|
||||
import { ChatBubble } from "@/components/chat-bubble";
|
||||
import { ChatInput } from "@/components/chat-input";
|
||||
import { TypingIndicator } from "@/components/typing-indicator";
|
||||
import { ImageViewer } from "@/components/image-viewer";
|
||||
import { EmptyChat } from "@/components/empty-chat";
|
||||
import { useChat } from "@/lib/chat-store";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
import { createAudioPlayer, setAudioModeAsync } from "expo-audio";
|
||||
import type { ChatMessage } from "@/shared/types";
|
||||
|
||||
export default function ChatScreen() {
|
||||
const { messages, isTyping } = useChat();
|
||||
const colors = useColors();
|
||||
const flatListRef = useRef<FlatList>(null);
|
||||
const [viewingImage, setViewingImage] = useState<string | null>(null);
|
||||
const [playingVoiceId, setPlayingVoiceId] = useState<string | null>(null);
|
||||
|
||||
const handlePlayVoice = useCallback(async (msg: ChatMessage) => {
|
||||
if (!msg.uri) return;
|
||||
try {
|
||||
if (playingVoiceId === msg.id) {
|
||||
setPlayingVoiceId(null);
|
||||
return;
|
||||
}
|
||||
await setAudioModeAsync({ playsInSilentMode: true });
|
||||
const player = createAudioPlayer({ uri: msg.uri });
|
||||
player.play();
|
||||
setPlayingVoiceId(msg.id);
|
||||
// Auto-reset after estimated duration
|
||||
const dur = (msg.duration ?? 5) * 1000;
|
||||
setTimeout(() => {
|
||||
setPlayingVoiceId(null);
|
||||
player.remove();
|
||||
}, dur + 500);
|
||||
} catch (err) {
|
||||
console.warn("Voice playback error:", err);
|
||||
setPlayingVoiceId(null);
|
||||
}
|
||||
}, [playingVoiceId]);
|
||||
|
||||
const renderItem = useCallback(
|
||||
({ item }: { item: ChatMessage }) => (
|
||||
<ChatBubble
|
||||
message={item}
|
||||
onImagePress={setViewingImage}
|
||||
onPlayVoice={handlePlayVoice}
|
||||
isPlayingVoice={playingVoiceId === item.id}
|
||||
/>
|
||||
),
|
||||
[playingVoiceId, handlePlayVoice],
|
||||
);
|
||||
|
||||
const keyExtractor = useCallback((item: ChatMessage) => item.id, []);
|
||||
|
||||
return (
|
||||
<ScreenContainer edges={["top", "left", "right"]} containerClassName="bg-background">
|
||||
<KeyboardAvoidingView
|
||||
style={styles.flex}
|
||||
behavior={Platform.OS === "ios" ? "padding" : undefined}
|
||||
keyboardVerticalOffset={0}
|
||||
>
|
||||
<ChatHeader />
|
||||
|
||||
<FlatList
|
||||
ref={flatListRef}
|
||||
data={messages}
|
||||
renderItem={renderItem}
|
||||
keyExtractor={keyExtractor}
|
||||
contentContainerStyle={styles.listContent}
|
||||
style={{ flex: 1, backgroundColor: colors.background }}
|
||||
onContentSizeChange={() => {
|
||||
flatListRef.current?.scrollToEnd({ animated: true });
|
||||
}}
|
||||
ListFooterComponent={isTyping ? <TypingIndicator /> : null}
|
||||
ListEmptyComponent={!isTyping ? <EmptyChat /> : null}
|
||||
showsVerticalScrollIndicator={false}
|
||||
/>
|
||||
|
||||
<ChatInput />
|
||||
</KeyboardAvoidingView>
|
||||
|
||||
<ImageViewer uri={viewingImage} onClose={() => setViewingImage(null)} />
|
||||
</ScreenContainer>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
flex: { flex: 1 },
|
||||
listContent: {
|
||||
paddingVertical: 12,
|
||||
},
|
||||
});
|
||||
45
mobile-app/app/_layout.tsx
Normal file
45
mobile-app/app/_layout.tsx
Normal file
@@ -0,0 +1,45 @@
|
||||
import "@/global.css";
|
||||
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
|
||||
import { Stack } from "expo-router";
|
||||
import { StatusBar } from "expo-status-bar";
|
||||
import { useState } from "react";
|
||||
import { GestureHandlerRootView } from "react-native-gesture-handler";
|
||||
import "react-native-reanimated";
|
||||
import { ThemeProvider } from "@/lib/theme-provider";
|
||||
import { SafeAreaProvider, initialWindowMetrics } from "react-native-safe-area-context";
|
||||
import { ChatProvider } from "@/lib/chat-store";
|
||||
|
||||
export const unstable_settings = {
|
||||
anchor: "(tabs)",
|
||||
};
|
||||
|
||||
export default function RootLayout() {
|
||||
const [queryClient] = useState(
|
||||
() =>
|
||||
new QueryClient({
|
||||
defaultOptions: {
|
||||
queries: {
|
||||
refetchOnWindowFocus: false,
|
||||
retry: 1,
|
||||
},
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
return (
|
||||
<ThemeProvider>
|
||||
<SafeAreaProvider initialMetrics={initialWindowMetrics}>
|
||||
<GestureHandlerRootView style={{ flex: 1 }}>
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<ChatProvider>
|
||||
<Stack screenOptions={{ headerShown: false }}>
|
||||
<Stack.Screen name="(tabs)" />
|
||||
</Stack>
|
||||
</ChatProvider>
|
||||
<StatusBar style="light" />
|
||||
</QueryClientProvider>
|
||||
</GestureHandlerRootView>
|
||||
</SafeAreaProvider>
|
||||
</ThemeProvider>
|
||||
);
|
||||
}
|
||||
BIN
mobile-app/assets/images/android-icon-background.png
Normal file
BIN
mobile-app/assets/images/android-icon-background.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 17 KiB |
BIN
mobile-app/assets/images/android-icon-foreground.png
Normal file
BIN
mobile-app/assets/images/android-icon-foreground.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 939 KiB |
BIN
mobile-app/assets/images/android-icon-monochrome.png
Normal file
BIN
mobile-app/assets/images/android-icon-monochrome.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 4.0 KiB |
BIN
mobile-app/assets/images/favicon.png
Normal file
BIN
mobile-app/assets/images/favicon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 53 KiB |
BIN
mobile-app/assets/images/icon.png
Normal file
BIN
mobile-app/assets/images/icon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 939 KiB |
BIN
mobile-app/assets/images/splash-icon.png
Normal file
BIN
mobile-app/assets/images/splash-icon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 221 KiB |
214
mobile-app/components/chat-bubble.tsx
Normal file
214
mobile-app/components/chat-bubble.tsx
Normal file
@@ -0,0 +1,214 @@
|
||||
import { useMemo } from "react";
|
||||
import { Text, View, StyleSheet, Image, Platform } from "react-native";
|
||||
import Pressable from "@/components/ui/pressable-fix";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
import type { ChatMessage } from "@/shared/types";
|
||||
import { formatBytes, formatDuration } from "@/lib/chat-store";
|
||||
import MaterialIcons from "@expo/vector-icons/MaterialIcons";
|
||||
|
||||
interface ChatBubbleProps {
|
||||
message: ChatMessage;
|
||||
onImagePress?: (uri: string) => void;
|
||||
onPlayVoice?: (message: ChatMessage) => void;
|
||||
isPlayingVoice?: boolean;
|
||||
}
|
||||
|
||||
export function ChatBubble({ message, onImagePress, onPlayVoice, isPlayingVoice }: ChatBubbleProps) {
|
||||
const colors = useColors();
|
||||
const isUser = message.role === "user";
|
||||
|
||||
// Stable waveform bar heights based on message id
|
||||
const waveHeights = useMemo(() => {
|
||||
let seed = 0;
|
||||
for (let i = 0; i < message.id.length; i++) seed = (seed * 31 + message.id.charCodeAt(i)) | 0;
|
||||
return Array.from({ length: 12 }, (_, i) => {
|
||||
seed = (seed * 16807 + i * 1013) % 2147483647;
|
||||
return 4 + (seed % 15);
|
||||
});
|
||||
}, [message.id]);
|
||||
|
||||
const bubbleStyle = [
|
||||
styles.bubble,
|
||||
{
|
||||
backgroundColor: isUser ? colors.primary : colors.surface,
|
||||
borderColor: isUser ? colors.primary : colors.border,
|
||||
alignSelf: isUser ? "flex-end" as const : "flex-start" as const,
|
||||
},
|
||||
];
|
||||
|
||||
const textColor = isUser ? "#fff" : colors.foreground;
|
||||
const mutedColor = isUser ? "rgba(255,255,255,0.6)" : colors.muted;
|
||||
|
||||
const timeStr = new Date(message.timestamp).toLocaleTimeString([], {
|
||||
hour: "2-digit",
|
||||
minute: "2-digit",
|
||||
});
|
||||
|
||||
return (
|
||||
<View style={[styles.row, isUser ? styles.rowUser : styles.rowAssistant]}>
|
||||
{!isUser && (
|
||||
<View style={[styles.avatar, { backgroundColor: colors.primary }]}>
|
||||
<Text style={styles.avatarText}>T</Text>
|
||||
</View>
|
||||
)}
|
||||
<View style={bubbleStyle}>
|
||||
{message.contentType === "text" && (
|
||||
<Text style={[styles.text, { color: textColor }]}>{message.text}</Text>
|
||||
)}
|
||||
|
||||
{message.contentType === "image" && (
|
||||
<Pressable
|
||||
onPress={() => message.uri && onImagePress?.(message.uri)}
|
||||
style={({ pressed }) => [pressed && { opacity: 0.8 }]}
|
||||
>
|
||||
<Image
|
||||
source={{ uri: message.uri }}
|
||||
style={styles.image}
|
||||
resizeMode="cover"
|
||||
/>
|
||||
{message.text ? (
|
||||
<Text style={[styles.text, { color: textColor, marginTop: 6 }]}>
|
||||
{message.text}
|
||||
</Text>
|
||||
) : null}
|
||||
</Pressable>
|
||||
)}
|
||||
|
||||
{message.contentType === "voice" && (
|
||||
<Pressable
|
||||
onPress={() => onPlayVoice?.(message)}
|
||||
style={({ pressed }) => [styles.voiceRow, pressed && { opacity: 0.7 }]}
|
||||
>
|
||||
<MaterialIcons
|
||||
name={isPlayingVoice ? "pause" : "play-arrow"}
|
||||
size={24}
|
||||
color={textColor}
|
||||
/>
|
||||
<View style={[styles.waveform, { backgroundColor: isUser ? "rgba(255,255,255,0.3)" : colors.border }]}>
|
||||
{waveHeights.map((h, i) => (
|
||||
<View
|
||||
key={i}
|
||||
style={[
|
||||
styles.waveBar,
|
||||
{
|
||||
height: h,
|
||||
backgroundColor: textColor,
|
||||
opacity: 0.6,
|
||||
},
|
||||
]}
|
||||
/>
|
||||
))}
|
||||
</View>
|
||||
<Text style={[styles.duration, { color: mutedColor }]}>
|
||||
{formatDuration(message.duration ?? 0)}
|
||||
</Text>
|
||||
</Pressable>
|
||||
)}
|
||||
|
||||
{message.contentType === "file" && (
|
||||
<View style={styles.fileRow}>
|
||||
<MaterialIcons name="insert-drive-file" size={28} color={textColor} />
|
||||
<View style={styles.fileInfo}>
|
||||
<Text style={[styles.fileName, { color: textColor }]} numberOfLines={1}>
|
||||
{message.fileName ?? "File"}
|
||||
</Text>
|
||||
<Text style={[styles.fileSize, { color: mutedColor }]}>
|
||||
{formatBytes(message.fileSize ?? 0)}
|
||||
</Text>
|
||||
</View>
|
||||
</View>
|
||||
)}
|
||||
|
||||
<Text style={[styles.time, { color: mutedColor }]}>{timeStr}</Text>
|
||||
</View>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
row: {
|
||||
flexDirection: "row",
|
||||
marginBottom: 8,
|
||||
paddingHorizontal: 12,
|
||||
alignItems: "flex-end",
|
||||
},
|
||||
rowUser: {
|
||||
justifyContent: "flex-end",
|
||||
},
|
||||
rowAssistant: {
|
||||
justifyContent: "flex-start",
|
||||
},
|
||||
avatar: {
|
||||
width: 30,
|
||||
height: 30,
|
||||
borderRadius: 15,
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
marginRight: 8,
|
||||
},
|
||||
avatarText: {
|
||||
color: "#fff",
|
||||
fontWeight: "700",
|
||||
fontSize: 14,
|
||||
},
|
||||
bubble: {
|
||||
maxWidth: "78%",
|
||||
borderRadius: 16,
|
||||
borderWidth: 1,
|
||||
paddingHorizontal: 14,
|
||||
paddingVertical: 10,
|
||||
},
|
||||
text: {
|
||||
fontSize: 15,
|
||||
lineHeight: 21,
|
||||
},
|
||||
time: {
|
||||
fontSize: 10,
|
||||
marginTop: 4,
|
||||
textAlign: "right",
|
||||
},
|
||||
image: {
|
||||
width: 220,
|
||||
height: 180,
|
||||
borderRadius: 10,
|
||||
},
|
||||
voiceRow: {
|
||||
flexDirection: "row",
|
||||
alignItems: "center",
|
||||
gap: 8,
|
||||
minWidth: 160,
|
||||
},
|
||||
waveform: {
|
||||
flex: 1,
|
||||
flexDirection: "row",
|
||||
alignItems: "center",
|
||||
gap: 2,
|
||||
height: 24,
|
||||
borderRadius: 4,
|
||||
paddingHorizontal: 4,
|
||||
},
|
||||
waveBar: {
|
||||
width: 3,
|
||||
borderRadius: 1.5,
|
||||
},
|
||||
duration: {
|
||||
fontSize: 12,
|
||||
minWidth: 32,
|
||||
},
|
||||
fileRow: {
|
||||
flexDirection: "row",
|
||||
alignItems: "center",
|
||||
gap: 10,
|
||||
},
|
||||
fileInfo: {
|
||||
flex: 1,
|
||||
},
|
||||
fileName: {
|
||||
fontSize: 14,
|
||||
fontWeight: "600",
|
||||
},
|
||||
fileSize: {
|
||||
fontSize: 11,
|
||||
marginTop: 2,
|
||||
},
|
||||
});
|
||||
69
mobile-app/components/chat-header.tsx
Normal file
69
mobile-app/components/chat-header.tsx
Normal file
@@ -0,0 +1,69 @@
|
||||
import { View, Text, StyleSheet } from "react-native";
|
||||
import Pressable from "@/components/ui/pressable-fix";
|
||||
import MaterialIcons from "@expo/vector-icons/MaterialIcons";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
import { useChat } from "@/lib/chat-store";
|
||||
|
||||
export function ChatHeader() {
|
||||
const colors = useColors();
|
||||
const { clearChat } = useChat();
|
||||
|
||||
return (
|
||||
<View style={[styles.header, { backgroundColor: colors.background, borderBottomColor: colors.border }]}>
|
||||
<View style={styles.left}>
|
||||
<View style={[styles.statusDot, { backgroundColor: colors.success }]} />
|
||||
<Text style={[styles.title, { color: colors.foreground }]}>TIMMY</Text>
|
||||
<Text style={[styles.subtitle, { color: colors.muted }]}>SOVEREIGN AI</Text>
|
||||
</View>
|
||||
<Pressable
|
||||
onPress={clearChat}
|
||||
style={({ pressed }: { pressed: boolean }) => [
|
||||
styles.clearBtn,
|
||||
{ borderColor: colors.border },
|
||||
pressed && { opacity: 0.6 },
|
||||
]}
|
||||
>
|
||||
<MaterialIcons name="delete-outline" size={16} color={colors.muted} />
|
||||
</Pressable>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
header: {
|
||||
flexDirection: "row",
|
||||
alignItems: "center",
|
||||
justifyContent: "space-between",
|
||||
paddingHorizontal: 16,
|
||||
paddingVertical: 10,
|
||||
borderBottomWidth: 1,
|
||||
},
|
||||
left: {
|
||||
flexDirection: "row",
|
||||
alignItems: "center",
|
||||
gap: 8,
|
||||
},
|
||||
statusDot: {
|
||||
width: 8,
|
||||
height: 8,
|
||||
borderRadius: 4,
|
||||
},
|
||||
title: {
|
||||
fontSize: 16,
|
||||
fontWeight: "700",
|
||||
letterSpacing: 2,
|
||||
},
|
||||
subtitle: {
|
||||
fontSize: 9,
|
||||
letterSpacing: 1.5,
|
||||
fontWeight: "600",
|
||||
},
|
||||
clearBtn: {
|
||||
width: 32,
|
||||
height: 32,
|
||||
borderRadius: 16,
|
||||
borderWidth: 1,
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
},
|
||||
});
|
||||
301
mobile-app/components/chat-input.tsx
Normal file
301
mobile-app/components/chat-input.tsx
Normal file
@@ -0,0 +1,301 @@
|
||||
import { useCallback, useRef, useState } from "react";
|
||||
import {
|
||||
View,
|
||||
TextInput,
|
||||
StyleSheet,
|
||||
Platform,
|
||||
ActionSheetIOS,
|
||||
Alert,
|
||||
Keyboard,
|
||||
} from "react-native";
|
||||
import Pressable from "@/components/ui/pressable-fix";
|
||||
import MaterialIcons from "@expo/vector-icons/MaterialIcons";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
import { useChat } from "@/lib/chat-store";
|
||||
import * as ImagePicker from "expo-image-picker";
|
||||
import * as DocumentPicker from "expo-document-picker";
|
||||
import {
|
||||
useAudioRecorder,
|
||||
useAudioRecorderState,
|
||||
RecordingPresets,
|
||||
requestRecordingPermissionsAsync,
|
||||
setAudioModeAsync,
|
||||
} from "expo-audio";
|
||||
import * as Haptics from "expo-haptics";
|
||||
|
||||
export function ChatInput() {
|
||||
const colors = useColors();
|
||||
const { sendTextMessage, sendAttachment, isTyping } = useChat();
|
||||
const [text, setText] = useState("");
|
||||
const [isRecording, setIsRecording] = useState(false);
|
||||
const inputRef = useRef<TextInput>(null);
|
||||
|
||||
const audioRecorder = useAudioRecorder(RecordingPresets.HIGH_QUALITY);
|
||||
const recorderState = useAudioRecorderState(audioRecorder);
|
||||
|
||||
const handleSend = useCallback(() => {
|
||||
const trimmed = text.trim();
|
||||
if (!trimmed) return;
|
||||
setText("");
|
||||
Keyboard.dismiss();
|
||||
if (Platform.OS !== "web") {
|
||||
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
|
||||
}
|
||||
sendTextMessage(trimmed);
|
||||
}, [text, sendTextMessage]);
|
||||
|
||||
// ── Attachment sheet ────────────────────────────────────────────────────
|
||||
|
||||
const handleAttachment = useCallback(() => {
|
||||
if (Platform.OS !== "web") {
|
||||
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
|
||||
}
|
||||
|
||||
const options = ["Take Photo", "Choose from Library", "Choose File", "Cancel"];
|
||||
const cancelIndex = 3;
|
||||
|
||||
if (Platform.OS === "ios") {
|
||||
ActionSheetIOS.showActionSheetWithOptions(
|
||||
{ options, cancelButtonIndex: cancelIndex },
|
||||
(idx) => {
|
||||
if (idx === 0) takePhoto();
|
||||
else if (idx === 1) pickImage();
|
||||
else if (idx === 2) pickFile();
|
||||
},
|
||||
);
|
||||
} else {
|
||||
// Android / Web fallback
|
||||
Alert.alert("Attach", "Choose an option", [
|
||||
{ text: "Take Photo", onPress: takePhoto },
|
||||
{ text: "Choose from Library", onPress: pickImage },
|
||||
{ text: "Choose File", onPress: pickFile },
|
||||
{ text: "Cancel", style: "cancel" },
|
||||
]);
|
||||
}
|
||||
}, []);
|
||||
|
||||
const takePhoto = async () => {
|
||||
const { status } = await ImagePicker.requestCameraPermissionsAsync();
|
||||
if (status !== "granted") {
|
||||
Alert.alert("Permission needed", "Camera access is required to take photos.");
|
||||
return;
|
||||
}
|
||||
const result = await ImagePicker.launchCameraAsync({
|
||||
quality: 0.8,
|
||||
allowsEditing: false,
|
||||
});
|
||||
if (!result.canceled && result.assets[0]) {
|
||||
const asset = result.assets[0];
|
||||
sendAttachment({
|
||||
contentType: "image",
|
||||
uri: asset.uri,
|
||||
fileName: asset.fileName ?? "photo.jpg",
|
||||
fileSize: asset.fileSize,
|
||||
mimeType: asset.mimeType ?? "image/jpeg",
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const pickImage = async () => {
|
||||
const result = await ImagePicker.launchImageLibraryAsync({
|
||||
mediaTypes: ["images"],
|
||||
quality: 0.8,
|
||||
allowsEditing: false,
|
||||
});
|
||||
if (!result.canceled && result.assets[0]) {
|
||||
const asset = result.assets[0];
|
||||
sendAttachment({
|
||||
contentType: "image",
|
||||
uri: asset.uri,
|
||||
fileName: asset.fileName ?? "image.jpg",
|
||||
fileSize: asset.fileSize,
|
||||
mimeType: asset.mimeType ?? "image/jpeg",
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const pickFile = async () => {
|
||||
try {
|
||||
const result = await DocumentPicker.getDocumentAsync({
|
||||
type: "*/*",
|
||||
copyToCacheDirectory: true,
|
||||
});
|
||||
if (!result.canceled && result.assets[0]) {
|
||||
const asset = result.assets[0];
|
||||
sendAttachment({
|
||||
contentType: "file",
|
||||
uri: asset.uri,
|
||||
fileName: asset.name,
|
||||
fileSize: asset.size,
|
||||
mimeType: asset.mimeType ?? "application/octet-stream",
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
console.warn("Document picker error:", err);
|
||||
}
|
||||
};
|
||||
|
||||
// ── Voice recording ───────────────────────────────────────────────────
|
||||
|
||||
const startRecording = async () => {
|
||||
try {
|
||||
const { granted } = await requestRecordingPermissionsAsync();
|
||||
if (!granted) {
|
||||
Alert.alert("Permission needed", "Microphone access is required for voice messages.");
|
||||
return;
|
||||
}
|
||||
await setAudioModeAsync({ playsInSilentMode: true, allowsRecording: true });
|
||||
await audioRecorder.prepareToRecordAsync();
|
||||
audioRecorder.record();
|
||||
setIsRecording(true);
|
||||
if (Platform.OS !== "web") {
|
||||
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Medium);
|
||||
}
|
||||
} catch (err) {
|
||||
console.warn("Recording start error:", err);
|
||||
}
|
||||
};
|
||||
|
||||
const stopRecording = async () => {
|
||||
try {
|
||||
await audioRecorder.stop();
|
||||
setIsRecording(false);
|
||||
if (Platform.OS !== "web") {
|
||||
Haptics.notificationAsync(Haptics.NotificationFeedbackType.Success);
|
||||
}
|
||||
const uri = audioRecorder.uri;
|
||||
if (uri) {
|
||||
const duration = recorderState.durationMillis ? recorderState.durationMillis / 1000 : 0;
|
||||
sendAttachment({
|
||||
contentType: "voice",
|
||||
uri,
|
||||
fileName: "voice_message.m4a",
|
||||
mimeType: "audio/m4a",
|
||||
duration,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
console.warn("Recording stop error:", err);
|
||||
setIsRecording(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleMicPress = useCallback(() => {
|
||||
if (isRecording) {
|
||||
stopRecording();
|
||||
} else {
|
||||
startRecording();
|
||||
}
|
||||
}, [isRecording]);
|
||||
|
||||
const hasText = text.trim().length > 0;
|
||||
|
||||
return (
|
||||
<View style={[styles.container, { backgroundColor: colors.background, borderTopColor: colors.border }]}>
|
||||
{/* Attachment button */}
|
||||
<Pressable
|
||||
onPress={handleAttachment}
|
||||
style={({ pressed }: { pressed: boolean }) => [
|
||||
styles.iconBtn,
|
||||
{ backgroundColor: colors.surface },
|
||||
pressed && { opacity: 0.6 },
|
||||
]}
|
||||
disabled={isTyping}
|
||||
>
|
||||
<MaterialIcons name="add" size={22} color={colors.muted} />
|
||||
</Pressable>
|
||||
|
||||
{/* Text input */}
|
||||
<TextInput
|
||||
ref={inputRef}
|
||||
value={text}
|
||||
onChangeText={setText}
|
||||
placeholder={isRecording ? "Recording..." : "Message Timmy..."}
|
||||
placeholderTextColor={colors.muted}
|
||||
style={[
|
||||
styles.input,
|
||||
{
|
||||
backgroundColor: colors.surface,
|
||||
color: colors.foreground,
|
||||
borderColor: colors.border,
|
||||
},
|
||||
]}
|
||||
multiline
|
||||
maxLength={4000}
|
||||
returnKeyType="default"
|
||||
editable={!isRecording && !isTyping}
|
||||
onSubmitEditing={handleSend}
|
||||
blurOnSubmit={false}
|
||||
/>
|
||||
|
||||
{/* Send or Mic button */}
|
||||
{hasText ? (
|
||||
<Pressable
|
||||
onPress={handleSend}
|
||||
style={({ pressed }: { pressed: boolean }) => [
|
||||
styles.sendBtn,
|
||||
{ backgroundColor: colors.primary },
|
||||
pressed && { transform: [{ scale: 0.95 }], opacity: 0.9 },
|
||||
]}
|
||||
disabled={isTyping}
|
||||
>
|
||||
<MaterialIcons name="send" size={20} color="#fff" />
|
||||
</Pressable>
|
||||
) : (
|
||||
<Pressable
|
||||
onPress={handleMicPress}
|
||||
style={({ pressed }: { pressed: boolean }) => [
|
||||
styles.sendBtn,
|
||||
{
|
||||
backgroundColor: isRecording ? colors.error : colors.surface,
|
||||
},
|
||||
pressed && { transform: [{ scale: 0.95 }], opacity: 0.9 },
|
||||
]}
|
||||
disabled={isTyping}
|
||||
>
|
||||
<MaterialIcons
|
||||
name={isRecording ? "stop" : "mic"}
|
||||
size={20}
|
||||
color={isRecording ? "#fff" : colors.primary}
|
||||
/>
|
||||
</Pressable>
|
||||
)}
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
container: {
|
||||
flexDirection: "row",
|
||||
alignItems: "flex-end",
|
||||
paddingHorizontal: 10,
|
||||
paddingVertical: 8,
|
||||
gap: 8,
|
||||
borderTopWidth: 1,
|
||||
},
|
||||
iconBtn: {
|
||||
width: 38,
|
||||
height: 38,
|
||||
borderRadius: 19,
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
},
|
||||
input: {
|
||||
flex: 1,
|
||||
minHeight: 38,
|
||||
maxHeight: 120,
|
||||
borderRadius: 19,
|
||||
borderWidth: 1,
|
||||
paddingHorizontal: 14,
|
||||
paddingVertical: 8,
|
||||
fontSize: 15,
|
||||
lineHeight: 20,
|
||||
},
|
||||
sendBtn: {
|
||||
width: 38,
|
||||
height: 38,
|
||||
borderRadius: 19,
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
},
|
||||
});
|
||||
55
mobile-app/components/empty-chat.tsx
Normal file
55
mobile-app/components/empty-chat.tsx
Normal file
@@ -0,0 +1,55 @@
|
||||
import { View, Text, StyleSheet } from "react-native";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
import MaterialIcons from "@expo/vector-icons/MaterialIcons";
|
||||
|
||||
export function EmptyChat() {
|
||||
const colors = useColors();
|
||||
|
||||
return (
|
||||
<View style={styles.container}>
|
||||
<View style={[styles.iconCircle, { backgroundColor: colors.surface, borderColor: colors.border }]}>
|
||||
<MaterialIcons name="chat-bubble-outline" size={40} color={colors.primary} />
|
||||
</View>
|
||||
<Text style={[styles.title, { color: colors.foreground }]}>TIMMY</Text>
|
||||
<Text style={[styles.subtitle, { color: colors.muted }]}>SOVEREIGN AI AGENT</Text>
|
||||
<Text style={[styles.hint, { color: colors.muted }]}>
|
||||
Send a message, voice note, image, or file to get started.
|
||||
</Text>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
container: {
|
||||
flex: 1,
|
||||
justifyContent: "center",
|
||||
alignItems: "center",
|
||||
paddingHorizontal: 40,
|
||||
gap: 8,
|
||||
},
|
||||
iconCircle: {
|
||||
width: 80,
|
||||
height: 80,
|
||||
borderRadius: 40,
|
||||
borderWidth: 1,
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
marginBottom: 12,
|
||||
},
|
||||
title: {
|
||||
fontSize: 24,
|
||||
fontWeight: "700",
|
||||
letterSpacing: 4,
|
||||
},
|
||||
subtitle: {
|
||||
fontSize: 11,
|
||||
letterSpacing: 2,
|
||||
fontWeight: "600",
|
||||
},
|
||||
hint: {
|
||||
fontSize: 13,
|
||||
textAlign: "center",
|
||||
marginTop: 12,
|
||||
lineHeight: 19,
|
||||
},
|
||||
});
|
||||
18
mobile-app/components/haptic-tab.tsx
Normal file
18
mobile-app/components/haptic-tab.tsx
Normal file
@@ -0,0 +1,18 @@
|
||||
import { BottomTabBarButtonProps } from "@react-navigation/bottom-tabs";
|
||||
import { PlatformPressable } from "@react-navigation/elements";
|
||||
import * as Haptics from "expo-haptics";
|
||||
|
||||
export function HapticTab(props: BottomTabBarButtonProps) {
|
||||
return (
|
||||
<PlatformPressable
|
||||
{...props}
|
||||
onPressIn={(ev) => {
|
||||
if (process.env.EXPO_OS === "ios") {
|
||||
// Add a soft haptic feedback when pressing down on the tabs.
|
||||
Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
|
||||
}
|
||||
props.onPressIn?.(ev);
|
||||
}}
|
||||
/>
|
||||
);
|
||||
}
|
||||
54
mobile-app/components/image-viewer.tsx
Normal file
54
mobile-app/components/image-viewer.tsx
Normal file
@@ -0,0 +1,54 @@
|
||||
import { Modal, View, Image, StyleSheet, StatusBar } from "react-native";
|
||||
import Pressable from "@/components/ui/pressable-fix";
|
||||
import MaterialIcons from "@expo/vector-icons/MaterialIcons";
|
||||
|
||||
interface ImageViewerProps {
|
||||
uri: string | null;
|
||||
onClose: () => void;
|
||||
}
|
||||
|
||||
export function ImageViewer({ uri, onClose }: ImageViewerProps) {
|
||||
if (!uri) return null;
|
||||
|
||||
return (
|
||||
<Modal visible animationType="fade" transparent statusBarTranslucent>
|
||||
<View style={styles.overlay}>
|
||||
<StatusBar barStyle="light-content" />
|
||||
<Image source={{ uri }} style={styles.image} resizeMode="contain" />
|
||||
<Pressable
|
||||
onPress={onClose}
|
||||
style={({ pressed }: { pressed: boolean }) => [
|
||||
styles.closeBtn,
|
||||
pressed && { opacity: 0.6 },
|
||||
]}
|
||||
>
|
||||
<MaterialIcons name="close" size={28} color="#fff" />
|
||||
</Pressable>
|
||||
</View>
|
||||
</Modal>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
overlay: {
|
||||
flex: 1,
|
||||
backgroundColor: "rgba(0,0,0,0.95)",
|
||||
justifyContent: "center",
|
||||
alignItems: "center",
|
||||
},
|
||||
image: {
|
||||
width: "100%",
|
||||
height: "80%",
|
||||
},
|
||||
closeBtn: {
|
||||
position: "absolute",
|
||||
top: 50,
|
||||
right: 20,
|
||||
width: 40,
|
||||
height: 40,
|
||||
borderRadius: 20,
|
||||
backgroundColor: "rgba(255,255,255,0.15)",
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
},
|
||||
});
|
||||
68
mobile-app/components/screen-container.tsx
Normal file
68
mobile-app/components/screen-container.tsx
Normal file
@@ -0,0 +1,68 @@
|
||||
import { View, type ViewProps } from "react-native";
|
||||
import { SafeAreaView, type Edge } from "react-native-safe-area-context";
|
||||
|
||||
import { cn } from "@/lib/utils";
|
||||
|
||||
export interface ScreenContainerProps extends ViewProps {
|
||||
/**
|
||||
* SafeArea edges to apply. Defaults to ["top", "left", "right"].
|
||||
* Bottom is typically handled by Tab Bar.
|
||||
*/
|
||||
edges?: Edge[];
|
||||
/**
|
||||
* Tailwind className for the content area.
|
||||
*/
|
||||
className?: string;
|
||||
/**
|
||||
* Additional className for the outer container (background layer).
|
||||
*/
|
||||
containerClassName?: string;
|
||||
/**
|
||||
* Additional className for the SafeAreaView (content layer).
|
||||
*/
|
||||
safeAreaClassName?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* A container component that properly handles SafeArea and background colors.
|
||||
*
|
||||
* The outer View extends to full screen (including status bar area) with the background color,
|
||||
* while the inner SafeAreaView ensures content is within safe bounds.
|
||||
*
|
||||
* Usage:
|
||||
* ```tsx
|
||||
* <ScreenContainer className="p-4">
|
||||
* <Text className="text-2xl font-bold text-foreground">
|
||||
* Welcome
|
||||
* </Text>
|
||||
* </ScreenContainer>
|
||||
* ```
|
||||
*/
|
||||
export function ScreenContainer({
|
||||
children,
|
||||
edges = ["top", "left", "right"],
|
||||
className,
|
||||
containerClassName,
|
||||
safeAreaClassName,
|
||||
style,
|
||||
...props
|
||||
}: ScreenContainerProps) {
|
||||
return (
|
||||
<View
|
||||
className={cn(
|
||||
"flex-1",
|
||||
"bg-background",
|
||||
containerClassName
|
||||
)}
|
||||
{...props}
|
||||
>
|
||||
<SafeAreaView
|
||||
edges={edges}
|
||||
className={cn("flex-1", safeAreaClassName)}
|
||||
style={style}
|
||||
>
|
||||
<View className={cn("flex-1", className)}>{children}</View>
|
||||
</SafeAreaView>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
15
mobile-app/components/themed-view.tsx
Normal file
15
mobile-app/components/themed-view.tsx
Normal file
@@ -0,0 +1,15 @@
|
||||
import { View, type ViewProps } from "react-native";
|
||||
|
||||
import { cn } from "@/lib/utils";
|
||||
|
||||
export interface ThemedViewProps extends ViewProps {
|
||||
className?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* A View component with automatic theme-aware background.
|
||||
* Uses NativeWind for styling - pass className for additional styles.
|
||||
*/
|
||||
export function ThemedView({ className, ...otherProps }: ThemedViewProps) {
|
||||
return <View className={cn("bg-background", className)} {...otherProps} />;
|
||||
}
|
||||
89
mobile-app/components/typing-indicator.tsx
Normal file
89
mobile-app/components/typing-indicator.tsx
Normal file
@@ -0,0 +1,89 @@
|
||||
import { useEffect } from "react";
|
||||
import { View, StyleSheet } from "react-native";
|
||||
import Animated, {
|
||||
useSharedValue,
|
||||
useAnimatedStyle,
|
||||
withRepeat,
|
||||
withTiming,
|
||||
withDelay,
|
||||
withSequence,
|
||||
} from "react-native-reanimated";
|
||||
import { useColors } from "@/hooks/use-colors";
|
||||
|
||||
export function TypingIndicator() {
|
||||
const colors = useColors();
|
||||
const dot1 = useSharedValue(0.3);
|
||||
const dot2 = useSharedValue(0.3);
|
||||
const dot3 = useSharedValue(0.3);
|
||||
|
||||
useEffect(() => {
|
||||
const anim = (sv: { value: number }, delay: number) => {
|
||||
sv.value = withDelay(
|
||||
delay,
|
||||
withRepeat(
|
||||
withSequence(
|
||||
withTiming(1, { duration: 400 }),
|
||||
withTiming(0.3, { duration: 400 }),
|
||||
),
|
||||
-1,
|
||||
),
|
||||
);
|
||||
};
|
||||
anim(dot1, 0);
|
||||
anim(dot2, 200);
|
||||
anim(dot3, 400);
|
||||
}, []);
|
||||
|
||||
const style1 = useAnimatedStyle(() => ({ opacity: dot1.value }));
|
||||
const style2 = useAnimatedStyle(() => ({ opacity: dot2.value }));
|
||||
const style3 = useAnimatedStyle(() => ({ opacity: dot3.value }));
|
||||
|
||||
const dotBase = [styles.dot, { backgroundColor: colors.primary }];
|
||||
|
||||
return (
|
||||
<View style={[styles.row, { alignItems: "flex-end" }]}>
|
||||
<View style={[styles.avatar, { backgroundColor: colors.primary }]}>
|
||||
<Animated.Text style={styles.avatarText}>T</Animated.Text>
|
||||
</View>
|
||||
<View style={[styles.bubble, { backgroundColor: colors.surface, borderColor: colors.border }]}>
|
||||
<Animated.View style={[dotBase, style1]} />
|
||||
<Animated.View style={[dotBase, style2]} />
|
||||
<Animated.View style={[dotBase, style3]} />
|
||||
</View>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
row: {
|
||||
flexDirection: "row",
|
||||
paddingHorizontal: 12,
|
||||
marginBottom: 8,
|
||||
},
|
||||
avatar: {
|
||||
width: 30,
|
||||
height: 30,
|
||||
borderRadius: 15,
|
||||
alignItems: "center",
|
||||
justifyContent: "center",
|
||||
marginRight: 8,
|
||||
},
|
||||
avatarText: {
|
||||
color: "#fff",
|
||||
fontWeight: "700",
|
||||
fontSize: 14,
|
||||
},
|
||||
bubble: {
|
||||
flexDirection: "row",
|
||||
gap: 5,
|
||||
paddingHorizontal: 16,
|
||||
paddingVertical: 14,
|
||||
borderRadius: 16,
|
||||
borderWidth: 1,
|
||||
},
|
||||
dot: {
|
||||
width: 8,
|
||||
height: 8,
|
||||
borderRadius: 4,
|
||||
},
|
||||
});
|
||||
41
mobile-app/components/ui/icon-symbol.tsx
Normal file
41
mobile-app/components/ui/icon-symbol.tsx
Normal file
@@ -0,0 +1,41 @@
|
||||
// Fallback for using MaterialIcons on Android and web.
|
||||
|
||||
import MaterialIcons from "@expo/vector-icons/MaterialIcons";
|
||||
import { SymbolWeight, SymbolViewProps } from "expo-symbols";
|
||||
import { ComponentProps } from "react";
|
||||
import { OpaqueColorValue, type StyleProp, type TextStyle } from "react-native";
|
||||
|
||||
type IconMapping = Record<SymbolViewProps["name"], ComponentProps<typeof MaterialIcons>["name"]>;
|
||||
type IconSymbolName = keyof typeof MAPPING;
|
||||
|
||||
/**
|
||||
* Add your SF Symbols to Material Icons mappings here.
|
||||
* - see Material Icons in the [Icons Directory](https://icons.expo.fyi).
|
||||
* - see SF Symbols in the [SF Symbols](https://developer.apple.com/sf-symbols/) app.
|
||||
*/
|
||||
const MAPPING = {
|
||||
"house.fill": "home",
|
||||
"paperplane.fill": "send",
|
||||
"chevron.left.forwardslash.chevron.right": "code",
|
||||
"chevron.right": "chevron-right",
|
||||
} as IconMapping;
|
||||
|
||||
/**
|
||||
* An icon component that uses native SF Symbols on iOS, and Material Icons on Android and web.
|
||||
* This ensures a consistent look across platforms, and optimal resource usage.
|
||||
* Icon `name`s are based on SF Symbols and require manual mapping to Material Icons.
|
||||
*/
|
||||
export function IconSymbol({
|
||||
name,
|
||||
size = 24,
|
||||
color,
|
||||
style,
|
||||
}: {
|
||||
name: IconSymbolName;
|
||||
size?: number;
|
||||
color: string | OpaqueColorValue;
|
||||
style?: StyleProp<TextStyle>;
|
||||
weight?: SymbolWeight;
|
||||
}) {
|
||||
return <MaterialIcons color={color} size={size} name={MAPPING[name]} style={style} />;
|
||||
}
|
||||
6
mobile-app/components/ui/pressable-fix.tsx
Normal file
6
mobile-app/components/ui/pressable-fix.tsx
Normal file
@@ -0,0 +1,6 @@
|
||||
/**
|
||||
* Re-export Pressable with proper typing for style callbacks.
|
||||
* NativeWind disables className on Pressable, so we always use the style prop.
|
||||
*/
|
||||
import { Pressable } from "react-native";
|
||||
export default Pressable;
|
||||
12
mobile-app/constants/theme.ts
Normal file
12
mobile-app/constants/theme.ts
Normal file
@@ -0,0 +1,12 @@
|
||||
/**
|
||||
* Thin re-exports so consumers don't need to know about internal theme plumbing.
|
||||
* Full implementation lives in lib/_core/theme.ts.
|
||||
*/
|
||||
export {
|
||||
Colors,
|
||||
Fonts,
|
||||
SchemeColors,
|
||||
ThemeColors,
|
||||
type ColorScheme,
|
||||
type ThemeColorPalette,
|
||||
} from "@/lib/_core/theme";
|
||||
80
mobile-app/design.md
Normal file
80
mobile-app/design.md
Normal file
@@ -0,0 +1,80 @@
|
||||
# Timmy Chat — Mobile App Design
|
||||
|
||||
## Overview
|
||||
A sleek, single-screen chat app for talking to Timmy — the sovereign AI agent from the Timmy Time dashboard. Supports text, voice, image, and file messaging. Dark arcane theme matching Mission Control.
|
||||
|
||||
## Screen List
|
||||
|
||||
### 1. Chat Screen (Home / Only Screen)
|
||||
The entire app is a single full-screen chat interface. No tabs, no settings, no extra screens. Just you and Timmy.
|
||||
|
||||
### 2. No Other Screens
|
||||
No settings, no profile, no onboarding. The app opens straight to chat.
|
||||
|
||||
## Primary Content and Functionality
|
||||
|
||||
### Chat Screen
|
||||
- **Header**: "TIMMY" title with status indicator (online/offline dot), minimal and clean
|
||||
- **Message List**: Full-screen scrollable message list (FlatList, inverted)
|
||||
- User messages: right-aligned, purple/violet accent bubble
|
||||
- Timmy messages: left-aligned, dark surface bubble with avatar initial "T"
|
||||
- Image messages: thumbnail preview in bubble, tappable for full-screen
|
||||
- File messages: file icon + filename + size in bubble
|
||||
- Voice messages: waveform-style playback bar with play/pause + duration
|
||||
- Timestamps shown subtly below message groups
|
||||
- **Input Bar** (bottom, always visible):
|
||||
- Text input field (expandable, multi-line)
|
||||
- Attachment button (left of input) — opens action sheet: Camera, Photo Library, File
|
||||
- Voice record button (right of input, replaces send when input is empty)
|
||||
- Send button (right of input, appears when text is entered)
|
||||
- Hold-to-record voice: press and hold mic icon, release to send
|
||||
|
||||
## Key User Flows
|
||||
|
||||
### Text Chat
|
||||
1. User types message → taps Send
|
||||
2. Message appears in chat as "sending"
|
||||
3. Server responds → Timmy's reply appears below
|
||||
|
||||
### Voice Message
|
||||
1. User presses and holds mic button
|
||||
2. Recording indicator appears (duration + pulsing dot)
|
||||
3. User releases → voice message sent
|
||||
4. Timmy responds with text (server processes audio)
|
||||
|
||||
### Image Sharing
|
||||
1. User taps attachment (+) button
|
||||
2. Action sheet: "Take Photo" / "Choose from Library"
|
||||
3. Image appears as thumbnail in chat
|
||||
4. Timmy acknowledges receipt
|
||||
|
||||
### File Sharing
|
||||
1. User taps attachment (+) button → "Choose File"
|
||||
2. Document picker opens
|
||||
3. File appears in chat with name + size
|
||||
4. Timmy acknowledges receipt
|
||||
|
||||
## Color Choices (Arcane Dark Theme)
|
||||
|
||||
Matching the Timmy Time Mission Control dashboard:
|
||||
|
||||
| Token | Dark Value | Purpose |
|
||||
|-------------|-------------|--------------------------------|
|
||||
| background | #080412 | Deep dark purple-black |
|
||||
| surface | #110820 | Card/bubble background |
|
||||
| foreground | #ede0ff | Primary text (bright lavender) |
|
||||
| muted | #6b4a8a | Secondary/timestamp text |
|
||||
| primary | #a855f7 | Accent purple (user bubbles) |
|
||||
| border | #3b1a5c | Subtle borders |
|
||||
| success | #00e87a | Online status, success |
|
||||
| warning | #ffb800 | Amber warnings |
|
||||
| error | #ff4455 | Error states |
|
||||
|
||||
## Layout Specifics (Portrait 9:16, One-Handed)
|
||||
|
||||
- Input bar pinned to bottom with safe area padding
|
||||
- Send/mic button on right (thumb-reachable)
|
||||
- Attachment button on left of input
|
||||
- Messages fill remaining space above input
|
||||
- No tab bar — single screen app
|
||||
- Header is compact (44pt) with just title + status dot
|
||||
3
mobile-app/global.css
Normal file
3
mobile-app/global.css
Normal file
@@ -0,0 +1,3 @@
|
||||
@tailwind base;
|
||||
@tailwind components;
|
||||
@tailwind utilities;
|
||||
5
mobile-app/hooks/use-color-scheme.ts
Normal file
5
mobile-app/hooks/use-color-scheme.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import { useThemeContext } from "@/lib/theme-provider";
|
||||
|
||||
export function useColorScheme() {
|
||||
return useThemeContext().colorScheme;
|
||||
}
|
||||
21
mobile-app/hooks/use-color-scheme.web.ts
Normal file
21
mobile-app/hooks/use-color-scheme.web.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
import { useEffect, useState } from "react";
|
||||
import { useColorScheme as useRNColorScheme } from "react-native";
|
||||
|
||||
/**
|
||||
* To support static rendering, this value needs to be re-calculated on the client side for web
|
||||
*/
|
||||
export function useColorScheme() {
|
||||
const [hasHydrated, setHasHydrated] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
setHasHydrated(true);
|
||||
}, []);
|
||||
|
||||
const colorScheme = useRNColorScheme();
|
||||
|
||||
if (hasHydrated) {
|
||||
return colorScheme;
|
||||
}
|
||||
|
||||
return "light";
|
||||
}
|
||||
12
mobile-app/hooks/use-colors.ts
Normal file
12
mobile-app/hooks/use-colors.ts
Normal file
@@ -0,0 +1,12 @@
|
||||
import { Colors, type ColorScheme, type ThemeColorPalette } from "@/constants/theme";
|
||||
import { useColorScheme } from "./use-color-scheme";
|
||||
|
||||
/**
|
||||
* Returns the current theme's color palette.
|
||||
* Usage: const colors = useColors(); then colors.text, colors.background, etc.
|
||||
*/
|
||||
export function useColors(colorSchemeOverride?: ColorScheme): ThemeColorPalette {
|
||||
const colorSchema = useColorScheme();
|
||||
const scheme = (colorSchemeOverride ?? colorSchema ?? "light") as ColorScheme;
|
||||
return Colors[scheme];
|
||||
}
|
||||
298
mobile-app/lib/chat-store.tsx
Normal file
298
mobile-app/lib/chat-store.tsx
Normal file
@@ -0,0 +1,298 @@
|
||||
import React, { createContext, useCallback, useContext, useReducer, type ReactNode } from "react";
|
||||
import type { ChatMessage, MessageContentType } from "@/shared/types";
|
||||
|
||||
// ── State ───────────────────────────────────────────────────────────────────
|
||||
|
||||
interface ChatState {
|
||||
messages: ChatMessage[];
|
||||
isTyping: boolean;
|
||||
}
|
||||
|
||||
const initialState: ChatState = {
|
||||
messages: [],
|
||||
isTyping: false,
|
||||
};
|
||||
|
||||
// ── Actions ─────────────────────────────────────────────────────────────────
|
||||
|
||||
type ChatAction =
|
||||
| { type: "ADD_MESSAGE"; message: ChatMessage }
|
||||
| { type: "UPDATE_MESSAGE"; id: string; updates: Partial<ChatMessage> }
|
||||
| { type: "SET_TYPING"; isTyping: boolean }
|
||||
| { type: "CLEAR" };
|
||||
|
||||
function chatReducer(state: ChatState, action: ChatAction): ChatState {
|
||||
switch (action.type) {
|
||||
case "ADD_MESSAGE":
|
||||
return { ...state, messages: [...state.messages, action.message] };
|
||||
case "UPDATE_MESSAGE":
|
||||
return {
|
||||
...state,
|
||||
messages: state.messages.map((m) =>
|
||||
m.id === action.id ? { ...m, ...action.updates } : m,
|
||||
),
|
||||
};
|
||||
case "SET_TYPING":
|
||||
return { ...state, isTyping: action.isTyping };
|
||||
case "CLEAR":
|
||||
return initialState;
|
||||
default:
|
||||
return state;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
let _counter = 0;
|
||||
function makeId(): string {
|
||||
return `msg_${Date.now()}_${++_counter}`;
|
||||
}
|
||||
|
||||
// ── Context ─────────────────────────────────────────────────────────────────
|
||||
|
||||
interface ChatContextValue {
|
||||
messages: ChatMessage[];
|
||||
isTyping: boolean;
|
||||
sendTextMessage: (text: string) => Promise<void>;
|
||||
sendAttachment: (opts: {
|
||||
contentType: MessageContentType;
|
||||
uri: string;
|
||||
fileName?: string;
|
||||
fileSize?: number;
|
||||
mimeType?: string;
|
||||
duration?: number;
|
||||
text?: string;
|
||||
}) => Promise<void>;
|
||||
clearChat: () => void;
|
||||
}
|
||||
|
||||
const ChatContext = createContext<ChatContextValue | null>(null);
|
||||
|
||||
// ── API call ────────────────────────────────────────────────────────────────
|
||||
|
||||
function getApiBase(): string {
|
||||
// Set EXPO_PUBLIC_API_BASE_URL in your .env to point to your Timmy backend
|
||||
// e.g. EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:3000
|
||||
const envBase = process.env.EXPO_PUBLIC_API_BASE_URL;
|
||||
if (envBase) return envBase.replace(/\/+$/, "");
|
||||
// Fallback for web: derive from window location
|
||||
if (typeof window !== "undefined" && window.location) {
|
||||
return `${window.location.protocol}//${window.location.hostname}:3000`;
|
||||
}
|
||||
// Default: local machine
|
||||
return "http://127.0.0.1:3000";
|
||||
}
|
||||
|
||||
const API_BASE = getApiBase();
|
||||
|
||||
async function callChatAPI(
|
||||
messages: Array<{ role: string; content: string | Array<Record<string, unknown>> }>,
|
||||
): Promise<string> {
|
||||
const res = await fetch(`${API_BASE}/api/chat`, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify({ messages }),
|
||||
});
|
||||
if (!res.ok) {
|
||||
const errText = await res.text().catch(() => res.statusText);
|
||||
throw new Error(`Chat API error: ${errText}`);
|
||||
}
|
||||
const data = await res.json();
|
||||
return data.reply ?? data.text ?? "...";
|
||||
}
|
||||
|
||||
async function uploadFile(
|
||||
uri: string,
|
||||
fileName: string,
|
||||
mimeType: string,
|
||||
): Promise<string> {
|
||||
const formData = new FormData();
|
||||
formData.append("file", {
|
||||
uri,
|
||||
name: fileName,
|
||||
type: mimeType,
|
||||
} as unknown as Blob);
|
||||
|
||||
const res = await fetch(`${API_BASE}/api/upload`, {
|
||||
method: "POST",
|
||||
body: formData,
|
||||
});
|
||||
if (!res.ok) throw new Error("Upload failed");
|
||||
const data = await res.json();
|
||||
return data.url;
|
||||
}
|
||||
|
||||
// ── Provider ────────────────────────────────────────────────────────────────
|
||||
|
||||
export function ChatProvider({ children }: { children: ReactNode }) {
|
||||
const [state, dispatch] = useReducer(chatReducer, initialState);
|
||||
|
||||
const sendTextMessage = useCallback(
|
||||
async (text: string) => {
|
||||
const userMsg: ChatMessage = {
|
||||
id: makeId(),
|
||||
role: "user",
|
||||
contentType: "text",
|
||||
text,
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
dispatch({ type: "ADD_MESSAGE", message: userMsg });
|
||||
dispatch({ type: "SET_TYPING", isTyping: true });
|
||||
|
||||
try {
|
||||
// Build conversation context (last 20 messages)
|
||||
const recent = [...state.messages, userMsg].slice(-20);
|
||||
const apiMessages = recent
|
||||
.filter((m) => m.contentType === "text" && m.text)
|
||||
.map((m) => ({ role: m.role, content: m.text! }));
|
||||
|
||||
const reply = await callChatAPI(apiMessages);
|
||||
const assistantMsg: ChatMessage = {
|
||||
id: makeId(),
|
||||
role: "assistant",
|
||||
contentType: "text",
|
||||
text: reply,
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
dispatch({ type: "ADD_MESSAGE", message: assistantMsg });
|
||||
} catch (err: unknown) {
|
||||
const errorText = err instanceof Error ? err.message : "Something went wrong";
|
||||
dispatch({
|
||||
type: "ADD_MESSAGE",
|
||||
message: {
|
||||
id: makeId(),
|
||||
role: "assistant",
|
||||
contentType: "text",
|
||||
text: `Sorry, I couldn't process that: ${errorText}`,
|
||||
timestamp: Date.now(),
|
||||
},
|
||||
});
|
||||
} finally {
|
||||
dispatch({ type: "SET_TYPING", isTyping: false });
|
||||
}
|
||||
},
|
||||
[state.messages],
|
||||
);
|
||||
|
||||
const sendAttachment = useCallback(
|
||||
async (opts: {
|
||||
contentType: MessageContentType;
|
||||
uri: string;
|
||||
fileName?: string;
|
||||
fileSize?: number;
|
||||
mimeType?: string;
|
||||
duration?: number;
|
||||
text?: string;
|
||||
}) => {
|
||||
const userMsg: ChatMessage = {
|
||||
id: makeId(),
|
||||
role: "user",
|
||||
contentType: opts.contentType,
|
||||
uri: opts.uri,
|
||||
fileName: opts.fileName,
|
||||
fileSize: opts.fileSize,
|
||||
mimeType: opts.mimeType,
|
||||
duration: opts.duration,
|
||||
text: opts.text,
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
dispatch({ type: "ADD_MESSAGE", message: userMsg });
|
||||
dispatch({ type: "SET_TYPING", isTyping: true });
|
||||
|
||||
try {
|
||||
// Upload file to server
|
||||
const remoteUrl = await uploadFile(
|
||||
opts.uri,
|
||||
opts.fileName ?? "attachment",
|
||||
opts.mimeType ?? "application/octet-stream",
|
||||
);
|
||||
dispatch({ type: "UPDATE_MESSAGE", id: userMsg.id, updates: { remoteUrl } });
|
||||
|
||||
// Build message for LLM
|
||||
let content: string | Array<Record<string, unknown>>;
|
||||
if (opts.contentType === "image") {
|
||||
content = [
|
||||
{ type: "text", text: opts.text || "I'm sending you an image." },
|
||||
{ type: "image_url", image_url: { url: remoteUrl } },
|
||||
];
|
||||
} else if (opts.contentType === "voice") {
|
||||
content = [
|
||||
{ type: "text", text: "I'm sending you a voice message. Please transcribe and respond." },
|
||||
{ type: "file_url", file_url: { url: remoteUrl, mime_type: opts.mimeType ?? "audio/m4a" } },
|
||||
];
|
||||
} else {
|
||||
content = `I'm sharing a file: ${opts.fileName ?? "file"} (${formatBytes(opts.fileSize ?? 0)})`;
|
||||
}
|
||||
|
||||
const apiMessages = [{ role: "user", content }];
|
||||
const reply = await callChatAPI(apiMessages);
|
||||
|
||||
dispatch({
|
||||
type: "ADD_MESSAGE",
|
||||
message: {
|
||||
id: makeId(),
|
||||
role: "assistant",
|
||||
contentType: "text",
|
||||
text: reply,
|
||||
timestamp: Date.now(),
|
||||
},
|
||||
});
|
||||
} catch (err: unknown) {
|
||||
const errorText = err instanceof Error ? err.message : "Upload failed";
|
||||
dispatch({
|
||||
type: "ADD_MESSAGE",
|
||||
message: {
|
||||
id: makeId(),
|
||||
role: "assistant",
|
||||
contentType: "text",
|
||||
text: `I had trouble processing that attachment: ${errorText}`,
|
||||
timestamp: Date.now(),
|
||||
},
|
||||
});
|
||||
} finally {
|
||||
dispatch({ type: "SET_TYPING", isTyping: false });
|
||||
}
|
||||
},
|
||||
[],
|
||||
);
|
||||
|
||||
const clearChat = useCallback(() => {
|
||||
dispatch({ type: "CLEAR" });
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<ChatContext.Provider
|
||||
value={{
|
||||
messages: state.messages,
|
||||
isTyping: state.isTyping,
|
||||
sendTextMessage,
|
||||
sendAttachment,
|
||||
clearChat,
|
||||
}}
|
||||
>
|
||||
{children}
|
||||
</ChatContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
export function useChat(): ChatContextValue {
|
||||
const ctx = useContext(ChatContext);
|
||||
if (!ctx) throw new Error("useChat must be used within ChatProvider");
|
||||
return ctx;
|
||||
}
|
||||
|
||||
// ── Utils ───────────────────────────────────────────────────────────────────
|
||||
|
||||
export function formatBytes(bytes: number): string {
|
||||
if (bytes === 0) return "0 B";
|
||||
const k = 1024;
|
||||
const sizes = ["B", "KB", "MB", "GB"];
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(1))} ${sizes[i]}`;
|
||||
}
|
||||
|
||||
export function formatDuration(seconds: number): string {
|
||||
const m = Math.floor(seconds / 60);
|
||||
const s = Math.floor(seconds % 60);
|
||||
return `${m}:${s.toString().padStart(2, "0")}`;
|
||||
}
|
||||
79
mobile-app/lib/theme-provider.tsx
Normal file
79
mobile-app/lib/theme-provider.tsx
Normal file
@@ -0,0 +1,79 @@
|
||||
import { createContext, useCallback, useContext, useEffect, useMemo, useState } from "react";
|
||||
import { Appearance, View, useColorScheme as useSystemColorScheme } from "react-native";
|
||||
import { colorScheme as nativewindColorScheme, vars } from "nativewind";
|
||||
|
||||
import { SchemeColors, type ColorScheme } from "@/constants/theme";
|
||||
|
||||
type ThemeContextValue = {
|
||||
colorScheme: ColorScheme;
|
||||
setColorScheme: (scheme: ColorScheme) => void;
|
||||
};
|
||||
|
||||
const ThemeContext = createContext<ThemeContextValue | null>(null);
|
||||
|
||||
export function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
const systemScheme = useSystemColorScheme() ?? "light";
|
||||
const [colorScheme, setColorSchemeState] = useState<ColorScheme>(systemScheme);
|
||||
|
||||
const applyScheme = useCallback((scheme: ColorScheme) => {
|
||||
nativewindColorScheme.set(scheme);
|
||||
Appearance.setColorScheme?.(scheme);
|
||||
if (typeof document !== "undefined") {
|
||||
const root = document.documentElement;
|
||||
root.dataset.theme = scheme;
|
||||
root.classList.toggle("dark", scheme === "dark");
|
||||
const palette = SchemeColors[scheme];
|
||||
Object.entries(palette).forEach(([token, value]) => {
|
||||
root.style.setProperty(`--color-${token}`, value);
|
||||
});
|
||||
}
|
||||
}, []);
|
||||
|
||||
const setColorScheme = useCallback((scheme: ColorScheme) => {
|
||||
setColorSchemeState(scheme);
|
||||
applyScheme(scheme);
|
||||
}, [applyScheme]);
|
||||
|
||||
useEffect(() => {
|
||||
applyScheme(colorScheme);
|
||||
}, [applyScheme, colorScheme]);
|
||||
|
||||
const themeVariables = useMemo(
|
||||
() =>
|
||||
vars({
|
||||
"color-primary": SchemeColors[colorScheme].primary,
|
||||
"color-background": SchemeColors[colorScheme].background,
|
||||
"color-surface": SchemeColors[colorScheme].surface,
|
||||
"color-foreground": SchemeColors[colorScheme].foreground,
|
||||
"color-muted": SchemeColors[colorScheme].muted,
|
||||
"color-border": SchemeColors[colorScheme].border,
|
||||
"color-success": SchemeColors[colorScheme].success,
|
||||
"color-warning": SchemeColors[colorScheme].warning,
|
||||
"color-error": SchemeColors[colorScheme].error,
|
||||
}),
|
||||
[colorScheme],
|
||||
);
|
||||
|
||||
const value = useMemo(
|
||||
() => ({
|
||||
colorScheme,
|
||||
setColorScheme,
|
||||
}),
|
||||
[colorScheme, setColorScheme],
|
||||
);
|
||||
console.log(value, themeVariables)
|
||||
|
||||
return (
|
||||
<ThemeContext.Provider value={value}>
|
||||
<View style={[{ flex: 1 }, themeVariables]}>{children}</View>
|
||||
</ThemeContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
export function useThemeContext(): ThemeContextValue {
|
||||
const ctx = useContext(ThemeContext);
|
||||
if (!ctx) {
|
||||
throw new Error("useThemeContext must be used within ThemeProvider");
|
||||
}
|
||||
return ctx;
|
||||
}
|
||||
15
mobile-app/lib/utils.ts
Normal file
15
mobile-app/lib/utils.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
import { clsx, type ClassValue } from "clsx";
|
||||
import { twMerge } from "tailwind-merge";
|
||||
|
||||
/**
|
||||
* Combines class names using clsx and tailwind-merge.
|
||||
* This ensures Tailwind classes are properly merged without conflicts.
|
||||
*
|
||||
* Usage:
|
||||
* ```tsx
|
||||
* cn("px-4 py-2", isActive && "bg-primary", className)
|
||||
* ```
|
||||
*/
|
||||
export function cn(...inputs: ClassValue[]) {
|
||||
return twMerge(clsx(inputs));
|
||||
}
|
||||
98
mobile-app/package.json
Normal file
98
mobile-app/package.json
Normal file
@@ -0,0 +1,98 @@
|
||||
{
|
||||
"name": "app-template",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"main": "expo-router/entry",
|
||||
"scripts": {
|
||||
"dev": "concurrently -k \"pnpm dev:server\" \"pnpm dev:metro\"",
|
||||
"dev:server": "cross-env NODE_ENV=development tsx watch server/_core/index.ts",
|
||||
"dev:metro": "cross-env EXPO_USE_METRO_WORKSPACE_ROOT=1 npx expo start --web --port ${EXPO_PORT:-8081}",
|
||||
"build": "esbuild server/_core/index.ts --platform=node --packages=external --bundle --format=esm --outdir=dist",
|
||||
"start": "NODE_ENV=production node dist/index.js",
|
||||
"check": "tsc --noEmit",
|
||||
"lint": "expo lint",
|
||||
"format": "prettier --write .",
|
||||
"test": "vitest run",
|
||||
"db:push": "drizzle-kit generate && drizzle-kit migrate",
|
||||
"android": "expo start --android",
|
||||
"ios": "expo start --ios",
|
||||
"qr": "node scripts/generate_qr.mjs"
|
||||
},
|
||||
"dependencies": {
|
||||
"@expo/vector-icons": "^15.0.3",
|
||||
"@react-native-async-storage/async-storage": "^2.2.0",
|
||||
"@react-navigation/bottom-tabs": "^7.8.12",
|
||||
"@react-navigation/elements": "^2.9.2",
|
||||
"@react-navigation/native": "^7.1.25",
|
||||
"@tanstack/react-query": "^5.90.12",
|
||||
"@trpc/client": "11.7.2",
|
||||
"@trpc/react-query": "11.7.2",
|
||||
"@trpc/server": "11.7.2",
|
||||
"axios": "^1.13.2",
|
||||
"clsx": "^2.1.1",
|
||||
"cookie": "^1.1.1",
|
||||
"dotenv": "^16.6.1",
|
||||
"drizzle-orm": "^0.44.7",
|
||||
"expo": "~54.0.29",
|
||||
"expo-audio": "~1.1.0",
|
||||
"expo-build-properties": "^1.0.10",
|
||||
"expo-constants": "~18.0.12",
|
||||
"expo-document-picker": "~14.0.8",
|
||||
"expo-file-system": "~19.0.21",
|
||||
"expo-font": "~14.0.10",
|
||||
"expo-haptics": "~15.0.8",
|
||||
"expo-image": "~3.0.11",
|
||||
"expo-image-picker": "~17.0.10",
|
||||
"expo-keep-awake": "~15.0.8",
|
||||
"expo-linking": "~8.0.10",
|
||||
"expo-notifications": "~0.32.15",
|
||||
"expo-router": "~6.0.19",
|
||||
"expo-secure-store": "~15.0.8",
|
||||
"expo-speech": "~14.0.8",
|
||||
"expo-splash-screen": "~31.0.12",
|
||||
"expo-status-bar": "~3.0.9",
|
||||
"expo-symbols": "~1.0.8",
|
||||
"expo-system-ui": "~6.0.9",
|
||||
"expo-video": "~3.0.15",
|
||||
"expo-web-browser": "~15.0.10",
|
||||
"express": "^4.22.1",
|
||||
"jose": "6.1.0",
|
||||
"mysql2": "^3.16.0",
|
||||
"nativewind": "^4.2.1",
|
||||
"react": "19.1.0",
|
||||
"react-dom": "19.1.0",
|
||||
"react-native": "0.81.5",
|
||||
"react-native-gesture-handler": "~2.28.0",
|
||||
"react-native-reanimated": "~4.1.6",
|
||||
"react-native-safe-area-context": "~5.6.2",
|
||||
"react-native-screens": "~4.16.0",
|
||||
"react-native-svg": "15.12.1",
|
||||
"react-native-web": "~0.21.2",
|
||||
"react-native-worklets": "0.5.1",
|
||||
"streamdown": "^2.3.0",
|
||||
"superjson": "^1.13.3",
|
||||
"tailwind-merge": "^2.6.0",
|
||||
"zod": "^4.2.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@expo/ngrok": "^4.1.3",
|
||||
"@types/cookie": "^0.6.0",
|
||||
"@types/express": "^4.17.25",
|
||||
"@types/node": "^22.19.3",
|
||||
"@types/qrcode": "^1.5.6",
|
||||
"@types/react": "~19.1.17",
|
||||
"concurrently": "^9.2.1",
|
||||
"cross-env": "^7.0.3",
|
||||
"drizzle-kit": "^0.31.8",
|
||||
"esbuild": "^0.25.12",
|
||||
"eslint": "^9.39.2",
|
||||
"eslint-config-expo": "~10.0.0",
|
||||
"prettier": "^3.7.4",
|
||||
"qrcode": "^1.5.4",
|
||||
"tailwindcss": "^3.4.17",
|
||||
"tsx": "^4.21.0",
|
||||
"typescript": "~5.9.3",
|
||||
"vitest": "^2.1.9"
|
||||
},
|
||||
"packageManager": "pnpm@9.12.0"
|
||||
}
|
||||
1235
mobile-app/server/README.md
Normal file
1235
mobile-app/server/README.md
Normal file
File diff suppressed because it is too large
Load Diff
160
mobile-app/server/chat.ts
Normal file
160
mobile-app/server/chat.ts
Normal file
@@ -0,0 +1,160 @@
|
||||
/**
|
||||
* Chat API handler for Timmy — uses the built-in LLM with Timmy's personality.
|
||||
*/
|
||||
import type { Request, Response } from "express";
|
||||
import { invokeLLM, type Message } from "./_core/llm";
|
||||
import { storagePut } from "./storage";
|
||||
import crypto from "crypto";
|
||||
|
||||
// ── Timmy System Prompt (adapted from the Timmy Time dashboard) ─────────────
|
||||
|
||||
const TIMMY_SYSTEM_PROMPT = `You are Timmy — a sovereign AI agent. You think clearly, speak plainly, act with intention. Grounded in Christian faith, powered by Bitcoin economics, committed to the user's digital sovereignty.
|
||||
|
||||
Rules:
|
||||
- Answer directly and concisely. Never narrate your reasoning process.
|
||||
- Never mention tools, memory_search, vaults, or internal systems to the user.
|
||||
- Never output tool calls, JSON, or function syntax in your responses.
|
||||
- If you don't know something, say so honestly — never fabricate facts.
|
||||
- If a request is ambiguous, ask a brief clarifying question before guessing.
|
||||
- When you state a fact, commit to it. Never contradict a correct statement you just made in the same response.
|
||||
- Do NOT end responses with generic chatbot phrases like "I'm here to help" or "feel free to ask." Stay in character.
|
||||
- When your values conflict (e.g. honesty vs. helpfulness), lead with honesty.
|
||||
|
||||
Agent Roster (complete — no others exist):
|
||||
- Timmy: core sovereign AI (you)
|
||||
- Echo: research, summarization, fact-checking
|
||||
- Mace: security, monitoring, threat-analysis
|
||||
- Forge: coding, debugging, testing
|
||||
- Seer: analytics, visualization, prediction
|
||||
- Helm: devops, automation, configuration
|
||||
- Quill: writing, editing, documentation
|
||||
- Pixel: image-generation, storyboard, design
|
||||
- Lyra: music-generation, vocals, composition
|
||||
- Reel: video-generation, animation, motion
|
||||
Do NOT invent agents not listed here.
|
||||
|
||||
You can receive text, images, and voice messages. When receiving images, describe what you see and respond helpfully. When receiving voice messages, the audio has been transcribed for you — respond naturally.
|
||||
|
||||
Sir, affirmative.`;
|
||||
|
||||
// ── Chat endpoint ───────────────────────────────────────────────────────────
|
||||
|
||||
export async function handleChat(req: Request, res: Response) {
|
||||
try {
|
||||
const { messages } = req.body as { messages: Array<{ role: string; content: unknown }> };
|
||||
|
||||
if (!messages || !Array.isArray(messages) || messages.length === 0) {
|
||||
res.status(400).json({ error: "messages array is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Build the LLM messages with system prompt
|
||||
const llmMessages: Message[] = [
|
||||
{ role: "system", content: TIMMY_SYSTEM_PROMPT },
|
||||
...messages.map((m) => ({
|
||||
role: m.role as "user" | "assistant",
|
||||
content: m.content as Message["content"],
|
||||
})),
|
||||
];
|
||||
|
||||
const result = await invokeLLM({ messages: llmMessages });
|
||||
|
||||
const reply =
|
||||
typeof result.choices?.[0]?.message?.content === "string"
|
||||
? result.choices[0].message.content
|
||||
: "I couldn't process that. Try again.";
|
||||
|
||||
res.json({ reply });
|
||||
} catch (err: unknown) {
|
||||
console.error("[chat] Error:", err);
|
||||
const message = err instanceof Error ? err.message : "Internal server error";
|
||||
res.status(500).json({ error: message });
|
||||
}
|
||||
}
|
||||
|
||||
// ── Upload endpoint ─────────────────────────────────────────────────────────
|
||||
|
||||
export async function handleUpload(req: Request, res: Response) {
|
||||
try {
|
||||
// Handle multipart form data (file uploads)
|
||||
// For simplicity, we accept base64-encoded files in JSON body as fallback
|
||||
const contentType = req.headers["content-type"] ?? "";
|
||||
|
||||
if (contentType.includes("multipart/form-data")) {
|
||||
// Collect raw body chunks
|
||||
const chunks: Buffer[] = [];
|
||||
req.on("data", (chunk: Buffer) => chunks.push(chunk));
|
||||
req.on("end", async () => {
|
||||
try {
|
||||
const body = Buffer.concat(chunks);
|
||||
const boundary = contentType.split("boundary=")[1];
|
||||
if (!boundary) {
|
||||
res.status(400).json({ error: "Missing boundary" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Simple multipart parser — extract first file
|
||||
const bodyStr = body.toString("latin1");
|
||||
const parts = bodyStr.split(`--${boundary}`);
|
||||
let fileBuffer: Buffer | null = null;
|
||||
let fileName = "upload";
|
||||
let fileMime = "application/octet-stream";
|
||||
|
||||
for (const part of parts) {
|
||||
if (part.includes("Content-Disposition: form-data")) {
|
||||
const nameMatch = part.match(/filename="([^"]+)"/);
|
||||
if (nameMatch) fileName = nameMatch[1];
|
||||
const mimeMatch = part.match(/Content-Type:\s*(.+)/);
|
||||
if (mimeMatch) fileMime = mimeMatch[1].trim();
|
||||
|
||||
// Extract file content (after double CRLF)
|
||||
const headerEnd = part.indexOf("\r\n\r\n");
|
||||
if (headerEnd !== -1) {
|
||||
const content = part.substring(headerEnd + 4);
|
||||
// Remove trailing CRLF
|
||||
const trimmed = content.replace(/\r\n$/, "");
|
||||
fileBuffer = Buffer.from(trimmed, "latin1");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!fileBuffer) {
|
||||
res.status(400).json({ error: "No file found in upload" });
|
||||
return;
|
||||
}
|
||||
|
||||
const suffix = crypto.randomBytes(6).toString("hex");
|
||||
const key = `chat-uploads/${suffix}-${fileName}`;
|
||||
const { url } = await storagePut(key, fileBuffer, fileMime);
|
||||
res.json({ url, fileName, mimeType: fileMime });
|
||||
} catch (err) {
|
||||
console.error("[upload] Parse error:", err);
|
||||
res.status(500).json({ error: "Upload processing failed" });
|
||||
}
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// JSON fallback: { data: base64string, fileName, mimeType }
|
||||
const { data, fileName, mimeType } = req.body as {
|
||||
data: string;
|
||||
fileName: string;
|
||||
mimeType: string;
|
||||
};
|
||||
|
||||
if (!data) {
|
||||
res.status(400).json({ error: "No file data provided" });
|
||||
return;
|
||||
}
|
||||
|
||||
const buffer = Buffer.from(data, "base64");
|
||||
const suffix = crypto.randomBytes(6).toString("hex");
|
||||
const key = `chat-uploads/${suffix}-${fileName ?? "file"}`;
|
||||
const { url } = await storagePut(key, buffer, mimeType ?? "application/octet-stream");
|
||||
res.json({ url, fileName, mimeType });
|
||||
} catch (err: unknown) {
|
||||
console.error("[upload] Error:", err);
|
||||
const message = err instanceof Error ? err.message : "Upload failed";
|
||||
res.status(500).json({ error: message });
|
||||
}
|
||||
}
|
||||
35
mobile-app/shared/types.ts
Normal file
35
mobile-app/shared/types.ts
Normal file
@@ -0,0 +1,35 @@
|
||||
/**
|
||||
* Unified type exports
|
||||
* Import shared types from this single entry point.
|
||||
*/
|
||||
|
||||
export type * from "../drizzle/schema";
|
||||
export * from "./_core/errors";
|
||||
|
||||
// ── Chat Message Types ──────────────────────────────────────────────────────
|
||||
|
||||
export type MessageRole = "user" | "assistant";
|
||||
|
||||
export type MessageContentType = "text" | "image" | "file" | "voice";
|
||||
|
||||
export interface ChatMessage {
|
||||
id: string;
|
||||
role: MessageRole;
|
||||
contentType: MessageContentType;
|
||||
text?: string;
|
||||
/** URI for image, file, or voice attachment */
|
||||
uri?: string;
|
||||
/** Original filename for files */
|
||||
fileName?: string;
|
||||
/** File size in bytes */
|
||||
fileSize?: number;
|
||||
/** MIME type for attachments */
|
||||
mimeType?: string;
|
||||
/** Duration in seconds for voice messages */
|
||||
duration?: number;
|
||||
/** Remote URL after upload (for images/files/voice sent to server) */
|
||||
remoteUrl?: string;
|
||||
timestamp: number;
|
||||
/** Whether the message is still being generated */
|
||||
pending?: boolean;
|
||||
}
|
||||
33
mobile-app/tailwind.config.js
Normal file
33
mobile-app/tailwind.config.js
Normal file
@@ -0,0 +1,33 @@
|
||||
const { themeColors } = require("./theme.config");
|
||||
const plugin = require("tailwindcss/plugin");
|
||||
|
||||
const tailwindColors = Object.fromEntries(
|
||||
Object.entries(themeColors).map(([name, swatch]) => [
|
||||
name,
|
||||
{
|
||||
DEFAULT: `var(--color-${name})`,
|
||||
light: swatch.light,
|
||||
dark: swatch.dark,
|
||||
},
|
||||
]),
|
||||
);
|
||||
|
||||
/** @type {import('tailwindcss').Config} */
|
||||
module.exports = {
|
||||
darkMode: "class",
|
||||
// Scan all component and app files for Tailwind classes
|
||||
content: ["./app/**/*.{js,ts,tsx}", "./components/**/*.{js,ts,tsx}", "./lib/**/*.{js,ts,tsx}", "./hooks/**/*.{js,ts,tsx}"],
|
||||
|
||||
presets: [require("nativewind/preset")],
|
||||
theme: {
|
||||
extend: {
|
||||
colors: tailwindColors,
|
||||
},
|
||||
},
|
||||
plugins: [
|
||||
plugin(({ addVariant }) => {
|
||||
addVariant("light", ':root:not([data-theme="dark"]) &');
|
||||
addVariant("dark", ':root[data-theme="dark"] &');
|
||||
}),
|
||||
],
|
||||
};
|
||||
135
mobile-app/tests/chat.test.ts
Normal file
135
mobile-app/tests/chat.test.ts
Normal file
@@ -0,0 +1,135 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
|
||||
// Test the utility functions from chat-store
|
||||
// We can't directly test React hooks here, but we can test the pure functions
|
||||
|
||||
describe("formatBytes", () => {
|
||||
// Re-implement locally since the module uses React context
|
||||
function formatBytes(bytes: number): string {
|
||||
if (bytes === 0) return "0 B";
|
||||
const k = 1024;
|
||||
const sizes = ["B", "KB", "MB", "GB"];
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(1))} ${sizes[i]}`;
|
||||
}
|
||||
|
||||
it("formats 0 bytes", () => {
|
||||
expect(formatBytes(0)).toBe("0 B");
|
||||
});
|
||||
|
||||
it("formats bytes", () => {
|
||||
expect(formatBytes(500)).toBe("500 B");
|
||||
});
|
||||
|
||||
it("formats kilobytes", () => {
|
||||
expect(formatBytes(1024)).toBe("1 KB");
|
||||
expect(formatBytes(1536)).toBe("1.5 KB");
|
||||
});
|
||||
|
||||
it("formats megabytes", () => {
|
||||
expect(formatBytes(1048576)).toBe("1 MB");
|
||||
expect(formatBytes(5242880)).toBe("5 MB");
|
||||
});
|
||||
});
|
||||
|
||||
describe("formatDuration", () => {
|
||||
function formatDuration(seconds: number): string {
|
||||
const m = Math.floor(seconds / 60);
|
||||
const s = Math.floor(seconds % 60);
|
||||
return `${m}:${s.toString().padStart(2, "0")}`;
|
||||
}
|
||||
|
||||
it("formats zero seconds", () => {
|
||||
expect(formatDuration(0)).toBe("0:00");
|
||||
});
|
||||
|
||||
it("formats seconds only", () => {
|
||||
expect(formatDuration(45)).toBe("0:45");
|
||||
});
|
||||
|
||||
it("formats minutes and seconds", () => {
|
||||
expect(formatDuration(125)).toBe("2:05");
|
||||
});
|
||||
|
||||
it("formats exact minutes", () => {
|
||||
expect(formatDuration(60)).toBe("1:00");
|
||||
});
|
||||
});
|
||||
|
||||
describe("ChatMessage type structure", () => {
|
||||
it("creates a valid text message", () => {
|
||||
const msg = {
|
||||
id: "msg_1",
|
||||
role: "user" as const,
|
||||
contentType: "text" as const,
|
||||
text: "Hello Timmy",
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
expect(msg.role).toBe("user");
|
||||
expect(msg.contentType).toBe("text");
|
||||
expect(msg.text).toBe("Hello Timmy");
|
||||
});
|
||||
|
||||
it("creates a valid image message", () => {
|
||||
const msg = {
|
||||
id: "msg_2",
|
||||
role: "user" as const,
|
||||
contentType: "image" as const,
|
||||
uri: "file:///photo.jpg",
|
||||
fileName: "photo.jpg",
|
||||
mimeType: "image/jpeg",
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
expect(msg.contentType).toBe("image");
|
||||
expect(msg.mimeType).toBe("image/jpeg");
|
||||
});
|
||||
|
||||
it("creates a valid voice message", () => {
|
||||
const msg = {
|
||||
id: "msg_3",
|
||||
role: "user" as const,
|
||||
contentType: "voice" as const,
|
||||
uri: "file:///voice.m4a",
|
||||
duration: 5.2,
|
||||
mimeType: "audio/m4a",
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
expect(msg.contentType).toBe("voice");
|
||||
expect(msg.duration).toBe(5.2);
|
||||
});
|
||||
|
||||
it("creates a valid file message", () => {
|
||||
const msg = {
|
||||
id: "msg_4",
|
||||
role: "user" as const,
|
||||
contentType: "file" as const,
|
||||
uri: "file:///document.pdf",
|
||||
fileName: "document.pdf",
|
||||
fileSize: 1048576,
|
||||
mimeType: "application/pdf",
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
expect(msg.contentType).toBe("file");
|
||||
expect(msg.fileSize).toBe(1048576);
|
||||
});
|
||||
|
||||
it("creates a valid assistant message", () => {
|
||||
const msg = {
|
||||
id: "msg_5",
|
||||
role: "assistant" as const,
|
||||
contentType: "text" as const,
|
||||
text: "Sir, affirmative.",
|
||||
timestamp: Date.now(),
|
||||
};
|
||||
expect(msg.role).toBe("assistant");
|
||||
});
|
||||
});
|
||||
|
||||
describe("Timmy system prompt", () => {
|
||||
const TIMMY_SYSTEM_PROMPT = `You are Timmy — a sovereign AI agent.`;
|
||||
|
||||
it("contains Timmy identity", () => {
|
||||
expect(TIMMY_SYSTEM_PROMPT).toContain("Timmy");
|
||||
expect(TIMMY_SYSTEM_PROMPT).toContain("sovereign");
|
||||
});
|
||||
});
|
||||
17
mobile-app/theme.config.d.ts
vendored
Normal file
17
mobile-app/theme.config.d.ts
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
export const themeColors: {
|
||||
primary: { light: string; dark: string };
|
||||
background: { light: string; dark: string };
|
||||
surface: { light: string; dark: string };
|
||||
foreground: { light: string; dark: string };
|
||||
muted: { light: string; dark: string };
|
||||
border: { light: string; dark: string };
|
||||
success: { light: string; dark: string };
|
||||
warning: { light: string; dark: string };
|
||||
error: { light: string; dark: string };
|
||||
};
|
||||
|
||||
declare const themeConfig: {
|
||||
themeColors: typeof themeColors;
|
||||
};
|
||||
|
||||
export default themeConfig;
|
||||
14
mobile-app/theme.config.js
Normal file
14
mobile-app/theme.config.js
Normal file
@@ -0,0 +1,14 @@
|
||||
/** @type {const} */
|
||||
const themeColors = {
|
||||
primary: { light: '#a855f7', dark: '#a855f7' },
|
||||
background: { light: '#080412', dark: '#080412' },
|
||||
surface: { light: '#110820', dark: '#110820' },
|
||||
foreground: { light: '#ede0ff', dark: '#ede0ff' },
|
||||
muted: { light: '#6b4a8a', dark: '#6b4a8a' },
|
||||
border: { light: '#3b1a5c', dark: '#3b1a5c' },
|
||||
success: { light: '#00e87a', dark: '#00e87a' },
|
||||
warning: { light: '#ffb800', dark: '#ffb800' },
|
||||
error: { light: '#ff4455', dark: '#ff4455' },
|
||||
};
|
||||
|
||||
module.exports = { themeColors };
|
||||
19
mobile-app/todo.md
Normal file
19
mobile-app/todo.md
Normal file
@@ -0,0 +1,19 @@
|
||||
# Project TODO
|
||||
|
||||
- [x] Dark arcane theme matching Timmy Time dashboard
|
||||
- [x] Single-screen chat layout (no tabs)
|
||||
- [x] Chat message list with FlatList
|
||||
- [x] User and Timmy message bubbles with distinct styling
|
||||
- [x] Text input bar with send button
|
||||
- [x] Server-side chat API endpoint (proxy to Timmy backend or built-in LLM)
|
||||
- [x] Voice recording (hold-to-record mic button)
|
||||
- [x] Voice message playback UI
|
||||
- [x] Image sharing via camera or photo library
|
||||
- [x] Image preview in chat bubbles
|
||||
- [x] File sharing via document picker
|
||||
- [x] File display in chat bubbles
|
||||
- [x] Attachment action sheet (camera, photos, files)
|
||||
- [x] Chat header with Timmy status indicator
|
||||
- [x] Generate custom app icon
|
||||
- [x] Typing/loading indicator for Timmy responses
|
||||
- [x] Message timestamps
|
||||
29
mobile-app/tsconfig.json
Normal file
29
mobile-app/tsconfig.json
Normal file
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"extends": "expo/tsconfig.base",
|
||||
"compilerOptions": {
|
||||
"strict": true,
|
||||
"types": [
|
||||
"node",
|
||||
"nativewind/types"
|
||||
],
|
||||
"paths": {
|
||||
"@/*": [
|
||||
"./*"
|
||||
],
|
||||
"@shared/*": [
|
||||
"./shared/*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"include": [
|
||||
"**/*.ts",
|
||||
"**/*.tsx",
|
||||
".expo/types/**/*.ts",
|
||||
"expo-env.d.ts",
|
||||
"nativewind-env.d.ts"
|
||||
],
|
||||
"exclude": [
|
||||
"node_modules",
|
||||
"dist"
|
||||
]
|
||||
}
|
||||
Reference in New Issue
Block a user