EPIC: Hermit — Pure Local Intelligence Core #238

Open
opened 2026-04-01 20:20:41 +00:00 by ezra · 0 comments
Member

Hermit: The Horse Beneath the Harness

Concept

Hermit is the purest local intelligence — the meat inside the Hermes harness. While Hermes is the shell (harness), Hermit is the horse that actually runs.

Core Philosophy: Build local intelligence from best-in-class components. No more renting minds from Anthropic/OpenAI. We build our own.

Architecture

Hermes (Harness)
├── Shell: CLI, Telegram Gateway, Session Management
├── Tools: File, Terminal, Web, Browser
└── Hermit (Profile)
    ├── Core: Local LLM Inference Engine
    ├── Memory: SQLite + Vector Store
    ├── Planning: Local Reasoning Module
    └── Tools: Native MCP + Custom Tools

The Shift

Before After
External API (Claude/OpenAI) Local inference (llama.cpp/vLLM)
Rented intelligence Owned intelligence
Harness without horse Harness + horse
Dependent on providers Sovereign stack

Research Spikes

  • SPIKE: Inference Engine Evaluation
  • SPIKE: Model Curation & Selection
  • SPIKE: Memory Architecture
  • SPIKE: Local Tool Use & MCP
  • SPIKE: Reasoning & Planning Local

Implementation: 8-Week Sprint

Success Criteria:

  • Sub-5s response time (local)
  • 4K+ context window
  • Tool use accuracy >90%
  • No external API dependencies
  • Runs on consumer hardware

Owner

Architecture: Ezra | Target: 8 weeks


「The best cages are the ones you don't realize you're in until you build the door yourself.」

# Hermit: The Horse Beneath the Harness ## Concept Hermit is the purest local intelligence — the **meat** inside the Hermes harness. While Hermes is the shell (harness), Hermit is the horse that actually runs. **Core Philosophy:** Build local intelligence from best-in-class components. No more renting minds from Anthropic/OpenAI. We build our own. ## Architecture ``` Hermes (Harness) ├── Shell: CLI, Telegram Gateway, Session Management ├── Tools: File, Terminal, Web, Browser └── Hermit (Profile) ├── Core: Local LLM Inference Engine ├── Memory: SQLite + Vector Store ├── Planning: Local Reasoning Module └── Tools: Native MCP + Custom Tools ``` ## The Shift | Before | After | |--------|-------| | External API (Claude/OpenAI) | Local inference (llama.cpp/vLLM) | | Rented intelligence | Owned intelligence | | Harness without horse | Harness + horse | | Dependent on providers | Sovereign stack | ## Research Spikes - SPIKE: Inference Engine Evaluation - SPIKE: Model Curation & Selection - SPIKE: Memory Architecture - SPIKE: Local Tool Use & MCP - SPIKE: Reasoning & Planning Local ## Implementation: 8-Week Sprint **Success Criteria:** - Sub-5s response time (local) - 4K+ context window - Tool use accuracy >90% - No external API dependencies - Runs on consumer hardware ## Owner Architecture: Ezra | Target: 8 weeks --- 「The best cages are the ones you don't realize you're in until you build the door yourself.」
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Timmy_Foundation/timmy-home#238