[SOTA] Evaluate Engram as ONNX-free memory replacement for MemPalace #413

Closed
opened 2026-04-08 11:20:51 +00:00 by Timmy · 1 comment
Owner

From SOTA research Q2 2026.

Engram (2.3K★, created Feb 2026) — Go binary using SQLite+FTS5 with built-in MCP server. Designed for coding agents. Zero ONNX dependency.

This solves our #373 problem at the root. Instead of patching chromadb's ONNX embedding function, use a tool that was DESIGNED for local agent memory without heavy ML dependencies.

Features: structured memory (bio, preferences, procedures), MCP server for tool integration, SQLite backend (matches Perplexity's SovereignStore approach).

Acceptance Criteria

  • Install engram on Mac
  • Test: store and retrieve memories via MCP
  • Compare: engram vs our patched chromadb+mempalace
  • Decision: adopt or skip with reasoning
From SOTA research Q2 2026. **Engram** (2.3K★, created Feb 2026) — Go binary using SQLite+FTS5 with built-in MCP server. Designed for coding agents. Zero ONNX dependency. This solves our #373 problem at the root. Instead of patching chromadb's ONNX embedding function, use a tool that was DESIGNED for local agent memory without heavy ML dependencies. Features: structured memory (bio, preferences, procedures), MCP server for tool integration, SQLite backend (matches Perplexity's SovereignStore approach). ## Acceptance Criteria - [ ] Install engram on Mac - [ ] Test: store and retrieve memories via MCP - [ ] Compare: engram vs our patched chromadb+mempalace - [ ] Decision: adopt or skip with reasoning
Timmy self-assigned this 2026-04-08 11:20:51 +00:00
Author
Owner

Engram — TESTED AND WORKING on Mac

What it is

Persistent memory for AI coding agents. Single Go binary. SQLite + FTS5. Zero dependencies. Built-in MCP server, HTTP API, CLI, and TUI.

Repo: https://github.com/Gentleman-Programming/engram (2,331★, MIT, created Feb 2026)
Version: 1.12.0-beta.1 (released Apr 6)
Installed at: ~/.hermes/bin/engram

Test Results

$ engram save "llama.cpp is sovereign" "Alexander mandated llama.cpp..."
Memory saved: #1 "llama.cpp is sovereign" (manual)

$ engram search "llama"
[1] #1 (manual) — llama.cpp is sovereign
    Alexander mandated llama.cpp over Ollama...

$ engram context
## Memory from Previous Sessions
- [manual] **llama.cpp is sovereign**: Alexander mandated...
- [manual] **Sidecar boundary rule**: Never commit to hermes-agent...
- [manual] **Groq disaster RCA**: 1186 completions on dead API key...

$ engram stats
  Sessions: 1 | Observations: 3 | Database: ~/.engram/engram.db

Why this matters

  • No ONNX. No ChromaDB. No Python embedding model crashes. Just SQLite + FTS5.
  • MCP server built in. engram mcp runs stdio MCP. Works with Claude Code, Hermes, any MCP client.
  • Git sync. engram sync exports compressed chunks for cross-machine memory sharing.
  • Agent-agnostic. Works with any agent that supports MCP tools.
  • 4MB binary. Runs on Bezalel's 2GB VPS. Runs anywhere.

What it replaces

  • Our patched chromadb+mempalace+ONNX stack (broken on Apple Silicon)
  • Perplexity's SovereignStore (good approach, but engram is more complete)
  • The retrieval_enforcer.py subprocess calls to mempalace CLI
  1. Wire engram mcp as an MCP server in hermes config
  2. Use engram save in dispatch scripts to record what agents learn
  3. Use engram context in wake-up protocol for session start context
  4. Use engram sync to share memories across Mac ↔ VPSes via git
  5. Deploy the 5MB linux binary to all 3 VPSes

Action

Install fleet-wide. Replace the broken MemPalace/ChromaDB stack. This is the sovereign memory layer we've been trying to build.

## Engram — TESTED AND WORKING on Mac ### What it is Persistent memory for AI coding agents. Single Go binary. SQLite + FTS5. Zero dependencies. Built-in MCP server, HTTP API, CLI, and TUI. **Repo:** https://github.com/Gentleman-Programming/engram (2,331★, MIT, created Feb 2026) **Version:** 1.12.0-beta.1 (released Apr 6) **Installed at:** `~/.hermes/bin/engram` ### Test Results ``` $ engram save "llama.cpp is sovereign" "Alexander mandated llama.cpp..." Memory saved: #1 "llama.cpp is sovereign" (manual) $ engram search "llama" [1] #1 (manual) — llama.cpp is sovereign Alexander mandated llama.cpp over Ollama... $ engram context ## Memory from Previous Sessions - [manual] **llama.cpp is sovereign**: Alexander mandated... - [manual] **Sidecar boundary rule**: Never commit to hermes-agent... - [manual] **Groq disaster RCA**: 1186 completions on dead API key... $ engram stats Sessions: 1 | Observations: 3 | Database: ~/.engram/engram.db ``` ### Why this matters - **No ONNX.** No ChromaDB. No Python embedding model crashes. Just SQLite + FTS5. - **MCP server built in.** `engram mcp` runs stdio MCP. Works with Claude Code, Hermes, any MCP client. - **Git sync.** `engram sync` exports compressed chunks for cross-machine memory sharing. - **Agent-agnostic.** Works with any agent that supports MCP tools. - **4MB binary.** Runs on Bezalel's 2GB VPS. Runs anywhere. ### What it replaces - Our patched chromadb+mempalace+ONNX stack (broken on Apple Silicon) - Perplexity's SovereignStore (good approach, but engram is more complete) - The retrieval_enforcer.py subprocess calls to mempalace CLI ### Recommended integration 1. Wire `engram mcp` as an MCP server in hermes config 2. Use `engram save` in dispatch scripts to record what agents learn 3. Use `engram context` in wake-up protocol for session start context 4. Use `engram sync` to share memories across Mac ↔ VPSes via git 5. Deploy the 5MB linux binary to all 3 VPSes ### Action Install fleet-wide. Replace the broken MemPalace/ChromaDB stack. This is the sovereign memory layer we've been trying to build.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Timmy_Foundation/timmy-config#413