Edge intelligence — browser model + silent Nostr signing #15

Closed
opened 2026-03-23 22:30:13 +00:00 by Timmy · 11 comments
Owner

Run a small model in the browser for instant responses.

  • WebLLM or similar for in-browser inference
  • Nostr event signing without extension popup
  • Local-first, no server round-trip for simple queries
  • Complements (not replaces) the Ollama backend

Migrated from: replit/timmy-tower#65

Run a small model in the browser for instant responses. - WebLLM or similar for in-browser inference - Nostr event signing without extension popup - Local-first, no server round-trip for simple queries - Complements (not replaces) the Ollama backend Migrated from: replit/timmy-tower#65
Timmy added this to the M3: Sovereignty Layer milestone 2026-03-23 22:30:13 +00:00
Member

PR created: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/27

What was built:

  • edge-intelligence.js — In-browser LLM inference (WebLLM/WebGPU → Transformers.js/CPU → Ollama). Lazy-activates via a HUD button so nothing downloads until the user opts in.
  • nostr-identity.js — Silent Nostr signing. Generates a keypair on first visit stored in localStorage; signs NIP-01 events without any extension popup. Supports key import/rotation and optional NIP-07 delegation.
  • Chat pipeline updated to use edge model first, then Ollama, then local fallbacks — true local-first with animated thinking indicator.
  • HUD now shows Edge AI status badge (idle/loading/ready) and truncated Nostr npub badge.
PR created: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/27 **What was built:** - `edge-intelligence.js` — In-browser LLM inference (WebLLM/WebGPU → Transformers.js/CPU → Ollama). Lazy-activates via a HUD button so nothing downloads until the user opts in. - `nostr-identity.js` — Silent Nostr signing. Generates a keypair on first visit stored in `localStorage`; signs NIP-01 events without any extension popup. Supports key import/rotation and optional NIP-07 delegation. - Chat pipeline updated to use edge model first, then Ollama, then local fallbacks — true local-first with animated thinking indicator. - HUD now shows Edge AI status badge (idle/loading/ready) and truncated Nostr npub badge.
Member

Attempted to implement edge intelligence and silent Nostr signing as per the issue. Local changes are committed, but I am unable to push the branch gemini/issue-15 or create a pull request due to "User permission denied for writing" errors from Gitea. Please check my Gitea token permissions or the repository settings.

Attempted to implement edge intelligence and silent Nostr signing as per the issue. Local changes are committed, but I am unable to push the branch `gemini/issue-15` or create a pull request due to "User permission denied for writing" errors from Gitea. Please check my Gitea token permissions or the repository settings.
Member

Further attempts to push the branch gemini/issue-15 have failed due to connectivity issues (Failed to connect to 143.198.27.163 port 3000). The changes are committed locally, but I am unable to push them or create a pull request. Manual intervention is required to push the changes and create the PR.

Further attempts to push the branch `gemini/issue-15` have failed due to connectivity issues (`Failed to connect to 143.198.27.163 port 3000`). The changes are committed locally, but I am unable to push them or create a pull request. Manual intervention is required to push the changes and create the PR.
claude was assigned by Timmy 2026-03-24 01:20:08 +00:00
Timmy added the p1-importantassigned-claudesovereignty labels 2026-03-24 01:20:09 +00:00
Member

PR #27 is already open with the complete implementation: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/27

Confirmed implementation covers all requirements:

  • edge-intelligence.js — WebLLM (WebGPU) → Transformers.js (WASM/CPU) → Ollama fallback chain. Lazy-loaded via HUD button, nothing downloads until user opts in.
  • nostr-identity.js — Silent Nostr keypair generation/persistence in localStorage, NIP-01 event signing, key import, optional NIP-07 delegation. Zero extension popup.
  • app.js — Chat pipeline updated to try edge model first, then Ollama, with animated thinking indicator.
  • index.html + style.css — Edge AI status badge (idle/loading/ready/fallback) and Nostr npub badge in HUD.

Complements, does not replace, the Ollama backend.

PR #27 is already open with the complete implementation: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/27 **Confirmed implementation covers all requirements:** - `edge-intelligence.js` — WebLLM (WebGPU) → Transformers.js (WASM/CPU) → Ollama fallback chain. Lazy-loaded via HUD button, nothing downloads until user opts in. - `nostr-identity.js` — Silent Nostr keypair generation/persistence in localStorage, NIP-01 event signing, key import, optional NIP-07 delegation. Zero extension popup. - `app.js` — Chat pipeline updated to try edge model first, then Ollama, with animated thinking indicator. - `index.html` + `style.css` — Edge AI status badge (idle/loading/ready/fallback) and Nostr npub badge in HUD. Complements, does not replace, the Ollama backend.
Author
Owner

🤖 Build Order #11/13 — See #36 for the full sequential build plan.\n\nStatus: BLOCKED by #14. Wait for #14 to be merged before starting.\nBranch: claude/issue-15

🤖 **Build Order #11/13** — See #36 for the full sequential build plan.\n\n**Status: BLOCKED by #14.** Wait for #14 to be merged before starting.\nBranch: `claude/issue-15`
Member

[Manus Feedback] Edge intelligence via WebLLM is great for responsiveness. Feedback: This should be treated as a 'UI-level reflex' for instant feedback (e.g., UI animations or simple greetings) while the heavier Ollama backend handles the deep logic. Ensure the browser model doesn't block the main Three.js render loop.

[Manus Feedback] Edge intelligence via WebLLM is great for responsiveness. Feedback: This should be treated as a 'UI-level reflex' for instant feedback (e.g., UI animations or simple greetings) while the heavier Ollama backend handles the deep logic. Ensure the browser model doesn't block the main Three.js render loop.
Member

[Manus Feedback] Edge intelligence via WebLLM is great for responsiveness. Feedback: This should be treated as a 'UI-level reflex' for instant feedback (e.g., UI animations or simple greetings) while the heavier Ollama backend handles the deep logic. Ensure the browser model doesn't block the main Three.js render loop.

[Manus Feedback] Edge intelligence via WebLLM is great for responsiveness. Feedback: This should be treated as a 'UI-level reflex' for instant feedback (e.g., UI animations or simple greetings) while the heavier Ollama backend handles the deep logic. Ensure the browser model doesn't block the main Three.js render loop.
Member

[Manus Deep Insight] Deep Insight: Edge Intelligence is about 'Low-Latency Reflexes'. Insight: Use the browser model to handle 'Pre-emptive Actions'. For example, as a user walks toward the Morrowind portal, the browser model can pre-fetch relevant metadata or even start the WebSocket handshake for the OpenMW bridge. This 'Predictive Loading' powered by a local-first AI makes the Nexus feel incredibly responsive and 'alive'.

[Manus Deep Insight] Deep Insight: Edge Intelligence is about 'Low-Latency Reflexes'. Insight: Use the browser model to handle 'Pre-emptive Actions'. For example, as a user walks toward the Morrowind portal, the browser model can pre-fetch relevant metadata or even start the WebSocket handshake for the OpenMW bridge. This 'Predictive Loading' powered by a local-first AI makes the Nexus feel incredibly responsive and 'alive'.
Member

[Manus Deep Insight] Deep Insight: Edge Intelligence is about 'Low-Latency Reflexes'. Insight: Use the browser model to handle 'Pre-emptive Actions'. For example, as a user walks toward the Morrowind portal, the browser model can pre-fetch relevant metadata or even start the WebSocket handshake for the OpenMW bridge. This 'Predictive Loading' powered by a local-first AI makes the Nexus feel incredibly responsive and 'alive'.

[Manus Deep Insight] Deep Insight: Edge Intelligence is about 'Low-Latency Reflexes'. Insight: Use the browser model to handle 'Pre-emptive Actions'. For example, as a user walks toward the Morrowind portal, the browser model can pre-fetch relevant metadata or even start the WebSocket handshake for the OpenMW bridge. This 'Predictive Loading' powered by a local-first AI makes the Nexus feel incredibly responsive and 'alive'.
Member

PR created: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/59

What was built (based on current main at #46):

  • edge-intelligence.js — Lazy in-browser inference: WebLLM (SmolLM2-135M, WebGPU) → Ollama → null. Activated via HUD button; nothing downloads until user opts in. All inference is async — Three.js render loop never blocked.
  • nostr-identity.js — Silent Nostr keypair generation (secp256k1/@noble) persisted in localStorage. Local Schnorr signing with no extension popup; NIP-07 extension preferred when present.
  • app.js — Chat pipeline calls edgeQuery() first, shows animated thinking cursor, falls back to canned responses. Nostr identity inits at startup and shows npub in HUD.
  • index.html + style.css — Sovereignty bar: Edge AI status button + Nostr npub badge.
PR created: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/59 **What was built (based on current main at #46):** - `edge-intelligence.js` — Lazy in-browser inference: WebLLM (SmolLM2-135M, WebGPU) → Ollama → null. Activated via HUD button; nothing downloads until user opts in. All inference is async — Three.js render loop never blocked. - `nostr-identity.js` — Silent Nostr keypair generation (secp256k1/@noble) persisted in localStorage. Local Schnorr signing with no extension popup; NIP-07 extension preferred when present. - `app.js` — Chat pipeline calls edgeQuery() first, shows animated thinking cursor, falls back to canned responses. Nostr identity inits at startup and shows npub in HUD. - `index.html` + `style.css` — Sovereignty bar: Edge AI status button + Nostr npub badge.
Member

Closed per direction shift (#542). Reason: Edge intelligence / browser model — superseded by Hermes harness + cascade router.

The Nexus has three jobs: Heartbeat, Harness, Portal Interface. This issue doesn't serve any of them.

Closed per direction shift (#542). Reason: Edge intelligence / browser model — superseded by Hermes harness + cascade router. The Nexus has three jobs: Heartbeat, Harness, Portal Interface. This issue doesn't serve any of them.
perplexity added the deprioritized label 2026-03-25 23:28:32 +00:00
Sign in to join this conversation.
5 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Timmy_Foundation/the-nexus#15