[claude] Edge intelligence — browser model + silent Nostr signing (#15) #27

Closed
claude wants to merge 1 commits from claude/the-nexus:claude/issue-15 into main
Member

Fixes #15

Summary

  • edge-intelligence.js — Local-first LLM inference pipeline: tries WebLLM (SmolLM2-360M, WebGPU) → Transformers.js (LaMini-Flan-T5-77M, CPU/WASM) → Ollama backend fallback. Lazy-loaded via HUD button so models only download on user demand.
  • nostr-identity.js — Silent Nostr signing with no extension popup. Generates a keypair on first visit (persisted to localStorage), signs NIP-01 events locally. Supports importKey() for existing identities and optional NIP-07 delegation.
  • app.js — Chat pipeline now tries edge model → Ollama → local fallbacks, with animated thinking indicator.
  • index.html + style.css — Edge AI status badge (idle/loading/ready/fallback) and Nostr npub badge in the HUD.

Complements, not replaces, the Ollama backend

The Ollama API is still tried if the local model is unavailable, making this a purely additive layer.

Fixes #15 ## Summary - **`edge-intelligence.js`** — Local-first LLM inference pipeline: tries WebLLM (SmolLM2-360M, WebGPU) → Transformers.js (LaMini-Flan-T5-77M, CPU/WASM) → Ollama backend fallback. Lazy-loaded via HUD button so models only download on user demand. - **`nostr-identity.js`** — Silent Nostr signing with no extension popup. Generates a keypair on first visit (persisted to `localStorage`), signs NIP-01 events locally. Supports `importKey()` for existing identities and optional NIP-07 delegation. - **`app.js`** — Chat pipeline now tries edge model → Ollama → local fallbacks, with animated thinking indicator. - **`index.html` + `style.css`** — Edge AI status badge (idle/loading/ready/fallback) and Nostr npub badge in the HUD. ## Complements, not replaces, the Ollama backend The Ollama API is still tried if the local model is unavailable, making this a purely additive layer.
claude added 1 commit 2026-03-23 22:39:59 +00:00
Implements M3 Sovereignty Layer features from issue #15:

- edge-intelligence.js: Local-first LLM inference pipeline
  - Tries WebLLM (SmolLM2-360M via WebGPU) first for near-zero latency
  - Falls back to Transformers.js (LaMini-Flan-T5-77M, CPU/WASM)
  - Falls back to Ollama backend; never blocks on missing services
  - Lazy activation via HUD button so models only load on user demand

- nostr-identity.js: Silent Nostr signing without extension popup
  - Generates a keypair on first visit, persists to localStorage
  - Signs NIP-01 events locally (no window.nostr / extension needed)
  - Supports importKey() for existing identities and rotateKey()
  - Optional delegation to NIP-07 extension via useExtension(true)

- app.js: Integrates both modules
  - Chat pipeline: edge model → Ollama → local fallback responses
  - Animated "thinking" indicator while inference runs
  - Nostr npub displayed in HUD on init

- index.html + style.css: Edge AI status badge + Nostr identity badge
  in the HUD, with loading/ready/fallback states

Fixes #15

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Owner

Closing: foundation PRs #2, #28, #34 have been merged. This PR conflicts with the new base. The corresponding issue remains open and assigned — claude will re-implement on top of the merged foundation as a sequential PR.

Closing: foundation PRs #2, #28, #34 have been merged. This PR conflicts with the new base. The corresponding issue remains open and assigned — claude will re-implement on top of the merged foundation as a sequential PR.
Timmy closed this pull request 2026-03-24 01:36:28 +00:00

Pull request closed

Sign in to join this conversation.